Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By KENNETH CHANG This occasional column explores topics covered in Science Times 25 years ago to see what has changed — and what has not. The claim about babies was startling: A test administered to infants as young as 6 months could predict their score on an intelligence test years later, when they started school. “Why not test infants and find out which of them could take more in terms of stimulation?” Joseph F. Fagan III, the psychologist at Case Western Reserve University in Cleveland who developed the test, was quoted as saying in an article by Gina Kolata on April 4, 1989. “It’s not going to hurt anybody, that’s for sure.” In the test, the infant looks at a series of photographs — first a pair of identical faces, then the same face paired with one the baby hasn’t seen. The researchers measure how long the baby looks at the new face. “On the surface, it tests novelty preference,” said Douglas K. Detterman, a colleague of Dr. Fagan’s at Case Western. For reasons not quite understood, babies of below-average intelligence do not exhibit the same attraction to novelty. Dr. Fagan suggested that the test could be used to identify children with above-average intelligence in poorer families so they could be exposed to enrichment programs more readily available to wealthier families. But his primary motivation, said Cynthia R. Holland, his wife and longtime collaborator, was to look for babies at the other end of the intelligence curve, those who would fall behind as they grew up. “His hope was always was to identify early on, in the first year of life, kids who were at risk, cognitively, so we could focus our resources on them and help them out,” said Dr. Holland, a professor of psychology at Cuyahoga Community College. 25 YEARS LATER For the most part, the validity of the Fagan test holds up. Indeed, Dr. Fagan (who died last August) and Dr. Holland revisited infants they had tested in the 1980s, and found that the Fagan scores were predictive of the I.Q. and academic achievement two decades later when these babies turned 21. © 2014 The New York Times Company
Keyword: Intelligence; Development of the Brain
Link ID: 19456 - Posted: 04.08.2014
By Sam Kean Kent Cochrane, the amnesiac known throughout the world of neuroscience and psychology as K.C., died last week at age 62 in his nursing home in Toronto, probably of a stroke or heart attack. Although not as celebrated as the late American amnesiac H.M., for my money K.C. taught us more important and poignant things about how memory works. He showed how we make memories personal and personally meaningful. He also had a heck of a life story. During a wild and extended adolescence, K.C. jammed in rock bands, partied at Mardi Gras, played cards till all hours, and got into fights in bars; he was also knocked unconscious twice, once in a dune-buggy accident, once when a bale of hay conked him on the head. In October 1981, at age 30, he skidded off an exit ramp on his motorcycle. He spent a month in intensive care and lost, among other brain structures, both his hippocampuses. As H.M.’s case demonstrated in the early 1950s, the hippocampus—you have one in each hemisphere of your brain—helps form and store new memories and retrieve old ones. Without a functioning hippocampus, names, dates, and other information falls straight through the mind like a sieve. At least that’s what supposed to happen. K.C. proved that that’s not quite true—memories can sometimes bypass the hippocampus. After the motorcycle accident, K.C. lost most of his past memories and could make almost no new memories. But a neuroscientist named Endel Tulving began studying K.C., and he determined that K.C. could remember certain things from his past life just fine. Oddly, though, everything K.C. remembered fell within one restricted category: It was all stuff you could look up in reference books, like the difference between stalactites and stalagmites or between spares and strikes in bowling. Tulving called these bare facts “semantic memories,” memories devoid of all context and emotion. © 2014 The Slate Group LLC
Keyword: Learning & Memory
Link ID: 19455 - Posted: 04.08.2014
By NICHOLAS BAKALAR A new study adds to the evidence that the use of antidepressants during pregnancy is associated with a higher risk of premature birth, though many factors most likely play a role and the relationship is complex. Researchers reviewed data from 41 studies, some of which controlled for factors like smoking, alcohol or coffee drinking, weight gain during pregnancy, and other behavioral and health issues. They found no increase in the risk of early birth with the use of antidepressants during the first trimester, a 53 percent higher risk over all and a 96 percent higher risk with antidepressant use late in pregnancy. Depression itself is a risk factor for premature births, and a few studies tried to account for this by using, as a control, a group of women with a diagnosis of depression who did not take antidepressants during their pregnancy. Generally, researchers still found a higher, though diminished, risk from taking antidepressants. The review was published in March in PLOS One. Does this mean that all pregnant women should avoid these drugs? No, said the senior author, Dr. Adam C. Urato, an assistant professor of maternal-fetal medicine at Tufts University. Risks and benefits have to be balanced, he said. “It’s very complex, and depends on the severity of the disease,” Dr. Urato added. “The point is that we have to get the right information out so that we can let pregnant women make an informed decision.” © 2014 The New York Times Company
Keyword: Depression; Development of the Brain
Link ID: 19454 - Posted: 04.08.2014
By ANA GANTMAN and JAY VAN BAVEL TAKE a close look at your breakfast. Is that Jesus staring out at you from your toast? Such apparitions can be as lucrative as they are seemingly miraculous. In 2004, a Florida woman named Diane Duyser sold a decade-old grilled cheese sandwich that bore a striking resemblance to the Virgin Mary. She got $28,000 for it on eBay. The psychological phenomenon of seeing something significant in an ambiguous stimulus is called pareidolia. Virgin Mary grilled cheese sandwiches and other pareidolia remind us that almost any object is open to multiple interpretations. Less understood, however, is what drives some interpretations over others. In a forthcoming paper in the journal Cognition, we hope to shed some light on that question. In a series of experiments, we examined whether awareness of perceptually ambiguous stimuli was enhanced by the presence of moral content. We quickly flashed strings of letters on a computer screen and asked participants to indicate whether they believed each string formed a word or not. To ensure that the letter strings were perceptually ambiguous, we flashed them between approximately 40 and 70 milliseconds. (When they were presented for too long, people easily saw all the letter strings and demonstrated close to 100 percent accuracy. When they were presented too quickly, people were unable to see the words and performed “at chance,” around 50 percent accuracy.) Some of the strings of letters we flashed were words, others were not. Importantly, some of the words we flashed had moral content (virtue, steal, God) and others did not (virtual, steel, pet). Over the course of three experiments, we found that participants correctly identified strings of letters as words more often when they formed moral words (69 percent accuracy) than when they formed nonmoral words (65 percent accuracy). This suggested that moral content gave a “boost” to perceptually ambiguous stimuli — a shortcut to conscious awareness. We call this phenomenon the “moral pop-out effect.” © 2014 The New York Times Company
Keyword: Attention
Link ID: 19453 - Posted: 04.07.2014
By BARBARA EHRENREICH MY atheism is hard-core, rooted in family tradition rather than adolescent rebellion. According to family legend, one of my 19th-century ancestors, a dirt-poor Irish-American woman in Montana, expressed her disgust with the church by vehemently refusing last rites when she lay dying in childbirth. From then on, we were atheists and rationalists, a stance I perpetuated by opting, initially, for a career in science. How else to understand the world except as the interaction of tiny bits of matter and mathematically predictable forces? There were no gods or spirits, just our own minds pressing up against the unknown. But something happened when I was 17 that shook my safely rationalist worldview and left me with a lifelong puzzle. Years later, I learned that this sort of event is usually called a mystical experience, and I can see in retrospect that the circumstances had been propitious: Thanks to a severely underfunded and poorly planned skiing trip, I was sleep-deprived and probably hypoglycemic that morning in 1959 when I stepped out alone, walked into the streets of Lone Pine, Calif., and saw the world — the mountains, the sky, the low scattered buildings — suddenly flame into life. There were no visions, no prophetic voices or visits by totemic animals, just this blazing everywhere. Something poured into me and I poured out into it. This was not the passive beatific merger with “the All,” as promised by the Eastern mystics. It was a furious encounter with a living substance that was coming at me through all things at once, too vast and violent to hold on to, too heartbreakingly beautiful to let go of. It seemed to me that whether you start as a twig or a gorgeous tapestry, you will be recruited into the flame and made indistinguishable from the rest of the blaze. I felt ecstatic and somehow completed, but also shattered. © 2014 The New York Times Company
Keyword: Schizophrenia; Emotions
Link ID: 19452 - Posted: 04.07.2014
By GRETCHEN REYNOLDS Age-related vision loss is common and devastating. But new research suggests that physical activity might protect our eyes as we age. There have been suggestions that exercise might reduce the risk of macular degeneration, which occurs when neurons in the central part of the retina deteriorate. The disease robs millions of older Americans of clear vision. A 2009 study of more than 40,000 middle-aged distance runners, for instance, found that those covering the most miles had the least likelihood of developing the disease. But the study did not compare runners to non-runners, limiting its usefulness. It also did not try to explain how exercise might affect the incidence of an eye disease. So, more recently, researchers at Emory University in Atlanta and the Atlanta Veterans Administration Medical Center in Decatur, Ga., took up that question for a study published last month in The Journal of Neuroscience. Their interest was motivated in part by animal research at the V.A. medical center. That work had determined that exercise increases the levels of substances known as growth factors in the animals’ bloodstream and brains. These growth factors, especially one called brain-derived neurotrophic factor, or B.D.N.F., are known to contribute to the health and well-being of neurons and consequently, it is thought, to improvements in brain health and cognition after regular exercise. But the brain is not the only body part to contain neurons, as the researchers behind the new study knew. The retina does as well, and the researchers wondered whether exercise might raise levels of B.D.N.F. there, too, potentially affecting retinal health and vision. © 2014 The New York Times Company
Keyword: Vision
Link ID: 19451 - Posted: 04.07.2014
by Clare Wilson A genetic tweak can make light work of some nervous disorders. Using flashes of light to stimulate modified neurons can restore movement to paralysed muscles. A study demonstrating this, carried out in mice, lays the path for using such "optogenetic" approaches to treat nerve disorders ranging from spinal cord injury to epilepsy and motor neuron disease. Optogenetics has been hailed as one of the most significant recent developments in neuroscience. It involves genetically modifying neurons so they produce a light-sensitive protein, which makes them "fire", sending an electrical signal, when exposed to light. So far optogenetics has mainly been used to explore how the brain works, but some groups are exploring using it as therapy. One stumbling block has been fears about irreversibly genetically manipulating the brain. In the latest study, a team led by Linda Greensmith of University College London altered mouse stem cells in the lab before transplanting them into nerves in the leg – this means they would be easier to remove if something went wrong. "It's a very exciting approach that has a lot of potential," says Ziv Williams of Harvard Medical School in Boston. Greensmith's team inserted an algal gene that codes for a light-responsive protein into mouse embryonic stem cells. They then added signalling molecules to make the stem cells develop into motor neurons, the cells that carry signals to and from the spinal cord to the rest of the body. They implanted these into the sciatic nerve – which runs from the spinal cord to the lower limbs – of mice whose original nerves had been cut. © Copyright Reed Business Information Ltd.
Keyword: Movement Disorders
Link ID: 19450 - Posted: 04.05.2014
By Deborah Serani Sometimes I work with children and adults who can’t put words to their feelings and thoughts. It’s not that they don’t want to – it’s more that they don’t know how. The clinical term for this experience is alexithymia and is defined as the inability to recognize emotions and their subtleties and textures [1]. Alexithymia throws a monkey wrench into a person’s ability to know their own self-experience or understand the intricacies of what others feel and think. Here are a few examples those with alexithymia experience: Difficulty identifying different types of feelings Limited understanding of what causes feelings Difficulty expressing feelings Difficulty recognizing facial cues in others Limited or rigid imagination Constricted style of thinking Hypersensitive to physical sensations Detached or tentative connection to others Alexithymia was first mentioned as a psychological construct in 1976 and was viewed as a deficit in emotional awareness [2]. Research suggests that approximately 8% of males and 2% of females experience alexithymia, and that it can come in mild, moderate and severe intensities [3]. Studies also show that alexithymia has two dimensions – a cognitive dimension, where a child or adult struggles to identify, interpret and verbalize feelings (the “thinking” part of our emotional experience). And an affective dimension, where difficulties arise in reacting, expressing, feeling and imagining (the “experiencing” part of our emotional experience) [4]. © 2014 Scientific American
Keyword: Emotions; Language
Link ID: 19449 - Posted: 04.05.2014
By LISA SANDERS, M.D. On Thursday, we challenged Well readers to solve the mystery of a 23-year-old man with episodes of aggressive, manic behavior that couldn’t be controlled. Nearly 1,000 readers wrote in with their take on this terrifying case. More than 300 of you got the right class of disease, and 21 of you nailed the precise form of the disorder. Amazing! The correct diagnosis is … Variegate porphyria The first person with the correct answer was Francis Graziano, a 23-year-old recent graduate of the University of Michigan. His major in neuroscience really gave him a leg up on this case, he told me. He recalled a case he read of a young Vietnam veteran with symptoms of porphyria. He’s a surgical technician right now, waiting to hear where he’ll be going to medical school next year. Strong work, Dr.-to-be Graziano! The Diagnosis: The word porphyria comes from the ancient Greek word for purple, “porphyra,” because patients with this disease can have purplish-red urine, tears or saliva. The porphyrias are a group of rare genetic diseases that develop in patients born without the machinery to make certain essential body chemicals, including one of the most important parts of blood known as heme. This compound makes up the core of the blood component hemoglobin. (The presence of heme is why blood is red.) Patients who can’t make heme correctly end up with too much of its chemical precursors, known as porphyrins. The excess porphyrins injure tissues throughout the body, but especially in the nervous system. The disorder is characterized by frequent episodes of debilitating back or abdominal pain and is often accompanied by severe psychiatric symptoms. Patients with porphyria do not respond to most psychiatric medications. Indeed, many of these drugs make the symptoms of porphyria worse. © 2014 The New York Times Company
Keyword: Schizophrenia
Link ID: 19448 - Posted: 04.05.2014
David Adam The day the Brazilian racing driver Ayrton Senna died in a crash, I was stuck in the toilet of a Manchester swimming pool. The door was open, but my thoughts blocked the way out. It was May 1994. I was 22 and hungry. After swimming a few lengths of the pool, I had lifted myself from the water and headed for the locker rooms. Going down the steps, I had scraped the back of my heel on the sharp edge of the final step. It left a small graze through which blood bulged into a blob that hung from my broken skin. I transferred the drop to my finger and a second swelled to take its place. I pulled a paper towel from above the sink to press to my wet heel. The blood on my finger ran with the water as it dripped down my arm. My eyes followed the blood. And the anxiety, of course, rushed back, ahead even of the memory. My shoulders sagged. My stomach tightened. Four weeks earlier, I had pricked my finger on a screw that stuck out from a bus shelter's corrugated metal. It was a busy Saturday afternoon and there had been lots of people around. Any one of them, I thought, could easily have injured themselves in the way I had. What if one had been HIV positive? They could have left infected blood on the screw, which then pierced my skin. That would put the virus into my bloodstream. I knew the official line was that transmission was impossible this way – the virus couldn't survive outside the body – but I also knew that, when pressed for long enough, those in the know would weaken the odds to virtually impossible. They couldn't be absolutely sure. In fact, several had admitted to me there was a theoretical risk. © 2014 Guardian News and Media Limited
Keyword: OCD - Obsessive Compulsive Disorder
Link ID: 19447 - Posted: 04.05.2014
By SABRINA TAVERNISE Federal health regulators approved a drug overdose treatment device on Thursday that experts say will provide a powerful lifesaving tool in the midst of a surging epidemic of prescription drug abuse. Similar to an EpiPen used to stop allergic reactions to bee stings, the easy-to-use injector — small enough to tuck into a pocket or a medicine cabinet — can be used by the relatives or friends of people who have overdosed. The hand-held device, called Evzio, delivers a single dose of naloxone, a medication that reverses the effects of an overdose, and will be used on those who have stopped breathing or lost consciousness from an opioid drug overdose. Naloxone is the standard treatment in such circumstances, but until now, has been available mostly in hospitals and other medical settings, when it is often used too late to save the patient. The decision to quickly approve the new treatment, which is expected to be available this summer, comes as deaths from opioids continue to mount, including an increase in those from heroin, which contributed to the death of the actor Philip Seymour Hoffman in February. Federal health officials, facing criticism for failing to slow the rising death toll, are under pressure to act, experts say. “This is a big deal, and I hope gets wide attention,” said Dr. Carl R. Sullivan III, director of the addictions program at West Virginia University. “It’s pretty simple: Having these things in the hands of people around drug addicts just makes sense because you’re going to prevent unnecessary mortality.” The scourge of drug abuse has battered states across the country, with deaths from overdoses now outstripping those from traffic crashes. Prescription drugs alone now account for more than half of all drug overdose deaths, and one major category of them, opioids, or painkillers, take the lives of more Americans than heroin and cocaine combined. Deaths from opioids have quadrupled in 10 years to more than 16,500 in 2010, according to federal data. © 2014 The New York Times Company
Keyword: Drug Abuse; Pain & Touch
Link ID: 19446 - Posted: 04.05.2014
Walking backward may seem a simple task, but researchers don’t know how the mind controls this behavior. A study published online today in Science provides the first glimpse of the brain circuit responsible—at least in fruit flies. Geneticists created 3500 strains of the insects, each with a temperature-controlled switch that turned random networks of neurons on when the flies entered an incubator. One mutant batch of fruit flies started strolling in reverse when exposed to warmth (video, right panel), which the team dubbed “moonwalkers,” in honor of Michael Jackson’s famous dance. Two neurons were responsible for the behavior. One lived in the brain and extended its connections to the end of the ventral nerve cord—the fly’s version of a spine, which runs along its belly. The other neuron had the opposite orientation—it started at the bottom of the nerve cord and sent its messaging cables—or axons—into the brain. The neuron in the brain acted like a reverse gear in a car; when turned on, it triggered reverse walking. The researchers say this neuron is possibly a command center that responds to environmental cues, such as, “Hey! I see a wall in front of me.” The second neuron functioned as the brakes for forward motion, but it couldn’t compel the fly to moonwalk. It may serve as a fail-safe that reflexively prevents moving ahead, such as when the fly accidentally steps onto a very cold floor. Using the two neurons as a starting point, the team will trace their links to sensory neurons for touch, sight, and smell, which feed into and control the moonwalking network. No word yet on the neurons responsible for the Macarena. © 2014 American Association for the Advancement of Science
Keyword: Movement Disorders
Link ID: 19445 - Posted: 04.05.2014
By James Gallagher Health and science reporter, BBC News The illegal party drug ketamine is an "exciting" and "dramatic" new treatment for depression, say doctors who have conducted the first trial in the UK. Some patients who have faced incurable depression for decades have had symptoms disappear within hours of taking low doses of the drug. The small trial on 28 people, reported in the Journal of Psychopharmacology, shows the benefits can last months. Experts said the findings opened up a whole new avenue of research. Depression is common and affects one-in-10 people at some point in their lives. Antidepressants, such as prozac, and behavioural therapies help some patients, but a significant proportion remain resistant to any form of treatment. A team at Oxford Health NHS Foundation Trust gave patients doses of ketamine over 40 minutes on up to six occasions. Eight showed improvements in reported levels of depression, with four of them improving so much they were no longer classed as depressed. Some responded within six hours of the first infusion of ketamine. Lead researcher Dr Rupert McShane said: "It really is dramatic for some people, it's the sort of thing really that makes it worth doing psychiatry, it's a really wonderful thing to see. He added: "[The patients] say 'ah this is how I used to think' and the relatives say 'we've got x back'." Dr McShane said this included patients who had lived with depression for 20 years. Stressed man The testing of ketamine has indentified some serious side-effects The duration of the effect is still a problem. Some relapse within days, while others have found they benefit for around three months and have since had additional doses of ketamine. There are also some serious side-effects including one case of the supply of blood to the brain being interrupted. Doctors say people should not try to self-medicate because of the serious risk to health outside of a hospital setting. BBC © 2014
Keyword: Depression; Drug Abuse
Link ID: 19444 - Posted: 04.03.2014
A high-resolution map of the human brain in utero is providing hints about the origins of brain disorders including schizophrenia and autism. The map shows where genes are turned on and off throughout the entire brain at about the midpoint of pregnancy, a time when critical structures are taking shape, researchers Wednesday in the journal Nature. "It's a pretty big leap," says , an investigator at the in Seattle who played a central role in creating the map. "Basically, there was no information of this sort prior to this project." Having a map like this is important because many psychiatric and behavioral problems appear to begin before birth, "even though they may not manifest until teenage years or even the early 20s," says , director of the . The human brain is often called the most complex object in the universe. Yet its basic architecture is created in just nine months, when it grows from a single cell to more than 80 billion cells organized in a way that will eventually let us think and feel and remember. "We're talking about a remarkable process," a process controlled by our genes, Lein says. So he and a large team of researchers decided to use genetic techniques to create a map that would help reveal this process. Funding came from the 2009 federal stimulus package. The massive effort required tens of thousands of brain tissue samples so small that they had to be cut out with a laser. Researchers used brain tissue from aborted fetuses, which the Obama administration has authorized over the objections of abortion opponents. ©2014 NPR
Keyword: Brain imaging; Development of the Brain
Link ID: 19443 - Posted: 04.03.2014
He was known in his many appearances in the scientific literature as simply K.C., an amnesiac who was unable to form new memories. But to the people who knew him, and the scientists who studied him for decades, he was Kent Cochrane, or just Kent. Cochrane, who suffered a traumatic brain injury in a motorcycle accident when he was 30 years old, helped to rewrite the understanding of how the brain forms new memories and whether learning can occur without that capacity. "From a scientific point of view, we've really learned a lot [from him], not just about memory itself but how memory contributes to other abilities," said Shayna Rosenbaum, a cognitive neuropsychologist at York University who started working with Cochrane in 1998 when she was a graduate student. Cochrane was 62 when he died late last week. The exact cause of death is unknown, but his sister, Karen Casswell, said it is believed he had a heart attack or stroke. He died in his room at an assisted living facility where he lived and the family opted not to authorize an autopsy. Few in the general public would know about Cochrane, though some may have seen or read media reports on the man whose life was like that of the lead character of the 2000 movie Memento. But anyone who works on the science of human memory would know K.C. Casswell and her mother, Ruth Cochrane, said the family was proud of the contribution Kent Cochrane made to science. Casswell noted her eldest daughter was in a psychology class at university when the professor started to lecture about the man the scientific literature knows as K.C. © CBC 2014
Keyword: Learning & Memory
Link ID: 19442 - Posted: 04.03.2014
Dr Nicola Davis The electronic nose in an instrument that attempts to mimic the human olfactory system. Humans and animals don't identify specific chemicals within odours; what they do is to recognise a smell based on a response pattern. You, as a human, will smell a strawberry and say "that's a strawberry". If you gave this to a traditional analytical piece of equipment, it might tell you what the 60-odd chemicals in the odour were - but that wouldn't tell you that it was a strawberry. How does it work? A traditional electronic nose has an array of chemical sensors, designed either to detect gases or vapours. These sensors are not tuned to a single chemical, but detect families of chemicals - [for example] alcohols. Each one of these sensors is different, so when they are presented to a complex odour formed of many chemicals, each sensor responds differently to that odour. This creates a pattern of sensor responses, which the machine can be taught [to recognise]. Can't we just use dogs? A dog is very, very sensitive. Special research teams work on training dogs to detect cancers as you would do explosives. What you we are trying to do with the electronic nose is create an artificial means of replicating what the dog does. Such machines have the advantage that they don't get tired, will work all day and you only need to feed them electricity. © 2014 Guardian News and Media Limited
Keyword: Chemical Senses (Smell & Taste); Robotics
Link ID: 19441 - Posted: 04.03.2014
For years, some biomedical researchers have worried that a push for more bench-to-bedside studies has meant less support for basic research. Now, the chief of one of the National Institutes of Health’s (NIH’s) largest institutes has added her voice—and hard data—to the discussion. Story Landis describes what she calls a “sharp decrease” in basic research at her institute, a trend she finds worrisome. In a blog post last week, Landis, director of the $1.6 billion National Institute of Neurological Disorders and Stroke (NINDS), says her staff started out asking why, in the mid-2000s, NINDS funding declined for R01s, the investigator-initiated grants that are the mainstay of most labs. After examining the aims and abstracts of grants funded between 1997 and 2012, her staff found that the portion of NINDS competing grant funding that went to basic research has declined (from 87% to 71%) while applied research rose (from 13% to 29%). To dig deeper, the staffers divided the grants into four categories—basic/basic; basic/disease-focused; applied/translational; and applied/clinical. Here, the decline in basic/basic research was “striking”: It fell from 52% to 27% of new and competing grants, while basic/disease-focused has been rising (see graph). The same trend emerged when the analysts looked only at investigator-initiated grants, which are proposals based on a researcher’s own ideas, not a solicitation by NINDS for proposals in a specific area. The shift could reflect changes in science and “a natural progression of the field,” Landis writes. Or it could mean researchers “falsely believe” that NINDS is not interested in basic studies and they have a better shot at being funded if they propose disease-focused or applied studies. The tight NIH budget and new programs focused on translational research could be fostering this belief, she writes. When her staff compared applications submitted in 2008 and 2011, they found support for a shift to disease-focused proposals: There was a “striking” 21% decrease in the amount of funding requested for basic studies, even though those grants had a better chance of being funded. © 2014 American Association for the Advancement of Science.
Keyword: Movement Disorders
Link ID: 19440 - Posted: 04.02.2014
Erika Check Hayden Monkeys on a reduced-calorie diet live longer than those that can eat as much as they want, a new study suggests. The findings add to a thread of studies on how a restricted diet prolongs life in a range of species, but they complicate the debate over whether the research applies to animals closely related to humans. In the study, which has been running since 1989 at the Wisconsin National Primate Research Center in Madison, 38 rhesus macaques (Macaca mulatta) that were allowed to eat whatever they wanted were nearly twice as likely to die at any age than were 38 monkeys whose calorie intakes were cut by 30%1. The same study reported2 in 2009 that calorie-restricted monkeys were less likely to die of age-related causes than control monkeys, but had similar overall mortality rates at all ages. “We set out to test the hypothesis: would calorie restriction delay ageing? And I think we've shown that it does,” says Rozalyn Anderson, a biochemist at the University of Wisconsin who led the study, which is published today in Nature Communications. She said it is not surprising that the 2009 paper did not find that the calorie-restricted monkeys lived longer, because at the time too few monkeys had died to prove the point. Eating a very low-calorie diet has been shown3 to prolong the lives of mice, leading to speculation that such a diet triggers a biochemical pathway that promotes survival. But what that pathway might be — and whether humans have it — has been a matter of hot debate. Eat to live In 2012, a study at the US National Institute on Aging (NIA) in Bethesda, Maryland, cast doubt on the idea, reporting4 that monkeys on low-calorie diets did not live longer than those that ate more food. But Anderson says that the Wisconsin findings are good news. © 2014 Nature Publishing Group
Keyword: Obesity
Link ID: 19439 - Posted: 04.02.2014
Neandertals and modern Europeans had something in common: They were fatheads of the same ilk. A new genetic analysis reveals that our brawny cousins had a number of distinct genes involved in the buildup of certain types of fat in their brains and other tissues—a trait shared by today’s Europeans, but not Asians. Because two-thirds of our brains are built of fatty acids, or lipids, the differences in fat composition between Europeans and Asians might have functional consequences, perhaps in helping them adapt to colder climates or causing metabolic diseases. “This is the first time we have seen differences in lipid concentrations between populations,” says evolutionary biologist Philipp Khaitovich of the CAS-MPG Partner Institute for Computational Biology in Shanghai, China, and the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, lead author of the new study. “How our brains are built differently of lipids might be due to Neandertal DNA.” Ever since researchers at the Max Planck sequenced the genome of Neandertals, including a super high-quality genome of a Neandertal from the Altai Mountains of Siberia in December, researchers have been comparing Neandertal DNA with that of living people. Neandertals, who went extinct 30,000 years ago, interbred with modern humans at least once in the past 60,000 years, probably somewhere in the Middle East. Because the interbreeding happened after moderns left Africa, today’s Africans did not inherit any Neandertal DNA. But living Europeans and Asians have inherited a small amount—1% to 4% on average. So far, scientists have found that different populations of living humans have inherited the Neandertal version of genes that cause diabetes, lupus, and Crohn’s disease; alter immune function; and affect the function of the protein keratin in skin, nails, and hair. © 2014 American Association for the Advancement of Science.
Keyword: Evolution; Obesity
Link ID: 19438 - Posted: 04.02.2014
By NATALIE ANGIER The “Iliad” may be a giant of Western literature, yet its plot hinges on a human impulse normally thought petty: spite. Achilles holds a festering grudge against Agamemnon (“He cheated me, wronged me ... He can go to hell...”) turning down gifts, homage, even the return of his stolen consort Briseis just to prolong the king’s suffering. Now, after decades of focusing on such staples of bad behavior as aggressiveness, selfishness, narcissism and greed, scientists have turned their attention to the subtler and often unsettling theme of spite — the urge to punish, hurt, humiliate or harass another, even when one gains no obvious benefit and may well pay a cost. Psychologists are exploring spitefulness in its customary role as a negative trait, a lapse that should be embarrassing but is often sublimated as righteousness, as when you take your own sour time pulling out of a parking space because you notice another car is waiting for it and you’ll show that vulture who’s boss here, even though you’re wasting your own time, too. Evolutionary theorists, by contrast, are studying what might be viewed as the brighter side of spite, and the role it may have played in the origin of admirable traits like a cooperative spirit and a sense of fair play. The new research on spite transcends older notions that we are savage, selfish brutes at heart, as well as more recent suggestions that humans are inherently affiliative creatures yearning to love and connect. Instead, it concludes that vice and virtue, like the two sides of a V, may be inextricably linked. “Spitefulness is such an intrinsically interesting subject, and it fits with so many people’s everyday experience, that I was surprised to see how little mention there was of it in the psychology literature,” said David K. Marcus, a psychologist at Washington State University. At the same time, he said, “I was thrilled to find something that people haven’t researched to exhaustion.” © 2014 The New York Times Company
Keyword: Emotions; Evolution
Link ID: 19436 - Posted: 04.01.2014