Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 10461 - 10480 of 29326

For years, some biomedical researchers have worried that a push for more bench-to-bedside studies has meant less support for basic research. Now, the chief of one of the National Institutes of Health’s (NIH’s) largest institutes has added her voice—and hard data—to the discussion. Story Landis describes what she calls a “sharp decrease” in basic research at her institute, a trend she finds worrisome. In a blog post last week, Landis, director of the $1.6 billion National Institute of Neurological Disorders and Stroke (NINDS), says her staff started out asking why, in the mid-2000s, NINDS funding declined for R01s, the investigator-initiated grants that are the mainstay of most labs. After examining the aims and abstracts of grants funded between 1997 and 2012, her staff found that the portion of NINDS competing grant funding that went to basic research has declined (from 87% to 71%) while applied research rose (from 13% to 29%). To dig deeper, the staffers divided the grants into four categories—basic/basic; basic/disease-focused; applied/translational; and applied/clinical. Here, the decline in basic/basic research was “striking”: It fell from 52% to 27% of new and competing grants, while basic/disease-focused has been rising (see graph). The same trend emerged when the analysts looked only at investigator-initiated grants, which are proposals based on a researcher’s own ideas, not a solicitation by NINDS for proposals in a specific area. The shift could reflect changes in science and “a natural progression of the field,” Landis writes. Or it could mean researchers “falsely believe” that NINDS is not interested in basic studies and they have a better shot at being funded if they propose disease-focused or applied studies. The tight NIH budget and new programs focused on translational research could be fostering this belief, she writes. When her staff compared applications submitted in 2008 and 2011, they found support for a shift to disease-focused proposals: There was a “striking” 21% decrease in the amount of funding requested for basic studies, even though those grants had a better chance of being funded. © 2014 American Association for the Advancement of Science.

Keyword: Movement Disorders
Link ID: 19440 - Posted: 04.02.2014

Erika Check Hayden Monkeys on a reduced-calorie diet live longer than those that can eat as much as they want, a new study suggests. The findings add to a thread of studies on how a restricted diet prolongs life in a range of species, but they complicate the debate over whether the research applies to animals closely related to humans. In the study, which has been running since 1989 at the Wisconsin National Primate Research Center in Madison, 38 rhesus macaques (Macaca mulatta) that were allowed to eat whatever they wanted were nearly twice as likely to die at any age than were 38 monkeys whose calorie intakes were cut by 30%1. The same study reported2 in 2009 that calorie-restricted monkeys were less likely to die of age-related causes than control monkeys, but had similar overall mortality rates at all ages. “We set out to test the hypothesis: would calorie restriction delay ageing? And I think we've shown that it does,” says Rozalyn Anderson, a biochemist at the University of Wisconsin who led the study, which is published today in Nature Communications. She said it is not surprising that the 2009 paper did not find that the calorie-restricted monkeys lived longer, because at the time too few monkeys had died to prove the point. Eating a very low-calorie diet has been shown3 to prolong the lives of mice, leading to speculation that such a diet triggers a biochemical pathway that promotes survival. But what that pathway might be — and whether humans have it — has been a matter of hot debate. Eat to live In 2012, a study at the US National Institute on Aging (NIA) in Bethesda, Maryland, cast doubt on the idea, reporting4 that monkeys on low-calorie diets did not live longer than those that ate more food. But Anderson says that the Wisconsin findings are good news. © 2014 Nature Publishing Group

Keyword: Obesity
Link ID: 19439 - Posted: 04.02.2014

Neandertals and modern Europeans had something in common: They were fatheads of the same ilk. A new genetic analysis reveals that our brawny cousins had a number of distinct genes involved in the buildup of certain types of fat in their brains and other tissues—a trait shared by today’s Europeans, but not Asians. Because two-thirds of our brains are built of fatty acids, or lipids, the differences in fat composition between Europeans and Asians might have functional consequences, perhaps in helping them adapt to colder climates or causing metabolic diseases. “This is the first time we have seen differences in lipid concentrations between populations,” says evolutionary biologist Philipp Khaitovich of the CAS-MPG Partner Institute for Computational Biology in Shanghai, China, and the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, lead author of the new study. “How our brains are built differently of lipids might be due to Neandertal DNA.” Ever since researchers at the Max Planck sequenced the genome of Neandertals, including a super high-quality genome of a Neandertal from the Altai Mountains of Siberia in December, researchers have been comparing Neandertal DNA with that of living people. Neandertals, who went extinct 30,000 years ago, interbred with modern humans at least once in the past 60,000 years, probably somewhere in the Middle East. Because the interbreeding happened after moderns left Africa, today’s Africans did not inherit any Neandertal DNA. But living Europeans and Asians have inherited a small amount—1% to 4% on average. So far, scientists have found that different populations of living humans have inherited the Neandertal version of genes that cause diabetes, lupus, and Crohn’s disease; alter immune function; and affect the function of the protein keratin in skin, nails, and hair. © 2014 American Association for the Advancement of Science.

Keyword: Evolution; Obesity
Link ID: 19438 - Posted: 04.02.2014

By NATALIE ANGIER The “Iliad” may be a giant of Western literature, yet its plot hinges on a human impulse normally thought petty: spite. Achilles holds a festering grudge against Agamemnon (“He cheated me, wronged me ... He can go to hell...”) turning down gifts, homage, even the return of his stolen consort Briseis just to prolong the king’s suffering. Now, after decades of focusing on such staples of bad behavior as aggressiveness, selfishness, narcissism and greed, scientists have turned their attention to the subtler and often unsettling theme of spite — the urge to punish, hurt, humiliate or harass another, even when one gains no obvious benefit and may well pay a cost. Psychologists are exploring spitefulness in its customary role as a negative trait, a lapse that should be embarrassing but is often sublimated as righteousness, as when you take your own sour time pulling out of a parking space because you notice another car is waiting for it and you’ll show that vulture who’s boss here, even though you’re wasting your own time, too. Evolutionary theorists, by contrast, are studying what might be viewed as the brighter side of spite, and the role it may have played in the origin of admirable traits like a cooperative spirit and a sense of fair play. The new research on spite transcends older notions that we are savage, selfish brutes at heart, as well as more recent suggestions that humans are inherently affiliative creatures yearning to love and connect. Instead, it concludes that vice and virtue, like the two sides of a V, may be inextricably linked. “Spitefulness is such an intrinsically interesting subject, and it fits with so many people’s everyday experience, that I was surprised to see how little mention there was of it in the psychology literature,” said David K. Marcus, a psychologist at Washington State University. At the same time, he said, “I was thrilled to find something that people haven’t researched to exhaustion.” © 2014 The New York Times Company

Keyword: Emotions; Evolution
Link ID: 19436 - Posted: 04.01.2014

A new study has raised new questions about how MRI scanners work in the quest to understand the brain. The research, led by Professor Brian Trecox and a team of international researchers, used a brand new technique to assess fluctuations in the performance of brain scanners as they were being used during a series of basic experiments. The results are due to appear in the Journal of Knowledge in Neuroscience: General later today. “Most people think that we know a lot about how MRI scanners actually work. The truth is, we don’t,” says Trecox. “We’ve even been misleading the public about the name – we made up functional Magnetic Resonance Imaging in 1983 because it sounded scientific and technical. fMRI really stands for flashy, Magically Rendered Images. So we thought: why not put an MRI scanner in an MRI scanner, and figure out what’s going on inside?” To do this, Trecox and his team built a giant imaging machine – thought to be the world’s largest – using funds from a Kickstarter campaign and a local bake sale. They then took a series of scans of standard-sized MRI scanners while they were repeatedly switched on and off, in one of the largest and most robust neuroscience studies of its type. “We tested six different MRI scanners,” says Eric Salmon, a PhD student involved in the project. “We found activation in an area called insular cortex in four of the six machines when they were switched on,” he added. In humans, the insular cortex has previously been implicated in a wide range of functions, including consciousness and self-awareness. According to Trecox and his team, activation in this area has never been found in imaging machines before. While Salmon acknowledged that the results should be treated with caution – research assistants were found asleep in at least two of the machines – the results nevertheless provide a potentially huge step in our understanding of the tools we use to research the brain. © 2014 Guardian News and Media Limited

Keyword: Brain imaging
Link ID: 19435 - Posted: 04.01.2014

By Karen Kaplan There are lies, damn lies – and the lies that we tell for the sake of others when we are under the influence of oxytocin. Researchers found that after a squirt of the so-called love hormone, volunteers lied more readily about their results in a game in order to benefit their team. Compared with control subjects who were given a placebo, those on oxytocin told more extreme lies and told them with less hesitation, according to a study published Monday in Proceedings of the National Academy of Sciences. Oxytocin is a brain hormone that is probably best known for its role in helping mothers bond with their newborns. In recent years, scientists have been examining its role in monogamy and in strengthening trust and empathy in social groups. Sometimes, doing what’s good for the group requires lying. (Think of parents who fake their addresses to get their kids into a better school.) A pair of researchers from Ben-Gurion University of the Negev in Israel and the University of Amsterdam figured that oxytocin would play a role in this type of behavior, so they set up a series of experiments to test their hypothesis. The researchers designed a simple computer game that asked players to predict whether a virtual coin toss would wind up heads or tails. After seeing the outcome on a computer screen, players were asked to report whether their prediction was correct or not. In some cases, making the right prediction would earn a player’s team a small payment (the equivalent of about 40 cents). In other cases, a correct prediction would cost the team the same amount, and sometimes there was no payoff or cost. Los Angeles Times Copyright 2014

Keyword: Hormones & Behavior; Attention
Link ID: 19434 - Posted: 04.01.2014

by Bob Holmes People instinctively organise a new language according to a logical hierarchy, not simply by learning which words go together, as computer translation programs do. The finding may add further support to the notion that humans possess a "universal grammar", or innate capacity for language. The existence of a universal grammar has been in hot dispute among linguists ever since Noam Chomsky first proposed the idea half a century ago. If the theory is correct, this innate structure should leave some trace in the way people learn languages. To test the idea, Jennifer Culbertson, a linguist at George Mason University in Fairfax, Virginia, and her colleague David Adger of Queen Mary University of London, constructed an artificial "nanolanguage". They presented English-speaking volunteers with two-word phrases, such as "shoes blue" and "shoes two", which were supposed to belong to a new language somewhat like English. They then asked the volunteers to choose whether "shoes two blue" or "shoes blue two" would be the correct three-word phrase. In making this choice, the volunteers – who hadn't been exposed to any three-word phrases – would reveal their innate bias in language-learning. Would they rely on familiarity ("two" usually precedes "blue" in English), or would they follow a semantic hierarchy and put "blue" next to "shoe" (because it modifies the noun more tightly than "two", which merely counts how many)? © Copyright Reed Business Information Ltd.

Keyword: Language; Development of the Brain
Link ID: 19433 - Posted: 04.01.2014

|By Hal Arkowitz and Scott O. Lilienfeld A commercial sponsored by Pfizer, the drug company that manufactures the antidepressant Zoloft, asserts, “While the cause [of depression] is unknown, depression may be related to an imbalance of natural chemicals between nerve cells in the brain. Prescription Zoloft works to correct this imbalance.” Using advertisements such as this one, pharmaceutical companies have widely promoted the idea that depression results from a chemical imbalance in the brain. The general idea is that a deficiency of certain neurotransmitters (chemical messengers) at synapses, or tiny gaps, between neurons interferes with the transmission of nerve impulses, causing or contributing to depression. One of these neurotransmitters, serotonin, has attracted the most attention, but many others, including norepinephrine and dopamine, have also been granted supporting roles in the story. Much of the general public seems to have accepted the chemical imbalance hypothesis uncritically. For example, in a 2007 survey of 262 undergraduates, psychologist Christopher M. France of Cleveland State University and his colleagues found that 84.7 percent of participants found it “likely” that chemical imbalances cause depression. In reality, however, depression cannot be boiled down to an excess or deficit of any particular chemical or even a suite of chemicals. “Chemical imbalance is sort of last-century thinking. It's much more complicated than that,” neuroscientist Joseph Coyle of Harvard Medical School was quoted as saying in a blog by National Public Radio's Alix Spiegel. © 2014 Scientific American

Keyword: Depression
Link ID: 19432 - Posted: 04.01.2014

by Aviva Rutkin Don't blame baby for trying to eat that Lego piece. Humans may have a brain circuit dedicated to grabbing stuff and putting it in our mouths, and it probably develops in the womb. Researchers and parents alike have long known that babies stick all manner of things in their mouths from very early on. Some fetuses even suck their thumbs. As putting something in the mouth seems advanced compared to the other, limited actions of newborns, Angela Sirigu of the Institute of Cognitive Sciences in Bron, France, and colleagues wondered whether the behaviour is encoded in the brain from birth. To investigate, they studied 26 people of different ages while they were undergoing brain surgery. The researchers found that they were able to make nine of the unconscious patients bring their hands up and open their mouths, just by stimulating a part of the brain we know is linked to those actions in non-human primates. Brain pudding Because this behaviour is encoded in the same region as in other primates, it may be there from birth or earlier, the researchers say. If it was learned, you would expect it to involve multiple brain areas, and those could vary between individuals. Newborn kangaroos are able to climb into their mother's pouch and baby wildebeests can run away from lions, but our babies appear helpless and have to learn most complex actions. The new work suggests that the way our brain develops is more like what happens in other animals than previously thought. © Copyright Reed Business Information Ltd.

Keyword: Development of the Brain
Link ID: 19431 - Posted: 04.01.2014

by Meghan Rosen Human faces just got a lot more emotional. People can broadcast more than three times as many different feelings on their faces as scientists once suspected. For years, scientists have thought that people could convey only happiness, surprise, sadness, anger, fear and disgust. “I thought it was very odd to have only one positive emotion,” says cognitive scientist Aleix Martinez of Ohio State University in Columbus. So he and colleagues came up with 16 combined ones, such as “happily disgusted” and “happily surprised.” Then the researchers asked volunteers to imagine situations that would provoke these emotions, such as listening to a gross joke, or getting unexpected good news. When the team compared pictures of the volunteers making different faces and analyzed every eyebrow wrinkle, mouth stretch and tightened chin, “what we found was beyond belief,” Martinez says. For each compound emotion, almost everyone used the same facial muscles, the team reports March 31 in the Proceedings of the National Academy of Sciences. Martinez’s team’s findings could one day help computer engineers improve facial recognition software and help scientists better understand emotion-perception disorders such as schizophrenia. Citations S Du, Y. Tao and A. M. Martinez Compound facial expressions of emotion. Proceedings of the National Academy of Sciences. Published online March 30, 2014. Doi: 10.1073/pnas.1322355111. © Society for Science & the Public 2000 - 2013

Keyword: Emotions
Link ID: 19430 - Posted: 04.01.2014

By SAM WANG A STUDY published last week found that the brains of autistic children show abnormalities that are likely to have arisen before birth, which is consistent with a large body of previous evidence. Yet most media coverage focuses on vaccines, which do not cause autism and are given after birth. How can we help people separate real risks from false rumors? Over the last few years, we’ve seen an explosion of studies linking autism to a wide variety of genetic and environmental factors. Putting these studies in perspective is an enormous challenge. In a database search of more than 34,000 scientific publications mentioning autism since its first description in 1943, over half have come since 2008. As a statistically minded neuroscientist, I suggest a different approach that relies on a concept we are familiar with: relative odds. As a single common measuring stick to compare odds, I have chosen the “risk ratio,” a measure that allows the bigger picture to come into focus. For a variety of studies I asked the same question: How large is the increased risk for autism? My standard for comparison was the likelihood in the general population of autism spectrum disorder. Here’s an example. Start from the fact that the recorded rate of autism is now 1 in 68, according to a report released last week by the Centers for Disease Control and Prevention. If babies born in purple farmhouses have a rate of autism of 2 in 68, this doubling means that the purple farmhouse carries a risk ratio of 2. However, correlation is not causation, and there is no need to repaint that farmhouse just yet. © 2014 The New York Times Company

Keyword: Autism
Link ID: 19429 - Posted: 03.31.2014

by Catherine de Lange Why wait for the doctor to see you? A smart patch attached to your skin could diagnose health problems automatically – and even administer drugs. Monitoring movement disorders such as Parkinson's disease or epilepsy relies on video recordings of symptoms and personal surveys, says Dae-Hyeong Kim at the Seoul National University in South Korea. And although using wearable devices to monitor the vital signs of patients is theoretically possible, the wearable pads, straps and wrist bands that can do this are often cumbersome and inflexible. To track the progression of symptoms and the response to medication more accurately would require devices that monitor cues from the body, store recorded data for pattern analysis and deliver therapeutic agents through the human skin in a controlled way, Kim says. So Kim and his team have developed an adhesive patch that is flexible and can be worn on the wrist like a second skin. The patch is 1 millimetre thick and made of a hydrocolloid dressing – a type of thin flexible bandage. Into it they embedded a layer of silicon nanoparticles. These silicon nanomembranes are often used for flexible electronics, and can pick up the bend and stretch of human skin and convert these into small electronic signals. The signals are stored as data in separate memory cells made from layers of gold nanoparticles. The device could be used to detect and treat tremors in people who have Parkinson's disease, or epileptic seizures, says Kim. If these movements are detected, small heaters in the patch trigger the release of drugs from silicon nanoparticles. The patch also contains temperature sensors to make sure the skin doesn't burn during the process. © Copyright Reed Business Information Ltd.

Keyword: Parkinsons; Robotics
Link ID: 19428 - Posted: 03.31.2014

By ABIGAIL ZUGER, M.D. One legend says it all began when a North African herder saw his goats eat some wild berries, then frolic with unusual verve. Another story cites a few small leaves blown off a nearby bush into the Chinese emperor’s mug of hot water. Either way, whether caffeine entered the life of man by coffee bean or tea leaf, happiness ensued. Happiness, that is, for all but the poor souls charged with saving us from our drugs, for no regulatory challenge trumps the one posed by caffeine, molecule of elegant enjoyment and increasing abuse, man’s best friend and occasional killer. As Murray Carpenter makes clear in his methodical review, our society’s metrics are no match for this substance’s nuances, whether among athletes, teenagers, experimental subjects or the average dependent Joe. (Read an excerpt of “Caffeinated.”) Pure caffeine is a bitter white powder. In the body it blocks the effects of the molecule adenosine, a crucial brake on many physiologic processes. With just enough caffeine in the system, the body’s organs become a little more themselves: the brain a little brainier, the muscles a little springier, the blood vessels a little tighter, the digestion a little more efficient. With too much caffeine, all can accelerate into cardiac arrest. It takes only about 30 milligrams of caffeine (less than a cup of coffee or can of cola) for stimulative effects to be noticeable. A hundred milligrams a day will hook most people: They feel immensely unhappy without their daily fix, and the organs all whine in protest for a few days. It takes more than 10 grams to kill you — a dose impossible to achieve with traditional beverages alone. However, the new caffeine-rich energy shots make it alarmingly easy for party-minded people to achieve the zone between enough and much too much. © 2014 The New York Times Company

Keyword: Drug Abuse
Link ID: 19427 - Posted: 03.31.2014

By BRAYDEN KING and JERRY KIM THIS season Major League Baseball is allowing its officiating crews to use instant replay to review certain critical calls, including home runs, force plays and foul balls. But the calling of the strike zone — determining whether a pitch that is not swung at is a ball or a strike — will still be left completely to the discretion of the officials. This might seem an odd exception, since calling the strike zone may be the type of officiating decision most subject to human foible. In research soon to be published in the journal Management Science, we studied umpires’ strike-zone calls using pitch-location data compiled by the high-speed cameras introduced by Major League Baseball several years ago in an effort to measure, monitor and reward umpires’ accuracy. After analyzing more than 700,000 pitches thrown during the 2008 and 2009 seasons, we found that umpires frequently made errors behind the plate — about 14 percent of non-swinging pitches were called erroneously. Some of those errors occurred in fairly predictable ways. We found, for example, that umpires tended to favor the home team by expanding the strike zone, calling a strike when the pitch was actually a ball 13.3 percent of the time for home team pitchers versus 12.7 percent of the time for visitors. Other errors were more surprising. Contrary to the expectation (or hope) that umpires would be more accurate in important situations, we found that they were, in fact, more likely to make mistakes when the game was on the line. For example, our analyses suggest that umpires were 13 percent more likely to miss an actual strike in the bottom of the ninth inning of a tie game than in the top of the first inning, on the first pitch. © 2014 The New York Times Company

Keyword: Attention
Link ID: 19426 - Posted: 03.31.2014

By SABRINA TAVERNISE In 1972, researchers in North Carolina started following two groups of babies from poor families. In the first group, the children were given full-time day care up to age 5 that included most of their daily meals, talking, games and other stimulating activities. The other group, aside from baby formula, got nothing. The scientists were testing whether the special treatment would lead to better cognitive abilities in the long run. Forty-two years later, the researchers found something that they had not expected to see: The group that got care was far healthier, with sharply lower rates of high blood pressure and obesity, and higher levels of so-called good cholesterol. The study, which was published in the journal Science on Thursday, is part of a growing body of scientific evidence that hardship in early childhood has lifelong health implications. But it goes further than outlining the problem, offering evidence that a particular policy might prevent it. “This tells us that adversity matters and it does affect adult health,” said James Heckman, a professor of economics at the University of Chicago who led the data analysis. “But it also shows us that we can do something about it, that poverty is not just a hopeless condition.” The findings come amid a political push by the Obama administration for government-funded preschool for 4-year-olds. But a growing number of experts, Professor Heckman among them, say they believe that more effective public programs would start far earlier — in infancy, for example, because that is when many of the skills needed to take control of one’s life and become a successful adult are acquired. © 2014 The New York Times Company

Keyword: Development of the Brain; Learning & Memory
Link ID: 19425 - Posted: 03.29.2014

by Laura Sanders Ever-increasing numbers of autism diagnoses have parents worried about a skyrocketing epidemic, and this week’s news may only drive alarm higher. Perhaps it shouldn’t. In 2010, 1 in 68 (or 14.7 per 1,000) 8-year-olds had an autism spectrum disorder, the Centers for Disease Control and Prevention now estimates. That number is a substantial increase from 2008, which had an estimate of 1 in 88 (or 11.3 per 1,000). But the numbers might not reflect a spike in actual cases. Instead, the rise might be driven, at least in part, by an increase in diagnoses. The estimates are drawn from a collection of organizations that provide services to children with autism, including doctors, schools and social service agencies. As awareness builds and more people look for signs of autism, these numbers will keep going up. Regional spottiness suggests that better autism detection is feeding the increase. The autism rate in Alabama is just one in 175, while the rate in New Jersey is one in 45, the CDC reports. It would be surprising, and scientifically really important, if children in Alabama were truly much more protected from the disorder. Instead, differences in diagnosis rates are probably at play. If these alarmingly high numbers are driven by professionals and parents better spotting autism, that’s nothing to be alarmed at. On the contrary: This is good news. The earlier therapies begin, the better kids with autism do. That’s the idea behind CDC’s “Learn the Signs: Act Early” program to educate people about signs that something might be amiss with a child. So our best move is to find the kids who need help, and find them when they’re young. Most kids, including the ones in the new CDC survey, aren’t diagnosed with autism until about age 4 1/2. But whatever goes wrong happens long before then. © Society for Science & the Public 2000 - 2013.

Keyword: Autism
Link ID: 19424 - Posted: 03.29.2014

By Lenny Bernstein After 60 years of refusing, the people who run the Golden Gate Bridge are moving toward installing a suicide barrier, the New York Times reports. As soon as May, the Golden Gate Bridge, Highway and Transportation District is expected to approve construction of a steel mesh net 20 feet below the California landmark’s sidewalk. A record 46 people jumped to their deaths from the span in 2013, and another 118 were stopped before they could. According to the Times, they have tended to be younger than in the past. Experts have long known, and good research shows, that barriers are highly effective at halting suicides, the 10th-leading cause of death in the United States at 38,364 fatalities in 2010. This is true not just of bridges or other high places: locking up firearms and individually bubble-wrapping pills both limit suicides by those methods, said Jill Harkavy-Friedman, vice president of research for the American Foundation for Suicide Prevention. The key is the characteristics of a person on the verge of committing suicide, even someone who has been contemplating it for a while. Suicides are impulsive acts, and the people who commit them are not thinking clearly, have trouble solving problems, have difficulty shifting gears and weigh risks differently. If thwarted in that first, impulsive attempt, they often do not adjust and seek another way to take their lives, Harkavy-Friedman said. “In a suicidal crisis, it’s all about time,” she said. “They’re going to grab whatever is available. They don’t change gears if that is thwarted, because they have rigid thinking in that moment. They’re not thinking about dying. They’re thinking about ending the pain. © 1996-2014 The Washington Post

Keyword: Depression
Link ID: 19423 - Posted: 03.29.2014

By SINDYA N. BHANOO Monogamy is rare in animals. Even among species that pair off, there is often philandering. But a new genetic analysis adds to the evidence that the South American primates called Azara’s owl monkeys are remarkably faithful to their partners. The study confirms what one of its authors, Eduardo Fernandez-Duque, an evolutionary anthropologist at the University of Pennsylvania who leads the Owl Monkey Project, had long suspected. For 18 years, he and other Penn researchers have been observing the Azara’s owl monkey in the Chaco region of Argentina. Not only have they never witnessed a philanderer, but they have also found that infant owl monkeys get an unusual amount of care from their fathers. “The male plays with the infant and the male shares food with the infant even more than the mother,” Dr. Fernandez-Duque said. “The males care because these are their offspring, and this has a direct benefit in terms of reproductive success.” In the new study, published in the Proceedings of the Royal Society B, the researchers performed genetic analysis on 35 offspring born to 17 owl monkey pairs and confirmed that the parents were monogamous for the mating season. The monkey is the first primate and only the fifth mammal for which monogamy has been verified through genetics. Because paternal care is also seen in other species of owl monkeys, the scientists suspect that they, too, are serially monogamous. © 2014 The New York Times Company

Keyword: Sexual Behavior; Evolution
Link ID: 19422 - Posted: 03.29.2014

Nicola Davis The moment when 40-year old Joanne Milne, who has been deaf since birth, first hears sound is heart-wrenching scene. Amateur footage showing her emotional reaction has taken social media by storm and touched viewers across the world, reinforcing the technological triumph of cochlear implants. It’s a story I have touched on before. Earlier this month I wrote about how cochlear implants changed the lives of the Campbells whose children Alice and Oliver were born with the condition auditory neuropathy spectrum disorder (ANSD). Implants, together with auditory verbal therapy, have allowed them to embrace the hearing world. It was incredibly moving to glimpse the long and difficult journey this family had experienced, and the joy that hearing - a sense so many of us take for granted - can bring. Cochlear implants are not a ‘cure’ for deafness. They make use of electrodes to directly stimulate auditory nerve fibres in the cochlea of the inner ear, creating a sense of sound that is not the same as that which hearing people experience, but nevertheless allows users to perceive speech, develop language and often enjoy music. As an adult Milne, who was born with the rare condition Usher syndrome, is unusual in receiving cochlear implants on both sides. Such bilateral implantation enables users to work out where sounds are coming from, enhances speech perception in bustling environments and means that should something go wrong with one device, the user isn’t cut off from the hearing world. © 2014 Guardian News

Keyword: Hearing; Robotics
Link ID: 19421 - Posted: 03.29.2014

By Helen Briggs BBC News When it comes to detecting lies, you should trust your instinct, research suggests. We are better at identifying liars when we rely on initial responses rather than thinking about it, say psychologists. Generally we are poor at spotting liars - managing only slightly better than flipping a coin. But our success rate rises when we harness the unconscious mind, according to a report in Psychological Science. "What interested us about the unconscious mind is that it just might really be the seat of where accurate lie detection lives," said Dr Leanne ten Brinke of the University of California, Berkeley. "So if our ability to detect lies is not conscious - we simply can't do this when we're thinking hard about it - then maybe it lives somewhere else, and so we thought one possible explanation was the unconscious mind." When trying to find out if someone is lying, most people rely on cues like someone averting their gaze or appearing nervous. However, research suggests this is not accurate - people perform at only about 50% accuracy in traditional lie detection tasks. Psychologists at the University of California were puzzled by this, as some primates, such as chimps, are able to detect deceit - and evolutionary theory supposes that it maximises survival and reproductive success. Dr Ten Brinke and colleagues devised experiments to test the ability of the unconscious mind to spot a liar, to see if they could do better than the conscious mind. BBC © 2014

Keyword: Emotions
Link ID: 19420 - Posted: 03.29.2014