Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 10181 - 10200 of 29322

Virginia Morell If we humans inhale oxytocin, the so-called “love hormone,” we become more trusting, cooperative, and generous. Scientists have shown that it’s the key chemical in the formation of bonds between many mammalian species and their offspring. But does oxytocin play the same role in social relationships that aren’t about reproduction? To find out, scientists in Japan sprayed either oxytocin or a saline spray into the nostrils of 16 pet dogs, all more than 1 year old. The canines then joined their owners, who were seated in another room and didn’t know which treatment their pooch had received. The owners were instructed to ignore any social response from their dogs. But those Fidos that inhaled the oxytocin made it tough for their masters not to break the rule. A statistical analysis showed the canines were more likely to sniff, lick, and paw at their people than were those given the saline solution. The amount of time that the oxytocin-enhanced dogs spent close to their owners, staring at their eyes, was also markedly higher, the scientists report online today in the Proceedings of the National Academy of Sciences. Getting a whiff of oxytocin also made the dogs friendlier toward their dog pals as determined by the amount of time they spent in close proximity to their buddies. The study supports the idea, the scientists say, that oxytocin isn’t just produced in mammals during reproductive events. It’s also key to forming and maintaining close social relationships—even when those are with unrelated individuals or different species. © 2014 American Association for the Advancement of Science.

Keyword: Hormones & Behavior
Link ID: 19716 - Posted: 06.10.2014

Claudia M. Gold Tom Insel, director of the National Institute of Mental Health (NIMH,) in his recent blog post "Are Children Overmedicated?" seems to suggest that perhaps more psychiatric medication is in order. Comparing mental illness in children to food allergies, he dismisses the "usual" explanations given for the increase medication prescribing patterns. In his view, these explanations are: Blaming psychiatrists who are too busy to provide therapy, parents who are too busy to provide a stable home environment, drug companies for marketing their products, and schools for lack of recess. By concluding that perhaps the explanation for the increase in prescribing of psychiatric medication to children is a greater number of children with serious psychiatric illness, Insel shows a lack of recognition of the complexity of the situation. When a recent New York Times article, that Insel makes reference to, reported on the rise in prescribing of psychiatric medication for toddlers diagnosed with ADHD, with a disproportionate number coming from families of poverty, one clinician remarked that if this is an attempt to medicate social and economic issues, then we have a huge problem. He was on to something. In conversations with pediatricians (the main prescribers of these medications) and child psychiatrists on the front lines, I find many in a reactive stance. When people feel overwhelmed, they go into survival mode, with their immediate aim just to get through the day. They find themselves prescribing medication because they have no other options.

Keyword: ADHD; Drug Abuse
Link ID: 19715 - Posted: 06.10.2014

by Ashley Yeager Being put under anesthesia as an infant may make it harder for a person to recall details or events when they grow older. Previous studies on animals had shown that anesthesia impairs parts of the brain that help with recollection. But it was not clear how this type of temporary loss of consciousness affected humans. Comparing the memory of 28 children ages 6 to 11 who had undergone anesthesia as infants to 28 children similar in age who had not been put under suggests that the early treatment impairs recollection later in life, researchers report June 9 in Neuropsychopharmacology. The team reported similar results for a small study on rats and notes that early anesthesia did not appear to affect the children's familiarity with objects and events or their IQ. © Society for Science & the Public 2000 - 2013.

Keyword: Learning & Memory; Development of the Brain
Link ID: 19714 - Posted: 06.10.2014

Jane J. Lee Could've, should've, would've. Everyone has made the wrong choice at some point in life and suffered regret because of it. Now a new study shows we're not alone in our reaction to incorrect decisions. Rats too can feel regret. Regret is thinking about what you should have done, says David Redish, a neuroscientist at the University of Minnesota in Minneapolis. It differs from disappointment, which you feel when you don't get what you expected. And it affects how you make decisions in the future. (See "Hand Washing Wipes Away Regrets?") If you really want to study emotions or feelings like regret, says Redish, you can't just ask people how they feel. So when psychologists and economists study regret, they look for behavioral and neural manifestations of it. Using rats is one way to get down into the feeling's neural mechanics. Redish and colleague Adam Steiner, also at the University of Minneapolis, found that rats expressed regret through both their behavior and their neural activity. Those signals, researchers report today in the journal Nature Neuroscience, were specific to situations the researchers set up to induce regret, which led to specific neural patterns in the brain and in behavior. When Redish and Steiner looked for neural activity, they focused on two areas known in people—and in some animals—to be involved in decision-making and the evaluation of expected outcomes: the orbitofrontal cortex and the ventral striatum. Brain scans have revealed that people with a damaged orbitofrontal cortex, for instance, don't express regret. To record nerve-cell activity, the researchers implanted electrodes in the brains of four rats—a typical sample size in this kind of experiment—then trained them to run a "choice" maze. © 1996-2014 National Geographic Society

Keyword: Attention; Emotions
Link ID: 19713 - Posted: 06.09.2014

By Chris Wodskou, CBC News For the past 25 years, people suffering from depression have been treated with antidepressant drugs like Zoloft, Prozac and Paxil — three of the world’s best-selling selective serotonin reuptake inhibitors, or SSRIs. But people are questioning whether these drugs are the appropriate treatment for depression, and if they could even be causing harm. The drugs are designed to address a chemical imbalance in the brain and thereby relieve the symptoms of depression. In this case, it’s a shortage of serotonin that antidepressants work to correct. In fact, there are pharmaceutical treatments targeting chemical imbalances for just about every form of mental illness, from schizophrenia to ADHD, and a raft of anxiety disorders. Hundreds of millions of prescriptions are written for antipsychotic, antidepressant and anti-anxiety medications every year in the United States alone, producing billions of dollars in revenue for pharmaceutical companies. But what if the very premise behind these drugs is flawed? What if mental illnesses like depression aren’t really caused by chemical imbalances, and that millions of the people who are prescribed those drugs derive no benefit from them? And what if those drugs could actually make their mental illness worse and more intractable over the long term? Investigative journalist Robert Whitaker argued that psychiatric drugs are a largely ineffective way of treating mental illness in his 2010 book called Anatomy of an Epidemic: Magic Bullets, Psychiatric Drugs and the Astonishing Rise of Mental Illness in America. Whitaker maintains that the foundation of modern psychiatry, the chemical imbalance model, is scientifically unproven. © CBC 2014

Keyword: Depression
Link ID: 19712 - Posted: 06.09.2014

By JOHN COATES SIX years after the financial meltdown there is once again talk about market bubbles. Are stocks succumbing to exuberance? Is real estate? We thought we had exorcised these demons. It is therefore with something close to despair that we ask: What is it about risk taking that so eludes our understanding, and our control? Part of the problem is that we tend to view financial risk taking as a purely intellectual activity. But this view is incomplete. Risk is more than an intellectual puzzle — it is a profoundly physical experience, and it involves your body. Risk by its very nature threatens to hurt you, so when confronted by it your body and brain, under the influence of the stress response, unite as a single functioning unit. This occurs in athletes and soldiers, and it occurs as well in traders and people investing from home. The state of your body predicts your appetite for financial risk just as it predicts an athlete’s performance. If we understand how a person’s body influences risk taking, we can learn how to better manage risk takers. We can also recognize that mistakes governments have made have contributed to excessive risk taking. Consider the most important risk manager of them all — the Federal Reserve. Over the past 20 years, the Fed has pioneered a new technique of influencing Wall Street. Where before the Fed shrouded its activities in secrecy, it now informs the street in as clear terms as possible of what it intends to do with short-term interest rates, and when. Janet L. Yellen, the chairwoman of the Fed, declared this new transparency, called forward guidance, a revolution; Ben S. Bernanke, her predecessor, claimed it reduced uncertainty and calmed the markets. But does it really calm the markets? Or has eliminating uncertainty in policy spread complacency among the financial community and actually helped inflate market bubbles? We get a fascinating answer to these questions if we turn from economics and look into the biology of risk taking. © 2014 The New York Times Company

Keyword: Attention; Stress
Link ID: 19711 - Posted: 06.09.2014

By Jonathan Webb Science reporter, BBC News A new theory suggests that our male ancestors evolved beefy facial features as a defence against fist fights. The bones most commonly broken in human punch-ups also gained the most strength in early "hominin" evolution. They are also the bones that show most divergence between males and females. The paper, in the journal Biological Reviews, argues that the reinforcements evolved amid fighting over females and resources, suggesting that violence drove key evolutionary changes. For many years, this extra strength was seen as an adaptation to a tough diet including nuts, seeds and grasses. But more recent findings, examining the wear pattern and carbon isotopes in australopith teeth, have cast some doubt on this "feeding hypothesis". "In fact, [the australopith] boisei, the 'nutcracker man', was probably eating fruit," said Prof David Carrier, the new theory's lead author and an evolutionary biologist at the University of Utah. Masculine armour Instead of diet, Prof Carrier and his co-author, physician Dr Michael Morgan, propose that violent competition demanded the development of these facial fortifications: what they call the "protective buttressing hypothesis". In support of their proposal, Carrier and Morgan offer data from modern humans fighting. Several studies from hospital emergency wards, including one from the Bristol Royal Infirmary, show that faces are particularly vulnerable to violent injuries. BBC © 2014

Keyword: Aggression; Evolution
Link ID: 19710 - Posted: 06.09.2014

Jennifer Couzin-Frankel What if you could trick your body into thinking you were racing on a treadmill—and burning off calories at a rapid clip—while simply walking down the street? Changing our rate of energy expenditure is still far into the future, but work in mice explores how this might happen. Two teams of scientists suggest that activating immune cells in fat can convert the tissue from a type of fat that stores energy to one that burns it, opening up potential new therapies for obesity and diabetes. There are two types of fat in humans: white adipose tissue, which makes up nearly all the fat in adults, and brown adipose tissue, which is found in babies but disappears as they age. Brown fat protects against the cold (it’s also common in animals that hibernate), and researchers have found that mice exposed to cold show a temporary “browning” of some of their white fat. The same effect occurred in preliminary studies of people, where the browning—which creates a tissue known as beige fat—helps generate heat and burn calories. But cold is “the only stimulus we know that can increase beige fat mass or brown fat mass,” says Ajay Chawla, a physiologist at the University of California (UC), San Francisco. He wanted to better understand how cold caused this change in the tissue and whether there was a way to mimic cold and induce browning some other way. A few years ago, Chawla’s group had reported that cold exposure activated macrophages, a type of immune cell, in white adipose tissue. To further untangle what was going on, Chawla, his postdoc Yifu Qiu, and their colleagues used mice that lacked interleukin-4 (IL-4) and interleukin-13, proteins that help activate macrophages. When they exposed these mice to the cold, the animals developed far fewer beige fat cells than did normal animals, suggesting that macrophages were key to browning of white fat. © 2014 American Association for the Advancement of Science

Keyword: Obesity
Link ID: 19709 - Posted: 06.07.2014

Haroon Siddique The forehead and fingertips are the most sensitive parts to pain, according to the first map created by scientists of how the ability to feel pain varies across the human body. It is hoped that the study, in which volunteers had pain inflicted without touching them, could help the estimated 10 million people in the UK who suffer from chronic pain by allowing physicians to use lasers to monitor nerve damage across the body. This would offer a quantitative way to monitor the progression or regression of a condition. Lead author Dr Flavia Mancini, of the UCL Institute of Cognitive Neuroscience, said: "Acuity for touch has been known for more than a century, and tested daily in neurology to assess the state of sensory nerves on the body. It is striking that until now nobody had done the same for pain." In the study, a pair of lasers were used to cause brief sensation of pinprick pain to 26 blindfolded healthy volunteers on various parts of their body without any touch, in order to define our ability to identify where it hurts, known as "spatial acuity". Sometimes only one laser would be activated, and sometimes both. The participants were asked whether they felt one sting or two, at varying distances between the two beams and researchers recorded the minimum distance between the beams at which people were able to accurately say whether it was one sting or two. © 2014 Guardian News and Media Limited

Keyword: Pain & Touch
Link ID: 19708 - Posted: 06.07.2014

by Laura Sanders Transplanted cells can flourish for over a decade in the brain of a person with Parkinson’s disease, scientists write in the June 26 Cell Reports. Finding that these cells have staying power may encourage clinicians to pursue stem cell transplants, a still-experimental way to counter the brain deterioration that comes with Parkinson’s. Penelope Hallett of Harvard University and McLean Hospital in Belmont, Mass., and colleagues studied postmortem brain tissue from five people with advanced Parkinson’s. The five had received stem cell transplants between four and 14 years earlier. In all five people’s samples, neurons that originated from the transplanted cells showed signs of good health and appeared capable of sending messages with the brain chemical dopamine, a neurotransmitter that Parkinson’s depletes. Results are mixed about whether these transplanted cells are a good way to ease Parkinson’s symptoms. Some patients have shown improvements after the new cells stitched themselves into the brain, while others didn’t benefit from them. The cells can also cause unwanted side effects such as involuntary movements. P. J. Hallett et al. Long-term health of dopaminergic neuron transplants in Parkinson’s disease patients. Cell Reports. Vol. 7, June 26, 2014. doi: 10.1016/j.celrep.2014.05.027. © Society for Science & the Public 2000 - 2013

Keyword: Parkinsons; Stem Cells
Link ID: 19707 - Posted: 06.07.2014

By C. CLAIBORNE RAY Q. Does the slit shape of a cat’s pupil confer any advantages over the more rounded pupils of other animals? A. “There are significant advantages,” said Dr. Richard E. Goldstein, chief medical officer of the Animal Medical Center in New York City. “A cat can quickly adjust to different lighting conditions, control the amount of light that reaches the eye and see in almost complete darkness,” he said. “Moreover, the slit shape protects the sensitive retina in daylight.” The slit-shaped pupil found in many nocturnal animals, including the domestic cat, presumably allows more effective control of how much light reaches the retina, in terms of both speed and completeness. “A cat has the capacity to alter the intensity of light falling on its retina 135-fold, compared to tenfold in a human, with a circular pupil,” Dr. Goldstein said. “A cat’s eye has a large cornea, which allows more light into the eye, and a slit pupil can dilate more than a round pupil, allowing more light to enter in dark conditions.” Cats have other visual advantages as well, Dr. Goldstein said. A third eyelid, between the regular eyelids and the cornea, protects the globe and also has a gland at the bottom that produces extra tears. The eyes’ location, facing forward in the front of the skull, gives cats a large area of binocular vision, providing depth perception and helping them to catch prey. © 2014 The New York Times Company

Keyword: Vision
Link ID: 19706 - Posted: 06.07.2014

by Moheb Costandi Rest easy after learning a new skill. Experiments in mice suggest that a good night's sleep helps us lay down memories by promoting the growth of new connections between brain cells. Neuroscientists believe that memory involves the modification of synapses, which connect brain cells, and numerous studies published over the past decade have shown that sleep enhances the consolidation of newly formed memories in people. But exactly how these observations were related was unclear. To find out, Wenbiao Gan of the Skirball Institute of Biomolecular Medicine at New York University Medical School and his colleagues trained 15 mice to run backwards or forwards on a rotating rod. They allowed some of them to fall asleep afterwards for 7 hours, while the rest were kept awake. The team monitored the activity and microscopic structure of the mice's motor cortex, the part of the brain that controls movement, through a small transparent "window" in their skulls. This allowed them to watch in real time how the brain responded to learning the different tasks. Sprouting spines They found that learning a new task led to the formation of new dendritic spines – tiny structures that project from the end of nerve cells and help pass electric signals from one neuron to another – but only in the mice left to sleep. This happened during the non-rapid eye movement stage of sleep. Each task caused a different pattern of spines to sprout along the branches of the same motor cortex neurons. © Copyright Reed Business Information Ltd.

Keyword: Sleep; Learning & Memory
Link ID: 19705 - Posted: 06.06.2014

By Meeri Kim, Many of us find ourselves swimming along in the tranquil sea of life when suddenly a crisis hits — a death in the family, the loss of a job, a bad breakup. Some power through and find calm waters again, while others drown in depression. Scientists continue to search for the underlying genes and neurobiology that dictate our reactions to stress. Now, a study using mice has found a switch-like mechanism between resilience and defeat in an area of the brain that plays an important role in regulating emotions and has been linked with mood and anxiety disorders. (Bo Li/Cold Spring Harbor Laboratory) - Researchers at Cold Spring Harbor Laboratory identify the neurons in the brain that determine if a mouse will learn to cope with stress or become depressed. These neurons, located in a region of the brain known as the medial prefrontal cortex (top, green image) become hyperactive in depressed mice. The bottom panel is close-up of above image - yellow indicates activation. The team showed that this enhanced activity causes depression. After artificially enhancing the activity of neurons in that part of the brain — the medial prefrontal cortex — mice that previously fought to avoid electric shocks started to act helpless. Rather than leaping for an open escape route, they sat in a corner taking the pain — presumably out of a belief that nothing they could do would change their circumstances. “This helpless behavior is quite similar to what clinicians see in depressed individuals — an inability to take action to avoid or correct a difficult situation,” said study author and neuroscientist Bo Li of the Cold Spring Harbor Laboratory in New York. The results were published online May 27 in the Journal of Neuroscience. © 1996-2014 The Washington Post

Keyword: Depression
Link ID: 19704 - Posted: 06.06.2014

By ANNA NORTH The “brain” is a powerful thing. Not the organ itself — though of course it’s powerful, too — but the word. Including it in explanations of human behavior might make those explanations sound more legitimate — and that might be a problem. Though neuroscientific examinations of everyday experiences have fallen out of favor somewhat recently, the word “brain” remains popular in media. Ben Lillie, the director of the science storytelling series The Story Collider, drew attention to the phenomenon last week on Twitter, mentioning in particular a recent Atlantic article: “Your Kid’s Brain Might Benefit From an Extra Year in Middle School.” In the piece, Jessica Lahey, a teacher and education writer, examined the benefits of letting kids repeat eighth grade. Mr. Lillie told Op-Talk the word “brain” could take the emphasis off middle-school students as people: The piece, he said, was “not ignoring the fact that the middle schooler (in this case) is a person, but somehow taking a quarter-step away by focusing on a thing we don’t really think of as human.” The New York Times isn’t immune to “brain”-speak — in her 2013 project “Brainlines,” the artist Julia Buntaine collected all Times headlines using the word “brain” since 1851. She told Op-Talk in an email that “the number of headlines about the brain increased exponentially since around the year 2000, where some years before there were none at all, after that there were at least 30, 40, 80 headlines.” Adding “brain” to a headline may make it sound more convincing — some research shows that talking about the brain has measurable effects on how people respond to scientific explanations. In a 2008 study, researchers found that adding phrases like “brain scans indicate” to explanations of psychological concepts like attention made those explanations more satisfying to nonexpert audiences. Perhaps disturbingly, the effect was greatest when the explanations were actually wrong. © 2014 The New York Times Company

Keyword: Miscellaneous
Link ID: 19703 - Posted: 06.06.2014

Neil Levy Can human beings still be held responsible in the age of neuroscience? Some people say no: they say once we understand how the brain processes information and thereby causes behaviour, there’s nothing left over for the person to do. This argument has not impressed philosophers, who say there doesn’t need to be anything left for the person to do in order to be responsible. People are not anything over and above the causal systems involved in information processing, we are our brains (plus some other, equally physical stuff). We are responsible if our information processing systems are suitably attuned to reasons, most philosophers think. There are big philosophical debates concerning what it takes to be suitably attuned to reasons, and whether this is really enough for responsibility. But I want to set those debates aside here. It’s more interesting to ask what we can learn from neuroscience about the nature of responsibility and about when we’re responsible. Even if neuroscience doesn’t tell us that no one is ever responsible, it might be able to tell us if particular people are responsible for particular actions. A worthy case study Consider a case like this: early one morning in 1987, a Canadian man named Ken Parks got up from the sofa where he had fallen asleep and drove to his parents’-in-law house. There he stabbed them both before driving to the police station, where he told police he thought he had killed someone. He had: his mother-in-law died from her injuries. © 2010–2014, The Conversation Trust (UK)

Keyword: Consciousness; Emotions
Link ID: 19702 - Posted: 06.06.2014

By Sadie Dingfelder Want to become famous in the field of neuroscience? You could go the usual route, spending decades collecting advanced degrees, slaving away in science labs and publishing your results. Or you could simply fall victim to a freak accident. The stars of local science writer Sam Kean’s new book, “The Tale of the Dueling Neurosurgeons,” (which he’ll discuss Saturday at Politics and Prose) took the latter route. Be it challenging the wrong guy to a joust, spinning out on a motorcycle, or suffering from a stroke, these folks sustained brain injuries with bizarre and fascinating results. One man, for instance, lost the ability to identify different kinds of animals but had no trouble naming plants and objects. Another man lost his short-term memory. The result? A diary filled with entries like: “I am awake for the very first time.” “Now, I’m really awake.” “Now, I’m really, completely awake.” Unfortunate mishaps like these have advanced our understanding of how the gelatinous gray mass that (usually) stays hidden inside our skulls gives rise to thoughts, feelings and ideas, Kean says. “Traditionally, every major discovery in the history of neuroscience came about this way,” he says. “We had no other way of looking at the brain for centuries and centuries, because we didn’t have things like MRI machines.” Rather than covering the case studies textbook-style, Kean provides all the gory details. Consider Phineas Gage. You may remember from Psych 101 that Gage, a railroad worker, survived having a metal rod launched through his skull. You might not know, however, that one doctor “shaved Gage’s scalp and peeled off the dried blood and gelatinous brains. He then extracted skull fragments from the wound by sticking his fingers in from both ends, Chinese-finger-trap-style,” as Kean writes in his new book. © 1996-2014 The Washington Post

Keyword: Learning & Memory; Attention
Link ID: 19701 - Posted: 06.06.2014

By Ian Randall The human tongue may have a sixth sense—and no, it doesn’t have anything to do with seeing ghosts. Researchers have found that in addition to recognizing sweet, sour, salty, savory, and bitter tastes, our tongues can also pick up on carbohydrates, the nutrients that break down into sugar and form our main source of energy. Past studies have shown that some rodents can distinguish between sugars of different energy densities, while others can still tell carbohydrate and protein solutions apart even when their ability to taste sweetness is lost. A similar ability has been proposed in humans, with research showing that merely having carbohydrates in your mouth can improve physical performance. How this works, however, has been unclear. In the new study, to be published in Appetite, the researchers asked participants to squeeze a sensor held between their right index finger and thumb when shown a visual cue. At the same time, the participants’ tongues were rinsed with one of three different fluids. The first two were artificially sweetened—to identical tastes—but with only one containing carbohydrate; the third, a control, was neither sweet nor carb-loaded. When the carbohydrate solution was used, the researchers observed a 30% increase in activity for the brain areas that control movement and vision. This reaction, they propose, is caused by our mouths reporting that additional energy in the form of carbs is coming. The finding may explain both why diet products are often viewed as not being as satisfying as their real counterparts and why carbohydrate-loaded drinks seem to immediately perk up athletes—even before their bodies can convert the carbs to energy. Learning more about how this “carbohydrate sense” works could lead to the development of artificially sweetened foods, the researchers propose, “as hedonistically rewarding as the real thing.” © 2014 American Association for the Advancement of Science

Keyword: Chemical Senses (Smell & Taste)
Link ID: 19700 - Posted: 06.06.2014

By Jenny Graves The claim that homosexual men share a “gay gene” created a furor in the 1990s. But new research two decades on supports this claim – and adds another candidate gene. To an evolutionary geneticist, the idea that a person’s genetic makeup affects their mating preference is unsurprising. We see it in the animal world all the time. There are probably many genes that affect human sexual orientation. But rather than thinking of them as “gay genes,” perhaps we should consider them “male-loving genes.” They may be common because these variant genes, in a female, predispose her to mate earlier and more often and to have more children. Likewise, it would be surprising if there were not “female-loving genes” in lesbian women that, in a male, predispose him to mate earlier and have more children. We can detect genetic variants that produce differences between people by tracking traits in families that display differences. Patterns of inheritance reveal variants of genes (called “alleles”) that affect normal differences, such as hair color, or disease states, such as sickle cell anemia. Quantitative traits, such as height, are affected by many different genes, as well as environmental factors. It’s hard to use these techniques to detect genetic variants associated with male homosexuality partly because many gay men prefer not to be open about their sexuality. It is even harder because, as twin studies have shown, shared genes are only part of the story. Hormones, birth order and environment play roles, too.

Keyword: Sexual Behavior; Genes & Behavior
Link ID: 19699 - Posted: 06.06.2014

By Kelly Servick During the World Cup next week, there may be 1 minute during the opening ceremony when the boisterous stadium crowd in São Paulo falls silent: when a paraplegic young person wearing a brain-controlled, robotic exoskeleton attempts to rise from a wheelchair, walk several steps, and kick a soccer ball. The neuroscientist behind the planned event, Miguel Nicolelis, is familiar with the spotlight. His lab at Duke University in Durham, North Carolina, pioneered brain-computer interfaces, using surgically implanted electrodes to read neural signals that can control robotic arms. Symbolically, the project is a homecoming for Nicolelis. He has portrayed it as a testament to the scientific progress and potential of his native Brazil, where he founded and directs the International Institute of Neuroscience of Natal. The press has showered him with attention, and the Brazilian government chipped in nearly $15 million in support. But scientifically, the project is a departure. Nicolelis first intended the exoskeleton to read signals from implanted electrodes, but decided instead to use a noninvasive, EEG sensor cap. That drew skepticism from Nicolelis’s critics—and he has a few—that the system wouldn’t really be a scientific advance. Others have developed crude EEG-based exoskeletons, they note, and it will be impossible to tell from the demo how this system compares. A bigger concern is that the event could generate false hope for paralyzed patients and give the public a skewed impression of the field’s progress. © 2014 American Association for the Advancement of Science

Keyword: Robotics
Link ID: 19698 - Posted: 06.06.2014

By JAMES GORMAN The National Institutes of Health set an ambitious $4.5 billion price tag on its part of President Obama’s Brain Initiative on Thursday, stamping it as an effort on the scale of the Human Genome Project. The goals of the Brain Initiative were clearly grand when Mr. Obama announced it a year ago — nothing less than developing and applying new technology to crack the toughest unsolved puzzles of how the brains of humans and animals function. The hope is to lay a foundation for future advances in the medical treatment of brain disorders. But the initiative began with $110 million budgeted for 2014, shared by three major entities: the National Science Foundation; the Defense Advanced Research Projects Agency; and the N.I.H., which has a $40 million share. By calling for such a major commitment, to be spread over 12 years, the institutes answered concerns among neuroscientists about the initial level of funding. “This is a realistic amount of money,” said Dr. Eric R. Kandel, director of the Kavli Institute for Brain Science at Columbia University, who, like some other neuroscientists, had been skeptical of what could be accomplished with the funding committed when the initiative was announced about a year ago. Gerald Rubin, the executive director of the Janelia Farm Research Campus in Virginia, also found that this budget request allayed some of his concerns, but not all. “I am much more concerned about convincing Congress to fund the Brain Initiative at this level,” he said. © 2014 The New York Times Company

Keyword: Brain imaging
Link ID: 19697 - Posted: 06.06.2014