Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By BARBARA EHRENREICH MY atheism is hard-core, rooted in family tradition rather than adolescent rebellion. According to family legend, one of my 19th-century ancestors, a dirt-poor Irish-American woman in Montana, expressed her disgust with the church by vehemently refusing last rites when she lay dying in childbirth. From then on, we were atheists and rationalists, a stance I perpetuated by opting, initially, for a career in science. How else to understand the world except as the interaction of tiny bits of matter and mathematically predictable forces? There were no gods or spirits, just our own minds pressing up against the unknown. But something happened when I was 17 that shook my safely rationalist worldview and left me with a lifelong puzzle. Years later, I learned that this sort of event is usually called a mystical experience, and I can see in retrospect that the circumstances had been propitious: Thanks to a severely underfunded and poorly planned skiing trip, I was sleep-deprived and probably hypoglycemic that morning in 1959 when I stepped out alone, walked into the streets of Lone Pine, Calif., and saw the world — the mountains, the sky, the low scattered buildings — suddenly flame into life. There were no visions, no prophetic voices or visits by totemic animals, just this blazing everywhere. Something poured into me and I poured out into it. This was not the passive beatific merger with “the All,” as promised by the Eastern mystics. It was a furious encounter with a living substance that was coming at me through all things at once, too vast and violent to hold on to, too heartbreakingly beautiful to let go of. It seemed to me that whether you start as a twig or a gorgeous tapestry, you will be recruited into the flame and made indistinguishable from the rest of the blaze. I felt ecstatic and somehow completed, but also shattered. © 2014 The New York Times Company
Keyword: Schizophrenia; Emotions
Link ID: 19452 - Posted: 04.07.2014
By GRETCHEN REYNOLDS Age-related vision loss is common and devastating. But new research suggests that physical activity might protect our eyes as we age. There have been suggestions that exercise might reduce the risk of macular degeneration, which occurs when neurons in the central part of the retina deteriorate. The disease robs millions of older Americans of clear vision. A 2009 study of more than 40,000 middle-aged distance runners, for instance, found that those covering the most miles had the least likelihood of developing the disease. But the study did not compare runners to non-runners, limiting its usefulness. It also did not try to explain how exercise might affect the incidence of an eye disease. So, more recently, researchers at Emory University in Atlanta and the Atlanta Veterans Administration Medical Center in Decatur, Ga., took up that question for a study published last month in The Journal of Neuroscience. Their interest was motivated in part by animal research at the V.A. medical center. That work had determined that exercise increases the levels of substances known as growth factors in the animals’ bloodstream and brains. These growth factors, especially one called brain-derived neurotrophic factor, or B.D.N.F., are known to contribute to the health and well-being of neurons and consequently, it is thought, to improvements in brain health and cognition after regular exercise. But the brain is not the only body part to contain neurons, as the researchers behind the new study knew. The retina does as well, and the researchers wondered whether exercise might raise levels of B.D.N.F. there, too, potentially affecting retinal health and vision. © 2014 The New York Times Company
Keyword: Vision
Link ID: 19451 - Posted: 04.07.2014
by Clare Wilson A genetic tweak can make light work of some nervous disorders. Using flashes of light to stimulate modified neurons can restore movement to paralysed muscles. A study demonstrating this, carried out in mice, lays the path for using such "optogenetic" approaches to treat nerve disorders ranging from spinal cord injury to epilepsy and motor neuron disease. Optogenetics has been hailed as one of the most significant recent developments in neuroscience. It involves genetically modifying neurons so they produce a light-sensitive protein, which makes them "fire", sending an electrical signal, when exposed to light. So far optogenetics has mainly been used to explore how the brain works, but some groups are exploring using it as therapy. One stumbling block has been fears about irreversibly genetically manipulating the brain. In the latest study, a team led by Linda Greensmith of University College London altered mouse stem cells in the lab before transplanting them into nerves in the leg – this means they would be easier to remove if something went wrong. "It's a very exciting approach that has a lot of potential," says Ziv Williams of Harvard Medical School in Boston. Greensmith's team inserted an algal gene that codes for a light-responsive protein into mouse embryonic stem cells. They then added signalling molecules to make the stem cells develop into motor neurons, the cells that carry signals to and from the spinal cord to the rest of the body. They implanted these into the sciatic nerve – which runs from the spinal cord to the lower limbs – of mice whose original nerves had been cut. © Copyright Reed Business Information Ltd.
Keyword: Movement Disorders
Link ID: 19450 - Posted: 04.05.2014
By Deborah Serani Sometimes I work with children and adults who can’t put words to their feelings and thoughts. It’s not that they don’t want to – it’s more that they don’t know how. The clinical term for this experience is alexithymia and is defined as the inability to recognize emotions and their subtleties and textures [1]. Alexithymia throws a monkey wrench into a person’s ability to know their own self-experience or understand the intricacies of what others feel and think. Here are a few examples those with alexithymia experience: Difficulty identifying different types of feelings Limited understanding of what causes feelings Difficulty expressing feelings Difficulty recognizing facial cues in others Limited or rigid imagination Constricted style of thinking Hypersensitive to physical sensations Detached or tentative connection to others Alexithymia was first mentioned as a psychological construct in 1976 and was viewed as a deficit in emotional awareness [2]. Research suggests that approximately 8% of males and 2% of females experience alexithymia, and that it can come in mild, moderate and severe intensities [3]. Studies also show that alexithymia has two dimensions – a cognitive dimension, where a child or adult struggles to identify, interpret and verbalize feelings (the “thinking” part of our emotional experience). And an affective dimension, where difficulties arise in reacting, expressing, feeling and imagining (the “experiencing” part of our emotional experience) [4]. © 2014 Scientific American
Keyword: Emotions; Language
Link ID: 19449 - Posted: 04.05.2014
By LISA SANDERS, M.D. On Thursday, we challenged Well readers to solve the mystery of a 23-year-old man with episodes of aggressive, manic behavior that couldn’t be controlled. Nearly 1,000 readers wrote in with their take on this terrifying case. More than 300 of you got the right class of disease, and 21 of you nailed the precise form of the disorder. Amazing! The correct diagnosis is … Variegate porphyria The first person with the correct answer was Francis Graziano, a 23-year-old recent graduate of the University of Michigan. His major in neuroscience really gave him a leg up on this case, he told me. He recalled a case he read of a young Vietnam veteran with symptoms of porphyria. He’s a surgical technician right now, waiting to hear where he’ll be going to medical school next year. Strong work, Dr.-to-be Graziano! The Diagnosis: The word porphyria comes from the ancient Greek word for purple, “porphyra,” because patients with this disease can have purplish-red urine, tears or saliva. The porphyrias are a group of rare genetic diseases that develop in patients born without the machinery to make certain essential body chemicals, including one of the most important parts of blood known as heme. This compound makes up the core of the blood component hemoglobin. (The presence of heme is why blood is red.) Patients who can’t make heme correctly end up with too much of its chemical precursors, known as porphyrins. The excess porphyrins injure tissues throughout the body, but especially in the nervous system. The disorder is characterized by frequent episodes of debilitating back or abdominal pain and is often accompanied by severe psychiatric symptoms. Patients with porphyria do not respond to most psychiatric medications. Indeed, many of these drugs make the symptoms of porphyria worse. © 2014 The New York Times Company
Keyword: Schizophrenia
Link ID: 19448 - Posted: 04.05.2014
David Adam The day the Brazilian racing driver Ayrton Senna died in a crash, I was stuck in the toilet of a Manchester swimming pool. The door was open, but my thoughts blocked the way out. It was May 1994. I was 22 and hungry. After swimming a few lengths of the pool, I had lifted myself from the water and headed for the locker rooms. Going down the steps, I had scraped the back of my heel on the sharp edge of the final step. It left a small graze through which blood bulged into a blob that hung from my broken skin. I transferred the drop to my finger and a second swelled to take its place. I pulled a paper towel from above the sink to press to my wet heel. The blood on my finger ran with the water as it dripped down my arm. My eyes followed the blood. And the anxiety, of course, rushed back, ahead even of the memory. My shoulders sagged. My stomach tightened. Four weeks earlier, I had pricked my finger on a screw that stuck out from a bus shelter's corrugated metal. It was a busy Saturday afternoon and there had been lots of people around. Any one of them, I thought, could easily have injured themselves in the way I had. What if one had been HIV positive? They could have left infected blood on the screw, which then pierced my skin. That would put the virus into my bloodstream. I knew the official line was that transmission was impossible this way – the virus couldn't survive outside the body – but I also knew that, when pressed for long enough, those in the know would weaken the odds to virtually impossible. They couldn't be absolutely sure. In fact, several had admitted to me there was a theoretical risk. © 2014 Guardian News and Media Limited
Keyword: OCD - Obsessive Compulsive Disorder
Link ID: 19447 - Posted: 04.05.2014
By SABRINA TAVERNISE Federal health regulators approved a drug overdose treatment device on Thursday that experts say will provide a powerful lifesaving tool in the midst of a surging epidemic of prescription drug abuse. Similar to an EpiPen used to stop allergic reactions to bee stings, the easy-to-use injector — small enough to tuck into a pocket or a medicine cabinet — can be used by the relatives or friends of people who have overdosed. The hand-held device, called Evzio, delivers a single dose of naloxone, a medication that reverses the effects of an overdose, and will be used on those who have stopped breathing or lost consciousness from an opioid drug overdose. Naloxone is the standard treatment in such circumstances, but until now, has been available mostly in hospitals and other medical settings, when it is often used too late to save the patient. The decision to quickly approve the new treatment, which is expected to be available this summer, comes as deaths from opioids continue to mount, including an increase in those from heroin, which contributed to the death of the actor Philip Seymour Hoffman in February. Federal health officials, facing criticism for failing to slow the rising death toll, are under pressure to act, experts say. “This is a big deal, and I hope gets wide attention,” said Dr. Carl R. Sullivan III, director of the addictions program at West Virginia University. “It’s pretty simple: Having these things in the hands of people around drug addicts just makes sense because you’re going to prevent unnecessary mortality.” The scourge of drug abuse has battered states across the country, with deaths from overdoses now outstripping those from traffic crashes. Prescription drugs alone now account for more than half of all drug overdose deaths, and one major category of them, opioids, or painkillers, take the lives of more Americans than heroin and cocaine combined. Deaths from opioids have quadrupled in 10 years to more than 16,500 in 2010, according to federal data. © 2014 The New York Times Company
Keyword: Drug Abuse; Pain & Touch
Link ID: 19446 - Posted: 04.05.2014
Walking backward may seem a simple task, but researchers don’t know how the mind controls this behavior. A study published online today in Science provides the first glimpse of the brain circuit responsible—at least in fruit flies. Geneticists created 3500 strains of the insects, each with a temperature-controlled switch that turned random networks of neurons on when the flies entered an incubator. One mutant batch of fruit flies started strolling in reverse when exposed to warmth (video, right panel), which the team dubbed “moonwalkers,” in honor of Michael Jackson’s famous dance. Two neurons were responsible for the behavior. One lived in the brain and extended its connections to the end of the ventral nerve cord—the fly’s version of a spine, which runs along its belly. The other neuron had the opposite orientation—it started at the bottom of the nerve cord and sent its messaging cables—or axons—into the brain. The neuron in the brain acted like a reverse gear in a car; when turned on, it triggered reverse walking. The researchers say this neuron is possibly a command center that responds to environmental cues, such as, “Hey! I see a wall in front of me.” The second neuron functioned as the brakes for forward motion, but it couldn’t compel the fly to moonwalk. It may serve as a fail-safe that reflexively prevents moving ahead, such as when the fly accidentally steps onto a very cold floor. Using the two neurons as a starting point, the team will trace their links to sensory neurons for touch, sight, and smell, which feed into and control the moonwalking network. No word yet on the neurons responsible for the Macarena. © 2014 American Association for the Advancement of Science
Keyword: Movement Disorders
Link ID: 19445 - Posted: 04.05.2014
By James Gallagher Health and science reporter, BBC News The illegal party drug ketamine is an "exciting" and "dramatic" new treatment for depression, say doctors who have conducted the first trial in the UK. Some patients who have faced incurable depression for decades have had symptoms disappear within hours of taking low doses of the drug. The small trial on 28 people, reported in the Journal of Psychopharmacology, shows the benefits can last months. Experts said the findings opened up a whole new avenue of research. Depression is common and affects one-in-10 people at some point in their lives. Antidepressants, such as prozac, and behavioural therapies help some patients, but a significant proportion remain resistant to any form of treatment. A team at Oxford Health NHS Foundation Trust gave patients doses of ketamine over 40 minutes on up to six occasions. Eight showed improvements in reported levels of depression, with four of them improving so much they were no longer classed as depressed. Some responded within six hours of the first infusion of ketamine. Lead researcher Dr Rupert McShane said: "It really is dramatic for some people, it's the sort of thing really that makes it worth doing psychiatry, it's a really wonderful thing to see. He added: "[The patients] say 'ah this is how I used to think' and the relatives say 'we've got x back'." Dr McShane said this included patients who had lived with depression for 20 years. Stressed man The testing of ketamine has indentified some serious side-effects The duration of the effect is still a problem. Some relapse within days, while others have found they benefit for around three months and have since had additional doses of ketamine. There are also some serious side-effects including one case of the supply of blood to the brain being interrupted. Doctors say people should not try to self-medicate because of the serious risk to health outside of a hospital setting. BBC © 2014
Keyword: Depression; Drug Abuse
Link ID: 19444 - Posted: 04.03.2014
A high-resolution map of the human brain in utero is providing hints about the origins of brain disorders including schizophrenia and autism. The map shows where genes are turned on and off throughout the entire brain at about the midpoint of pregnancy, a time when critical structures are taking shape, researchers Wednesday in the journal Nature. "It's a pretty big leap," says , an investigator at the in Seattle who played a central role in creating the map. "Basically, there was no information of this sort prior to this project." Having a map like this is important because many psychiatric and behavioral problems appear to begin before birth, "even though they may not manifest until teenage years or even the early 20s," says , director of the . The human brain is often called the most complex object in the universe. Yet its basic architecture is created in just nine months, when it grows from a single cell to more than 80 billion cells organized in a way that will eventually let us think and feel and remember. "We're talking about a remarkable process," a process controlled by our genes, Lein says. So he and a large team of researchers decided to use genetic techniques to create a map that would help reveal this process. Funding came from the 2009 federal stimulus package. The massive effort required tens of thousands of brain tissue samples so small that they had to be cut out with a laser. Researchers used brain tissue from aborted fetuses, which the Obama administration has authorized over the objections of abortion opponents. ©2014 NPR
Keyword: Brain imaging; Development of the Brain
Link ID: 19443 - Posted: 04.03.2014
He was known in his many appearances in the scientific literature as simply K.C., an amnesiac who was unable to form new memories. But to the people who knew him, and the scientists who studied him for decades, he was Kent Cochrane, or just Kent. Cochrane, who suffered a traumatic brain injury in a motorcycle accident when he was 30 years old, helped to rewrite the understanding of how the brain forms new memories and whether learning can occur without that capacity. "From a scientific point of view, we've really learned a lot [from him], not just about memory itself but how memory contributes to other abilities," said Shayna Rosenbaum, a cognitive neuropsychologist at York University who started working with Cochrane in 1998 when she was a graduate student. Cochrane was 62 when he died late last week. The exact cause of death is unknown, but his sister, Karen Casswell, said it is believed he had a heart attack or stroke. He died in his room at an assisted living facility where he lived and the family opted not to authorize an autopsy. Few in the general public would know about Cochrane, though some may have seen or read media reports on the man whose life was like that of the lead character of the 2000 movie Memento. But anyone who works on the science of human memory would know K.C. Casswell and her mother, Ruth Cochrane, said the family was proud of the contribution Kent Cochrane made to science. Casswell noted her eldest daughter was in a psychology class at university when the professor started to lecture about the man the scientific literature knows as K.C. © CBC 2014
Keyword: Learning & Memory
Link ID: 19442 - Posted: 04.03.2014
Dr Nicola Davis The electronic nose in an instrument that attempts to mimic the human olfactory system. Humans and animals don't identify specific chemicals within odours; what they do is to recognise a smell based on a response pattern. You, as a human, will smell a strawberry and say "that's a strawberry". If you gave this to a traditional analytical piece of equipment, it might tell you what the 60-odd chemicals in the odour were - but that wouldn't tell you that it was a strawberry. How does it work? A traditional electronic nose has an array of chemical sensors, designed either to detect gases or vapours. These sensors are not tuned to a single chemical, but detect families of chemicals - [for example] alcohols. Each one of these sensors is different, so when they are presented to a complex odour formed of many chemicals, each sensor responds differently to that odour. This creates a pattern of sensor responses, which the machine can be taught [to recognise]. Can't we just use dogs? A dog is very, very sensitive. Special research teams work on training dogs to detect cancers as you would do explosives. What you we are trying to do with the electronic nose is create an artificial means of replicating what the dog does. Such machines have the advantage that they don't get tired, will work all day and you only need to feed them electricity. © 2014 Guardian News and Media Limited
Keyword: Chemical Senses (Smell & Taste); Robotics
Link ID: 19441 - Posted: 04.03.2014
For years, some biomedical researchers have worried that a push for more bench-to-bedside studies has meant less support for basic research. Now, the chief of one of the National Institutes of Health’s (NIH’s) largest institutes has added her voice—and hard data—to the discussion. Story Landis describes what she calls a “sharp decrease” in basic research at her institute, a trend she finds worrisome. In a blog post last week, Landis, director of the $1.6 billion National Institute of Neurological Disorders and Stroke (NINDS), says her staff started out asking why, in the mid-2000s, NINDS funding declined for R01s, the investigator-initiated grants that are the mainstay of most labs. After examining the aims and abstracts of grants funded between 1997 and 2012, her staff found that the portion of NINDS competing grant funding that went to basic research has declined (from 87% to 71%) while applied research rose (from 13% to 29%). To dig deeper, the staffers divided the grants into four categories—basic/basic; basic/disease-focused; applied/translational; and applied/clinical. Here, the decline in basic/basic research was “striking”: It fell from 52% to 27% of new and competing grants, while basic/disease-focused has been rising (see graph). The same trend emerged when the analysts looked only at investigator-initiated grants, which are proposals based on a researcher’s own ideas, not a solicitation by NINDS for proposals in a specific area. The shift could reflect changes in science and “a natural progression of the field,” Landis writes. Or it could mean researchers “falsely believe” that NINDS is not interested in basic studies and they have a better shot at being funded if they propose disease-focused or applied studies. The tight NIH budget and new programs focused on translational research could be fostering this belief, she writes. When her staff compared applications submitted in 2008 and 2011, they found support for a shift to disease-focused proposals: There was a “striking” 21% decrease in the amount of funding requested for basic studies, even though those grants had a better chance of being funded. © 2014 American Association for the Advancement of Science.
Keyword: Movement Disorders
Link ID: 19440 - Posted: 04.02.2014
Erika Check Hayden Monkeys on a reduced-calorie diet live longer than those that can eat as much as they want, a new study suggests. The findings add to a thread of studies on how a restricted diet prolongs life in a range of species, but they complicate the debate over whether the research applies to animals closely related to humans. In the study, which has been running since 1989 at the Wisconsin National Primate Research Center in Madison, 38 rhesus macaques (Macaca mulatta) that were allowed to eat whatever they wanted were nearly twice as likely to die at any age than were 38 monkeys whose calorie intakes were cut by 30%1. The same study reported2 in 2009 that calorie-restricted monkeys were less likely to die of age-related causes than control monkeys, but had similar overall mortality rates at all ages. “We set out to test the hypothesis: would calorie restriction delay ageing? And I think we've shown that it does,” says Rozalyn Anderson, a biochemist at the University of Wisconsin who led the study, which is published today in Nature Communications. She said it is not surprising that the 2009 paper did not find that the calorie-restricted monkeys lived longer, because at the time too few monkeys had died to prove the point. Eating a very low-calorie diet has been shown3 to prolong the lives of mice, leading to speculation that such a diet triggers a biochemical pathway that promotes survival. But what that pathway might be — and whether humans have it — has been a matter of hot debate. Eat to live In 2012, a study at the US National Institute on Aging (NIA) in Bethesda, Maryland, cast doubt on the idea, reporting4 that monkeys on low-calorie diets did not live longer than those that ate more food. But Anderson says that the Wisconsin findings are good news. © 2014 Nature Publishing Group
Keyword: Obesity
Link ID: 19439 - Posted: 04.02.2014
Neandertals and modern Europeans had something in common: They were fatheads of the same ilk. A new genetic analysis reveals that our brawny cousins had a number of distinct genes involved in the buildup of certain types of fat in their brains and other tissues—a trait shared by today’s Europeans, but not Asians. Because two-thirds of our brains are built of fatty acids, or lipids, the differences in fat composition between Europeans and Asians might have functional consequences, perhaps in helping them adapt to colder climates or causing metabolic diseases. “This is the first time we have seen differences in lipid concentrations between populations,” says evolutionary biologist Philipp Khaitovich of the CAS-MPG Partner Institute for Computational Biology in Shanghai, China, and the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, lead author of the new study. “How our brains are built differently of lipids might be due to Neandertal DNA.” Ever since researchers at the Max Planck sequenced the genome of Neandertals, including a super high-quality genome of a Neandertal from the Altai Mountains of Siberia in December, researchers have been comparing Neandertal DNA with that of living people. Neandertals, who went extinct 30,000 years ago, interbred with modern humans at least once in the past 60,000 years, probably somewhere in the Middle East. Because the interbreeding happened after moderns left Africa, today’s Africans did not inherit any Neandertal DNA. But living Europeans and Asians have inherited a small amount—1% to 4% on average. So far, scientists have found that different populations of living humans have inherited the Neandertal version of genes that cause diabetes, lupus, and Crohn’s disease; alter immune function; and affect the function of the protein keratin in skin, nails, and hair. © 2014 American Association for the Advancement of Science.
Keyword: Evolution; Obesity
Link ID: 19438 - Posted: 04.02.2014
By NATALIE ANGIER The “Iliad” may be a giant of Western literature, yet its plot hinges on a human impulse normally thought petty: spite. Achilles holds a festering grudge against Agamemnon (“He cheated me, wronged me ... He can go to hell...”) turning down gifts, homage, even the return of his stolen consort Briseis just to prolong the king’s suffering. Now, after decades of focusing on such staples of bad behavior as aggressiveness, selfishness, narcissism and greed, scientists have turned their attention to the subtler and often unsettling theme of spite — the urge to punish, hurt, humiliate or harass another, even when one gains no obvious benefit and may well pay a cost. Psychologists are exploring spitefulness in its customary role as a negative trait, a lapse that should be embarrassing but is often sublimated as righteousness, as when you take your own sour time pulling out of a parking space because you notice another car is waiting for it and you’ll show that vulture who’s boss here, even though you’re wasting your own time, too. Evolutionary theorists, by contrast, are studying what might be viewed as the brighter side of spite, and the role it may have played in the origin of admirable traits like a cooperative spirit and a sense of fair play. The new research on spite transcends older notions that we are savage, selfish brutes at heart, as well as more recent suggestions that humans are inherently affiliative creatures yearning to love and connect. Instead, it concludes that vice and virtue, like the two sides of a V, may be inextricably linked. “Spitefulness is such an intrinsically interesting subject, and it fits with so many people’s everyday experience, that I was surprised to see how little mention there was of it in the psychology literature,” said David K. Marcus, a psychologist at Washington State University. At the same time, he said, “I was thrilled to find something that people haven’t researched to exhaustion.” © 2014 The New York Times Company
Keyword: Emotions; Evolution
Link ID: 19436 - Posted: 04.01.2014
A new study has raised new questions about how MRI scanners work in the quest to understand the brain. The research, led by Professor Brian Trecox and a team of international researchers, used a brand new technique to assess fluctuations in the performance of brain scanners as they were being used during a series of basic experiments. The results are due to appear in the Journal of Knowledge in Neuroscience: General later today. “Most people think that we know a lot about how MRI scanners actually work. The truth is, we don’t,” says Trecox. “We’ve even been misleading the public about the name – we made up functional Magnetic Resonance Imaging in 1983 because it sounded scientific and technical. fMRI really stands for flashy, Magically Rendered Images. So we thought: why not put an MRI scanner in an MRI scanner, and figure out what’s going on inside?” To do this, Trecox and his team built a giant imaging machine – thought to be the world’s largest – using funds from a Kickstarter campaign and a local bake sale. They then took a series of scans of standard-sized MRI scanners while they were repeatedly switched on and off, in one of the largest and most robust neuroscience studies of its type. “We tested six different MRI scanners,” says Eric Salmon, a PhD student involved in the project. “We found activation in an area called insular cortex in four of the six machines when they were switched on,” he added. In humans, the insular cortex has previously been implicated in a wide range of functions, including consciousness and self-awareness. According to Trecox and his team, activation in this area has never been found in imaging machines before. While Salmon acknowledged that the results should be treated with caution – research assistants were found asleep in at least two of the machines – the results nevertheless provide a potentially huge step in our understanding of the tools we use to research the brain. © 2014 Guardian News and Media Limited
Keyword: Brain imaging
Link ID: 19435 - Posted: 04.01.2014
By Karen Kaplan There are lies, damn lies – and the lies that we tell for the sake of others when we are under the influence of oxytocin. Researchers found that after a squirt of the so-called love hormone, volunteers lied more readily about their results in a game in order to benefit their team. Compared with control subjects who were given a placebo, those on oxytocin told more extreme lies and told them with less hesitation, according to a study published Monday in Proceedings of the National Academy of Sciences. Oxytocin is a brain hormone that is probably best known for its role in helping mothers bond with their newborns. In recent years, scientists have been examining its role in monogamy and in strengthening trust and empathy in social groups. Sometimes, doing what’s good for the group requires lying. (Think of parents who fake their addresses to get their kids into a better school.) A pair of researchers from Ben-Gurion University of the Negev in Israel and the University of Amsterdam figured that oxytocin would play a role in this type of behavior, so they set up a series of experiments to test their hypothesis. The researchers designed a simple computer game that asked players to predict whether a virtual coin toss would wind up heads or tails. After seeing the outcome on a computer screen, players were asked to report whether their prediction was correct or not. In some cases, making the right prediction would earn a player’s team a small payment (the equivalent of about 40 cents). In other cases, a correct prediction would cost the team the same amount, and sometimes there was no payoff or cost. Los Angeles Times Copyright 2014
Keyword: Hormones & Behavior; Attention
Link ID: 19434 - Posted: 04.01.2014
by Bob Holmes People instinctively organise a new language according to a logical hierarchy, not simply by learning which words go together, as computer translation programs do. The finding may add further support to the notion that humans possess a "universal grammar", or innate capacity for language. The existence of a universal grammar has been in hot dispute among linguists ever since Noam Chomsky first proposed the idea half a century ago. If the theory is correct, this innate structure should leave some trace in the way people learn languages. To test the idea, Jennifer Culbertson, a linguist at George Mason University in Fairfax, Virginia, and her colleague David Adger of Queen Mary University of London, constructed an artificial "nanolanguage". They presented English-speaking volunteers with two-word phrases, such as "shoes blue" and "shoes two", which were supposed to belong to a new language somewhat like English. They then asked the volunteers to choose whether "shoes two blue" or "shoes blue two" would be the correct three-word phrase. In making this choice, the volunteers – who hadn't been exposed to any three-word phrases – would reveal their innate bias in language-learning. Would they rely on familiarity ("two" usually precedes "blue" in English), or would they follow a semantic hierarchy and put "blue" next to "shoe" (because it modifies the noun more tightly than "two", which merely counts how many)? © Copyright Reed Business Information Ltd.
Keyword: Language; Development of the Brain
Link ID: 19433 - Posted: 04.01.2014
|By Hal Arkowitz and Scott O. Lilienfeld A commercial sponsored by Pfizer, the drug company that manufactures the antidepressant Zoloft, asserts, “While the cause [of depression] is unknown, depression may be related to an imbalance of natural chemicals between nerve cells in the brain. Prescription Zoloft works to correct this imbalance.” Using advertisements such as this one, pharmaceutical companies have widely promoted the idea that depression results from a chemical imbalance in the brain. The general idea is that a deficiency of certain neurotransmitters (chemical messengers) at synapses, or tiny gaps, between neurons interferes with the transmission of nerve impulses, causing or contributing to depression. One of these neurotransmitters, serotonin, has attracted the most attention, but many others, including norepinephrine and dopamine, have also been granted supporting roles in the story. Much of the general public seems to have accepted the chemical imbalance hypothesis uncritically. For example, in a 2007 survey of 262 undergraduates, psychologist Christopher M. France of Cleveland State University and his colleagues found that 84.7 percent of participants found it “likely” that chemical imbalances cause depression. In reality, however, depression cannot be boiled down to an excess or deficit of any particular chemical or even a suite of chemicals. “Chemical imbalance is sort of last-century thinking. It's much more complicated than that,” neuroscientist Joseph Coyle of Harvard Medical School was quoted as saying in a blog by National Public Radio's Alix Spiegel. © 2014 Scientific American
Keyword: Depression
Link ID: 19432 - Posted: 04.01.2014