Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 11941 - 11960 of 29242

Meredith Wadman Ron Kalil, a neuroscientist at the University of Wisconsin–Madison, didn’t expect to see his son among the 28,500 attendees at the meeting of the Society for Neuroscience in New Orleans last October. And he wondered why Tom Kalil, deputy director for policy at the White House’s Office of Science and Technology Policy (OSTP), was accompanied by Miyoung Chun, vice-president of science programmes at the Kavli Foundation in Oxnard, California. Tom Kalil told his father that the Kavli Foundation had wanted his help in bringing nanoscientists together behind an ambitious idea. Ron Kalil says he thought: “Why are you talking about it at a neuroscience meeting?” He understands now. These two people, neither of them a working scientist, had been quietly pushing into existence the Brain Activity Map (BAM), the largest and most ambitious effort in fundamental biology since the Human Genome Project — and one that would need advances in both nanoscience and neuroscience to achieve its goals. This is the kind of science — big and bold — that politicians like. President Barack Obama praised brain mapping in his State of the Union address on 12 February. Soon after, Francis Collins, director of the US National Institutes of Health (NIH) in Bethesda, Maryland, which will be the lead agency on the project, talked up the idea in a television appearance. The Obama administration is expected to provide more details about the initiative this month, possibly in conjunction with the release of the federal 2014 budget request. But already, some scientists are wondering whether the project, a concept less than two years old and still evolving, can win new funding from Congress, or whether it would crowd out projects pitched by individual scientists. “Creative science is bottom-up, not top-down,” says Cori Bargmann, a neurobiologist at the Rockefeller University in New York. “Are we talking about central planning inside the Beltway?” © 2013 Nature Publishing Group

Keyword: Brain imaging
Link ID: 17875 - Posted: 03.07.2013

By Scicurious I heard the rumblings on Twitter, and then on the blogs. It was telepathy. No, it wasn’t telepathy, but it was close. It was like the Borg. No it wasn’t. It was a mind meld! Ok, maybe. So what was it? It was one rat learning to do something, while electrodes recorded his every move. In the meantime, on another continent, another rat received the signals into his own brain…and changed his behavior. Telepathy? No. A good solid proof of concept? I’m not sure. An interesting idea? Absolutely. So I wanted to look at this paper in depth. We know already that some other experts weren’t really thrilled with the results. But I’m going to look at WHY, and what a more convincing experiment might look like. So what actually happened here? Each experiment involved two sets of rats. First, you have your “encoder rats”. These rats were water-deprived (not terribly, just thirsty), and trained to press a lever for a water reward (water deprivation is one training technique for lever pressing, and is one of the fastest. But you can also food-deprive and train for food or just train the animal to something tasty, like Crisco or sweetened milk). The rats were trained until they were 95% accurate at the task. They were then implanted with electrodes in the motor cortex, that recorded the firing of the neurons as the rats pressed the left or right lever. © 2013 Scientific American,

Keyword: Robotics
Link ID: 17874 - Posted: 03.07.2013

Surgically implanting pacemaker-like devices into the brains of people with severe anorexia might help improve their symptoms, a small Canadian study suggests. Anorexia affects an estimated 15,000 to 20,000 people in Canada, mainly young women who face a high risk of premature death. The mortality rate is between six to 11 per cent. About 60 to 70 per cent of people with anorexia recover fully with traditional treatments, said Dr. Blake Woodside, medical director of the eating disorders program at Toronto General Hospital. But in Wednesday's online issue of the medical journal The Lancet, Woodside and his co-authors describe using deep brain stimulation to treat six women with severe anorexia that did not respond to treatment. The treatment involves surgery to implant the electrical stimulators. It's considered minimally invasive and the stimulation can be turned off. In the pilot study, the average age of the women at diagnosis was 20 and they ranged in age from 24 to 57 when the surgery was performed. Five had a history of repeated hospital admissions for eating disorders. While the study was meant to test the safety of the procedure, not its effectiveness, Woodside's team found three of the six patients achieved and maintained a body mass index greater than their historical level. © CBC 2013

Keyword: Anorexia & Bulimia
Link ID: 17873 - Posted: 03.07.2013

By Meghan Rosen Zombies aren’t the only things that feast on brains. Immune cells called microglia gorge on neural stem cells in developing rat and monkey brains, researchers report in the March 6 Journal of Neuroscience. Chewing up neuron-spawning stem cells could help control brain size by pruning away excess growth. Scientists have previously linked abnormal human brain size to autism and schizophrenia. “It shows microglia are very important in the developing brain,” says neuroscientist Joseph Mathew Antony of the University of Toronto, who was not involved in the research. Scientists have long known that in adult brains, microglia hunt for injured cells as well as pathogens. “They mop up all the dead and dying cells,” Antony says. And when the scavengers find a dangerous intruder, they pounce. “These guys are relentless,” says study coauthor Stephen Noctor, of the University of California, Davis MIND Institute in Sacramento. “They seek and destroy bacteria — it’s really quite amazing.” Microglia also lurk in embryonic brains, but the immune cells’ role there is less well understood. Previous studies had found microglia near neural stem cells — tiny factories that pump out new neurons. When Noctor’s team examined slices of embryonic human, monkey and rodent brains, he was struck by just how many microglia crowded around the stem cells and how closely the two cell types touched. © Society for Science & the Public 2000 - 2013

Keyword: Glia; Neuroimmunology
Link ID: 17872 - Posted: 03.07.2013

By Helen Shen When does a monkey turn down a free treat? When it is offered by a selfish person, apparently. Given the choice between accepting goodies from helpful, neutral or unhelpful people, capuchin monkeys (Cebus apella) tend to avoid individuals who refuse aid to others, according to a study published today in Nature Communications. “Humans can build up an impression about somebody just based on what we see,” says author James Anderson, a comparative psychologist at the University of Stirling, UK. The capuchin results suggest that this skill “probably extends to other species”, he says. Anderson chose to study capuchins because of their highly social and cooperative instincts. Monkeys in the study watched as a person either agreed or refused to help another person to open a jar containing a toy. Afterwards, both people offered a food pellet to the animal. The monkey was allowed to accept food from only one. When help was given, the capuchins showed little preference between the person requesting help and the one providing aid. But when help was denied, the seven monkeys tended to accept food less often from the unhelpful person than from the requester. To try to understand the monkeys’ motivations, Anderson and his team tested different scenarios. The animals showed no bias against people who failed to help because they were busy opening their own jar. But they tended to avoid people who were available to help but did not do so. © 2013 Scientific American

Keyword: Emotions; Aggression
Link ID: 17871 - Posted: 03.07.2013

by Elizabeth Norton The prospect of undergoing surgery while not fully "under" may sound like the stuff of horror movies. But one patient in a thousand remembers moments of awareness while under general anesthesia, physicians estimate. The memories are sometimes neutral images or sounds of the operating room, but occasionally patients report being fully aware of pain, terror, and immobility. Though surgeons scrupulously monitor vital signs such as pulse and blood pressure, anesthesiologists have no clear signal of whether the patient is conscious. But a new study finds that the brain may produce an early-warning signal that consciousness is returning—one that's detectable by electroencephalography (EEG), the recording of neural activity via electrodes on the skull. "We've known since the 1930s that brain activity changes dramatically with increasing doses of anesthetic," says the study's corresponding author, anesthesiologist Patrick Purdon of Massachusetts General Hospital in Boston. "But monitoring a patient's brain with EEG has never become routine practice." Beginning in the 1990s, some anesthesiologists began using an approach called the bispectral (BIS) index, in which readings from a single electrode are connected to a device that calculates, and displays, a single number indicating where the patient's brain activity falls on a scale of 100 (fully conscious) to zero (a "flatline" EEG). Anything between 40 and 60 is considered the target range for unconsciousness. But this index and other similar ones are only indirect measurements, Purdon explains. In 2011, a team led by anesthesiologist Michael Avidan at the Washington University School of Medicine in St. Louis, Missouri, found that monitoring with the BIS index was slightly less successful at preventing awareness during surgery than the nonbrain-based method of measuring exhaled anesthesia in the patient's breath. Of the 2861 patients monitored with the BIS index, seven had memories of the surgery, whereas only two of 2852 patients whose breath was analyzed remembered anything. © 2010 American Association for the Advancement of Science.

Keyword: Sleep; Consciousness
Link ID: 17870 - Posted: 03.05.2013

By David Brown, Hey, you, yawning in your cubicle at 2 in the afternoon. Your genes feel it, too. A new study, paid for by the U.S. Air Force but relevant for anyone with a small child, a large prostate or a lot on the mind, is helping illuminate what’s happening at the genetic level when we don’t get enough sleep. It turns out that chronic sleep deprivation — in this experiment, less than six hours a night for a week — changes the activity of about 700 genes, which is roughly 3 percent of all we carry. About one-third of the affected genes are ramped up when we go with insufficient sleep night after night. The other two-thirds are partially suppressed. Hundreds of “circadian genes” whose activity rises and falls each day abruptly lose their rhythm. Among the genes disturbed by sleep deprivation are ones involved in metabolism, immunity, inflammation, hormone response, the expression of other genes and the organization of material called chromatin on chromosomes. These changes may help explain how inadequate sleep alters attention and thinking and raises the risk for illnesses such as diabetes and coronary heart disease. “The findings will identify some of the pathways linking insufficient sleep and negative health outcomes,” said Derk-Jan Dijk, a physiologist at the University of Surrey in England, who led the study. “But how these things ultimately lead to obesity or diabetes is an unanswered question at this moment in time.” © 1996-2013 The Washington Post

Keyword: Sleep; Genes & Behavior
Link ID: 17869 - Posted: 03.05.2013

Steve Connor Physical exhaustion can occur when the brain – as well as the muscles – grows tired according to a study that sheds fresh light on the role played by the mind in determining endurance levels. Scientists have found that a key neurotransmitter in the brain, which controls signalling between nerve cells, can determine whether someone feels exhausted following physical exercise or after taking anti-depressant drugs such as Prozac. Although levels of serotonin rise during exercise, which provides a psychological boost and “feel-good” factor, it can also result in a widespread central fatigue that ultimately leads to someone feeling exhausted and unable to carry on, scientists found. Researchers led by Professor Jean-Francois Perrier of the University of Copenhagen found that while serotonin helps to keep people going during the early stage of vigorous exercise, a build-up of the neurotransmitter in the brain can have the opposite effect by causing “central fatigue” of the nervous system even when the muscles are still able to carry on. “We can now see it is actually a surplus of serotonin that triggers a braking mechanism in the brain. In other words, serotonin functions as an accelerator but also as a brake when the strain becomes excessive,” said Professor Perrier, whose study is published in the Proceedings of the National Academy of Sciences. © independent.co.uk

Keyword: Sleep
Link ID: 17868 - Posted: 03.05.2013

By George Johnson In the week since I wrote about Oliver Sacks and the idiot savant twins, I’ve been catching up with Season 2 of “Touch,” the TV series about an autistic boy named Jake who has an inexplicable ability to commune with a secret world of numbers — a buried skein of mathematics in which the Golden Mean, the fibonacci sequence, the genetic code, and the Kabbalah are all mysteriously connected. Jungian synchronicity, quantum entanglement, chaos theory — all turn out to be manifestations of an underlying order in which everything that perplexes us ultimately makes sense. It is the dream of both mystics and scientists, and I had wondered shortly after the show first began how the conceit was going to be sustained through more than a few episodes. The connecting thread has turned out to be a conspiracy by a shadowy corporation called AsterCorp — as secretive and powerful as Massive Dynamic, purveyors of the mind-enhancing medicine Cortexiphan in “Fringe” — to kidnap Jake and others like him in their attempt to control the world. Or the universe. It is too soon to tell. Dr. Sacks’s twins, with their power to see, hear, smell — somehow sense within minutes if a number was prime — would also have been on AsterCorp’s wish list. Something keeps pulling me back to Sacks’s story. That is how enchanting a writer he is. (His memoir, Uncle Tungsten, is my favorite of his books.) There are plenty of accounts in the psychiatric literature of amazing human calculators and mnemonists. Sacks describes some famous cases in his essay. But what he thought he saw in the twins went far beyond that. Somehow, as Sacks described it, they could recognize that a number is prime in the way that one might recognize a face. Something on the surface of 3334401341 told them it was prime while 3334401343 was not.

Keyword: Attention
Link ID: 17867 - Posted: 03.05.2013

By GINA KOLATA The psychiatric illnesses seem very different — schizophrenia, bipolar disorder, autism, major depression and attention deficit hyperactivity disorder. Yet they share several genetic glitches that can nudge the brain along a path to mental illness, researchers report. Which disease, if any, develops is thought to depend on other genetic or environmental factors. Their study, published online Wednesday in the Lancet, was based on an examination of genetic data from more than 60,000 people worldwide. Its authors say it is the largest genetic study yet of psychiatric disorders. The findings strengthen an emerging view of mental illness that aims to make diagnoses based on the genetic aberrations underlying diseases instead of on the disease symptoms. Two of the aberrations discovered in the new study were in genes used in a major signaling system in the brain, giving clues to processes that might go awry and suggestions of how to treat the diseases. “What we identified here is probably just the tip of an iceberg,” said Dr. Jordan Smoller, lead author of the paper and a professor of psychiatry at Harvard Medical School and Massachusetts General Hospital. “As these studies grow we expect to find additional genes that might overlap.” The new study does not mean that the genetics of psychiatric disorders are simple. Researchers say there seem to be hundreds of genes involved and the gene variations discovered in the new study confer only a small risk of psychiatric disease. Steven McCarroll, director of genetics for the Stanley Center for Psychiatric Research at the Broad Institute of Harvard and M.I.T., said it was significant that the researchers had found common genetic factors that pointed to a specific signaling system. © 2013 The New York Times Company

Keyword: Schizophrenia; Genes & Behavior
Link ID: 17866 - Posted: 03.04.2013

by Sheila M. Eldred Picture someone with attention deficit hyperactivity disorder, or ADHD, and you probably conjure up an image of an elementary school-age boy. But an analysis of data from the first large, population-based study to follow kids through to adulthood shows that the neurobehavioral disorder rarely goes away with age. Indeed, as ADHD patients make the transition to adulthood, the issues they face often multiply: they are more likely to have other psychiatric disorders and even commit suicide, reports a new study published online today in Pediatrics. NEWS: ADHD Linked to Missing Genes In fact, researchers found that only 37.5 percent of the adults who had been diagnosed with the disorder as a child were free of other psychiatric disorders, including alcohol and drug dependence, in their late 20s. Very few of the children with ADHD were still being treated as adults -- although neuropsychiatric interviews confirmed that 29 percent still had it. “I think there has been a view that ADHD is a childhood disorder, and it’s only relatively recently that people have been trained to detect it in adults,” said Nathan Blum, a developmental-behavioral pediatrician at Children’s Hospital in Philadelphia, who was not involved in the study. Among the adults who’d had ADHD as a child, 57 percent had at least one other psychiatric disorder, compared with 35 percent of the controls. Just under 2 percent percent had died; of the seven deaths, three were suicides. Of the controls, less than 1 percent had died. Of those 37 deaths, five were from suicide. And 2.7 percent were incarcerated at the time of recruitment for the study. © 2013 Discovery Communications, LLC.

Keyword: ADHD
Link ID: 17865 - Posted: 03.04.2013

by Emily Underwood No single cause has yet been discovered for schizophrenia, the devastating neuropsychiatric syndrome characterized by hallucinations, disordered thoughts, and other cognitive and emotional problems, typically beginning in early adulthood. Although schizophrenia runs in families, in many cases no genetic risk is apparent, leading many researchers to look for environmental explanations. Now, research in mice provides support for a long-held hypothesis: that the syndrome, and other neurological disorders, can emerge when multiple environmental insults such as prenatal infection and adolescent trauma combine. Environmental stressors such as infection and abuse were long ago shown to be risk factors for schizophrenia. Large studies of children whose mothers were infected with influenza during the last months of their pregnancy, for example, have a roughly twofold increase in risk of developing the syndrome compared with the general population. That doesn't explain why a few people who are exposed to an infection in the womb go on to develop schizophrenia while most don't, however, says Urs Meyer, a behavioral neurobiologist at the Swiss Federal Institute of Technology in Zurich and co-author of the study reported online today in Science. One long-held hypothesis, he says, is that early infection creates a latent vulnerability to schizophrenia that is only "unmasked" by later insults, such as physical injury or psychological trauma. Such stressors are thought to be particularly damaging during critical periods of brain development such as early puberty, he says. Although the "multiple-hit" hypothesis has been prominent in the literature for some time, it is difficult to test the idea with human epidemiology studies, he says. "You need huge, huge data sets to see anything." © 2010 American Association for the Advancement of Science.

Keyword: Schizophrenia; Stress
Link ID: 17864 - Posted: 03.02.2013

By Daisy Yuhas It's news chocolate lovers have been craving: raw cocoa may be packed with brain-boosting compounds. Researchers at the University of L'Aquila in Italy, with scientists from Mars, Inc., and their colleagues published findings last September that suggest cognitive function in the elderly is improved by ingesting high levels of natural compounds found in cocoa called flavanols. The study included 90 individuals with mild cognitive impairment, a precursor to Alzheimer's disease. Subjects who drank a cocoa beverage containing either moderate or high levels of flavanols daily for eight weeks demonstrated greater cognitive function than those who consumed low levels of flavanols on three separate tests that measured factors that included verbal fluency, visual searching and attention. Exactly how cocoa causes these changes is still unknown, but emerging research points to one flavanol in particular: (-)-epicatechin, pronounced “minus epicatechin.” Its name signifies its structure, differentiating it from other catechins, organic compounds highly abundant in cocoa and present in apples, wine and tea. The graph below shows how (-)-epicatechin fits into the world of brain-altering food molecules. Other studies suggest that the compound supports increased circulation and the growth of blood vessels, which could explain improvements in cognition, because better blood flow would bring the brain more oxygen and improve its function. Animal research has already demonstrated how pure (-)-epicatechin enhances memory. Findings published last October in the Journal of Experimental Biology note that snails can remember a trained task—such as holding their breath in deoxygenated water—for more than a day when given (-)-epicatechin but for less than three hours without the flavanol. Salk Institute neuroscientist Fred Gage and his colleagues found previously that (-)-epicatechin improves spatial memory and increases vasculature in mice. “It's amazing that a single dietary change could have such profound effects on behavior,” Gage says. If further research confirms the compound's cognitive effects, flavanol supplements—or raw cocoa beans—could be just what the doctor ordered. © 2013 Scientific American

Keyword: Attention; Obesity
Link ID: 17863 - Posted: 03.02.2013

But critics are sceptical about predicted organic computer. Ed Yong The brains of two rats on different continents have been made to act in tandem. When the first, in Brazil, uses its whiskers to choose between two stimuli, an implant records its brain activity and signals to a similar device in the brain of a rat in the United States. The US rat then usually makes the same choice on the same task. Miguel Nicolelis, a neuroscientist at Duke University in Durham, North Carolina, says that this system allows one rat to use the senses of another, incorporating information from its far-away partner into its own representation of the world. “It’s not telepathy. It’s not the Borg,” he says. “But we created a new central nervous system made of two brains.” Nicolelis says that the work, published today in Scientific Reports1, is the first step towards constructing an organic computer that uses networks of linked animal brains to solve tasks. But other scientists who work on neural implants are sceptical. Lee Miller, a physiologist at Northwestern University in Evanston, Illinois, says that Nicolelis’s team has made many important contributions to neural interfaces, but the current paper could be mistaken for a “poor Hollywood science-fiction script”. He adds, “It is not clear to what end the effort is really being made.” In earlier work2, Nicolelis’s team developed implants that can send and receive signals from the brain, allowing monkeys to control robotic or virtual arms and get a sense of touch in return. This time, Nicolelis wanted to see whether he could use these implants to couple the brains of two separate animals. © 2013 Nature Publishing Group

Keyword: Robotics
Link ID: 17862 - Posted: 03.02.2013

by Lizzie Wade With its complex interweaving of symbols, structure, and meaning, human language stands apart from other forms of animal communication. But where did it come from? A new paper suggests that researchers look to bird songs and monkey calls to understand how human language might have evolved from simpler, preexisting abilities. One reason that human language is so unique is that it has two layers, says Shigeru Miyagawa, a linguist at the Massachusetts Institute of Technology (MIT) in Cambridge. First, there are the words we use, which Miyagawa calls the lexical structure. "Mango," "Amanda," and "eat" are all components of the lexical structure. The rules governing how we put those words together make up the second layer, which Miyagawa calls the expression structure. Take these three sentences: "Amanda eats the mango," "Eat the mango, Amanda," and "Did Amanda eat the mango?" Their lexical structure—the words they use—is essentially identical. What gives the sentences different meanings is the variation in their expression structure, or the different ways those words fit together. The more Miyagawa studied the distinction between lexical structure and expression structure, "the more I started to think, 'Gee, these two systems are really fundamentally different,' " he says. "They almost seem like two different systems that just happen to be put together," perhaps through evolution. One preliminary test of his hypothesis, Miyagawa knew, would be to show that the two systems exist separately in nature. So he started studying the many ways that animals communicate, looking for examples of lexical or expressive structures. © 2010 American Association for the Advancement of Science.

Keyword: Language; Evolution
Link ID: 17861 - Posted: 03.02.2013

By Tina Hesman Saey If someone shouts “look behind you,” tadpoles in Michael Levin’s laboratory may be ready. The tadpoles can see out of eyes growing from their tails, even though the organs aren’t directly wired to the animals’ brains, Levin and Douglas Blackiston, both of Tufts University in Medford, Mass., report online February 27 in the Journal of Experimental Biology. Levin and Blackiston’s findings may help scientists better understand how the brain and body communicate, including in humans, and could be important for regenerative medicine or designing prosthetic devices to replace missing body parts, says Günther Zupanc, a neuroscientist at Northeastern University in Boston. Researchers have transplanted frog eyes to other body parts for decades, but until now, no one had shown that those oddly placed eyes (called “ectopic” eyes) actually worked. Ectopic eyes on tadpoles’ tails allow the animals to distinguish blue light from red light, the Tufts team found. Levin wanted to know whether the brain is hardwired to get visual information only from eyes in the head, or whether the brain could use data coming from elsewhere. To find out, he and Blackiston started with African clawed frog tadpoles (Xenopus laevis) and removed the normal eyes. They then transplanted cells that would grow into eyes onto the animals’ tails. The experiment seemed like a natural to test how well the brain can adapt, Levin says. “There’s no way the tadpole’s brain is expecting an eye on its tail.” Expected or not, some of the tadpoles managed to detect red and blue light from their tail eyes. The researchers placed tadpoles with transplanted eyes in chambers in which half of the chamber was illuminated in blue light and the other half in red light. A mild electric shock zapped the tadpole when it was in one half of the dish so that the animal learned to associate the color with the shock. The researchers periodically switched the colors in the chamber so that the tadpoles didn’t learn that staying still would save them. © Society for Science & the Public 2000 - 2013

Keyword: Vision; Evolution
Link ID: 17860 - Posted: 03.02.2013

Daphne Bavelier & Richard J. Davidson Video games are associated with a variety of negative outcomes, such as obesity, aggressiveness, antisocial behaviour and, in extreme cases, addiction2. At the same time, evidence is mounting that playing games can have beneficial effects on the brain. After spending an hour a day, 5 days a week for 8–10 weeks spotting snipers and evading opponents in shooter games such as Call of Duty or Unreal Tournament, young adults saw more small visual details in the middle of clutter and more accurately distinguished between various shades of grey3. After 10 hours stretched over 2 weeks spent chasing bad guys in mazes and labyrinths, players were better able to rotate an image mentally4, an improvement that was still present six months later and could be useful for activities as varied as navigation, research chemistry and architectural design. After guiding small rodents to a safe exit amid obstacles during a version of the game Lemmings that was designed to encourage positive behaviour, players were more likely in simulated scenarios to help another person after a mishap or to intervene when someone was being harassed5. Because gaming is clearly here to stay, some scientists are asking how to channel people's love of screen time towards positive effects on the brain and behaviour by designing video games specifically intended to train particular aspects of behaviour and brain function. One game, for example, aims to treat depression by introducing cognitive behavioural therapy while users fight off negative thoughts in a fantasy world6. In Re-mission, young cancer patients blast cancer cells and fight infections and the side effects of therapy — all to encourage them to stick with treatment (see www.re-mission.net). © 2013 Nature Publishing Group

Keyword: Learning & Memory
Link ID: 17859 - Posted: 03.02.2013

By Bruce Bower Children with dyslexia may read better after playing action video games that stress mayhem, not literacy, a contested study suggests. Playing fast-paced Wii video games for 12 hours over two weeks markedly increased the reading speed of 7- to 13-year-old kids with dyslexia, with no loss of reading accuracy, says a team led by psychologist Andrea Facoetti of the University of Padua, Italy. Reading gains lasted at least two months after the video game sessions. The gains matched or exceeded previously reported effects of reading-focused programs for dyslexia, the researchers report online February 28 in Current Biology. “These results are clear enough to say that action video games are able to improve reading abilities in children with dyslexia,” Facoetti says. Although the new study includes only 20 children with dyslexia, its results build on earlier evidence that many poor readers have difficulty focusing on items within arrays, Facoetti holds. By strengthening the ability to monitor central and peripheral objects in chaotic scenes, he says, action video games give kids with dyslexia a badly needed tool for tracking successive letters in written words. But evidence for Facoetti’s conclusions is shaky, asserts psychologist Nicola Brunswick of Middlesex University in London. The researchers tested word reading ability two months later but failed to test reading comprehension, she says. What’s more, they did so with a mere six of 10 kids who played the action video games. © Society for Science & the Public 2000 - 2013

Keyword: Dyslexia; Development of the Brain
Link ID: 17858 - Posted: 03.02.2013

By Kali Tal A few weeks ago an article in the Scientific American Twitter stream caught my eye. EMDR (Eye Movement Desensitization and Reprocessing) once again debuted as a “promising new treatment” for PTSD. EMDR, which has been repeatedly called “promising” over the last two decades, works only about as well for PTSD as other psychological treatment modalities with which it competes, primarily cognitive behavioral therapy (CBT) and exposure therapy. These so-called trauma focused treatments (TFT) all garner similar results. TFT have large effects in clinical trials, with two important caveats: 1) the enthusiasm of their various advocates bias the study results towards the treatment the researchers prefer; and, 2) they are effective for a significant number of carefully selected PTSD patients. The sad truth, however, is that current short-term treatments are not the solution for most patients with PTSD. Trial criteria often exclude those with comorbid disorders, multiple traumas, complex PTSD, and suicidal ideation, among others. Even when they are included, comorbid patients drop out of treatment studies at a much higher rate than those with simple PTSD, a problem that has implications for clinical practice. The large majority of those with PTSD also have other psychological disorders (commonly, substance abuse, depression, and anxiety disorders) and many of these patients have complex PTSD, which is both harder to treat, and more prone to relapse (see Fig. 1). Those who suffer from both PTSD and substance abuse (64%-84% of veterans, for example) often perceive the disorders as “functionally correlated.” Similarly, depression and PTSD are mutually reinforcing; each compounds the symptoms of the other. Both substance abuse and depression are notoriously difficult to treat, and harder to treat when comorbid with PTSD. Multiple studies document the long-term failure of PTSD treatment for veterans, but there are fewer on the effectiveness of therapies in treating comorbid PTSD in civilian populations. Existing studies challenge the assumption that PTSD treatments effective for simple PTSD, are also effective for combined PTSD and substance abuse, or PTSD and depression. © 2013 Scientific American

Keyword: Stress
Link ID: 17857 - Posted: 02.27.2013

by Tim Wall Some feathered crooners may advertise their size to females by hitting the low notes. Ornithologists at the Max Planck Institute found that only bigger-bodied birds belt out the bass. The physical size of some birds may put a limit on the frequency of the birds’ songs, according to a study published in PLOS ONE. Since only a larger males hit lower notes, females may be able to use deeper voices as a reliable measure of a male’s size. Size matters to some songbird species, with females preferring larger males, so vocal limitations could affect some birds’ love lives. The songs of purple-crowned fairy-wrens, Malurus coronatus coronatus, hit a range of notes. However the study found that in some songs, larger body size related to lower-pitched singing ability. Further study will be needed to prove a relationship among body size, singing frequency and sexual success in fairy-wrens. The authors suggested that body size may be just one of many characteristics advertized by fairy-wrens songs. The authors also noted that low-frequency singing ability may have resulted from good health as the male fairy-wrens grew up. Better health may have allowed better development of singing structures in the birds’ anatomies. The same healthy conditions could have also resulted in larger size. So size and singing would be correlated, but not causally related. © 2013 Discovery Communications, LLC.

Keyword: Sexual Behavior; Animal Communication
Link ID: 17856 - Posted: 02.27.2013