Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
by Helen Thomson "I've been in a crowded elevator with mirrors all around, and a woman will move and I'll go to get out the way and then realise: 'oh that woman is me'." Heather Sellers has prosopagnosia, more commonly known as face blindness. "I can't remember any image of the human face. It's simply not special to me," she says. "I don't process them like I do a car or a dog. It's not a visual problem, it's a perception problem." Heather knew from a young age that something was different about the way she navigated her world, but her condition wasn't diagnosed until she was in her 30s. "I always knew something was wrong – it was impossible for me to trust my perceptions of the world. I was diagnosed as anxious. My parents thought I was crazy." The condition is estimated to affect around 2.5 per cent of the population, and it's common for those who have it not to realise that anything is wrong. "In many ways it's a subtle disorder," says Heather. "It's easy for your brain to compensate because there are so many other things you can use to identify a person: hair colour, gait or certain clothes. But meet that person out of context and it's socially devastating." As a child, she was once separated from her mum at a grocery store. Store staff reunited the pair, but it was confusing for Heather, since she didn't initially recognise her mother. "But I didn't know that I wasn't recognising her." © Copyright Reed Business Information Ltd
Keyword: Attention
Link ID: 18119 - Posted: 05.04.2013
By TARA PARKER-POPE Suicide rates among middle-aged Americans have risen sharply in the past decade, prompting concern that a generation of baby boomers who have faced years of economic worry and easy access to prescription painkillers may be particularly vulnerable to self-inflicted harm. More people now die of suicide than in car accidents, according to the Centers for Disease Control and Prevention, which published the findings in Friday’s issue of its Morbidity and Mortality Weekly Report. In 2010 there were 33,687 deaths from motor vehicle crashes and 38,364 suicides. Suicide has typically been viewed as a problem of teenagers and the elderly, and the surge in suicide rates among middle-aged Americans is surprising. From 1999 to 2010, the suicide rate among Americans ages 35 to 64 rose by nearly 30 percent, to 17.6 deaths per 100,000 people, up from 13.7. Although suicide rates are growing among both middle-aged men and women, far more men take their own lives. The suicide rate for middle-aged men was 27.3 deaths per 100,000, while for women it was 8.1 deaths per 100,000. The most pronounced increases were seen among men in their 50s, a group in which suicide rates jumped by nearly 50 percent, to about 30 per 100,000. For women, the largest increase was seen in those ages 60 to 64, among whom rates increased by nearly 60 percent, to 7.0 per 100,000. Suicide rates can be difficult to interpret because of variations in the way local officials report causes of death. But C.D.C. and academic researchers said they were confident that the data documented an actual increase in deaths by suicide and not a statistical anomaly. While reporting of suicides is not always consistent around the country, the current numbers are, if anything, too low. © 2013 The New York Times Company
Keyword: Depression; Emotions
Link ID: 18118 - Posted: 05.04.2013
by Andy Coghlan and Sara Reardon The world's biggest mental health research institute is abandoning the new version of psychiatry's "bible" – the Diagnostic and Statistical Manual of Mental Disorders, questioning its validity and stating that "patients with mental disorders deserve better". This bombshell comes just weeks before the publication of the fifth revision of the manual, called DSM-5. On 29 April, Thomas Insel, director of the US National Institute of Mental Health (NIMH), advocated a major shift away from categorising diseases such as bipolar disorder and schizophrenia according to a person's symptoms. Instead, Insel wants mental disorders to be diagnosed more objectively using genetics, brain scans that show abnormal patterns of activity and cognitive testing. This would mean abandoning the manual published by the American Psychiatric Association that has been the mainstay of psychiatric research for 60 years. The DSM has been embroiled in controversy for a number of years. Critics have said that it has outlasted its usefulness, has turned complaints that are not truly illnesses into medical conditions, and has been unduly influenced by pharmaceutical companies looking for new markets for their drugs. There have also been complaints that widened definitions of several disorder have led to over-diagnosis of conditions such as bipolar disorder and attention deficit hyperactivity disorder. Now, Insel has said in a blog post published by the NIMH that he wants a complete shift to diagnoses based on science not symptoms. © Copyright Reed Business Information Ltd.
Keyword: Depression; Schizophrenia
Link ID: 18117 - Posted: 05.04.2013
The short answer is no. But your question gets to the heart of an important problem that we have in this country: that all medications are approved by the Food and Drug Administration on the basis of relatively short-term studies, even though many are used long-term for medical and psychiatric disorders that are chronic, if not lifelong. The F.D.A. approves antidepressants like selective serotonin re-uptake inhibitors, or S.S.R.I.’s, if the drug beats a placebo in two randomized clinical trials that typically last 4 to 12 weeks and involve a few hundred patients. Longer-term maintenance studies, usually lasting one to two years, indicate that S.S.R.I.’s do not cause any serious harm, though they have plenty of side effects, like weight gain and sexual dysfunction. Once a drug hits the market, we have only a voluntary system of reporting adverse effects in the United States; there are no systematic long-term studies of any drug lasting 10 or more years. Still, S.S.R.I.’s have been used since the late 1980s and given to more than 40 million Americans, so it’s reasonable to say that if these drugs caused any significant toxic effects, we would have seen many such reports. Instead, we have some anecdotal reports claiming a wide range of S.S.R.I.-related toxicity, but one cannot know from these reports whether the symptoms are related to S.S.R.I. use or to medical illnesses that happen to develop over time in people taking these drugs. Copyright 2013 The New York Times Company
Keyword: Depression
Link ID: 18116 - Posted: 05.04.2013
National Institutes of Health researchers used the popular anti-wrinkle agent Botox to discover a new and important role for a group of molecules that nerve cells use to quickly send messages. This novel role for the molecules, called SNARES, may be a missing piece that scientists have been searching for to fully understand how brain cells communicate under normal and disease conditions. "The results were very surprising,” said Ling-Gang Wu, Ph.D., a scientist at NIH’s National Institute of Neurological Disorders and Stroke. “Like many scientists we thought SNAREs were only involved in fusion." Every day almost 100 billion nerve cells throughout the body send thousands of messages through nearly 100 trillion communication points called synapses. Cell-to-cell communication at synapses controls thoughts, movements, and senses and could provide therapeutic targets for a number of neurological disorders, including epilepsy. Nerve cells use chemicals, called neurotransmitters, to rapidly send messages at synapses. Like pellets inside shotgun shells, neurotransmitters are stored inside spherical membranes, called synaptic vesicles. Messages are sent when a carrier shell fuses with the nerve cell’s own shell, called the plasma membrane, and releases the neurotransmitter “pellets” into the synapse. SNAREs (soluble N-ethylmaleimide-sensitive factor attachment protein receptor) are three proteins known to be critical for fusion between carrier shells and nerve cell membranes during neurotransmitter release.
Keyword: Muscles; Emotions
Link ID: 18115 - Posted: 05.04.2013
Symmetry study deemed a fraud Eugenie Samuel Reich Few researchers have tried harder than Robert Trivers to retract one of their own papers. In 2005, Trivers, an evolutionary biologist at Rutgers University in New Brunswick, New Jersey, published an attention-grabbing finding: Jamaican teenagers with a high degree of body symmetry were more likely to be rated ‘good dancers’ by their peers than were those with less symmetrical bodies. The study, which suggested that dancing is a signal for sexual selection in humans, was featured on the cover of this journal (W. M. Brown et al. Nature 438, 1148–1150; 2005). But two years later, Trivers began to suspect that the study data had been faked by one of his co-authors, William Brown, a postdoctoral researcher at the time. In seeking a retraction, Trivers self-published The Anatomy of a Fraud, a small book detailing what he saw as evidence of data fabrication. Later, Trivers had a verbal altercation over the matter with a close colleague and was temporarily banned from campus. An investigation of the case, completed by Rutgers and released publicly last month, now seems to validate Trivers’ allegations. Brown disputes the university’s finding, but it could help to clear the controversy that has clouded Trivers’ reputation as the author of several pioneering papers in the 1970s. For example, Trivers advanced an influential theory of ‘reciprocal altruism’, in which people behave unselfishly and hope that they will later be rewarded for their good deeds. He also analysed human sexuality in terms of the investments that mothers and fathers each make in child-rearing. © 2013 Nature Publishing Group
Keyword: Sexual Behavior; Evolution
Link ID: 18114 - Posted: 05.04.2013
By Bruce Bower Human ancestors living in East Africa 2 million years ago weren’t a steak-and-potatoes crowd. But they had a serious hankering for gazelle meat and antelope brains, fossils discovered in Kenya indicate. Three sets of butchered animal bones unearthed at Kenya’s Kanjera South site provide the earliest evidence of both long-term hunting and targeted scavenging by a member of the human evolutionary family, anthropologist Joseph Ferraro of Baylor University in Waco, Texas, and his colleagues conclude. An early member of the Homo genus, perhaps Homo erectus, hunted small animals and scavenged predators’ leftovers of larger creatures, researchers report April 25 in PLOS ONE. Along with hunting relatively small game such as gazelles, these hominids scavenged the heads of antelope and wildebeests, apparently to add a side of fatty, nutrient-rich brain tissue to their diets, the scientists say. Those dietary pursuits could have provided the extra energy Homo erectus needed to support large bodies, expanded brains and extensive travel across the landscape, Ferraro says. A few East African sites dating to as early as 3.4 million years ago had previously produced small numbers of animal bones bearing butchery marks made by stone tools. Scientists think those bones indicate occasional meat eating (SN: 9/11/10, p. 8). Now Kanjera South has yielded several thousand complete and partial animal bones, representing at least 81 individual animals. A known reversal of Earth’s magnetic field preserved in an excavated soil layer allowed Ferraro’s team to determine the age of the finds, which accumulated over a few thousand years at most. © Society for Science & the Public 2000 - 2013
Keyword: Evolution; Obesity
Link ID: 18113 - Posted: 05.04.2013
by Lizzie Wade If you were a rat living in a completely virtual world like in the movie The Matrix, could you tell? Maybe not, but scientists studying your brain might be able to. Today, researchers report that certain cells in rat brains work differently when the animals are in virtual reality than when they are in the real world. The neurons in question are known as place cells, which fire in response to specific physical locations in the outside world and reside in the hippocampus, the part of the brain responsible for spatial navigation and memory. As you walk out of your house every day, the same place cell fires each time you reach the shrub that's two steps away from your door. It fires again when you reach the same place on your way back home, even though you are traveling in the opposite direction. Scientists have long suspected that these place cells help the brain generate a map of the world around us. But how do the place cells know when to fire in the first place? Previous research showed that the cells rely on three different kinds of information. First, they analyze "visual cues," or what you see when you look around. Then, there are what researchers call "self-motion cues." These cues come from how your body moves in space and are the reason you can still find your way around a room with the lights out. The final type of information is the "proximal cues," which encompass everything else about the environment you're in. The smell of a bakery on your way to work, the sounds of a street jammed with traffic, and the springy texture of grass in a park are all proximal cues. © 2010 American Association for the Advancement of Science.
Keyword: Attention; Robotics
Link ID: 18112 - Posted: 05.04.2013
By ADRIAN RAINE In studying brain scans of criminals, researchers are discovering tell-tale signs of violent tendencies. WSJ's Jason Bellini speaks with Professor Adrian Raine about his latest discoveries. The scientific study of crime got its start on a cold, gray November morning in 1871, on the east coast of Italy. Cesare Lombroso, a psychiatrist and prison doctor at an asylum for the criminally insane, was performing a routine autopsy on an infamous Calabrian brigand named Giuseppe Villella. Lombroso found an unusual indentation at the base of Villella's skull. From this singular observation, he would go on to become the founding father of modern criminology. Lombroso's controversial theory had two key points: that crime originated in large measure from deformities of the brain and that criminals were an evolutionary throwback to more primitive species. Criminals, he believed, could be identified on the basis of physical characteristics, such as a large jaw and a sloping forehead. Based on his measurements of such traits, Lombroso created an evolutionary hierarchy, with Northern Italians and Jews at the top and Southern Italians (like Villella), along with Bolivians and Peruvians, at the bottom. These beliefs, based partly on pseudoscientific phrenological theories about the shape and size of the human head, flourished throughout Europe in the late 19th and early 20th centuries. Lombroso was Jewish and a celebrated intellectual in his day, but the theory he spawned turned out to be socially and scientifically disastrous, not least by encouraging early-20th-century ideas about which human beings were and were not fit to reproduce—or to live at all. ©2013 Dow Jones & Company, Inc.
Keyword: Aggression
Link ID: 18111 - Posted: 05.04.2013
by Paul Gabrielsen An insect's compound eye is an engineering marvel: high resolution, wide field of view, and incredible sensitivity to motion, all in a compact package. Now, a new digital camera provides the best-ever imitation of a bug's vision, using new optical materials and techniques. This technology could someday give patrolling surveillance drones the same exquisite vision as a dragonfly on the hunt. Human eyes and conventional cameras work about the same way. Light enters a single curved lens and resolves into an image on a retina or photosensitive chip. But a bug's eyes are covered with many individual lenses, each connected to light-detecting cells and an optic nerve. These units, called ommatidia, are essentially self-contained minieyes. Ants have a few hundred. Praying mantises have tens of thousands. The semicircular eyes sometimes take up most of an insect's head. While biologists continue to study compound eyes, materials scientists such as John Rogers try to mimic elements of their design. Many previous attempts to make compound eyes focused light from multiple lenses onto a flat chip, such as the charge-coupled device chips in digital cameras. While flat silicon chips have worked well for digital photography, in biology, "you never see that design," Rogers says. He thinks that a curved system of detectors better imitates biological eyes. In 2008, his lab created a camera designed like a mammal eye, with a concave electronic "retina" at the back. The curved surface enabled a wider field of view without the distortion typical of a wide-angle camera lens. Rogers then turned his attention to the compound eye. © 2010 American Association for the Advancement of Science.
Keyword: Vision
Link ID: 18110 - Posted: 05.02.2013
Alla Katsnelson People who use a ‘brain-workout’ program for just 10 hours have a mental edge over their peers even a year later, researchers report today in PLoS ONE1. The search for a regimen of mental callisthenics to stave off age-related cognitive decline is a booming area of research — and a multimillion-dollar business. But critics argue that even though such computer programs can improve performance on specific mental tasks, there is scant proof that they have broader cognitive benefits. For the study, adults aged 50 and older played a computer game designed to boost the speed at which players process visual stimuli. Processing speed is thought to be “the first domino that falls in cognitive decline”, says Fredric Wolinsky, a public-health researcher at the University of Iowa in Iowa City, who led the research. The game was developed by academic researchers but is now sold under the name Double Decision by Posit Science, based in San Francisco, California. (Posit did not fund the study.) Players are timed on how fast they click on an image in the centre of the screen and on others that appear around the periphery. The program ratchets up the difficulty as a player’s performance improves. Participants played the training game for 10 hours on site, some with an extra 4-hour ‘booster’ session later, or for 10 hours at home. A control group worked on computerized crossword puzzles for 10 hours on site. Researchers measured the mental agility of all 621 subjects before the brain training began, and again one year later, using eight well-established tests of cognitive performance. © 2013 Nature Publishing Group
Keyword: Learning & Memory; Alzheimers
Link ID: 18109 - Posted: 05.02.2013
Chris Palmer NF-kB activation in neurons in the hypothalamus increases with age (left column), while the total number of neurons (middle column) and the total number of all cell types in the hypothalamus (right column) is maintained at a relatively steady rate across age groups. The area of the brain that controls growth, reproduction and metabolism also kick-starts ageing, according to a study published today in Nature1. The finding could lead to new treatments for age-related illnesses, helping people to live longer. Dongsheng Cai, a physiologist at Albert Einstein College of Medicine in New York, and his colleagues tracked the activity of NF-κB — a molecule that controls DNA transcription and is involved in inflammation and the body's response to stress — in the brains of mice. They found that the molecule becomes more active in the brain area called the hypothalamus as a mouse grows older. Further tests suggested that NF-κB activity helps to determine when mice display signs of ageing. Animals lived longer than normal when they were injected with a substance that inhibited the activity of NF-κB in immune cells called microglia in the hypothalamus. Mice that received a substance to stimulate the activity of NF-κB died earlier. “We have provided scientific evidence for the concept that systemic ageing is influenced by a particular tissue in the body,” says Cai. Health and well-being © 2013 Nature Publishing Group
Keyword: Development of the Brain; Alzheimers
Link ID: 18108 - Posted: 05.02.2013
by Sara Reardon People with epilepsy have to learn to cope with the unpredictable nature of seizures – but that could soon be a thing of the past. A new brain implant can warn of seizures minutes before they strike, enabling them to get out of situations that could present a safety risk. Epileptic seizures are triggered by erratic brain activity. The seizures last for seconds or minutes, and their unpredictability makes them hazardous and disruptive for people with epilepsy, says Mark Cook of the University of Melbourne in Australia. Like earthquakes, "you can't stop them, but if you knew when one was going to happen, you could prepare", he says. With funding from NeuroVista, a medical device company in Seattle, Cook and his colleagues have developed a brain implant to do just that. The device consists of a small patch of electrodes that measure brain wave activity. Warning light Over time, the device's software learns which patterns of brainwave activity indicate that a seizure is about to happen. When it detects such a pattern, the implant then transmits a signal through a wire to a receiver implanted under the wearer's collarbone. This unit alerts the wearer by wirelessly activating a handheld gadget with coloured lights – a red warning light, for example, signals that a seizure is imminent. © Copyright Reed Business Information Ltd.
Keyword: Epilepsy; Robotics
Link ID: 18107 - Posted: 05.02.2013
Distinct patterns of brain activity are linked to greater rates of relapse among alcohol dependent patients in early recovery, a study has found. The research, supported by the National Institutes of Health, may give clues about which people in recovery from alcoholism are most likely to return to drinking. "Reducing the high rate of relapse among people treated for alcohol dependence is a fundamental research issue," said Kenneth R. Warren, Ph.D., acting director of the National Institute on Alcohol Abuse and Alcoholism (NIAAA), part of NIH. "Improving our understanding of the neural mechanisms that underlie relapse will help us identify susceptible individuals and could inform the development of other prevention strategies." Using brain scans, researchers found that people in recovery from alcoholism who showed hyperactivity in areas of the prefrontal cortex during a relaxing scenario were eight times as likely to relapse as those showing normal brain patterns or healthy controls. The prefrontal brain plays a role in regulating emotion, the ability to suppress urges, and decision-making. Chronic drinking may damage regions involved in self-control, affecting the ability to regulate cravings and resist relapse. Findings from the study, which was funded by NIAAA, appear online at the JAMA Psychiatry website.
Keyword: Drug Abuse; Brain imaging
Link ID: 18106 - Posted: 05.02.2013
By Scott O. Lilienfeld and Hal Arkowitz A German children's book from 1845 by Heinrich Hoffman featured “Fidgety Philip,” a boy who was so restless he would writhe and tilt wildly in his chair at the dinner table. Once, using the tablecloth as an anchor, he dragged all the dishes onto the floor. Yet it was not until 1902 that a British pediatrician, George Frederic Still, described what we now recognize as attention-deficit hyperactivity disorder (ADHD). Since Still's day, the disorder has gone by a host of names, including organic drivenness, hyperkinetic syndrome, attention-deficit disorder and now ADHD. Despite this lengthy history, the diagnosis and treatment of ADHD in today's children could hardly be more controversial. On his television show in 2004, Phil McGraw (“Dr. Phil”) opined that ADHD is “so overdiagnosed,” and a survey in 2005 by psychologists Jill Norvilitis of the University at Buffalo, S.U.N.Y., and Ping Fang of Capitol Normal University in Beijing revealed that in the U.S., 82 percent of teachers and 68 percent of undergraduates agreed that “ADHD is overdiagnosed today.” According to many critics, such overdiagnosis raises the specter of medicalizing largely normal behavior and relying too heavily on pills rather than skills—such as teaching children better ways of coping with stress. Yet although data point to at least some overdiagnosis, at least in boys, the extent of this problem is unclear. In fact, the evidence, with notable exceptions, appears to be stronger for the undertreatment than overtreatment of ADHD. © 2013 Scientific American,
Keyword: ADHD; Drug Abuse
Link ID: 18105 - Posted: 05.02.2013
Alison Abbott Thinking about a professor just before you take an intelligence test makes you perform better than if you think about football hooligans. Or does it? An influential theory that certain behaviour can be modified by unconscious cues is under serious attack. A paper published in PLoS ONE last week1 reports that nine different experiments failed to replicate this example of ‘intelligence priming’, first described in 1998 (ref. 2) by Ap Dijksterhuis, a social psychologist at Radboud University Nijmegen in the Netherlands, and now included in textbooks. David Shanks, a cognitive psychologist at University College London, UK, and first author of the paper in PLoS ONE, is among sceptical scientists calling for Dijksterhuis to design a detailed experimental protocol to be carried out indifferent laboratories to pin down the effect. Dijksterhuis has rejected the request, saying that he “stands by the general effect” and blames the failure to replicate on “poor experiments”. An acrimonious e-mail debate on the subject has been dividing psychologists, who are already jittery about other recent exposures of irreproducible results (see Nature 485, 298–300; 2012). “It’s about more than just replicating results from one paper,” says Shanks, who circulated a draft of his study in October; the failed replications call into question the underpinnings of ‘unconscious-thought theory’. © 2013 Nature Publishing Group
Keyword: Attention; Consciousness
Link ID: 18104 - Posted: 05.01.2013
By Breanna Draxler The ruse is common in spy movies—an attractive female saunters in at a critical moment and seduces the otherwise infallible protagonist, duping him into giving up the goods. It works in Hollywood and it works in real life, too. Men tend to say yes to attractive women without really scrutinizing whether or not they are trustworthy. But scientists have shown, for the first time, that a drug may be able to overcome this “honey trap,” and help men make more rational decisions. Nearly 100 men participated in the study; half were given minocycline, an antibiotic normally used to treat acne, and half were given a placebo. After four days of this drug regimen, participants played a computerized one-on-one trust game with eight different women, based only on pictures of the female players. In each round, the male player was given $13 and shown a picture of one of the female players. The male player would choose how much money he wanted to keep and how much he wanted to give to the female player. The amount given away was then tripled, and the female player would decide whether to split the money with the man or keep it all for herself. Unbeknownst to the men, however, the women kept the money every time. The researchers also asked the men to evaluate the photos of the females to determine how trustworthy and attractive they appeared, on a scale of 0 to 10.
Keyword: Emotions; Sexual Behavior
Link ID: 18103 - Posted: 05.01.2013
By ALAN SCHWARZ FRESNO, Calif. — Lisa Beach endured two months of testing and paperwork before the student health office at her college approved a diagnosis of attention deficit hyperactivity disorder. Then, to get a prescription for Vyvanse, a standard treatment for A.D.H.D., she had to sign a formal contract — promising to submit to drug testing, to see a mental health professional every month and to not share the pills. “As much as it stunk, it’s nice to know, ‘O.K., this is legit,' ” said Ms. Beach, a senior at California State University, Fresno. The rigorous process, she added, has deterred some peers from using the student health office to obtain A.D.H.D. medications, stimulants long abused on college campuses. “I tell them it takes a couple months,” Ms. Beach said, “and they’re like, ‘Oh, never mind.’ ” Fresno State is one of dozens of colleges tightening the rules on the diagnosis of A.D.H.D. and the subsequent prescription of amphetamine-based medications like Vyvanse and Adderall. Some schools are reconsidering how their student health offices handle A.D.H.D., and even if they should at all. Various studies have estimated that as many as 35 percent of college students illicitly take these stimulants to provide jolts of focus and drive during finals and other periods of heavy stress. Many do not know that it is a federal crime to possess the pills without a prescription and that abuse can lead to anxiety, depression and, occasionally, psychosis. Although few experts dispute that stimulant medications can be safe and successful treatments for many people with a proper A.D.H.D. diagnosis, the growing concern about overuse has led some universities, as one student health director put it, “to get out of the A.D.H.D. business.” © 2013 The New York Times Company
Keyword: ADHD; Drug Abuse
Link ID: 18102 - Posted: 05.01.2013
By Ferris Jabr This month the American Psychiatric Association (APA) will publish the fifth edition of its guidebook for clinicians, the Diagnostic and Statistical Manual of Mental Disorders, or DSM-5. Researchers around the world have eagerly anticipated the new manual, which, in typical fashion, took around 14 years to revise. The DSM describes the symptoms of more than 300 officially recognized mental illnesses—depression, bipolar disorder, schizophrenia and others—helping counselors, psychiatrists and general care practitioners diagnose their patients. Yet it has a fundamental flaw: it says nothing about the biological underpinnings of mental disorders. In the past, that shortcoming reflected the science. For most of the DSM's history, investigators have not had a detailed understanding of what causes mental illness. That excuse is no longer valid. Neuroscientists now understand some of the ways that brain circuits for memory, emotion and attention malfunction in various mental disorders. Since 2009 clinical psychologist Bruce Cuthbert and his team at the National Institute of Mental Health have been constructing a classification system based on recent research, which is revealing how the structure and activity of a mentally ill brain differs from that of a healthy one. The new framework will not replace the DSM, which is too important to discard, Cuthbert says. Rather he and his colleagues hope that future versions of the guide will incorporate information about the biology of mental illness to better distinguish one disorder from another. Cuthbert, whose project may receive additional funding from the Obama administration's planned Brain Activity Map initiative, is encouraging researchers to study basic cognitive and biological processes implicated in many types of mental illness. Some scientists might explore how and why the neural circuits that detect threats and store fearful memories sometimes behave in unusual ways after traumatic events—the kinds of changes that are partially responsible for post-traumatic stress disorder. Others may investigate the neurobiology of hallucinations, disruptions in circadian rhythms, or precisely how drug addiction rewires the brain. © 2013 Scientific American
Keyword: Schizophrenia; Depression
Link ID: 18101 - Posted: 05.01.2013
By NICHOLAS BAKALAR A large new study confirms that sticking to the Mediterranean diet — fish, poultry, vegetables and fruit, with minimal dairy foods and meat — may be good for the brain. Researchers prospectively followed 17,478 mentally healthy men and women 45 and older, gathering data on diet from food questionnaires, and testing mental function with a well-validated six-item screening tool. They ranked their adherence to the Mediterranean diet on a 10-point scale, dividing the group into low adherence and high adherence. The study was published April 30 in the journal Neurology. During a four-year follow-up, 1,248 people became cognitively impaired. But those with high adherence to the diet were 19 percent less likely to be among them. This association persisted even after controlling for almost two dozen demographic, environmental and vascular risk factors, and held true for both African-Americans and whites. The study included 2,913 people with Type 2 diabetes, but for them adherence to the diet had no effect on the likelihood of becoming impaired. The lead author, Dr. Georgios Tsivgoulis, an assistant professor of neurology at the University of Athens, said that this is the largest study of its kind. The Mediterranean diet, he added, “has many benefits — cardiovascular, cancer risk, anti-inflammatory, central nervous system. We’re on the tip of the iceberg, and trying to understand what is below.” Copyright 2013 The New York Times Company
Keyword: Obesity; Alzheimers
Link ID: 18100 - Posted: 05.01.2013