Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Joshua Hartshorne The setting: a nursery. A baby speaks directly to the camera: “Look at this. I’m a free man. I go anywhere I want now.” He describes his stock-buying activities, but then his phone interrupts. “Relentless! Hang on a second.” He answers his phone. “Hey girl, can I hit you back?” This E*Trade commercial is only the latest proof of what comedians have known for years: few things are as funny as a baby who talks like an adult. This comedic law obscures an important question: Why don’t young children express themselves articulately? And why is the idea of toddler speaking in perfect sentences so hilarious? Many people assume children learn to talk by copying what they hear. In other words, babies listen to the words adults use and the situations in which they use them and imitate accordingly. Behaviorism, the scientific approach that dominated American cognitive science for the first half of the 20th century, made exactly this argument. This “copycat” theory can’t explain why toddlers aren’t as loquacious adults, however. After all, when was the last time you heard literate adults express themselves in one-word sentences (“bottle,” “doggie”) or in short phrases such as, “Mommy open box.” Of course, showing that a copycat theory of language acquisition can’t explain these strange patterns in child speech is easy. Actually explaining one-word sentences is much harder. Over the past half-century, scientists settled on two reasonable possibilities. © 1996-2009 Scientific American Inc.
Keyword: Language; Development of the Brain
Link ID: 12512 - Posted: 06.24.2010
By BENEDICT CAREY They’re considered a release, a psychological tonic, and to many a glimpse of something deeper: the heart’s own sign language, emotional perspiration from the well of common humanity. Tears lubricate love songs and love, weddings and funerals, public rituals and private pain, and perhaps no scientific study can capture their many meanings. “I cry when I’m happy, I cry when I’m sad, I may cry when I’m sharing something that’s of great significance to me,” said Nancy Reiley, 62, who works at a women’s shelter in Tampa, Fla., “and for some reason I sometimes will cry when I’m in a public speaking situation. “It has nothing to do with feeling sad or vulnerable. There’s no reason I can think of why it happens, but it does.” Now, some researchers say that the common psychological wisdom about crying — crying as a healthy catharsis — is incomplete and misleading. Having a “good cry” can and usually does allow people to recover some mental balance after a loss. But not always and not for everyone, argues a review article in the current issue of the journal Current Directions in Psychological Science. Placing such high expectation on a tearful breakdown most likely sets some people up for emotional confusion afterward. Copyright 2009 The New York Times Company
Keyword: Emotions
Link ID: 12511 - Posted: 06.24.2010
By Bruce Bower Dreams don’t just bubble up at night and then evaporate like morning dew once the sun rises. What you dream shapes what you think about your upcoming plans and your closest confidants, especially if nighttime reveries fit with what’s already convenient to believe, a new report finds. In an effort to understand whether people take their dreams seriously, Carey Morewedge of Carnegie Mellon University in Pittsburgh and Michael Norton of Harvard University surveyed 149 college students attending universities in India, South Korea or the United States about theories of dream function. People across cultures often assume that dreams contain hidden truths, much as Sigmund Freud posited more than a century ago, Morewedge and Norton report in the February Journal of Personality and Social Psychology. In fact, many individuals consider dreams to provide more meaningful information regarding daily affairs than comparable waking thoughts do, the two psychologists conclude. Ideas that dreams come from the brain’s random output or are essential for daily problem-solving or for weeding out the routine clutter in one’s mind appeal to a minority of people, the scientists say. In a series of experiments, the researchers also probed interpretations of various real and imagined dreams in a national sample of 270 people surveyed online, 656 commuters and pedestrians interviewed in Boston and Cambridge, Mass., and 60 college students. © Society for Science & the Public 2000 - 2009
Keyword: Sleep
Link ID: 12510 - Posted: 06.24.2010
By Judy Foreman A troubled, gun-wielding 23-year-old student at Virginia Polytechnic Institute goes on a campus rampage, killing 32 people and eventually himself. An MIT student commits suicide by ingesting cyanide, and another dies in a fire after an overdose. Such highly publicized occurrences underscore the sense of personal angst on today's college campuses. But contrary to popular belief, the stress young people experience has nothing to do with meeting the demands of higher education. It comes simply with being a newly minted adult. Whether in college or not, almost half of this country's 19-to-25-year-olds meet standard criteria for at least one psychiatric disorder, although some of the disorders, such as phobias, are relatively mild, according to a government-funded survey of more than 5,000 young adults, published in December in the Archives of General Psychiatry. The study, done at Columbia University and called the National Epidemiologic Study on Alcohol and Related Conditions, found more alcohol use disorders among college students, while their noncollege peers were more likely to have a drug use disorder. But, beyond that, misery is largely an equal-opportunity affliction: Across the social spectrum, young people in America are depressed. They're anxious. They regularly break one another's hearts. And, all too often, they don't get the help they need as they face life's questions: © 2009 NY Times Co.
Keyword: Development of the Brain; Emotions
Link ID: 12509 - Posted: 06.24.2010
by Peter Aldhous GASTRIC surgery is a last resort for people who are dangerously obese. But there may soon be a gentler option in the shape of a removable device inserted into the gut though the mouth. The EndoBarrier, developed by GI Dynamics of Lexington, Massachusetts, is an impermeable sleeve that lines the first 60 centimetres of the small intestine. In animal experiments and preliminary human trials, it reduces weight and rapidly brings type II diabetes under control. Given the rising tide of obesity across the developed world, new treatments are a matter of priority. In the US alone, more than 15 million adults meet the criteria for gastric surgery because they have a body mass index of more than 40, or a BMI of 35 plus a complication such as diabetes. While the operations do cause dramatic and sustained weight loss, their high cost and concerns about the risk of dying on the operating table mean only a fraction of those who might benefit go on to have the surgery. According to the American Society for Metabolic and Bariatric Surgery, around 220,000 people in the US had gastric surgery for weight loss in 2008. GI Dynamics is not the only company working on alternatives (see "Wired for weight loss"), but its approach is appealing for its simplicity and low cost. The device, enclosed in a capsule, is inserted via the mouth using an endoscope. Once in place below the base of the stomach, the capsule releases a small ball that with the help of a catheter pulls a flexible sleeve made of the slippery polymer PTFE through the intestine. The ball is jettisoned and the sleeve fixed in place by releasing a spiked attachment made from the shape-memory metal alloy nitinol. © Copyright Reed Business Information Ltd.
Keyword: Obesity
Link ID: 12508 - Posted: 06.24.2010
By Carolyn Y. Johnson The humble honeybee can count - at least, sort of. New research shows that honeybees can tell the difference between different numbers of objects, up to four. That means the buzzing insects join the ranks of the numerically competent - pigeons, raccoons, dolphins, monkeys, songbirds, salamanders, and monkeys, all of which have been shown to have some form of numerical ability. The researchers are quick to note their results do not suggest that honeybees can actually put numbers in order or do math. But it does show bees can see features in visual patterns. Even if Pythagoras isn't at work in the hive, the work shines a light on an insect whose intellect is outsized for its buzzing body and minuscule brain. "Is it surprising? No. Is it amazing? Absolutely," said Kim Flottum, president of the Eastern Apicultural Society and editor of Bee Culture, who was not involved in the study. "Bees are just incredibly interesting . . . they just keep showing us things they can do that we didn't think anything as small as a bee should be able to do." To test bees' ability to recognize different numbers, researchers in Australia and Germany first trained bees to recognize one pattern of dots. The bees flew into a tunnel whose opening was marked with that pattern, then continued until they came to a fork in the tunnel, with the ability to go right or left. One choice was marked with the same dot pattern as the tunnel opening, the other with a different pattern. The bees that chose the matching pattern received a reward of sugar water. © 2009 NY Times Co.
Keyword: Miscellaneous
Link ID: 12507 - Posted: 06.24.2010
Reading a book triggers an active response in a person's brain, replicating the activity described in the story, a study by Washington University researchers in St. Louis, Mo., indicates. A brain-imaging study at Washington University tracked brain activity as participants read sections of a story. What scientists discovered was that parts of the brain associated with certain activities described in the story would light up as the person read those sections. For instance, if a character pulled a light cord in the story, the frontal lobe region, which controls grasping motions, would increase in activity. "There has been good evidence for a while that mental simulation — imagination — can improve performance in sport and other skilled behaviours. This study suggests that readers do mental simulation when they comprehend a story," Jeffrey Zacks, director of the university's dynamic cognition laboratory, told the Guardian newspaper. Zacks is also co-author of the study, soon to be published in the journal Psychological Science. The study's lead author is Nicole Speer. © CBC 2009
Keyword: Brain imaging; Language
Link ID: 12506 - Posted: 06.24.2010
By Carey Goldberg Marcie Lipsitt found that all she had to do was page Dr. Joseph Biederman or another doctor on his pediatric psychiatry team with that message, and she would get a call back within two minutes to help her through her son Andrew's latest violent crisis. Lynn Tesher credits the Biederman team at Massachusetts General Hospital with giving her several more years with her daughter, Ariel. Tormented from early childhood by rages and emotional pain, Ariel jumped to her death from her Manhattan balcony at age 20, but without the care from Biederman's team, Tesher is convinced, "she would not have lived past 13." For these mothers and some other parents, conflict-of-interest allegations against Biederman, one of the nation's leading child psychiatrists, are not just baffling, but personally upsetting. They see the doctor, a member of the Harvard Medical School faculty, being investigated by a powerful member of Congress and portrayed in the media as an apparent example of how drug company money taints medicine. To some, what is really under attack is child psychiatry, the research that aims to improve how it is practiced, and the terribly ill children who need help. US Senator Charles E. Grassley, Republican of Iowa, has accused Biederman of failing to tell Harvard until last March about most of the more than $1.5 million that the pharmaceutical industry paid him in consulting and speaking fees between 2000 and 2007. Biederman also created a research center with funding from Johnson & Johnson, a company that sells a popular antipsychotic drug. © 2009 NY Times Co.
Keyword: Development of the Brain; Schizophrenia
Link ID: 12505 - Posted: 06.24.2010
By Coco Ballantyne A Charlotte, N.C., man was charged with first-degree murder of a 79-year-old woman whom police said he scared to death. In an attempt to elude cops after a botched bank robbery, the Associated Press reports that 20-year-old Larry Whitfield broke into and hid out in the home of Mary Parnell. Police say he didn't touch Parnell but that she died after suffering a heart attack that was triggered by terror. Can the fugitive be held responsible for the woman's death? Prosecutors said that he can under the state's so-called felony murder rule, which allows someone to be charged with murder if he or she causes another person's death while committing or fleeing from a felony crime such as robbery—even if it's unintentional. But, medically speaking, can someone actually be frightened to death? We asked Martin A. Samuels, chairman of the neurology department at Brigham and Women's Hospital in Boston. Is it possible to literally be scared to death? Absolutely, no question about it. Really? How does that happen? The body has a natural protective mechanism called the fight-or-flight response, which was originally described by Walter Cannon [chairman of Harvard University's physiology department from 1906 to 1942]. If, in the wild, an animal is faced with a life-threatening situation, the autonomic (involuntary) nervous system responds by increasing heart rate, increasing blood flow to the muscles, dilating the pupils, and slowing digestion, among other things. All of this increases the chances of succeeding in a fight or running away from, say, an aggressive jaguar. This process certainly would be of help to primitive humans, but the problem, of course, is that in the modern world there is very limited advantage of the fight-or-flight response. There is a downside to revving up your nervous system like this. © 1996-2009 Scientific American Inc.
Keyword: Emotions; Stress
Link ID: 12504 - Posted: 06.24.2010
By Nathan Seppa Chronically elevated blood levels of the simple sugar glucose may contribute to poor cognitive function in elderly people with diabetes, a study in the February Diabetes Care suggests. But whether these levels add to a person’s risk of developing dementia is unclear, the study authors say. People with diabetes face a risk of old-age dementia that’s roughly 50 percent greater than those without diabetes, past studies have shown. Research has also hinted that surges in blood sugar might account for some of that added risk. Many previous studies have tested for elevated blood glucose by obtaining a snapshot blood sample taken after a person has fasted for a day. In the new study, Tali Cukierman-Yaffe, an endocrinologist at Tel-Aviv University and McMaster University in Hamilton, Canada, teamed with an international group of colleagues to assess blood glucose levels in nearly 3,000 diabetes patients by measuring A1c, shorthand for HbA1c or glycosylated hemoglobin. Since sugar in the blood sticks to the hemoglobin protein in red blood cells, the A1c test reveals an average sugar level over two or three months. In addition to collecting these blood glucose readings, the scientists also asked each volunteer to take a 30-minute battery of four standardized tests designed to assess memory, visual motor speed, capacity for learning and managing multiple tasks. © Society for Science & the Public 2000 - 2009
Keyword: Alzheimers; Obesity
Link ID: 12503 - Posted: 06.24.2010
By Benjamin Lester Inspirational followers may be just as important as stellar leaders, at least in fish. A new study finds that timid three-spined sticklebacks can inspire greater daring in their bold counterparts. The findings illustrate that leadership may be as much a product of social context as of individual temperament. Over the past several years, researchers have worked to understand how complex group behaviors arise from simple decisions by individuals. For instance, an ant trail might form on one tree branch instead of another because the first few ants randomly picked that branch and later ants followed their scent. But according to evolutionary biologist Andrea Manica and his colleagues at the University of Cambridge in the United Kingdom, much of the work has focused on situations in which all individuals are genetically very similar, such as groups of social insects. Less well understood, says Manica, is how the greater, individual differences in vertebrates' personalities can influence group behavior. In these situations, certain individuals often become group leaders. Previous studies have identified boldness--the amount of time an individual is willing to stay exposed in order to forage for food--as a trait of leaders in groups of sticklebacks (Gasterosteus aculeatus). To understand how boldness could translate into leadership, the Cambridge team set up aquaria in which one side was a "safe area" with deep water and plastic plants and the other side was a "risky," exposed area designed to make the fish feel vulnerable to being eaten by birds. The team placed one stickleback in each aquarium half, separated them with an opaque divider, and trained the fish to expect food only in the exposed area: To eat, the fish had to take risks. The scientists then observed each fish's behavior and assigned it a score on a boldness scale. They then randomly repaired the fish, using the boldness scores to classify each fish as either "bold" or "shy," relative to its new partner. This time they inserted clear and opaque dividers into the tanks. © 2009 American Association for the Advancement of Science.
Keyword: Animal Communication; Emotions
Link ID: 12502 - Posted: 06.24.2010
By Jesse Bering I wish I could say that I decided to come out of the closet in my early twenties for more admirable reasons—such as for love or the principle of the thing. But the truth is that passing for a straight person had become more of a hassle than I figured it was worth. Since the third grade, I’d spent too many valuable cognitive resources concocting deceptive schemes to cover up the fact that I was gay. In fact, my earliest conscious tactic to hide my homosexuality involved being outlandishly homophobic. When I was eight years old, I figured that if I used the word “fag” a lot and on every possible occasion expressed my repugnance for gay people, others would obviously think I was straight. But, although it sounded good in theory, I wasn’t very hostile by temperament and I had trouble channeling my fictitious outrage into convincing practice. I may have failed as a homophobe, but unfortunately, many people succeed. And it turns out we may have something in common—many young, homophobic males may secretly harbor homosexual desires (whether they are consciously trying to deceive the world about them as I was or not even aware they exist). One of the most important lines of work in this area dates back to a 1996 article published in The Journal of Abnormal Psychology. In this empirical paper, researchers Henry Adams, Lester Wright, Jr., and Bethany Lohr from the University of Georgia report evidence that homophobic young males may secretly have gay urges. © 1996-2009 Scientific American Inc
Keyword: Sexual Behavior; Aggression
Link ID: 12501 - Posted: 06.24.2010
By Martin Enserink Serotonin, the brain chemical involved in depression, anger, and a variety of other human behaviors, turns out to have another surprising role: It transforms desert locusts from solitary, innocuous bugs into swarming, voracious pests that can ravage orchards and fields in a matter of hours. The findings, published in tomorrow's issue of Science, could point the way to new locust-control methods that don't rely on insecticides. Most of the time, the desert locust (Schistocerca gregaria) is a bland, greenish insect that lives an inconspicuous life, shunning other members of its species and flying only by night. But when their densities reach a certain threshold, locusts become gregarious: They seek out one another's company, start reproducing explosively, and eventually form massive swarms that can move thousands of kilometers beyond their usual habitats and create havoc of biblical proportions. The behavior changes are accompanied by a complete physical makeover, taking several generations, during which the insects first turn pink and eventually black and bright yellow. A team of researchers based at three universities in the United Kingdom and Australia had previously discovered that the change from solitary to gregarious starts when locusts see and smell one another, or when their hind legs touch one another, a stimulus researchers can imitate in the lab by gently tickling them. In a 2004 paper, the group also showed that levels of 13 brain chemicals differ between insects in the two stages (Science, 10 December 2004, p. 1881). Now, the researchers have singled out serotonin as "the first domino to fall, the one that sets the entire process in motion," says lead author Michael Anstey of the University of Oxford in the U.K. © 2009 American Association for the Advancement of Science
Keyword: Aggression
Link ID: 12500 - Posted: 06.24.2010
by Linda Geddes For the first time, some of the disability associated with the early stages of multiple sclerosis appears to have been reversed. The treatment works by resetting patients' immune systems using their own stem cells. While randomised clinical trials are still needed to confirm the findings, they offer new hope to people in the early stages of the disease who don't respond to drug treatment. Multiple sclerosis is an autoimmune disease in which the fatty myelin sheath that wraps around nerve cells and speeds up their rate of transmission comes under attack from the body's own defences. Clean slate Richard Burt of Northwestern University Feinberg School of Medicine in Chicago and his colleagues had previously tried using stem cells to reverse this process in patients with advanced stages of the disease, with little success. "If you wait until there's neuro-degeneration, you're trying to close the barn door after the horse has already escaped," says Burt. What you really want to do is stop the autoimmune attack before it causes nerve-cell damage, he adds. In the latest trial, his team recruited 12 women and 11 men in the early relapsing-remitting stage of MS, who had not responded to treatment with the drug, interferon beta, after six months. © Copyright Reed Business Information Ltd.
Keyword: Multiple Sclerosis; Stem Cells
Link ID: 12499 - Posted: 06.24.2010
By Bruce Bower Good parenting provides a potent buffer against some youngsters’ genetic predisposition to use alcohol, cigarettes and marijuana by age 14, a new study finds. Uninvolved, unsupportive parenting heralds a spike in consumption of these substances among genetically vulnerable teens, reports a team led by psychologist Gene Brody of the University of Georgia in Athens. Brody and his colleagues conducted what to their knowledge is the first long-term examination of how parenting practices combine with a child’s genetic makeup to either prompt or prevent early drug use. Their results, based on a three-year study of rural, black youths from working poor families, a population that Brody’s Center for Family Research works with regularly, appear in the February Journal of Consulting and Clinical Psychology. “Our study emphasizes that there are protective processes in children’s lives, such as effective parenting, that shield them from a genetic risk for early substance use,” Brody says. His team focused on variations in the serotonin transporter gene, or 5HTT. This gene assists in regulating transmission of serotonin, a chemical messenger in the brain. Many people carry two copies of a long version of 5HTT. But approximately 40 percent of people inherit either one or two copies of a short version of the gene. Having at least one short version lessens serotonin transmission, relative to two long versions. © Society for Science & the Public 2000 - 2009
Keyword: Drug Abuse; Sexual Behavior
Link ID: 12498 - Posted: 06.24.2010
Brain cells called astrocytes help to cause the urge to sleep that comes with prolonged wakefulness, according to a study in mice, funded by the National Institutes of Health. The cells release adenosine, a chemical known to have sleep-inducing effects that are inhibited by caffeine. "Millions of Americans suffer from disorders that prevent a full night's sleep, and others—from pilots to combat soldiers – have jobs where sleepiness is a hazard. This research could lead to better drugs for inducing sleep when it is needed, and for staving off sleep when it is dangerous," says Merrill Mitler, Ph.D., a program director with the NIH's National Institute of Neurological Disorders and Stroke (NINDS). The study appears Jan. 29, 2009 in Neuron, and was funded by NINDS, the National Institute of Mental Health (NIMH) and the National Institute on Aging (NIA), all part of NIH. It is the result of a collaboration among Michael Halassa, M.D., and Philip Haydon, Ph.D., at Tufts University School of Medicine in Boston and Marcos Frank, Ph.D., and Ted Abel, Ph.D. at the University of Pennsylvania School of Medicine in Philadelphia. Although the exact purpose of sleep is unknown, everyone seems to need it, and some research suggests that it strengthens memories by adjusting the connections between neurons. As the waking hours tick by, all animals experience an increasing urge to sleep, known as sleep pressure. If sleep is delayed, a deep, long sleep usually follows as the body's means of compensating. Prior studies pointed to adenosine as a trigger for sleep pressure. The chemical accumulates in the brain during waking hours, eventually helping to stimulate the unique patterns of brain activity that occur during sleep.
LONDON - Doctors have long assumed that most antidepressants are interchangeable. But according to a new study, Zoloft and Cipralex work slightly better than 10 other popular drugs, and should be psychiatrists’ first choice for patients with moderate to severe depression. Previous research found few differences between antidepressants. A U.S. government study in 2006 concluded that patients with major depression did equally well on different drugs. But in a paper published online Thursday in the Lancet medical journal, two antidepressants came out on top, though only marginally. International doctors examined more than 100 previous studies on a dozen antidepressants, which included nearly 26,000 patients from 1991 to 2007. They found that Zoloft, developed by Pfizer Inc., and Cipralex, developed by Forest Laboratories in the U.S. and Danish drugmaker H. Lundbeck A/S in Europe, were the best options when considering benefits, side effects and cost. In contrast, Pfizer’s Edronax was the least effective. The other drugs tested were Celexa, Cymbalta, Efexor, Ixel, Luvox, Prozac, Seroxat, Remeron, and Zyban. “The bottom line is that there is a rational hierarchy when prescribing antidepressants,” said Dr. Andrea Cipriani, the study’s lead author, of the University of Verona in Italy. © 2009 The Associated Press.
Keyword: Schizophrenia
Link ID: 12496 - Posted: 06.24.2010
By Christof Koch At the heart of science are judicious observations and measurements. This reality presupposes that something can be measured. But how can consciousness—the notorious ineffable and ethereal stuff that can’t even be rigorously defined—be measured? Recent progress makes me optimistic. Consider a problem of great clinical, ethical and legal relevance, that of inferring the presence of consciousness in severely brain-damaged patients. Often the victims of traffic accidents, cardiac arrests or drug overdoses, such patients have periods when they are awake, and they may spontaneously open their eyes. On occasion, their head turns in response to a loud noise, or their eyes might briefly track an object, but never for long. They might grind their teeth, swallow or smile, but such activities occur sporadically, not on command. These fragmentary acts appear reflexlike, generated by an intact brain stem. As many as 25,000 such “vegetative” patients in hospices and nursing homes hover for years in this limbo, at a steep emotional and financial cost. The extent of the damage and the persistent absence of purposeful behavior usually leave little doubt that consciousness has fled the body for good. Terri Schiavo was such a case, alive but unconscious for 15 years before her court-ordered death in 2005 in Florida. Even worse, though, is the possibility that some of these patients may experience some remnants of consciousness, unable to communicate their feelings of discomfort or pain, agonizing thoughts or poignant memories to the outside world. Until recently, nothing could be done to diagnose when an awake mind was entombed inside a damaged brain. © 1996-2009 Scientific American Inc.
Keyword: Miscellaneous
Link ID: 12495 - Posted: 06.24.2010
By Johannes Haushofer and Ernst Fehr Imagine you are serving on a jury: the defendant is charged with murder, but he also suffers from a brain tumor that causes erratic behavior. Is he to be held responsible for the crime? Now imagine you are the judge: What should the defendant’s sentence be? Does the tumor count as a mitigating circumstance? The assignment of responsibility and the choice of an appropriate punishment lie at the heart of our justice system. At the same time, these are cognitive processes like many others—reasoning, remembering, decision-making—and as such must originate in the brain. These two facts lead to the intriguing question: How does the brain enable judges, juries, and you and me to perform these tasks? What are the neural mechanisms that let you decide whether someone is guilty or innocent? A recent study published in the December 2008 issue of the journal Neuron, by Joshua Buckholtz and his colleagues at Vanderbilt University tackles exactly this question. Until recently, such topics would have been out of the reach of cognitive neuroscience for lack of methods; today, functional magnetic resonance imaging (fMRI) allows researchers to watch the brain “in action” as normal human participants make decisions about responsibility and punishment. In the new study, Buckholtz and colleagues asked participants to read vignettes describing hypothetical crimes that a fictitious agent, “John,” commits against another person. The stories were divided into three conditions: in the first, the “responsibility” (R) condition, the perpetrator was fully responsible for the negative consequences of his action against the victim; for instance, John might have intentionally pushed his fiancée’s lover off a cliff. In the “diminished responsibility” (DR) condition, mitigating circumstances were present that reduced John’s responsibility; imagine that John committed the same crime, but suffered from a brain tumor. © 1996-2009 Scientific American Inc.
Keyword: Emotions
Link ID: 12494 - Posted: 06.24.2010
By Rachel Zelkowitz Did Grandma seem forgetful at the holiday parties last month? It could be time to put her on a diet. Sharply reducing calories improves memory in older adults, according to one of the first studies of dietary restriction and cognitive function in humans. Research on the benefits of an extremely low-calorie diet stretches back to the 1930s, when scientists found that rats lived up to twice as long when they nibbled less than control animals. Since then, some studies with rodents and nonhuman primates have shown that this spare diet, known as calorie restriction, improves some markers of diabetes and heart disease, such as blood glucose and triglyceride levels, and possibly prevents neurological declines similar to those seen with Alzheimer's disease and Parkinson's disease. In humans, however, the results have been mixed. Subjects on low-calorie diets generally have lower blood pressure and blood sugar levels than their chow-happy counterparts. But these studies were small, and none was designed to test how calorie restriction might affect cognitive performance. To fill that void, neurologist Agnes Flöel and her colleagues at the University of Muenster in Germany recruited 50 healthy elderly subjects. The average volunteer was 60 years old and overweight, with a body mass index of 28. The researchers randomly assigned the volunteers to one of three groups. Twenty people were instructed to reduce their daily calorie intake by 30%, while still eating a balanced diet of nutrient-rich carbohydrates, fats, and lean proteins. Another 20 were told to keep their caloric intake the same but increase their consumption of unsaturated fatty acids, such as those found in salmon or olive oil. (Previous studies have linked a diet rich in these fats to improved cognition.) The remaining 10 volunteers did not change their diets. © 2009 American Association for the Advancement of Science.
Keyword: Alzheimers; Learning & Memory
Link ID: 12493 - Posted: 06.24.2010


.gif)

