Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By WALECIA KONRAD DR. ELIZABETH WALTON, a 43-year-old internist in Atlanta and the mother of twin 4-year-old boys, has a common, if sometimes embarrassing, health problem. She snores — loudly. And she has tried to fix it with a variety of things, including a machine that blows air down her throat and an oral appliance that looks something like a mouthguard worn by a hockey player. The appliance works, and Dr. Walton is finally sleeping more easily. (So is her partner.) And because she was told she had obstructive sleep apnea, a more serious disorder than simple snoring, her treatments have been mostly covered by insurance. Still, she estimates she has spent hundreds of dollars in deductibles, co-payments and fees. Dr. Walton would have preferred not to go through so much expensive trial and error: “Unfortunately, it’s the nature of this condition.” Almost half of the adult population snores at least occasionally. Snoring occurs when air flows past relaxed tissues in the throat, causing them to vibrate. Nasal congestion can also contribute to the racket. “We laugh and joke about snoring,” said Dr. Nancy A. Collop, president elect of the American Academy of Sleep Medicine, “but it can be pretty annoying and disruptive to couples.” What’s more, while ordinary snoring in itself does not present health problems, it may be a sign of a sleep apnea, as it was in Dr. Walton’s case. Patients suffering from sleep apnea have airways that are so obstructed they stop or nearly stop breathing during sleep. Copyright 2010 The New York Times Company
Keyword: Sleep
Link ID: 14764 - Posted: 12.11.2010
By Bruce Bower Youth is wasted on the young, but not so for face memory. In an unexpected discovery, people remember unfamiliar faces best between ages 30 and 34, scientists report in an upcoming issue of Cognition. Many researchers think word skills, memory and other mental functions crest in the early 20s, as the brain attains full maturity. Consistent with that assumption, memory for names and for upside-down faces — a task that requires recognition of general visual patterns — hits a high point at ages 23 to 24, says a team led by psychology graduate student Laura Germine of Harvard University. But in an unanticipated twist, face learning takes about a decade longer to be the best it can be, the researchers find in online experiments conducted with 44,680 volunteers, ages 10 to 70. “Specialized face-processing in the brain may require an extended period of visual tuning during early adulthood to help individuals learn and recognize lots of different faces,” Germine says. Although researchers have not previously looked for late-developing face memory, the new findings fit with evidence that a brain structure critical for face recognition — the fusiform gyrus — undergoes reorganization at least through young adulthood, comments psychologist Isabel Gauthier of Vanderbilt University in Nashville. Gauthier hypothesizes that this brain area underlies all sorts of visual expertise, with face recognition as its most prominent achievement (SN: 7/7/01, p. 10). © Society for Science & the Public 2000 - 2010
Keyword: Learning & Memory; Development of the Brain
Link ID: 14763 - Posted: 12.11.2010
By Laura Sanders A menacing substance builds up in the brains of people with Alzheimer’s disease not because they make too much of it, but rather because they can’t get rid of it, a study appearing online December 9 in Science suggests. Understanding how the substance, called amyloid-beta, lodges in the brain is likely to yield clues about how Alzheimer’s disease inflicts its devastating damage. There’s no clear consensus on the ultimate cause of Alzheimer’s, but many scientists think A-beta is at the heart of the disease. The protein is thought to interfere with cells in the brain, scrambling its normal operations. In some rare forms of Alzheimer’s disease, genetic mutations ramp up the production of A-beta, creating an imbalance that floods the brain with the protein. But the cause of the accumulation is murkier for the most common form of Alzheimer’s disease. Although the new study is preliminary and has limitations, it suggests that A-beta clearance is the problem. The research “takes a long-held hypothesis and finally supports it with data,” says psychiatrist Bill Klunk of the University of Pittsburgh School of Medicine. In the study, researchers led by Randall Bateman of the Washington University School of Medicine in St. Louis designed a method to track the flux of A-beta in people living with the disease. The amino acid leucine was labeled with carbon-13, which is scarce in the body, and infused into 12 healthy volunteers and 12 volunteers with Alzheimer’s disease. © Society for Science & the Public 2000 - 2010
Keyword: Alzheimers
Link ID: 14762 - Posted: 12.11.2010
by Andy Coghlan STEM cells from the human brain that were transplanted into the brains of newborn rats have matured and are able to function just like native rat cells. The breakthrough demonstrates the potential for people with brain damage, caused by epilepsy or Parkinson's for example, to use their own brain stem cells as a treatment. The key finding was that the adult stem cells had the ability to turn into all types of brain tissue in the rats. This includes the neocortex, which deals with higher processing, and the hippocampus, involved in memory and spatial awareness. "We're showing the most dramatic integration of human adult neurons into rat brains," says Steven Roper of the University of Florida in Gainesville, who carried out the work. Roper extracted the adult stem cells from tissue he had taken from a teenage girl's brain as part of standard epilepsy surgery. He and his colleague Dennis Steindler multiplied the cells in the lab, then genetically engineered them so that they would glow green under ultraviolet light. Next, they injected groups of the cells into the brains of newborn rats. Three weeks later, they examined the rats' brains and found green cells throughout. "The cells matured into neurons appropriate for each part of the brain they reached," says Roper. © Copyright Reed Business Information Ltd.
Keyword: Stem Cells; Regeneration
Link ID: 14761 - Posted: 12.11.2010
by Carlin Flora At the 2008 Beijing Olympics, the star American gymnast Alicia Sacramone was expected to grab gold. But just as she approached the balance beam, an official pulled her aside. Watching at home on TV, Sian Beilock, an associate professor of psychology at the University of Chicago, cringed. An expert on “choking”—or falling apart under pressure—Beilock knew that allowing an athlete even a second to think about what she’s about to do can be disastrous. Indeed, after getting the all-clear and flipping backward onto the beam, Sacramone teetered, then crashed to the floor, costing her team the all-around title. Anyone who has flubbed a presentation or bombed an easy test knows the heartbreak of the choke. Beilock, who has Ph.D.s in psychology and kinesiology, the study of movement and physical activity, has found that test takers, speech givers, musicians, and top athletes fail in similar ways. (Her lab notably features both stacks of math tests and a putting green.) Choking happens when we let anxious thoughts distract us or when we start trying to consciously control motor skills best left on autopilot. Case in point: When Beilock asked golfers to think about their elbows before taking a shot, they performed worse than usual. Another trigger that can engage the choking mechanism is too much audience support. Home teams experience a significant disadvantage during playoffs or championship games because all the love amps up the pressure—and pressure spurs anxious thoughts and a misguided impulse to take conscious control of well-oiled automatic processes. © 2010, Kalmbach Publishing Co.
Keyword: Attention
Link ID: 14760 - Posted: 12.11.2010
by Alva Noë Inside each of us there is a thing that thinks and feels and wants and decides. Each of us is that thing. This is the traditional view of mind, the view that has dominated establishment research into cognition and consciousness for the last 500 years. Contemporary scientists — neuroscientists as well as other cognitive scientists — by and large take this basic schema for granted no less than Descartes did. Of course, today’s thinkers believe that thing inside us, which is the self we are, is a bit of our flesh (the brain). Descartes, for his part, could not conceive of how mere meat could produce mind, so he supposed that mind was an immaterial something. But this difference, it turns out, and as I argue in Out of Our Heads, is merely technical. Despite having learned so much about the anatomy and physiology of the human brain in the last century, we don’t actually have a better account of how consciousness and cognition arise in the brain than it arises out of immaterial soul-stuff. This last claim is not controversial, not really. But then why are we so certain, as a scientific and as a popular culture, that the secrets to our nature lie inside us, in the brain? Answer: We can’t imagine an alternative to this “you are your brain” idea that does not end up giving up on science. Either you are your brain, or you are a mystery. But this is mistaken. Copyright 2010 NPR
Keyword: Attention
Link ID: 14759 - Posted: 12.11.2010
by Greg Miller Before dipping your hand into that bowl of M&Ms at the holiday party, think about what you're about to do. A lot. A new study finds that people who imagine themselves consuming many pieces of candy eat less of the real thing when given the chance. The finding, say experts, could lead to the development of better weight-loss strategies. Picturing a delicious food—like a juicy steak or an ice cream sundae—generally whets the appetite. But what about visualizing yourself eating the entire sundae, spoonful by spoonful? There's reason to think that might have the opposite effect, says Carey Morewedge, an experimental psychologist at Carnegie Mellon University in Pittsburgh, Pennsylvania. Researchers have found that repeated exposure to a particular food—as in taking bite after bite of it—decreases the desire to consume more. This process, which psychologists call habituation, dampens appetite independently of physiological signals like rising blood sugar levels or an expanding stomach. But no one had looked to see whether merely imagining eating has the same effect. To investigate, Morewedge and colleagues Young Eun Huh and Joachim Vosgerau fed M&Ms and cheese cubes to 51 undergraduate students. In one experiment, the participants first imagined performing 33 repetitive motions: Half of them imagined eating 30 M&Ms and inserting three quarters into the slot of a laundry machine. The other half envisioned eating three M&Ms and inserting 30 quarters. Then everyone was allowed to eat their fill from a bowl of M&Ms. Those who'd envisioned eating more candy ate about three M&Ms on average (or about 2.2 grams), whereas the others ate about five M&Ms (or about 4.2 grams), the researchers report in the 10 December issue of Science. © 2010 American Association for the Advancement of Science.
Keyword: Obesity
Link ID: 14758 - Posted: 12.11.2010
Posted by Mo SUZANNE Corkin is a professor of behavioural neuroscience at the Massachusetts Institute of Technology who worked with the famous amnesic patient H.M. for more than 45 years. I interviewed her at the annual meeting of the Society for Neuroscience in San Diego last month, for this article I wrote for The Dana Foundation. We talked about her work with H.M., and about the project to examine his brain now that he has died, which was partly funded by Dana. The transcript of our conversation is below. How long did you work with H.M.? Did he ever know who you were? What was he like? I started working with H.M. in 1962 when I was a graduate student [with Brenda Milner], and I'm still working with him. In a way he did remember me - he didn't think I was a stranger, he always thought I was his friend from high school. But he never knew who I really was. He was always very polite, and those who knew him say exactly the same thing. I've talked to a few of his classmates, and they all said he was very quiet and kept himself to himself. He may have been like that because of his epilepsy. Perhaps he was afraid of having a seizure and embarrassing himself. But his father was also a quiet person, so maybe he had part of this in his genes. It wouldn't be an exaggeration to say that he has contributed more to our understanding of memory than anyone else. Yes, more than any other patient, and he's probably contributed more than all of the scientists who've studied him put together. I'd tell him that from time to time - I'd say "You know, you're really famous for doing all these tests." He'd act sheepish and smile, but then of course he'd forget what I said. © 2006-2010 ScienceBlogs LLC
Keyword: Learning & Memory
Link ID: 14757 - Posted: 12.11.2010
By Jennifer Viegas Human embryos resemble those of many other species because all animals carry very ancient genes. These genes date back to the origin of cells, which are expressed during a middle phase of embryonic development, according to two separate papers published in this week's Nature. The findings help to explain why our embryos have a tail when they are a few weeks old and why human embryos retain other characteristics, such as fur-like hair and fish embryo similarities, seen in the developmental stages of other species. "On average, the similarities will be even stronger for more closely related species," Diethard Tautz told Discovery News. "However, it is indeed true that even fish and human embryos go through a phase that looks very comparable, while they are rather different before and after this," added Tautz, who co-authored one of the papers and serves as managing director of the Max Planck Institute for Evolutionary Biology. He and colleague Tomislav Domazet-Loso tackled the "ontogeny recapitulates phylogeny" puzzle. This expression means that a more advanced organism, like humans, will resemble less advanced species during it's development stages. © 2010 Discovery Communications
Keyword: Development of the Brain; Genes & Behavior
Link ID: 14756 - Posted: 12.09.2010
by Debora MacKenzie It's been called the "warrior gene" – a mutation that seems to make people more aggressive. Now researchers report that people with this gene may not be aggressive, just better at spotting their own interests. Previous research has found that people with MAOA-L, a gene that controls signalling chemicals in the brain, can be more aggressive. But there is enormous controversy about this, as the gene's effects seem to vary with people's backgrounds. Cary Frydman and colleagues at the California Institute of Technology in Pasadena have now found that people with MAOA-L "just make better choices", says Frydman. "This isn't the same as aggression." Variants of the gene MAOA produce less or more of an enzyme that degrades several signalling chemicals, known as neurotransmitters. People with MAOA-L, which results in less of the enzyme, sometimes show more aggression or impulsivity – but not always. To try to dissect these differences, Frydman gave 83 male volunteers 140 hypothetical choices. With 3 minutes for each choice, the men had to decide whether they preferred a sure thing, say being given $2, or a risky option, for example a 50:50 chance of gaining $10 or losing $5. © Copyright Reed Business Information Ltd.
Keyword: Aggression; Genes & Behavior
Link ID: 14755 - Posted: 12.09.2010
by Cassandra Willyard Rods and cones hog all the credit for allowing us to see. But these light-sensitive neurons get some help from a much rarer kind of cell, according to a new study. If these unheralded cells are as important as the authors suspect, studying them may open the door to new therapies for some forms of blindness. Scientists have known of the existence of these nerve cells, called melanopsin-containing retinal ganglion cells (mRGCs), since 2000. Research over the past decade has shown that they play an important role in reflexive responses to light, such as pupil constriction and regulation of the body's sleep-wake cycle. But they did not appear to be involved in vision. In July, however, researchers reported in the journal Neuron that the stringy extensions, or axons, of mRGCs extend into parts of the mouse brain involved in conscious vision, not just the parts of the brain that control unconscious responses to light. The latest study confirms that finding and suggests that mRGCs enable mice to sense the brightness of their surroundings. In the new work, researchers tagged the mRGCs with a blue protein to see where the cells occur in the mouse eye. When they tracked the cells' axons from the eye into the brain, they saw that many of them terminated in the lateral geniculate nucleus (LGN), the first relay station in the brain for visual information. © 2010 American Association for the Advancement of Science.
Keyword: Vision
Link ID: 14754 - Posted: 12.09.2010
Alison Abbott Animal activists last summer set fire to the alpine holiday home of Daniel Vasella, then chief executive of pharmaceutical giant Novartis of Basel, Switzerland, in one of relatively few violent attacks on scientists working with animals in German-speaking countries. But in the past few years these scientists have been feeling the pressure in other ways — from animal activists who have attempted to publicly shame them or have sent threatening e-mails, and from legislation that increasingly restricts the use of animals in basic research. Now, in a bid to reverse that trend, more than 50 top scientists working in Germany and Switzerland have launched an education offensive. Meeting in Basel on 29 November, they drafted and signed a declaration pledging to be more open about their research, and to engage in more public dialogue. "The public tends to have false perceptions about animal research, such as thinking they can always be replaced by alternative methods like cell culture," says Stefan Treue, director of the German Primate Center in Göttingen. Treue co-chaired the Basel meeting, called 'Research at a Crossroads', with molecular biologist Michael Hengartner, dean of science at the University of Zurich, Switzerland. Outreach activities, such as inviting the public into universities to talk to scientists about animal research, "will be helpful to both sides". © 2010 Nature Publishing Group
Keyword: Animal Rights
Link ID: 14753 - Posted: 12.07.2010
By BENEDICT CAREY He did two crossword puzzles a day, sometimes more, working through the list of clues in strict order, as if to remember where he was. And, perhaps, what he was doing. Henry Gustav Molaison — known through most of his life only as H.M., to protect his privacy — became the most studied patient in the history of brain science after 1953, when an experimental brain operation left him, at age 27, unable to form new memories. Up until his death in a nursing home in 2008, Mr. Molaison cooperated in hundreds of studies, helping scientists identify and describe the brain structures critical to acquiring new information. He performed memory tests; he filled out questionnaires; he sat for brain scans and performed countless research tasks, each time as if for the first time. In between it all he did puzzles, books upon books of them, a habit he’d picked up as a teenager. Near the end of his life he kept a crossword book and pen with him always, in a basket attached to his walker. His solving opened a window on the brain, and demonstrated puzzles’ power, and their limitations, in stretching a damaged mind. “For someone with this profound amnesia, the question was: Why, of all the pastimes out there, would he find crosswords so reassuring?” said Dr. Brian Skotko, a clinical fellow in genetics at Children’s Hospital Boston. “Well, in a world that was buzzing by and not always so easy to understand, I think finding solutions gave him great satisfaction. He had those puzzle books nearby morning, afternoon and night, and he turned to them if nothing else was going on. It was his go-to activity.” Copyright 2010 The New York Times Company
Keyword: Learning & Memory
Link ID: 14752 - Posted: 12.07.2010
AUTISM is a puzzling phenomenon. In its pure form it is an inability to understand the emotional responses of others that is seen in people of otherwise normal—sometimes above normal—intelligence. However, it is often associated with other problems, and can also appear in mild and severe forms. This variability has led many people to think of it as a spectrum of symptoms rather than a single, clear-cut syndrome. And that variability makes it hard to work out what causes it. There is evidence of genetic influence, but no clear pattern of inheritance. The thought that the underlying cause may be hereditary, though, is one reason for disbelieving the hypothesis, which gained traction a few years ago but is now discredited, that measles vaccinations cause autism. One suggestion that does pop up from time to time is that the process which leads to autism involves faulty mitochondria. The mitochondria are a cell’s powerpacks. They disassemble sugar molecules and turn the energy thus liberated into a form that biochemical machinery can use. Mitochondrial faults could be caused by broken genes, by environmental effects, or by a combination of the two. Nerve cells have a huge demand for energy, so a failure of the mitochondria would certainly affect them. The question is, could it cause autism? To try to find out Cecilia Giulivi of the University of California, Davis, and her colleagues studied the mitochondria of ten children, aged between two and five years, who had been diagnosed with autism. They have just published their results in the Journal of the American Medical Association. © The Economist Newspaper Limited 2010.
Keyword: Autism
Link ID: 14751 - Posted: 12.07.2010
By Laura Hambleton Emergency room nurse Sheila DeRiso stood at the front of the high school auditorium and looked out on her audience of 50 teenagers and parents. From a black nylon bag she pulled out a long, plastic tube, then a stomach pump, a speculum, a catheter and an adult diaper. The group tittered. All these items are used in the ER every day to treat binge-drinking teens, she told them. Then she yanked out a white body bag and unfolded it: "And this is if you don't make it." The audience was dead silent. Binge drinking, or consuming many drinks fairly quickly, has been a hallmark of college life. But students in high school and even middle school are also engaging in it, according to DeRiso, local police officials and experts. In one 2005 study of 5,300 middle school students, about 8 percent of seventh-graders and 17 percent of eighth-graders said they had tried binge drinking during that year. The results of these bouts of excessive drinking show up in ERs across the region. Most weekend nights - and especially around holidays - young people arrive at the ER injured in car crashes, sick with alcohol blood poisoning, unconscious or barely conscious after binge drinking or engaging in the newest trend of blackout drinking, or drinking to the point of intentionally passing out. "I am a nurse," DeRiso told her audience at Magruder High School in Rockville. "Alcohol affects your thinking, your coordination and judgment. Alcohol affects your brain." © 2010 The Washington Post Company
Keyword: Drug Abuse; Development of the Brain
Link ID: 14750 - Posted: 12.07.2010
By Sandra G. Boodman As the all-too-familiar number flashed on his cellphone shortly before 9 p.m., Dan Landri-gan reflexively braced himself for bad news. The caller was one of the doctors treating his wife, Donna, who had been in a coma for four months. "She sounded pretty choked up," Landrigan recalled. "I think we've found out what's making your wife sick," the specialist at the University of Rochester's Strong Memorial Hospital told him, as a wave of relief flooded his body. "I was completely shocked," said the telecommunications executive, now 37. "My hope for so long was that this was the phone call I was going to get." Doctors at three Upstate New York hospitals had been stymied by Donna Landrigan, whose case was unlike any they had seen. The previously healthy 35-year-old mother of three had initially become so psychotic she had to be tied to her hospital bed to keep her from hurting herself or attacking others. A few weeks later she had been placed in a medically induced coma to protect her from the continuous seizures wracking her brain, spasms that could have killed her. Every promising lead had seemed to turn into a dead end, and the dangers of prolonged coma, including severe brain damage, were mounting. Things looked so hopeless that doctors had begun discussing whether to suggest terminating life support. © 2010 The Washington Post Company
Keyword: Epilepsy; Hormones & Behavior
Link ID: 14749 - Posted: 12.07.2010
by Jennifer Viegas You may not hear them go "Ouch," but fish feel pain just the same, according to a new book by Penn State professor Victoria Braithwaite. In her book "Do Fish Feel Pain?" (Oxford University Press, 2010), Braithewaite presents her case that fish, like most other organisms, are capable of experiencing pain and that humans can cause fish to suffer. Here at Discovery News we've covered similar research that concluded lobsters, crab and other shellfish feel pain too. For me, it would be a surprise if they didn't, but scientists have been struggling for ways of proving the obvious here. I think Braithewaite does a good job of summarizing the latest findings. Braithewaite found that fish have the same kinds of specialized nerve fibers that mammals and birds use to detect noxious stimuli, tissue damage and pain. She also explored whether fish are sentient beings and whether an organism must possess "awareness" to experience pain. "We now know that fish actually are cognitively more competent than we thought before -- some species of fish have very sophisticated forms of cognition," she said in a press release. "In our experiments we showed that if we hurt fish, they react, and then if we give them pain relief, they change their behavior, strongly indicating that they feel pain." She was initially drawn to the issue after reading about fish-farming concerns. © 2010 Discovery Communications, LLC.
Keyword: Pain & Touch; Evolution
Link ID: 14748 - Posted: 12.07.2010
by Robert Adler They were technologically savvy, creative and cultured. So maybe it's time we accepted that Neanderthals were people just like us EVER since the first fossils of a brawny, low-browed, chimp-chested hominin were unearthed in Germany in 1856, Neanderthals have stirred both fascination and disdain. German pathologist Rudolf Virchow decreed that the bones belonged to a wounded Cossack whose brow ridges reflected years of pain-driven frowns. French palaeontologist Marcellin Boule recognised the fossils as ancient, but ignored signs that the specimen he studied suffered from arthritis. It was he who reconstructed the bent-kneed, shambling brute that still lurks in the back of most people's minds. Irish geologist William King found the creature so ape-like that he considered putting it into a new genus. In the end he merely relegated it to a separate species, Homo neanderthalensis. Since then, hundreds of Neanderthal sites have been excavated. These show that Neanderthals occupied much of modern-day Eurasia, from the British Isles to Siberia, and from the Red Sea to the North Sea. Here they survived 200,000 years or more of climatic chaos before eventually disappearing around 30,000 years ago. The long-held view that Neanderthals were inferior to Homo sapiens is changing as, one by one, capabilities thought unique to us have been linked to them. What's more, the two species clearly crossed paths, and the publication of the Neanderthal genome earlier this year shows that they interbred. We share over 99 per cent of our genes with Neanderthals, and after splitting from a common ancestor almost 500,000 years ago anatomically modern humans met and mated with Neanderthals, most likely in the Middle East around 45,000 years ago. © Copyright Reed Business Information Ltd.
Keyword: Evolution
Link ID: 14747 - Posted: 12.07.2010
Scientists have identified a way of prompting nerve system repair in multiple sclerosis (MS). Studies on rats by Cambridge and Edinburgh University researchers identified how to help stem cells in the brain regenerate myelin sheath, needed to protect nerve fibres. MS charities said the "exciting" Nature Neuroscience work offered hope of restoring physical functions. But they cautioned it would be some years before treatments were developed. MS is caused by a defect in the body's immune system, which turns in on itself, and attacks the fatty myelin sheath. It is thought to affect around 100,000 people in the UK. Around 85% have the relapsing/remitting form of the condition, in which "flare-ups" which cause disability, are followed by a recovery of a level of the lost physical function. In this form of MS, there does appear to be some natural myelin repair. However, around 10% of people are diagnosed with a progressive form of MS, where the decline continues without any periods of remission. BBC © MMX
Keyword: Multiple Sclerosis; Stem Cells
Link ID: 14746 - Posted: 12.06.2010
By Christine Gorman When the National Institutes of Health convened a panel of independent experts this past April on how to prevent Alzheimer’s disease, the conclusions were pretty grim. The panel determined that “no evidence of even moderate scientific quality” links anything—from herbal or nutritional supplements to prescription medications to social, economic or environmental conditions—with the slightest decrease in the risk of developing Alzheimer’s. Furthermore, the committee argued, there is little credible evidence that you can do anything to delay the kinds of memory problems that are often associated with aging. The researchers’ conclusions made headlines around the world and struck a blow at the many purveyors of “brain boosters,” “memory enhancers” and “cognitive-training software” that advertise their wares on the Web and on television. One of the panel experts later told reporters in a conference call that the group wanted to “dissuade folks from spending extraordinary amounts of money on stuff that doesn’t work.” But did the panel overstate its case? Some memory and cognition researchers privately grumbled that the conclusions were too negative—particularly with respect to the potential benefits of not smoking, treating high blood pressure and engaging in physical activity. In late September the British Journal of Sports Medicine published a few of these criticisms. As a longtime science journalist, I suspected that this is the kind of instructive controversy—with top-level people taking opposing positions—that often occurs at the leading edge of research. As I spoke with various researchers, I realized that the disagreements signaled newly emerging views of how the brain ages. Investigators are exploring whether they need to look beyond the brain to the heart to understand what happens to nerve cells over the course of decades. In the process, they are uncovering new roles for the cardiovascular system, including ones that go beyond supplying the brain with plenty of oxygen-rich blood. The findings could suggest useful avenues for delaying dementia or less severe memory problems. © 2010 Scientific American,
Keyword: Alzheimers
Link ID: 14745 - Posted: 12.06.2010


.gif)

