Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Emily Anthes Three weeks after beginning his freshman year of college, 18-year-old David started behaving strangely. He made an impromptu trip from his New Hampshire school to his home in Revere, arriving at 3 a.m. His mother immediately noticed that something was wrong. “David was acting weird,’’ says Norma, 51, who asked that she and her son be identified by first names only. “He was spacing out, he was very disheveled, saying things that weren’t making sense at all. He cried a lot. He was listening to one CD on repeat. I kept asking what went through his mind, but he wouldn’t answer.’’ Doctors determined that David could be in the early stages of a psychotic episode and referred him to Massachusetts General Hospital. There, the First-Episode and Early Psychosis Program is designed to prevent small psychotic episodes from turning into big problems, such as schizophrenia. It’s tricky to identify the warning signs of mental health problems — there’s no blood test, for instance, that can signal coming distress. But experts are increasingly watchful for children and teens who are displaying subtle signs that their brains might be in trouble. Traditionally, attention has focused on chronic disease. But “once people have had five hospitalizations the train has sort of left the station,’’ says Dr. Oliver Freudenreich, a psychiatrist who directs the MGH program. “Catching the illness as early as possible means that you probably have an illness that is not as severe, [for which] interventions work better.’’ © 2010 NY Times Co
Keyword: Schizophrenia
Link ID: 14422 - Posted: 09.06.2010
By DUFF WILSON OPELOUSAS, La. — At 18 months, Kyle Warren started taking a daily antipsychotic drug on the orders of a pediatrician trying to quell the boy’s severe temper tantrums. Thus began a troubled toddler’s journey from one doctor to another, from one diagnosis to another, involving even more drugs. Autism, bipolar disorder, hyperactivity, insomnia, oppositional defiant disorder. The boy’s daily pill regimen multiplied: the antipsychotic Risperdal, the antidepressant Prozac, two sleeping medicines and one for attention-deficit disorder. All by the time he was 3. He was sedated, drooling and overweight from the side effects of the antipsychotic medicine. Although his mother, Brandy Warren, had been at her “wit’s end” when she resorted to the drug treatment, she began to worry about Kyle’s altered personality. “All I had was a medicated little boy,” Ms. Warren said. “I didn’t have my son. It’s like, you’d look into his eyes and you would just see just blankness.” Today, 6-year-old Kyle is in his fourth week of first grade, scoring high marks on his first tests. He is rambunctious and much thinner. Weaned off the drugs through a program affiliated with Tulane University that is aimed at helping low-income families whose children have mental health problems, Kyle now laughs easily and teases his family. Ms. Warren and Kyle’s new doctors point to his remarkable progress — and a more common diagnosis for children of attention-deficit hyperactivity disorder — as proof that he should have never been prescribed such powerful drugs in the first place. Copyright 2010 The New York Times Company
Keyword: Schizophrenia; Development of the Brain
Link ID: 14421 - Posted: 09.03.2010
By Jeremy Laurance, Health Editor If you want to tell whether your baby is in pain, looking at its face may not be enough, researchers have found. Generations of mothers have depended on their baby's facial expressions to tell them what they are feeling. But a study has found that giving a baby a spoonful of sugar before an injection or blood test may alter its expression without lessening its pain. The finding casts doubt on whether we can really know what a baby is feeling from observing its responses – and on the decade-old practice of using sugar as a pain reliever for infants. Until the 1950s, doctors thought babies did not suffer pain because their consciousness was not sufficiently developed. The normal pain responses – grimacing and crying – were dismissed as reflexes. Babies subjected to surgery were given anaesthetics to put them to sleep but not analgesic drugs for the pain, as children and adults were. In the 1970s, a definitive study showed babies did benefit from analgesia. But as it is difficult to test them on babies, few drugs are available. Giving a teaspoonful of sugar solution to babies was thought to relieve pain based on the way it reduced grimacing and crying after a painful procedure. It is believed to stimulate the production of "endogenous opiates" – the body's own natural pain-relieving drugs – and has become standard practice before blood tests and similar procedures. Some doctors maintain the evidence is now so strong that it may be unethical not to use it. ©independent.co.uk
Keyword: Emotions; Pain & Touch
Link ID: 14420 - Posted: 09.03.2010
Erin Allday, Chronicle Staff Writer We've all seen these people: the boss who blows her top when a meeting runs five minutes late, the man in the coffee shop who screams and rants when his latte isn't made with soy milk, the maniac driver who honks at every car in stop-and-go traffic. Maybe some of us actually are those people. Aside from being annoying, and sometimes even threatening, angry people aren't doing themselves any favors. A growing body of research suggests they may be setting themselves up for everything from heart disease and irritable bowel syndrome to headaches and maybe just the common cold. The latest research - a study of 5,600 Italians, published this month in the journal of the American Heart Association - found that individuals who are cynical, manipulative, arrogant or short-tempered have thicker carotid arteries, which means they're more vulnerable to heart attacks and strokes. What's doing the damage is stress and how angry people react to it - or overreact to it, mental health experts said. "It's sort of like idling the car too high on the traffic light - you're going to be racing your engine when you don't need to," said Dr. David Spiegel, associate chairman of psychiatry and behavioral sciences at Stanford University School of Medicine. "There are times when it's right to get angry. But if your characteristic response is anger, it's really a failure to deal with stress." © 2010 Hearst Communications Inc
Keyword: Emotions; Stress
Link ID: 14419 - Posted: 09.03.2010
By GINA KOLATA In a year when news about Alzheimer’s disease seems to whipsaw between encouraging and disheartening, a new discovery by an 84-year-old scientist has illuminated a new direction. The scientist, Paul Greengard, who was awarded a Nobel Prize in 2000 for his work on signaling in brain cells, still works in his Rockefeller University laboratory in New York City seven days a week, walking there from his apartment two blocks away, taking his aging Bernese mountain dog, Alpha. He got interested in Alzheimer’s about 25 years ago when his wife’s father developed it, and his research is now supported by a philanthropic foundation that was started solely to allow him to study the disease. It was mostly these funds and federal government grants that allowed him to find a new protein that is needed to make beta amyloid, which makes up the telltale plaque that builds up in the brains of people with Alzheimer’s. The finding, to be published Thursday in the journal Nature, reveals a new potential drug target that, according to the prevailing hypothesis of the genesis of Alzheimer’s, could slow or halt the devastating effects of this now untreatable disease. The work involves laboratory experiments and studies with mice — it is far from ready for the doctor’s office. But researchers, still reeling from the announcement two weeks ago by Eli Lilly that its experimental drug turned out to make Alzheimer’s worse, not better, were encouraged. Copyright 2010 The New York Times Company
Keyword: Alzheimers
Link ID: 14418 - Posted: 09.03.2010
By Tina Hesman Saey Eating may rejuvenate a tired body, but new research in fruit flies suggests that fasting actually helps ward off the ravages of sleep deprivation. Starving sleep-deprived fruit flies sheltered the insects from sleepiness and fended off learning and memory difficulties associated with grogginess, researchers report August 31 in PLoS Biology. Starvation may slow down the buildup of sleep-inducing substances that accumulate while an animal is awake, says Paul Shaw, a neuroscientist at Washington University School of Medicine in St. Louis who led the work. The new study suggests that a rise in lipids, a type of fat, during wakefulness makes fruit flies sluggish. Learning how lipids induce sleepiness may eventually help develop new sleep remedies and shed new light on how sleep evolved. The findings herald “a big change for the field” of sleep research, says Robert Greene, a neurobiologist at the University of Texas Southwestern Medical Center at Dallas. “It emphasizes the importance of metabolism and its interaction with sleep.” Scientists are still debating why animals and people sleep (SN: 10/24/09, p. 16). To learn what happens during sleep, most researchers compare sleep-deprived animals with animals that have been allowed to sleep normally. In the new study, Shaw’s team took a different approach. The researchers wanted to see if there was a difference between fruit flies that have been kept up all night by bumping them awake whenever they tried to sleep and fruit flies that stay awake longer than normal because they are starving. © Society for Science & the Public 2000 - 2010
Keyword: Sleep
Link ID: 14417 - Posted: 09.03.2010
by Kristen Minogue Most people don't want their parents meddling in their sex lives. But for one species of ape, having mom nearby can actually increase the odds of hooking up with an eligible mate. Bonobos—our closest living relatives, along with chimpanzees—aren't puritanical. Sex for these apes is a public, accepted form of social currency. They use it to acquire food from others, defuse conflicts, and ingratiate themselves with their superiors. But bonobos also live under a rigid social hierarchy. An ape retains its rank even when a community splits up into smaller groups to forage for food, which the primates do frequently. Normally with bonobos, the highest-ranking male in the group also mates the most, typically with the nubile females. But male bonobos also stick close to their mothers, sometimes spending as much as 90% of their time in their company. Because the mother-son bond is so strong, biologist Martin Surbeck of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and colleagues wondered whether having a mother close by could upset the mating hierarchy. For almost 2.5 years, the researchers observed a community of more than 30 wild bonobos in the Democratic Republic of the Congo's Salonga National Park, keeping a close eye on the adult and adolescent males—nine in all. When the apes split into small groups with fertile females but no moms, the highest-ranking male had about 40% of the intercourse with the females, the team will report online tomorrow in the Proceedings of the Royal Society B. But if every male's mom was also present, the top male managed only about 25% of the matings, leaving more for the subordinate males. Mom's presence didn't change the hierarchy, but it did level the playing field somewhat for the apes further down, says Surbeck. "The mother's like a social passport." © 2010 American Association for the Advancement of Science.
Keyword: Sexual Behavior; Evolution
Link ID: 14416 - Posted: 09.03.2010
By Katherine Harmon Marine worms might seem like lowly, slow-witted creatures, but new gene mapping shows that we might share an ancient brainy ancestor with them. Human cognition is largely rooted in the cerebral cortex, the part of the brain that enables consciousness, language and other higher-level functions. We share the basic evolutionary underpinnings of our big brains with other vertebrates, which have a structure known as the pallium. Although lacking palliums, many invertebrates, such as insects, spiders and some worms, instead have what are know as mushroom bodies—sections of the brain so called because their shape resembles mushrooms. Mushroom bodies and vertebrate palliums are both responsible for some sensory integration and memory, and they have "long been noted and interpreted as convergent acquisitions," noted a team of researchers in a new study, published online September 2 in Cell. In other words, the thinking has been that these two kinds of brains evolved from independent paths. The team, however, has proposed instead that these two brain structures do share a single common ancestor, one that likely lived some 600 million years ago. The group based their conclusions on new gene expression maps—"molecular fingerprints"— gathered from the mushroom bodies in developing marine ragworms (Platynereis dumerilii) that could be compared with gene expression patterns of developing vertebrate palliums. © 2010 Scientific American,
Keyword: Evolution
Link ID: 14415 - Posted: 09.03.2010
By Bruce Bower Mental exercise lets seniors outrun Alzheimer’s disease — for a while. Then the race takes a tragic turn for the sharp-minded, a new study finds, as declines in memory and other thinking skills kick into high gear. After age 65, regular participation in mentally stimulating activities, including doing crossword puzzles and reading, delays intellectual decay caused by Alzheimer’s disease, say neuropsychologist Robert Wilson of Rush University Medical Center in Chicago and his colleagues. But when this debilitating condition finally breaks through the defenses of a mentally fortified brain, it rapidly makes up for lost time, the scientists report in a paper published online September 1 in Neurology. “The benefit of delaying initial signs of cognitive decline by keeping mentally active may come at the cost of more rapid dementia progression later on,” Wilson says. His team also found that mental stimulation slows cognitive declines typically experienced by seniors with healthy brains but offers no protection against the onset of memory and thinking problems that fall short of Alzheimer’s disease. Several recent studies have pointed to a delayed but sharp drop in thinking skills among mentally active people who develop Alzheimer’s disease, remarks neuropsychologist Yaakov Stern of Columbia University College of Physicians and Surgeons in New York City. Unlike the new report, though, those studies did not compare mentally active adults who developed Alzheimer’s disease with those who remained healthy or lost some mental function. © Society for Science & the Public 2000 - 2010
Keyword: Alzheimers; Learning & Memory
Link ID: 14414 - Posted: 09.03.2010
by Catherine de Lange FROM the relaxing effects of cannabis to the highs of LSD and ecstasy, illegal drugs are not generally associated with the lab bench. Now, for the first time in decades, that is starting to change. For almost 40 years, mainstream research has shied away from investigating the therapeutic benefits of drugs whose recreational use is prohibited by law. But a better understanding of how these drugs work in animal studies, and the advancement of brain-imaging techniques, has sparked a swathe of new research. What's more, clinical trials of MDMA (ecstasy), LSD and other psychoactive drugs are starting to yield some positive results. This could lead to a call for governments to take a new approach to the funding and regulation of research into the potential benefits of such chemicals. LSD was developed in the 1940s (see "The highs and lows of LSD") but by the 1970s it and many other drugs became classed as schedule 1 in many countries - described as "abuse" drugs with no accepted medical use. "Research on psychedelics was severely restricted and interest in the therapeutic use of these drugs faded," says Franz Vollenweider of the neuropsychopharmacology and brain-imaging unit at the Zurich University Hospital of Psychiatry, Switzerland. The classification of LSD as schedule 1 was a mistake born of "ignorance and taboo", says Amanda Feilding, director of the Beckley Foundation, a charitable trust that promotes investigation into consciousness and its modulation, based in Oxford, UK. © Copyright Reed Business Information Ltd.
Keyword: Drug Abuse
Link ID: 14413 - Posted: 09.03.2010
By MATTHEW PERRONE WASHINGTON — Andrew White returned from a nine-month tour in Iraq beset with signs of post-traumatic stress disorder: insomnia, nightmares, constant restlessness. Doctors tried to ease his symptoms using three psychiatric drugs, including a potent anti-psychotic called Seroquel. Thousands of soldiers suffering from PTSD have received the same medication over the last nine years, helping to make Seroquel one of the Veteran Affairs Department's top drug expenditures and the No. 5 best-selling drug in the nation. Several soldiers and veterans have died while taking the pills, raising concerns among some military families that the government is not being up front about the drug's risks. They want Congress to investigate. In White's case, the nightmares persisted. So doctors recommended progressively larger doses of Seroquel. At one point, the 23-year-old Marine corporal was prescribed more than 1,600 milligrams per day — more than double the maximum dose recommended for schizophrenia patients. A short time later, White died in his sleep. "He was told if he had trouble sleeping he could take another (Seroquel) pill," said his father, Stan White, a retired high school principal. An investigation by the Veterans Affairs Department concluded that White died from a rare drug interaction. He was also taking an antidepressant and an anti-anxiety pill, as well as a painkiller for which he did not have a prescription. Inspectors concluded he received the "standard of care" for his condition. Copyright 2010 The Associated Press.
Keyword: Schizophrenia; Sleep
Link ID: 14412 - Posted: 08.31.2010
Smoking marijuana does help relieve a certain amount of pain, a small but well-designed Canadian study has found. People who suffer chronic neuropathic or nerve pain from damage or dysfunction of the nervous system have few treatment options with varying degrees of effectiveness and side-effects. Neuropathic pain is caused by damage to nerves that don't repair, which can make the skin sensitive to a light touch. Cannabis pills have been shown to help treat some types of pain but the effects and risks from smoked cannabis were unclear. To find out more, Dr. Mark Ware, an assistant professor in family medicine and anesthesia at Montreal's McGill University, and his colleagues conducted a randomized controlled trial — the gold standard of medical research — of inhaled cannabis in 21 adults with chronic neuropathic pain. Investigators used three different strengths of the active drug — THC levels of 2.5 per cent, six per cent and 9.4 per cent, as well as a zero per cent placebo. "We found that 25 mg herbal cannabis with 9.4 per cent THC, administered as a single smoked inhalation three times daily for five days, significantly reduces average pain intensity compared with a zero per cent THC cannabis placebo in adult subjects with chronic post traumatic/post surgical neuropathic pain," the study's authors concluded in Monday's online issue of the Canadian Medical Association Journal. Study participants inhaled the 25-milligram dose through a pipe for five days and then took no marijuana for nine days. Then they rotated through the other doses of THC. © CBC 2010
Keyword: Pain & Touch; Drug Abuse
Link ID: 14411 - Posted: 08.31.2010
By RICHARD A. FRIEDMAN, M.D. Of all the things that people do, few are as puzzling to psychiatrists as compulsive drug use. Sure, all drugs of abuse feel good — at least initially. But for most people, the euphoria doesn’t last. A patient of mine is all too typical. “I know this will sound strange,” he said, as I recall, “but cocaine doesn’t get me high any more and still I can’t stop.” When he first started using the drug, in his early 30s, my patient would go for days on a binge, hardly eating or drinking. The high was better than anything, even sex. Within several months, though, he had lost the euphoria — followed by his job. Only when his wife threatened to leave him did he finally seek treatment. When I met him, he told me that he would lose everything if he could not stop using cocaine. Well, I asked, what did he like about this drug, if it cost him so much and no longer made him feel good? He stared at me blankly. He had no clue. Neither did most psychiatrists, until recently. We understand the initial allure of recreational drugs pretty well. Whether it is cocaine, alcohol, opiates, you name it, drugs rapidly activate the brain’s reward system — a primitive neural circuit buried beneath the cortex — and release dopamine. This neurotransmitter, which is central to pleasure and desire, sends a message to the brain: This is an important experience that is worth remembering. Copyright 2010 The New York Times Company
Keyword: Drug Abuse
Link ID: 14410 - Posted: 08.31.2010
By TARA PARKER-POPE For kids around the country it’s back-to-school time. But for many of them, it’s also the return of headache season. Doctors say frequent headaches and migraines are among the most common childhood health complaints, yet the problem gets surprisingly little attention from the medical community. Many pediatricians and parents view migraines as an adult condition. And because many children complain of headaches more often during the school year than the summer, parents often think a child is exaggerating symptoms to get out of schoolwork. Often the real issue, say doctors, is that changes in a child’s sleep schedule, including getting up early for school and staying up late to study, as well as skipping breakfast, not drinking enough water and weather changes can all trigger migraines when the school year starts. “In many areas people just don’t think kids can get migraines,” says Dr. Andrew Hershey, professor of pediatrics and neurology and director of the headache center at Cincinnati Children’s Hospital Medical Center. “But kids shouldn’t be missing activities and having trouble at school because they’re having headaches. If it happens, it shouldn’t be ignored.” Migraine is an inherited neurological condition characterized by severe, often disabling headache pain. During a migraine attack, a number of changes occur throughout the brain causing dilation of blood vessels; severe pain; increased sensitivity to lights, sounds and smells; nausea and vomiting; and other symptoms. It’s estimated that about 10 percent of young children and up to 28 percent of older teenagers suffer from migraines. (Hormonal changes during puberty can also be a trigger.) Copyright 2010 The New York Times Company
Keyword: Pain & Touch; Development of the Brain
Link ID: 14409 - Posted: 08.31.2010
By CARL ZIMMER Why are worker ants sterile? Why do birds sometimes help their parents raise more chicks, instead of having chicks of their own? Why do bacteria explode with toxins to kill rival colonies? In 1964, the British biologist William Hamilton published a landmark paper to answer these kinds of questions. Sometimes, he argued, helping your relatives can spread your genes faster than having children of your own. For the past 46 years, biologists have used Dr. Hamilton’s theory to make sense of how animal societies evolve. They’ve even applied it to the evolution of our own species. But in the latest issue of the journal Nature, a team of prominent evolutionary biologists at Harvard try to demolish the theory. The scientists argue that studies on animals since Dr. Hamilton’s day have failed to support it. The scientists write that a close look at the underlying math reveals that Dr. Hamilton’s theory is superfluous. “It’s precisely like an ancient epicycle in the solar system,” said Martin Nowak, a co-author of the paper with Edward O. Wilson and Corina Tarnita. “The world is much simpler without it.” Other biologists are sharply divided about the paper. Some praise it for challenging a concept that has outlived its usefulness. But others dismiss it as fundamentally wrong. “Things are just bouncing around right now like a box full of Ping-Pong balls,” said James Hunt, a biologist at North Carolina State University. Copyright 2010 The New York Times Company
Keyword: Evolution; Sexual Behavior
Link ID: 14408 - Posted: 08.31.2010
By Steve Connor, Science Editor A revolutionary way of screening the entire human genome for the genetic signposts of disease has produced its latest success – the first inherited link to common migraine and a possible reason for extreme headaches. The technique, which scans all 23 pairs of human chromosomes in a single sweep, has found the first genetic risk factor that predisposes someone to the common form of migraine, which affects one in six women and one in 12 men. The discovery has immediately led to a new possible cause of migraine by alerting scientists to DNA defects involved in the build-up of a substance in the nerves of sufferers that could be the trigger for their migraines. Scientists believe the findings could lead both to a better understanding as well as new treatments for the chronic and debilitating condition which is estimated to be one of the most costly brain-related disorders in society, causing countless lost working days. Scanning the entire blueprint of human DNA by genome-wide association studies (GWAS) has had a profound effect on the understanding of a range of other medical conditions over the past few years, from heart disease and obesity to bipolar disorder and testicular cancer. The study of migraine, published in the journal Nature Genetics, was an archetypal example of the new approach of medical genetics using the GWAS technique. Scientists analysed the genomes of some 5,000 people with migraine and compared their DNA to that of unaffected people to see if there were any significant differences that could be linked statistically to the condition. ©independent.co.uk
Keyword: Pain & Touch; Genes & Behavior
Link ID: 14407 - Posted: 08.30.2010
Lauran Neergaard, Associated Press Scientists have created a new kind of artificial cornea, inserting a sliver of collagen into the eye that coaxes corneal cells to regrow and restore vision. It worked in a first-stage study of 10 patients in Sweden, researchers reported Wednesday. While larger studies are needed, it's a step toward developing an alternative to standard cornea transplants, which aren't available in much of the world because of a shortage of donated corneas. "We're trying to regenerate the cornea from within," said Dr. May Griffith, senior scientist at the Ottawa Hospital Research Institute in Canada and a professor of regenerative medicine at Linkoping University in Sweden. Vision depends on a healthy cornea, the filmlike covering of the eye's surface that helps it focus light. Corneas are fragile and easily harmed by injury or infection. About 42,000 people in the United States receive transplanted corneas every year. While that's considered an adequate supply in this country, donated corneas aren't available in many countries for the estimated 10 million people worldwide with corneal blindness. Transplants also bring risk of rejection. The new work, published in the journal Science Translational Medicine, is a bioartificial cornea - an attempt to use the same natural substances that make up a real cornea to induce healing. © 2010 Hearst Communications Inc.
Keyword: Vision
Link ID: 14406 - Posted: 08.30.2010
By GUY DEUTSCHER Seventy years ago, in 1940, a popular science magazine published a short article that set in motion one of the trendiest intellectual fads of the 20th century. At first glance, there seemed little about the article to augur its subsequent celebrity. Neither the title, “Science and Linguistics,” nor the magazine, M.I.T.’s Technology Review, was most people’s idea of glamour. And the author, a chemical engineer who worked for an insurance company and moonlighted as an anthropology lecturer at Yale University, was an unlikely candidate for international superstardom. And yet Benjamin Lee Whorf let loose an alluring idea about language’s power over the mind, and his stirring prose seduced a whole generation into believing that our mother tongue restricts what we are able to think. In particular, Whorf announced, Native American languages impose on their speakers a picture of reality that is totally different from ours, so their speakers would simply not be able to understand some of our most basic concepts, like the flow of time or the distinction between objects (like “stone”) and actions (like “fall”). For decades, Whorf’s theory dazzled both academics and the general public alike. In his shadow, others made a whole range of imaginative claims about the supposed power of language, from the assertion that Native American languages instill in their speakers an intuitive understanding of Einstein’s concept of time as a fourth dimension to the theory that the nature of the Jewish religion was determined by the tense system of ancient Hebrew. Eventually, Whorf’s theory crash-landed on hard facts and solid common sense, when it transpired that there had never actually been any evidence to support his fantastic claims. The reaction was so severe that for decades, any attempts to explore the influence of the mother tongue on our thoughts were relegated to the loony fringes of disrepute. But in the last few years, new research has revealed that when we learn our mother tongue, we do after all acquire certain habits of thought that shape our experience in significant and often surprising ways. Copyright 2010 The New York Times Company
Keyword: Language
Link ID: 14405 - Posted: 08.30.2010
By GINA KOLATA BETHESDA, Md. — The scene was a kind of science court. On trial was the question “Can anything — running on a treadmill, eating more spinach, learning Arabic — prevent Alzheimer’s disease or delay its progression?” To try to answer that question, the National Institutes of Health sponsored the court, appointing a jury of 15 medical scientists with no vested interests in Alzheimer’s research. They would hear the evidence and reach a judgment on what the data showed. For a day and a half last spring, researchers presented their cases, describing studies and explaining what they had hoped to show. The jury also heard from scientists from Duke University who had been commissioned to look at the body of evidence — hundreds of research papers — and weigh it. And the jury members had read the papers themselves, preparing for this day. The studies included research on nearly everything proposed to prevent the disease: exercise, mental stimulation, healthy diet, social engagement, nutritional supplements, anti-inflammatory drugs or those that lower cholesterol or blood pressure, even the idea that people who marry or stay trim might be saved from dementia. And they included research on traits that might hasten Alzheimer’s onset, like not having much of an education or being a loner. It is an issue that has taken on intense importance because scientists recently reported compelling evidence that two types of tests, PET scans of Alzheimer’s plaque in the brain and tests of spinal fluid, can find signs of the disease years before people have symptoms. That gives rise to the question: What, if anything, can people do to prevent it? Copyright 2010 The New York Times Company
Keyword: Alzheimers
Link ID: 14404 - Posted: 08.30.2010
Jerome Burne Sixty years ago, the philosopher Gilbert Ryle published his famous attack on Cartesian dualism, The Concept of Mind, which claimed to find a logical flaw in the popular notion that mental life has a parallel but separate existence from the physical body. Among other effects it provided sophisticated support for the psychological behaviourists, then in the ascendant, who asserted that since we could not objectively observe mental activity it was not really a fit subject for scientific investigation. Nowhere was the notion of banning mental states taken up more enthusiastically than by the emerging discipline of neuropsychiatry. If consciousness and all its manifestations were "merely" the firing of neurons and the release of chemicals in the brain, what need was there to focus on mental states? Once the physical brain was right, the rest would follow. It was an approach that has spawned a vast pharmaceutical industry to treat any pathological psychological state – anxiety, shyness, depression, psychosis – with a variety of pills. The underlying promise is that scientifically adjusting the levels of various brain chemicals will bring relief and a return to normality. The biggest-selling class of these drugs are the anti-depressant SSRIs – brands include Prozac, Seroxat and Lustral. A recent report revealed that they were the most widely prescribed drugs in America, with an estimated global market value of over $20 billion. However, as is set out calmly and clearly in Irving Kirsch’s The Emperor’s New Drugs, it would seem that the whole golden edifice is based on a lie. © Times Newspapers Ltd 2010
Keyword: Depression
Link ID: 14403 - Posted: 08.30.2010


.gif)

