Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Cassandra Willyard If you're constantly starting new diets, then breaking them, you may have more in common with a drug addict than you know. A new study suggests that yo-yo dieters experience the same stressful pangs of withdrawal when they go on a diet that addicts experience when they go cold turkey. The idea that bad food can be addictive is not new. But previous studies have tended to focus on the positive reinforcement side of the equation--for example, the pleasurable "rush" you get from eating chocolate cake. "This is just part of the story," says Pietro Cottone, a neuroscientist at Boston University and a co-author of the new study. The brain also has a negative reinforcement system that causes anxiety and stress during withdrawal. Rather than doing drugs for the rush, he says, addicts do drugs to relieve the stress associated with withdrawal. Dieters often follow the same pattern of abstinence and relapse as drug addicts, so Cottone and his colleagues wanted to see whether the same brain circuitry might be involved. The researchers gave one group of rats unlimited access to regular rat food for 5 days, followed by 2 days of sugary, chocolate-flavored rat chow. ("They like it a lot," says Cottone.) The team repeated this cycle for 7 weeks and compared the rats' food intake and behavior with that of a control group of rats that had access only to standard chow. The control rats ate roughly the same amount of food every day, but the rats in the experimental group did not: When the junk food arrived, they pigged out. By the fifth week, the experimental rats were eating roughly 20% more food when they had access to chocolate chow than rats in the control group ate. And when it was replaced with normal food, they ate less normal food, approximately 30% less by week 5. As the study progressed, the effect became stronger. What's more, the rats going through chocolate-chow withdrawal spent less time in the exposed parts of a specially designed maze, a measure of increased anxiety. When the chocolate chow was returned, the anxiety disappeared. © 2009 American Association for the Advancement of Science
Keyword: Drug Abuse; Obesity
Link ID: 13447 - Posted: 06.24.2010
Your ability to make sense of Groucho's words and Harpo's pantomimes in an old Marx Brothers movie takes place in the same regions of your brain, says new research funded by the National Institute on Deafness and Other Communication Disorders (NIDCD), one of the National Institutes of Health. In a study published in this week's Early Edition of Proceedings of the National Academy of Sciences (PNAS), researchers have shown that the brain regions that have long been recognized as a center in which spoken or written words are decoded are also important in interpreting wordless gestures. The findings suggest that these brain regions may play a much broader role in the interpretation of symbols than researchers have thought and, for this reason, could be the evolutionary starting point from which language originated. "In babies, the ability to communicate through gestures precedes spoken language, and you can predict a child's language skills based on the repertoire of his or her gestures during those early months," said James F. Battey, Jr., M.D., Ph.D., director of the NIDCD. "These findings not only provide compelling evidence regarding where language may have come from, they help explain the interplay that exists between language and gesture as children develop their language skills." Scientists have known that sign language is largely processed in the same regions of the brain as spoken language. These regions include the inferior frontal gyrus, or Broca's area, in the front left side of the brain, and the posterior temporal region, commonly referred to as Wernicke's area, toward the back left side of the brain. It isn't surprising that signed and spoken language activate the same brain regions, because sign language operates in the same way as spoken language does — with its own vocabulary and rules of grammar.
Keyword: Language; Evolution
Link ID: 13446 - Posted: 06.24.2010
By Mary Bates The discovery of mirror neurons in the brains of macaques about ten years ago sent shockwaves through the neuroscience community. Mirror neurons are cells that fire both when a monkey performs a certain task and when it observes another individual performing that same task. With the identification of networks of similarly-behaving cells in humans, there was much speculation over the role such neurons might play in phenomena such as imitation, language acquisition, observational learning, empathy, and theory of mind. Several research groups have observed the activity of mirror neuron networks indirectly in humans through the use of functional magnetic resonance imaging (fMRI). This technology allows scientists to correlate changes in blood flow in specific brain areas to particular behaviors or mental operations. Experiments using fMRI have demonstrated that there is more activation in the human mirror system when people observe movements with which they are familiar; for instance, experienced dancers had larger mirror network activations when they viewed steps from their own repertoire compared to moves from a different style of dance. Studies of the human mirror system have also revealed that it can be activated by the sounds of actions alone, in the absence of any visual cues. While evidence along these lines suggests that hearing can activate mirror neurons as well as vision, it is not clear if aurally-presented stimuli evoke visual imagery that then recruits the mirror system. These studies did not address whether a functional visual system was a necessary prerequisite for the development of the mirror system. © 1996-2009 Scientific American Inc.
Keyword: Vision
Link ID: 13445 - Posted: 06.24.2010
Marcus Munafò and Jonathan Flint. During the second world war, the physicist Enrico Fermi asked General Leslie Groves of the US Army how many generals might be called "great" and why. Groves replied that any general who won five major battles in a row might be called great, and that about three in every hundred would qualify. Fermi countered that if opposing forces are roughly equal, the odds are one in two that a general will win one battle, one in four that he will win two battles in a row, one in eight for three battles, one in 16 for four battles, and one in 32 for five battles in a row. "So you are right, General, about three in a hundred. Mathematical probability, not genius."1 There's an analogue of Fermi's "great general": the "great scientific discovery", or at least, as a case study, "the great genetic scientific discovery" as reported in the press. The discovery of genes for a certain behaviour, for schizophrenia, for happiness, always get good press coverage, usually based on publication in a respected scientific journal such as Science or Nature. The research paper will include a statistic: the probability that the finding could have occurred by chance. The probability will have been sufficiently low that a reviewer for the journal was impressed and therefore recommended publication. Typically this probability or "P-value" will be less than 0.05, or 5%, which means the odds are less than one in 20 that the observed genetic correlation could have occurred by chance. © Guardian News and Media Limited 2009
Keyword: Genes & Behavior; Depression
Link ID: 13444 - Posted: 06.24.2010
Being obese as a teenager may be linked with an increased risk of multiple sclerosis as an adult, researchers say. A 40-year study of 238,000 women found those who were obese at 18 had twice the risk of developing MS compared to women who were slimmer at that age. Yet body size during childhood or adulthood was not found to be associated with MS risk, the US researchers report in Neurology. But an MS charity warned more research was needed to confirm the findings. Researchers from Harvard School of Public Health used data from nurses taking part in a large study on diet, lifestyle factors and health. Over the course of the study, 593 women were diagnosed with MS, a condition caused by the loss of nerve fibres and their protective myelin sheath in the brain and spinal cord, which causes neurological damage. The researchers compared the risk of the disease with body mass index (BMI) - a ratio of weight to height - at age 18. Participants were also asked to describe their body size using a series of diagrams at the age of five, 10 and 20. The study showed that those with an "obese" BMI of 30 or larger at age 18 had more than twice the risk of developing MS. There was also a smaller increased risk in those who were classed as overweight. The results were the same after accounting for smoking status and physical activity level. When comparing the risk of MS with self-reported body shape, the researchers found no association between childhood obesity and the future chances of developing the disease. They also found no risk associated with adult obesity. But women who had a larger body size at 20 years of age also had almost twice the risk of MS compared to women who reported a thinner body size. (C)BBC
Keyword: Multiple Sclerosis; Obesity
Link ID: 13443 - Posted: 11.09.2009
By Victoria Gill A study in mice has hinted at the impact that early life trauma and stress can have on genes, and how they can result in behavioural problems. Scientists described the long-term effects of stress on baby mice in the journal Nature Neuroscience. Stressed mice produced hormones that "changed" their genes, affecting their behaviour throughout their lives. This work could provide clues to how stress and trauma in early life can lead to later problems. The study was led by Christopher Murgatroyd, a scientist from the Max Planck Institute of Psychiatry in Munich, Germany. He told BBC News that this study went into "molecular detail" - showing exactly how stressful experiences in early life could "programme" long-term behaviour. To do this, the researchers had to cause stress to newborn mouse pups and monitor how their experiences affected them throughout their lives. "We separated the pups from their mothers for three hours each day for ten days," Dr Murgatroyd explained. "It was a very mild stress and the animals were not affected at a nutritional level, but they would [have felt] abandoned." The team found that mice that had been "abandoned" during their early lives were then less able to cope with stressful situations throughout their lives. The stressed mice also had poorer memories. Dr Murgatroyd explained that these effects were caused by "epigenetic changes", where the early stressful experience actually changed the DNA of some of the animals' genes. (C)BBC
Keyword: Development of the Brain; Stress
Link ID: 13442 - Posted: 11.09.2009
By Virginia Morell Miss Piggy, the famed porcine muppet, knew a thing or two about mirrors. In fact, she was seldom without one. She may have been vain, but she was also one smart pig, given that researchers regard the ability to use a mirror as evidence of complex cognition. Now, it turns out, Miss Piggy isn't the only clever porker. Real pigs also understand the value of their reflection, according to new research, putting them in an elite group of animals. A team of animal welfare scientists at the University of Cambridge in the United Kingdom placed eight domesticated pigs (Sus scrofa), two at a time, in a pen with a mirror for 5 hours. Because pigs are social, they prefer having a companion in a pen; plus they could also observe each other's actions and movements in the mirror. At first, the pigs studied their reflected images and movements; some grunted at their image, and one banged the mirror so hard with its nose, it broke the glass. "They initially interpret the image as another pig," says lead author and animal welfare scientist Donald Broom. That's a classic error that most species never get beyond. But soon, the pigs showed their smarts. During their 5-hour sessions, they learned to correctly assess the mirror's properties--to understand the relationship "between their own movements and their image in the mirror," including the surrounding environment, says Broom. © 2009 American Association for the Advancement of Science.
Keyword: Intelligence; Evolution
Link ID: 13441 - Posted: 06.24.2010
by Nora Schultz The decline was rapid. I got my first pair of glasses aged 9, and by my mid-teens could no longer read the title on the cover of New Scientist at arm's length. With my mum's eyes just as bad, I always assumed that I'd inherited my short-sightedness from her and that I could do little to stop my vision from becoming a little blurrier each year. Around the same time, however, rates of short-sightedness, or myopia, were rising to epidemic proportions around the world. Today, in some of the worst-affected countries such as Singapore, Hong Kong and Taiwan, around 80 per cent of young adults are myopic, compared to only 25 per cent a few decades back. Rates are lower in western countries - between 30 and 50 per cent - but myopia seems to be rising steadily here too. What could be causing this mysterious epidemic? It is clear that genetics alone can't explain the condition, and the long-standing theory that reading was to blame has failed to play out in subsequent studies. Large-scale epidemiological surveys ensued, which have pinned down the specific aspects of modern lifestyles that cause children's eyesight to deteriorate. With just a few simple measures, it now looks like we could easily prevent future generations from descending into my blurry world. While the causes have been elusive, the anatomy of myopia has been well understood for decades. In the normal eye, the lens focuses light squarely on the retina, which records the image and sends it to the brain. We myopes, however, have eyeballs that are elongated, increasing the distance between the light-sensitive retina at the back of the eye and the lens at the front. The result is that light from distant objects is focused in front of the retina, so a blurred image is transmitted to the brain. © Copyright Reed Business Information Ltd
Keyword: Vision
Link ID: 13440 - Posted: 06.24.2010
By John Tierney Chronic pain affects more than 70 million Americans, which makes it more widespread than heart disease, cancer and diabetes combined. It costs the economy more than $100 billion per year. So why don’t more doctors and researchers take it seriously? That is the challenge raised by a new report from the Mayday Fund, a nonprofit group that studies pain treatment. The report, which been endorsed by an array of medical groups, advocates a revolution in the training of doctors, the financing of research and the education of law-enforcement officials. “The fact is that people aren’t getting competent and cost-effective treatment for chronic pain,” said Dr. Russell Portenoy, one of the co-chairmen of the panel that prepared the report. Dr. Portenoy, the chairman of the department of pain medicine and palliative care at Beth Israel Medical Center, was one of the pain experts who supported William Hurwitz, the Virginia doctor who was imprisoned for prescribing opioid painkillers to patients who resold them. (Dr. Hurwitz’s sentence was reduced after a retrial in which Dr. Portenoy and other experts testified on his behalf.) At a news conference Wednesday, Dr. Portenoy and the other co-chairman of the Mayday panel, Dr. Lonnie Zeltzer of the University of California, Los Angeles, said patients’ needs had to be better balanced against the concerns of law-enforcement officials, whose prosecutions of Dr. Hurtwitz and other doctors have made physicians reluctant to prescribe opioids. Dr. Zeltzer said doctors were especially reluctant to prescribe such painkillers to young people, and she cited the example of a teenager who had been incapacitated for six months until finding a doctor willing to prescribe opioids. Copyright 2009 The New York Times Company
Keyword: Pain & Touch
Link ID: 13439 - Posted: 06.24.2010
By David Derbyshire Scientists have shown that 'odour memories' get 'etched' onto the brain From the sudden whiff of school cabbage to the pungent smell of hospital disinfectant, nothing transports people back to their childhood more than an unexpected smell. Now scientists think they have discovered how scents from the past make such a lasting impression. Using brain scans, they have shown that new 'odour memories' - such as the association of a perfume with a person - really do get 'etched' onto the brain. The 'signature' of the memory is different from other types of memories, they found. Dr Yaara Yeshurun, who led the study at the Weizmann Institute of Science in Israel said early smells had a 'privileged' status in our memories. Scientists have long known that smells are one of the best ways to evoke the past. Past studies have shown that memories triggered by smells are more vivid and more emotional than those triggered by sounds, pictures or words. The new study, reported in the journal Current Biology, tried to mimic the creation of childhood memories of smells in 16 adult volunteers. All the tests were carried out while the volunteers were inside a functional Magnetic Resonance Imaging (fMRI) scanner which monitored brain activity. Published by Associated Newspapers Ltd
Keyword: Learning & Memory; Chemical Senses (Smell & Taste)
Link ID: 13438 - Posted: 11.07.2009
By Bruce Bower Only days after birth, babies have a bawl with language. Newborn babies cry in melodic patterns that they have heard in adults’ conversations — even while in the womb, say medical anthropologist Kathleen Wermke of the University of Würzburg in Germany, and her colleagues. By 2 to 5 days of age, infants’ cries bear the tuneful signature of their parents’ native tongue, a sign that language learning has already commenced, the researchers report in a paper published online November 5 in Current Biology. Fluent speakers use melodic patterns and pitch shifts to imbue words and phrases with emotional meaning. Changes in pitch and rhythm, for example, can indicate anger. During the last few months of fetal life, babies can hear what their mothers or other nearby adults are saying, providing exposure to melodies peculiar to a specific language, Wermke says. Newborns then re-create those familiar patterns in at least some of their cries, she proposes. “Our data support the idea that human infants’ crying is important for seeding language development,” Wermke says. “Melody lies at the roots of both the development of spoken language and music.” Newborns’ facility for imitating the underlying makeup of adult speech gets incorporated into babbling later in infancy, Wermke proposes. Earlier research has shown that, from age 3 months on, infants can reproduce vowel sounds demonstrated by adults. © Society for Science & the Public 2000 - 2009
Keyword: Language; Development of the Brain
Link ID: 13437 - Posted: 06.24.2010
By Jocelyn Kaiser Researchers have used a modified AIDS virus to halt a devastating brain disease in two young boys. The treatment, in which the virus delivered a therapeutic gene, marks the first time gene therapy has been successfully used against X-linked adrenoleukodystrophy (ALD)--a disorder that is always fatal if untreated. With this proof of principle, scientists hope versions of the AIDS virus engineered to carry different genes can now be applied to a variety of other diseases. ALD is caused by a defect in an X chromosome gene that produces a protein called ALD. Cells need this transporter protein to break down certain fats; without it, the fats build up and damage the myelin sheathing that protects nerves. In X-linked ALD, which strikes mainly boys, patients develop neurological symptoms such as seizures and loss of vision around age 6 to 8, and within months they become paralyzed, deaf, and eventually die. In the 1980s, the parents of a boy with ALD developed a mixture of fatty acids they called Lorenzo's oil that may have delayed the disease in their son (and inspired a 1992 movie). But the only widely accepted way to stave off ALD is a bone marrow transplant, which is risky--20% to 30% of patients die or have serious complications--and works best if the donor marrow comes from a sibling. In search of an alternative, pediatrician Patrick Aubourg of INSERM in Paris, the French biomedical research agency, and the University Paris-Descartes, along with collaborators in France and Germany, tried gene therapy on two 7-year-olds with ALD who couldn't be matched with a bone marrow donor. They removed blood cells from each boy and treated the cells with a so-called lentiviral vector, a modified HIV virus carrying the gene for the enzyme they lacked. The virus could not replicate, but it stitched the gene into the DNA of the blood cells. © 2009 American Association for the Advancement of Science
Keyword: Genes & Behavior; Development of the Brain
Link ID: 13436 - Posted: 06.24.2010
By Rachel Ehrenberg The safe answer to how a lantern shark turns its luminescence on and off is: “Any way it wants.” Now researchers have looked into the belly of the beast and found that three hormones act as on-off switches for these glow-in-the-dark sharks. It is the first discovery of hormones controlling bioluminescence in animals, the scientists report in the Nov. 15 Journal of Experimental Biology. Belgian researchers identified melatonin, prolactin and alpha-MSH, three hormones known to control sharkskin coloration, as key players in setting sharks aglow. In all animals investigated up to this point, luminescence is triggered by nerve cells. Finding a parallel pathway to bioluminescence — one that’s controlled by hormones, not nerves — strongly supports the notion that light-emitting powers have evolved multiple times in animals, comments marine scientist Jim Gelsleichter of the University of North Florida in Jacksonville, who was not involved in the research. access The light-emitting cells in some sharks aren’t connected to prominent nerve cells, and the slow onset of their glow hinted that something other than nerves were involved. Exposing patches of skin from lantern sharks to hormones and to nerve signaling molecules confirmed that hormones turn on the sharks’ bluish glow. Melatonin, which in humans is an important hormone for sleep regulation, induced a slow, long-lasting glow in the skin patches that persisted for several hours, researchers show. © Society for Science & the Public 2000 - 2009
Keyword: Hormones & Behavior; Animal Communication
Link ID: 13435 - Posted: 06.24.2010
by Maia Szalavitz Many millions have been made in Hollywood by lampooning the acute effects of marijuana on memory—but Israeli researchers suggest that they might one day be harnessed to prevent or treat post-traumatic stress disorder (PTSD). And today's election results bringing medical marijuana dispensaries to yet another state suggest that day might be sooner than ever. A new study—published in the Journal of Neuroscience—found that a synthetic drug that acts like one of the active components in marijuana (THC) can prevent stress-induced enhancement of fear memories in rats. PTSD is basically a syndrome in which fear-filled memories intrude on daily life and sleep—so preventing stress from strengthening memories of fear could potentially prevent or treat it. In the study, the rats were trained to fear a dark region of a cage where they received electric shocks. Though rats normally prefer dark places, they learned to stay in the light and avoid the now-scary dark area. When researchers stopped giving shocks in the dark region, rats slowly learned that it was safe again and began to return to it. The researchers measured how long this took. During the next experiment on a new group of rats, the experience was made more stressful. Now, rats were placed on an elevated grid after receiving the shock. Rats-- and most other animals, including many humans--tend to avoid walking over elevated grids if they can, and find being forced to do so distressing. As expected, the researchers found that it took longer for these rats to learn that the dark region was safe again. © 2009 Time Inc.
Keyword: Stress; Learning & Memory
Link ID: 13434 - Posted: 06.24.2010
Cristen Conger, HowStuffWorks.com -- The first detailed anatomical atlas of a living wildlife species has been constructed by researchers. Mapping the California sea lion's (Zalophus californianus) brain with a combination of magnetic resonance imaging (MRI) and volumetric measuring, scientists want to better understand how toxins in the water are causing neurological damage among marine mammal populations. Eric Montie, a postdoctoral researcher at the University of South Florida, spearheaded the study, which was published in The Anatomical Record in October. The brain atlas is a first step toward determining whether exposure to manmade chemicals, such as DDT and polychlorinated biphenyls (PCBs), increase California sea lions' susceptibility to life-threatening brain damage from domoic acid, a neurotoxin naturally produced by certain types of algae. Past studies have concluded that domoic acid, which accumulates in the sea lion's system from ingesting prey that feed on algae, causes the mammal's hippocampus to shrink. Research has also linked domoic acid to acute and chronic epilepsy and seizures in sea lions. But exactly how that neurotoxin-induced brain damage progresses is still unclear. sea lion "We don't know enough about the endocrinology and neurobiology of these animals," Montie told Discovery News. "That's why you start with baby steps like an atlas." © 2009 Discovery Communications, LLC.
Keyword: Neurotoxins; Development of the Brain
Link ID: 13433 - Posted: 06.24.2010
Female fiddler crabs have sex with their male neighbours in exchange for protection against wandering male intruders, say Australian researchers. A team led by Patricia Backwell of the Australian National University report their argument in the Royal Society journal Biology Letters. Both male and female fiddler crabs shelter in burrows, which they both must defend from intruders. But while males have an extremely large claw that can be used as a weapon, female crabs have just two small feeding claws. So how do female crabs defend their territory? To answer this question Backwell and colleagues built on previous work showing that under certain circumstances, males will help protect a neighbouring male from an intruder. Such "defensive coalitions" are rare in the animal kingdom and have so far only been demonstrated in two species of fiddler crab and a type of bird called a rock pipit. Protecting a neighbour can be risky, leading to injury, loss of a claw and even death, and a male also risks having his own unattended burrow invaded while off protecting a neighbour. However, team member and behavioural ecologist Michael Jennions says it's a case of better the enemy you know. © CBC 2009
Keyword: Sexual Behavior; Evolution
Link ID: 13432 - Posted: 06.24.2010
By Lindsey Konkel and Environmental Health News Seeking healthful foods, Americans are eating more soy than ever. But recent research with animals shows that consuming large amounts could have harmful effects on female fertility and reproductive development. Soy is ubiquitous in the American diet. Over a quarter of all infant formula sold is made with it, and the U.S. Food and Drug Administration promotes it in foods to reduce the risk of heart disease. School lunch programs across the country are even adding soy to hamburger patties. Many of soy’s health benefits have been linked to isoflavones—plant compounds that mimic estrogen. But animal studies suggest that eating large amounts of those estrogenic compounds might reduce fertility in women, trigger premature puberty and disrupt development of fetuses and children. Although most studies looking at the hormone-disrupting properties of genistein, the main isoflavone in soy, have been conducted in rodents, many scientists believe the findings may be relevant to humans as well. “We know that too much genistein is not a good thing for a developing mouse; it may not be a good thing for a developing child,” said Retha Newbold, a developmental biologist at the National Institute of Environmental Health Sciences. More definitive answers, she said, may lay ahead in future long-term human studies. © 1996-2009 Scientific American Inc.
Keyword: Hormones & Behavior; Sexual Behavior
Link ID: 13431 - Posted: 06.24.2010
Juggling and other physically complex activities may hold some promise for brain regeneration among those who have suffered stroke or are coping with other neurological diseases where the pathways that connect how people think with how they move their bodies begin to break down. Researchers at Britain's University of Oxford used diffusion, a new type of magnetic resonance imaging, to compare the physical structure of white matter -- the nerve fibers that connect parts of the brain -- in a control group of 24 men and women to a second group of 24 who had practiced juggling for 30 minutes a day for six weeks. The MRIs showed an increase in white matter among the jugglers, regardless of their skill level. And the increase persisted even after the juggling sessions ended. Scientists have long known that gray matter, where the brain processes information, increases when people tackle new tasks or have new experiences. But this was the first time researchers have shown that white matter can increase, too, according to Heidi Johansen-Berg, an Oxford neuroscientist and lead author of the study, which was published in the journal Nature Neuroscience. In an e-mail she said, "Gray matter consists of neurons, which can be thought of as computation units, processing and integrating the incoming information. White matter, on the other hand, is composed of the connections between different areas." Johansen-Berg said the MRIs only revealed that the white matter area of the brain had changed. "The findings may have a clinical relevance in the future -- but that would be a long way down the line. There are a number of brain diseases, such as multiple sclerosis, that result in degeneration of pathways. Our results show that, in healthy adults, those pathways can change positively as a result of training," she said. "Future therapies might try to use training regimes or drugs to enhance such positive changes in disease, to counteract degeneration." © 2009 The Washington Post Company
Keyword: Learning & Memory
Link ID: 13430 - Posted: 06.24.2010
by Michael Bond IS GEORGE W. BUSH stupid? It's a question that occupied a good many minds of all political persuasions during his turbulent eight-year presidency. The strict answer is no. Bush's IQ score is estimated to be above 120, which suggests an intelligence in the top 10 per cent of the population. But this, surely, does not tell the whole story. Even those sympathetic to the former president have acknowledged that as a thinker and decision-maker he is not all there. Even his loyal speechwriter David Frum called him glib, incurious and "as a result ill-informed". The political pundit and former Republican congressman Joe Scarborough accused him of lacking intellectual depth, claiming that compared with other US presidents whose intellect had been questioned, Bush junior was "in a league by himself". Bush himself has described his thinking style as "not very analytical". How can someone with a high IQ have these kinds of intellectual deficiencies? Put another way, how can a "smart" person act foolishly? Keith Stanovich, professor of human development and applied psychology at the University of Toronto, Canada, has grappled with this apparent incongruity for 15 years. He says it applies to more people than you might think. To Stanovich, however, there is nothing incongruous about it. IQ tests are very good at measuring certain mental faculties, he says, including logic, abstract reasoning, learning ability and working-memory capacity - how much information you can hold in mind. But the tests fall down when it comes to measuring those abilities crucial to making good judgements in real-life situations. That's because they are unable to assess things such as a person's ability to critically weigh up information, or whether an individual can override the intuitive cognitive biases that can lead us astray. © Copyright Reed Business Information Ltd
Keyword: Intelligence
Link ID: 13429 - Posted: 06.24.2010
By PAM BELLUCK By the time Corey Haas was 7, the retinal disease he was born with had already stolen much of his vision. “He always clung to me or my wife,” said Corey’s father, Ethan Haas. The boy relied on a cane and adults to guide him, and, unable to see blackboard writing, sat in back with a teacher’s aide, large-type computer screen and materials in Braille. Legally blind, Corey was expected eventually to lose all sight. Then, 13 months ago, after his eighth birthday, he underwent an experimental gene therapy procedure, receiving an injection in his left eye. His vision in that eye improved quickly. Now 9, Corey plays Little League baseball, drives go-carts, navigates wooded trails near his home in Hadley, N.Y., and reads the blackboard in class. “It’s gotten, like, really better,” he said. Experts in vision problems say that while it is unclear how many visually impaired people gene therapy could help, they consider the research promising for some types of blinding diseases, and an achievement for gene therapy, which has had many setbacks. The study, reported in the journal Lancet, involved five children and seven adults, from Belgium, Italy and the United states, with a type of Leber’s congenital amaurosis, rare but serious congenital retinal diseases. Copyright 2009 The New York Times Company
Keyword: Vision; Genes & Behavior
Link ID: 13428 - Posted: 06.24.2010


.gif)

