Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
by Roger Highfield , Richard Wiseman and Rob Jenkins THE history of science could have been so different. When Charles Darwin applied to be the "energetic young man" that Robert Fitzroy, the Beagle's captain, sought as his gentleman companion, he was almost let down by a woeful shortcoming that was as plain as the nose on his face. Fitzroy believed in physiognomy - the idea that you can tell a person's character from their appearance. As Darwin's daughter Henrietta later recalled, Fitzroy had "made up his mind that no man with such a nose could have energy". Fortunately, the rest of Darwin's visage compensated for his sluggardly proboscis: "His brow saved him." The idea that a person's character can be glimpsed in their face dates back to the ancient Greeks. It was most famously popularised in the late 18th century by the Swiss poet Johann Lavater, whose ideas became a talking point in intellectual circles. In Darwin's day, they were more or less taken as given. It was only after the subject became associated with phrenology, which fell into disrepute in the late 19th century, that physiognomy was written off as pseudoscience. Now the field is undergoing something of a revival. Researchers around the world are re-evaluating what we see in a face, investigating whether it can give us a glimpse of someone's personality or even help to shape their destiny. What is emerging is a "new physiognomy" which is more subtle but no less fascinating than its old incarnation. © Copyright Reed Business Information Ltd
Keyword: Miscellaneous
Link ID: 12552 - Posted: 06.24.2010
By Tina Hesman Saey Although it may not be as dramatic as the Big Bang birthing the universe, an explosion of DNA duplication in the common ancestor of humans, chimpanzees and gorillas may be responsible for many of the differences among the species, a new study suggests. The big blowup happened 8 million to 12 million years ago, but its effects are still apparent today. Human and great ape genes are notoriously similar, with few differences in the genetic letters that make up the instruction manual for building each of the primates. But gorillas, orangutans, chimpanzees and humans are obviously different. A new analysis of the entire genomes of humans and their ape cousins, published in the Feb. 11 Nature, suggests the differences may have roots in DNA duplications. Researchers led by Evan Eichler, a Howard Hughes Medical Institute investigator at the University of Washington in Seattle, compared the genomes of macaques, orangutans, gorillas, chimpanzees, bonobos and humans. The scientists found that chunks of the genomes had been copied and rearranged, sometimes multiple times, within each of the lineages. After orangutans branched off the primate family tree, duplication rates accelerated dramatically in the common ancestor of gorillas, chimpanzees and humans.The burst continued in the common ancestor of humans and chimps, but then slowed again. At the same time that duplication rates were heating up, other types of mutation — such as single letters changes in the genetic sequence — slowed down. access © Society for Science & the Public 2000 - 2009
Keyword: Evolution
Link ID: 12551 - Posted: 06.24.2010
by Graham Lawton THEY called it the second summer of love. Twenty years ago, young people all over the world donned T-shirts emblazoned with smiley faces and danced all night, fuelled by a molecule called MDMA. Most of these clubbers have since given up ecstasy and are sliding into middle age. The question is, has ecstasy given up on them? Enough time has finally elapsed to start asking if ecstasy damages health in the long term. According to the biggest review ever undertaken, it causes slight memory difficulties and mild depression, but these rarely translate into problems in the real world. While smaller studies show that some individuals have bigger problems, including weakened immunity and larger memory deficits, so far, for most people, ecstasy seems to be nowhere near as harmful over time as you may have been led to believe. The review was carried out by the UK Advisory Council on the Misuse of Drugs (ACMD), an independent body that advises the UK government on drug policy. As New Scientist went to press the final report had still to be published, but the committee was expected to recommend downgrading MDMA from a class A drug to a class B, putting it on a par with cannabis in terms of harmfulness. Nobody is arguing that taking ecstasy is risk-free: its short-term effects are fairly uncontroversial. MDMA is toxic, though not powerfully so - an average person would need to take around 20 or 30 tablets to reach a lethal dose. And for a small fraction of people, even small amounts of ecstasy can kill. For example, around half a million people take ecstasy every year in England and Wales, and 30 die from the acute effects, mostly overheating or water intoxication. © Copyright Reed Business Information Ltd.
Keyword: Drug Abuse
Link ID: 12550 - Posted: 06.24.2010
By Lucas Laursen Before the reels on a slot machine stop spinning, a gambler's brain is already anticipating the potential rewards. And although two bananas on the pay line with a third just barely visible won't pay a gambler any more than three random fruits, such near misses have the well-documented, if irrational, effect of enticing gamblers to try again. The reason, according to a new study, is that these near misses activate the same reward signals in the brain as a win. Slot machine makers capitalize on the near-miss effect. Researchers have found that they program their games to tease players with near misses about 30% of the time--a number previous studies have found optimal for getting gamblers to keep coming back. Games that offer gamblers the option to stop one reel on a favored icon--say a cherry--until the others spin to a stop also tend to inflate gambler's self-confidence. Taking these factors into account, a team led by Luke Clark of the University of Cambridge in the United Kingdrom developed a simplified slot machine game on a computer. The group asked 15 volunteers--mostly male and with an average age of 26--to play the game while it used functional magnetic resonance imaging (fMRI) to scan their brain activity. The subjects could choose when to stop one of two spinning reels on the screen, each decorated with six icons, including a banana, a pair of boots, and an elephant (see picture). They then watched while the second reel stopped spinning. When the icons matched, the volunteers won £0.50 in real money, but when the icon they chose landed just above or below the pay line, they earned no money, and the researchers logged the event as a near miss. In other trials, the computer chose which icon stopped in the first reel. © 2009 American Association for the Advancement of Science
Keyword: Emotions; Drug Abuse
Link ID: 12549 - Posted: 06.24.2010
By Penelope Green There are no dark corners in Madison, Wisconsin, a university town that sparkles with endowment and research dollars—more than $900 million last year—as well as just plain Midwestern niceness. The grants are well earned: It was at the University of Wisconsin–Madison that the first bone marrow transplant was performed and the first synthetic gene was created. It was here that human stem cells were isolated and cultured in a lab for the first time. And for more than a decade, one of the campus's most productive hit makers has been the Laboratory for Affective Neuroscience, run by a 56-year-old neuroscientist and professor of psychology and psychiatry named Richard J. Davidson, PhD, who has been systematically uncovering the architecture of emotion. Davidson, whose youthful appearance and wide-open smile give him more than a passing resemblance to Jerry Seinfeld, has been studying the brain structures behind not just anxiety, depression, and addiction but also happiness, resilience, and, most recently, compassion. Using brain imaging technologies, in particular a device called a functional magnetic resonance imaging (fMRI) machine, a sort of Hubble telescope for the brain, Davidson and his researchers have observed the areas associated with various emotions and how their function changes as an individual moves through them. His "brain maps" have revealed the neural terrain of so-called normal adults and children, as well as those suffering from mood disorders and autism. Davidson has also studied a now rather famous group of subjects: Tibetan monks with years of Buddhist meditation under their gleaming pates. © 2009 Harpo Productions, Inc.
Keyword: Emotions
Link ID: 12548 - Posted: 06.24.2010
By Pat Wingert and Barbara Kantrowitz When Dr. Lauren Streicher, an assistant professor of obstetrics and gynecology at Northwestern University's medical school in Chicago, got a call from "The Oprah Winfrey Show" inviting her to discuss menopausal hormones with actress Suzanne Somers, she figured she'd better read Somers's best-selling books on the subject. As Streicher worked her way through the first chapter, she started underlining every sentence she felt was inaccurate. "But pretty soon, I had to stop," Streicher says, "because I was underlining almost everything."The taping of the show, which aired Jan. 29, proved equally disconcerting. Somers, a self-styled hormone and anti-aging expert whose controversial books promise midlife women that they will feel young and sexy if they take unregulated hormone therapy (HT) in much higher doses and for much longer time periods than most experts recommend, was literally given center stage. She was seated next to Winfrey, the newly proclaimed convert to the so-called bio-identical hormones promoted by the 62-year-old Somers. (Bio-identical generally refers to products that are chemically identical to hormones produced by a woman's body.) While Winfrey, 55, encouraged "every woman" to read Somers's book, the guests with actual medical degrees were relegated to seats in the audience, where they had to sit quietly unless called upon. Interspersed were taped segments of Somers smearing her arms with hormone cream, standing on her head and lining up the 40 dietary supplements she takes with her morning smoothie. The whole setup seemed to give the drugs that Somers uses the same enthusiastic endorsement that turns everything Winfrey promotes into a blockbuster. The resulting spectacle disappointed many doctors who thought Winfrey had higher standards for the quality of medical information she dispersed—or, at least, more of a commitment to balance. Some said they were particularly upset because doctors had complained to Winfrey's production company about what they saw as misinformation disseminated during the show she did on hormone therapy two weeks before that featured Dr. Phil McGraw's wife, Robin. © 2009 Newsweek, Inc.
Keyword: Hormones & Behavior
Link ID: 12547 - Posted: 06.24.2010
By JANE E. BRODY How sweet it is! The American diet, that is. While the current recommendation is a maximum intake of eight teaspoons of sugars a day, one 12-ounce can of regular soda (or a 20-ounce bottle of VitaminWater) delivers eight or nine teaspoons. That means you are at or over the limit before you’ve eaten a single cookie or container of fruit-flavored yogurt, or even some commercial tomato soups or salad dressings with added sugars. The result is an average daily intake of more than 20 teaspoons of sweet calories. Although much fuss has been raised about high-fructose corn syrup, when it comes to calories and weight gain, it makes no difference if the sweetener was derived from corn, sugar cane, beets or fruit juice concentrate. All contain a combination of fructose and glucose and, gram for gram, supply the same number of calories. All contribute to the excessive caloric intake that has resulted in an epidemic of obesity among Americans in the last 25 years. Dr. George Bray, a specialist in obesity and metabolism at the Pennington Biomedical Research Center of Louisiana State University, has calculated that “the current epidemic of obesity could be explained by the consumption of an extra 20-ounce soft drink each day.” Among the most recent substances to take a turn as dietary villain is high-fructose corn syrup, a relatively cheap and reliable sweetener criticized, among other reasons, for being “artificial.” Copyright 2009 The New York Times Company
Keyword: Obesity
Link ID: 12546 - Posted: 06.24.2010
By Coco Ballantyne Giving Alzheimer's patients a battery of cognitive tests may help predict whether it's safe for them (and us) to get behind the wheel, according to a new study. "We found that tests that involved visual perception and visual memory were particularly important in preventing driving errors," says Jeffrey Dawson, a biostatistician at the University of Iowa College of Public Health in Iowa City and lead author of the study published in Neurology. Dawson hopes the findings will pave the way for the creation of a test that physicians could give to people diagnosed with Alzheimer's to determine if it’s safe for them to be on the road. Alzheimer’s disease, the most common form of dementia, affects an estimated five million people in the U.S., according to the Alzheimer's Association, a nonprofit based in Chicago, Illinois. The disease appears to be caused by protein plaques and tangles that accumulate in the brain, damaging cells and chipping away at the victim's cognition and motor skills. Dawson compared the driving ability of 40 men and women ages 51 to 89 with early Alzheimer's to 115 drivers ages 42 to 89 with no signs of dementia. (The average age of the Alzheimer's group, 75, was six years older than that of the other group, but the discrepancy was accounted for in the final statistical analysis, Dawson notes). The researchers gave all drivers a series of tests designed to measure cognitive, visual, and motor skills. He says the Alzheimer's patients did worse in virtually all of them. © 1996-2009 Scientific American Inc.
Keyword: Alzheimers
Link ID: 12545 - Posted: 06.24.2010
By Solmaz Barazesh Mothers with no previous history of mental illness face the greatest risk for postpartum psychosis during the first month after childbirth, a new study suggests. Postpartum depression is a common problem for many women in the days following delivery. But about one in 1,000 new mothers develops postpartum psychosis, a serious mental illness involving delusional thoughts, hallucinations and the inability to distinguish between reality and imagination. The new study found that first-time mothers who suffer postpartum psychosis faced the highest risk in the first month after delivery, and that the problem can strike women who had no previous history of mental illness. In the study, published online February 9 in PLoS Medicine, epidemiologist Unnur Valdimarsdóttir of the Karolinska Institute in Stockholm and colleagues used hospital records to track first-time mothers during the 90 days following childbirth. Of the almost 750,000 women in the study, 892 developed postpartum psychosis, with most cases reported within a month of childbirth. The rapid reduction in hormone levels after childbirth could trigger the psychosis in some women, the authors suggest. Earlier studies show that schizophrenic women face the greatest risk of psychosis when hormone levels are low. Trauma associated with the pregnancy and birth itself could also contribute to postpartum psychosis, Valdimarsdóttir says. © Society for Science & the Public 2000 - 2009
Keyword: Schizophrenia; Hormones & Behavior
Link ID: 12544 - Posted: 06.24.2010
By Moheb Costandi In the 1920s the behavioral psychologist Karl Lashley conducted a now famous series of experiments in an attempt to identify the part of the brain in which memories are stored. He trained rats to find their way through a maze, then made lesions in different parts of the cerebral cortex in an attempt to erase what he called the "engram," or the original memory trace. Lashley failed to find the engram—his experimental animals were still able to find their way through the maze, no matter where he put lesions on their brains. He therefore concluded that memories are not stored in any single area of the brain, but are instead distributed throughout it. Subsequent work on amnesics—most notably the studies of the recently deceased patient known only as H.M. carried out by Brenda Milner—implicated a part of the brain called the hippocampus as being crucial for memory formation. More recently, it was established that the frontal cortex is also involved; current thinking holds that new memories are encoded in the hippocampus and then eventually transferred to the frontal lobes for long-term storage. A new study, led by Christine Smith and Larry Squire at the University of California at San Diego, now provides evidence that the age of a memory determines the extent to which we are dependent on the frontal cortex and hippocampus for recalling it. In other words, the location of a recollection in the brain varies based on how old that recollection is. Smith and Squire assessed the brain activity associated with the recollection of old and new memories. They recruited 15 healthy male participants, and used functional magnetic resonance imaging (fMRI) to scan their brains while they answered 160 questions about news events that took place at different periods of time during the past 30 years. The study sounds simple, but the design of the experiments was actually somewhat complex, because the researchers had to overcome a number of confounding variables. © 1996-2009 Scientific American Inc
Keyword: Learning & Memory
Link ID: 12543 - Posted: 06.24.2010
By MATTHEW PERRONE WASHINGTON -- Two drugmakers spent hundreds of millions of dollars last year to raise awareness of a murky illness, helping boost sales of pills recently approved as treatments and drowning out unresolved questions _ including whether it's a real disease at all. Key components of the industry-funded buzz over the pain-and-fatigue ailment fibromyalgia are grants _ more than $6 million donated by drugmakers Eli Lilly and Pfizer in the first three quarters of 2008 _ to nonprofit groups for medical conferences and educational campaigns, an Associated Press analysis found. That's more than they gave for more accepted ailments such as diabetes and Alzheimer's. Among grants tied to specific diseases, fibromyalgia ranked third for each company, behind only cancer and AIDS for Pfizer and cancer and depression for Lilly. Fibromyalgia draws skepticism for several reasons. The cause is unknown. There are no tests to confirm a diagnosis. Many patients also fit the criteria for chronic fatigue syndrome and other pain ailments. Experts don't doubt the patients are in pain. They differ on what to call it and how to treat it. Many doctors and patients say the drugmakers are educating the medical establishment about a misunderstood illness, much as they did with depression in the 1980s. Those with fibromyalgia have often had to fight perceptions that they are hypochondriacs, or even faking their pain. © 2009 The Associated Press
Keyword: Stress
Link ID: 12542 - Posted: 06.24.2010
by Andy Coghlan Injections of a natural growth factor into the brains of mice, rats and monkeys offers hope of preventing or reversing the earliest impacts of Alzheimer's disease on memory. The benefits arose even in animals whose brains contained the hallmark plaques that clog up the brains of patients. By delivering brain-derived neurotrophic factor (BDNF) directly into the entorhinal cortex and hippocampus, the parts of the brain where memories are formed then consolidated, the researchers successfully tackled damage exactly where Alzheimer's strikes first. "We're administering BDNF directly to the degenerating neurons in memory systems of the cortex, and preventing their death," says Mark Tuszynski of the University of California at San Diego. The substance, which naturally supports brain cells throughout life, also amplified the numbers of connections, or synapses, between neurons. "Our most compelling evidence was the observation that brain cell death was prevented, and that connections between neurons rose in density by about 25%," says Tuszynski. Improvements on this scale happened in all the animals, including mice with a version of human Alzheimer's disease, elderly rats and monkeys with natural degeneration, plus rats and monkeys given brain lesions similar to those seen in Alzheimer's. © Copyright Reed Business Information Ltd
Keyword: Alzheimers; Trophic Factors
Link ID: 12541 - Posted: 06.24.2010
Frequent or long-term marijuana use may raise a man's risk of testicular cancer, American research suggests. The study of 369 men, published in the journal Cancer, found being a regular marijuana user doubled the risk compared to those who never smoked it. The results suggest that it may be linked to the most aggressive form of the cancer. A spokesman for Cancer Research UK said that no previous studies had found a link between marijuana and the disease. Testicular cancer is one of the most common cancers in younger men, with approximately 2,000 new cases each year in the UK. Incidence in Europe and North America is far higher than in some other parts of the world, and has been rising steadily for no apparent reason. Known risk factors for the cancer include previous injuries to the testicles, a family history of the disease, or suffering from undescended testicles as a young child. The study from scientists at the Fred Hutchinson Cancer Research Center in Seattle is the first to look specifically at marijuana use in relation to the disease. They studied 369 men aged 18 to 44, who had been diagnosed with testicular cancer, and quizzed them about marijuana use. Their replies were compared to those from almost 1,000 apparently healthy control subjects. Even after adjusting the figures to take account of the other known risk factors, marijuana use remained a clear risk factor for testicular cancer. Just being a marijuana smoker seemed to carry a 70% extra risk, while those who smoked it regularly, or had smoked from an early age, had twice the risk compared to those who had never smoked it. A connection was made to nonseminoma, a fast-growing form of testicular cancer which accounts for approximately 40% of all cases, and tends to strike younger. (C)BBC
Keyword: Drug Abuse
Link ID: 12540 - Posted: 02.09.2009
TAKE Ritalin for fun and you run the risk of addiction. That's if the drug causes the same chemical and structural changes in human brains as it does in mice. Ritalin is prescribed to children with hyperactivity disorders, but many American teenagers also take it without a prescription to boost academic performance, or for pleasure. When Yonk Kim and his colleagues at the Rockefeller University in New York gave mice the drug for a fortnight, a greater number of spiny neurons formed in the nucleus accumbens, a brain region stimulated by all addictive drugs (Proceedings of the National Academy of Sciences, DOI: 10.1073/pnas.0813179106). "These changes in neuronal structure and brain chemistry are known to be associated with the process of drug addiction," warns Kim. The finding is backed up by previous studies that found signs of addiction in recreational users. In contrast, hyperactive children prescribed the drug don't usually show signs of addiction. © Copyright Reed Business Information Ltd
Keyword: ADHD; Drug Abuse
Link ID: 12539 - Posted: 06.24.2010
Scientists have perfected a highly sensitive test to detect vCJD-causing proteins on surgical instruments. The test, which picks up the presence of prions on metal surfaces quickly and accurately, could help show whether decontamination processes are working. Although there is no recorded case of a patient developing vCJD after surgery, experts say it is possible. The test has been developed by the Medical Research Council Prion Unit at University College London. Details are featured in Proceedings of the National Academy of Sciences. As well as causing vCJD, prions are also responsible for a disease called kuru in humans, BSE in cattle and scrapie in sheep. They are known to be able to survive conventional hospital sterilisation methods. Professor John Collinge, director of the MRC Prion Unit, said: ''The presence of prions in blood and body tissues beyond the brain make many surgical and dental procedures a potential risk factor for transmission of prion diseases. "Research has found that prions can withstand many sterilisation techniques, are very sticky and, when attached to a metal surface like a surgical instrument, are even more resistant to both chemical and heat treatments.'' The new test is much faster, and 100 times more sensitive than the existing test which involves injecting samples of suspect tissue into the brain of a mouse or hamster, and waiting for the animal to develop symptoms of disease. It also makes it possible to test many samples at once at a relatively low cost. The new test uses steel wires to enhance the sensitivity of a standard cell-based prion detection test called SCEPA (scrapie cell endpoint assay). The prions present even in a very dilute sample bind tightly to the surface of the steel wires. The wires are then covered with special cells that are very susceptible to prion infection. (C)BBC
Keyword: Prions
Link ID: 12538 - Posted: 02.07.2009
Alison Abbott Researchers in Sweden have revealed a surprising change in brain biochemistry that occurs during the training of working memory, a buffer that stores information for the few second required to solve problems or even to understand what we are reading. The discovery may have implications for understanding disorders in which working memory is deficient — such as schizophrenia and attention deficit hyperactivity disorder (ADHD). Working memory depends on the transmission of signals in certain parts of the brain by the chemical dopamine and one of its receptors, the D1 receptor, particularly in the parietal and frontal regions of the cortex. The efficiency of working memory drops off as people age. But the 'use-it-or-lose-it' adage holds true — working memory can be improved through training. Torkel Klingberg, a neurologist at the Karolinska Institute in Stockholm, and his colleagues studied what happened to D1 receptors in the brains of healthy young men during such training1. In particular, the researchers wanted to see whether the density of the receptors changes, because when dopamine is plentiful, dopamine receptors 'downregulate' — that is, move from the nerve-cell membrane to the inside of the cell, where they cannot be activated. This is a normal 'tune-down' mechanism to avoid overstimulation. © 2009 Nature Publishing Group,
Keyword: Schizophrenia
Link ID: 12537 - Posted: 06.24.2010
By NICHOLAS BAKALAR Lengthy television viewing in adolescence may raise the risk for depression in young adulthood, according to a new report. The study, published in the February issue of The Archives of General Psychiatry, found a rising risk of depressive symptoms with increasing hours spent watching television. There was no association of depression with exposure to computer games, videocassettes or radio. Researchers used data from a larger analysis of 4,142 adolescents who were not depressed at the start of the study. After seven years of follow-up, more than 7 percent had symptoms of depression. But while about 6 percent of those who watched under three hours a day were depressed, more than 17 percent of those who watched more than nine hours a day had depressive symptoms. The association was stronger in boys than in girls, and it held after adjusting for age, race, socioeconomic status and educational level. “We really don’t know what it was specifically about TV exposure that was associated with depression, whether it was a particular kind of programming or some contextual factor such as watching alone or with other people,” said Dr. Brian Primack, the lead author and an assistant professor of medicine at the University of Pittsburgh. “Therefore, I would be uneasy to make any blanket recommendations based on this one study.” Copyright 2009 The New York Times Company
Keyword: Depression; Development of the Brain
Link ID: 12536 - Posted: 06.24.2010
By Melinda Wenner Suicide rates in the U.S. have increased for the first time in a decade, according to a report published in October by the Johns Hopkins Bloomberg School of Public Health. But what leads a person to commit suicide? Three new studies suggest that the neurological changes in a brain of a suicide victim differ markedly from those in other brains and that these changes develop over the course of a lifetime. The most common pathway to suicide is through depression, which afflicts two thirds of all people who kill themselves. In October researchers in Canada found that the depressed who commit suicide have an abnormal distribution of receptors for the chemical GABA, one of the most abundant neurotransmitters in the brain. GABA’s role is to inhibit neuron activity. “If you think about the gas pedal and brakes on a car, GABA is the brakes,” explains co-author Michael Poulter, a neuroscientist at the Robarts Research Institute at the University of Western Ontario. Poulter and his colleagues found that one of the thousands of types of receptors for GABA is underrepresented in the frontopolar cortex of people with major depressive disorder who have committed suicide as compared with nondepressed people who died of other causes. The frontopolar cortex is involved in higher-order thinking, such as decision making. The scientists do not yet know how this abnormality leads to the type of major depression that makes someone suicidal, but “anything that disturbs that system would be predicted to have some sort of important outcome,” Poulter says. © 1996-2009 Scientific American Inc.
Keyword: Depression; Aggression
Link ID: 12535 - Posted: 06.24.2010
LA JOLLA, CA—"Remember when…?" is how many a wistful trip down memory lane begins. But just how the brain keeps tabs on what happened and when is still a matter of speculation. A computational model developed by scientists at the Salk Institute for Biological Studies now suggests that newborn brain cells—generated by the thousands each day—add a time-related code, which is unique to memories formed around the same time. "By labeling contemporary events as similar, new neurons allow us to recall events from a certain period," speculates Fred H. Gage, Ph.D., a professor in the Laboratory for Genetics, who led the study published in the Jan. 29, 2009, issue of the journal Neuron. Unlike the kind of time stamp found on digital photographs, however, the neuronal time code only provides relative time. Ironically, Gage and his team had not set out to explain how the brain stores temporal information. Instead they were interested in why adult brains continually spawn new brain cells in the dentate gyrus, the entryway to the hippocampus. The hippocampus, a small seahorse-shaped area of the brain, distributes memory to appropriate storage sections in the brain after readying the information for efficient recall. "At least one percent of all cells in the dentate gyrus are immature at any given time," explains lead author Brad Aimone, a graduate student in the Computational Neuroscience Program at the University of California, San Diego. "Intuitively we feel that those new brain cells have to be good for something, but nobody really knows what it is." © 2009 Eureka! Science News
Keyword: Learning & Memory; Neurogenesis
Link ID: 12534 - Posted: 06.24.2010
By Charles Q. Choi Teens are notoriously self-conscious. Now brain-imaging experiments are revealing how this adolescent predilection might be the result of changes in brain anatomy linked with the self, and the findings may hint at how the sense of self develops in the brain. One way we build a sense of self is by reflecting on how others perceive us, a concept psychologists have dubbed “the looking-glass self.” To see how teenagers reacted to what other people thought of them, researchers asked adolescent girls ages 10 to 18 to imagine a variety of scenarios involving onlookers that were designed to evoke social emotions such as guilt or embarrassment—for example, “You were quietly picking your nose, but your friend saw you.” Cognitive neuroscientist Sarah-Jayne Blakemore of University College London and her colleagues found that when compared with scenarios describing basic emotions that did not involve the opinions of others, such as fear and disgust, girls who thought about onlookers’ opinions engaged a brain region known as the dorsal medial prefrontal cortex (MPFC) more during social emotional scenarios than adult women did. This area is one of the last regions to develop before adulthood, and it is known to activate in adults when they think about themselves, about other people and even about the personality traits of animals. It makes evolutionary sense for teenagers to be highly concerned about what others think, Blakemore suggests. Adolescence requires becoming more independent because one’s parents might not be around much longer. © 1996-2009 Scientific American Inc.
Keyword: Development of the Brain
Link ID: 12533 - Posted: 06.24.2010


.gif)

