Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 23341 - 23360 of 29369

SAN JUAN, Puerto Rico, -- The inability to identify the smell of lemons, lilac, leather and seven other odors predicts which patients with minimal to mild cognitive impairment (MMCI) will develop Alzheimer's Disease, according to a study presented today at the American College of Neuropsychopharmacology (ACNP) annual meeting. For patients with MMCI, the odor identification test was found to be a strong predictor of Alzheimer's Disease during follow-up, and compared favorably with reduction in brain volumes on MRI scan and memory test performance as potential predictors. "Early diagnosis of Alzheimer's Disease is critical for patients and their families to receive the most beneficial treatment and medications," says lead researcher D.P. Devanand, MD, Professor of Clinical Psychiatry and Neurology at Columbia University and Co-Director of the Memory Disorders Center at the New York State Psychiatric Institute. "While currently there is no cure for the disease, early diagnosis and treatment can help patients and their families to better plan their lives." Smell identification test results from Alzheimer's disease patients, MMCI patients and healthy elderly subjects were analyzed to select an optimal subset of fragrances that distinguished Alzheimer's and MMCI patients who developed the disease from healthy subjects and MMCI patients who did not develop Alzheimer's.

Keyword: Alzheimers; Chemical Senses (Smell & Taste)
Link ID: 6577 - Posted: 12.14.2004

The drug methylphenidate (brand name Ritalin) increased activity in brains of children with attention-deficit/hyperactivity disorder (ADHD) as well as those with a reading disorder, researchers at Yale report in the American Journal of Psychiatry. "During a test of divided attention, Ritalin increased activation in the basal ganglia, a structure of the brain involved in cognition and behavior," said first author Keith Shafritz, former graduate student in the interdepartmental Neuroscience Program at Yale and now a research associate at Duke University Medical Center. "We saw this activation in children with ADHD and those with reading disorder." The study used functional magnetic resonance imaging (fMRI) to analyze the effect of the drug on brain function. Researchers found that adolescents with ADHD or reading disorder who were on placebo (not medicated) had less activation of the basal ganglia than a group of healthy participants. When the same participants received Ritalin, the drug normalized the activation, which relates to the amount of blood flow to a specific brain region in response to a cognitive task. ADHD is characterized by inattention, but previous neuroimaging studies have examined the brain dysfunction associated with impulsivity. "This is one of few studies that used a test for attention rather than a cognitive test for impulsivity," said Shafritz. "It is also the first study, using fMRI to find that the attention circuitry in the brain is directly affected by ADHD."

Keyword: ADHD; Dyslexia
Link ID: 6576 - Posted: 12.14.2004

Helen Pearson By transforming the features of Margaret Thatcher into those of Marilyn Monroe, researchers have revealed hints about how our brains put a name to a face. Neuroscientists already know that certain spots in the brain play a vital role for recognizing a familiar face, even as it changes with age or a new hairstyle. But they have not been clear precisely what each area does. Using mugshots of celebrities, Pia Rotshtein at University College London and her colleagues have shown that there are at least three separate areas for processing and recognising faces. One processes the physical features of the face, one decides whether or not the face is known, and a third retrieves information about that person, such as their name. Rothstein's team used a computer to create a series of images in which the countenance of film star Marilyn Monroe gradually morphed into that of former British prime minister Margaret Thatcher, or that of James Bond actor Pierce Brosnan transformed into current prime minister Tony Blair. Although the physical features gradually change from one face into another, the researchers showed that subjects looking at the images tend to "suddenly flip" from seeing Marilyn to seeing Maggie, explains team member Jon Driver. ©2004 Nature Publishing Group

Keyword: Vision
Link ID: 6575 - Posted: 06.24.2010

By Rowan Hooper In 1995, the Supreme Court of Georgia heard a lawyer make a novel argument. He had read a study describing violent behavior shared by several generations of men in a Dutch family. Scientists had identified a mutated gene shared by all the violent men, and that's what got the lawyer's brain ticking. The accused, argued the lawyer, might carry a gene -- like the men in the Dutch family -- that predisposed him to violence. (The lawyer's client was on trial for murder.) Therefore, went the argument, the accused did not have free will, was innocent of the murder and should be acquitted. The defense, an attempt at legal trickery remarkable even for a lawyer, failed. However, scientific discoveries, particularly advances in neuroscience, are nevertheless having profound consequences for legal procedure. For example, the insanity plea in the United States currently requires that the accused does not know, because of mental illness, that he did wrong. The insanity plea derives from the M'Naghten rule, a case from English law. In 1843, a man named Daniel M'Naghten attempted to assassinate the British prime minister; at his trial, he was found to be insane and the trial was abandoned. From that point on, lawyers saw the power of mounting an insanity defense, and many such claims were made. © Copyright 2004, Lycos, Inc.

Keyword: Aggression
Link ID: 6574 - Posted: 06.24.2010

The brain goes through three separate stages to decide if it recognises a face, scientists claim. A team from University College London says the first assesses a face's physical aspects. The second decides if it is known or unknown. If it is a recognisable face, the third part puts a name to it. The researchers say their study, published in Nature Neuroscience, could help those people with dementia who lose their ability to recognise faces. The researchers say analysing how we respond to the stages of "morphing" a recognisable figure such as Margaret Thatcher into Marilyn Monroe gives clues as to how we process the facial features we see. Their study found the brain tries to pin a single identity on a face, even if it looks like a mix of two people. A face that was 60% Marilyn Monroe and 40% Margaret Thatcher will be identified as an older version of Marilyn Monroe. But an image which is 40% Marilyn and 60% Maggie will be seen as the "sexier" side of Margaret Thatcher, say the researchers. In the study, volunteers were then shown morphed faces and asked to identify each one. They then used fMRI (functional Magnetic Resonance Imaging) scans to monitor brain activity. It was found that the inferior occipital gyri at the back of the brain were found to be particularly sensitive to slight physical changes, such as wrinkles, in the faces. (C)BBC

Keyword: Miscellaneous
Link ID: 6573 - Posted: 12.13.2004

Scientists at Bangor University have discovered a patient who appears to possess a "sixth sense" that allows him to recognise sad faces. The 52-year-old "patient X" suffered two strokes which damaged the brain areas which process visual signals. Although he cannot see, researchers found that that the patient was able to identify angry or happy human faces. Scans showed that when the man looked at faces with emotion, another part of his brain, the amygdala, was activated. The small almond-shaped structure is known to respond to non-verbal signs of anger, defensiveness, avoidance and fear. The results of the study are published in the online edition of Nature Neuroscience on Sunday. Dr Alan Pegna of the school of psychology at the University of Wales, Bangor, led the research team with colleagues in north Wales and Geneva University Hospital. They found that when "patient X" was shown images of shapes like circles and squares, he was only able to make wild guesses about what they were. Nor was he able to identify the sex of "deadpan" male and female faces with any degree of success, or tell the difference between "normal" and jumbled faces. But when the patient was asked to identify angry or happy human faces, he did so with an accuracy of 59% - significantly higher than would be expected by chance. (C)BBC

Keyword: Vision; Emotions
Link ID: 6572 - Posted: 12.13.2004

UCSF scientists have found that the brains of rats can be trained to learn an alternate way of processing changes in the loudness of sound. The discovery, they say, has potential for the treatment of hearing loss, autism, and other sensory disabilities in humans. It also gives clues, they say, about the process of learning and the way we perceive the world. "We addressed a very fundamental question," says Daniel B. Polley, PhD, lead author of the study. "When we notice a sound getting louder, what happens in our brain so that we know it's getting louder?" Polley is a postdoctoral research fellow in the laboratory of senior author Michael M. Merzenich, PhD, co-director of the Coleman Memorial Laboratory in the UCSF Keck Center for Integrative Neuroscience and UCSF professor of otolaryngology. The study was published recently in Proceedings of the National Academy of Sciences (November 16, 2004). "This is a very old idea," Polley notes. "How to relate the bigness of a stimulus to the bigness of its internal representation in the brain." Over the centuries, philosophers and scientists have put together a picture of how our brains model the world through the mechanism of our senses. Physical stimuli such as light, sound, and touch are converted by our sensory organs -- eyes, ears, and skin -- into electrical signals, which are processed by neurons in different areas of the brain. As those neurons fire, we see, hear, and feel. When the light or sound changes in intensity, our neurons fire faster or slower in direct ratio to the change. That ratio varies depending on the sense involved, but is constant for each sense: the louder a sound, the faster the neurons in the auditory cortex fire.

Keyword: Hearing
Link ID: 6571 - Posted: 12.13.2004

James Owen in London for National Geographic News Anyone who has watched crows, jays, ravens and other members of the corvid family will know they're anything but "birdbrained." For instance, jays will sit on ant nests, allowing the angry insects to douse them with formic acid, a natural pesticide which helps rid the birds of parasites. Urban-living carrion crows have learned to use road traffic for cracking tough nuts. They do this at traffic light crossings, waiting patiently with human pedestrians for a red light before retrieving their prize. Yet corvids may be even cleverer than we think. A new study suggests their cognitive abilities are a match for primates such as chimpanzees and gorillas. Furthermore, crows may provide clues to understanding human intelligence. Published tomorrow in the journal Science, the study is co-authored by Nathan Emery and Nicola Clayton, from the departments of animal behavior and experimental psychology at Cambridge University, England. They say that, while having very different brain structures, both crows and primates use a combination of mental tools, including imagination and the anticipation of possible future events, to solve similar problems. They base their argument on existing studies. Emery and Clayton write, "These studies have found that some corvids are not only superior in intelligence to birds of other avian species (perhaps with the exception of some parrots), but also rival many nonhuman primates." © 2004 National Geographic Society.

Keyword: Evolution; Intelligence
Link ID: 6570 - Posted: 06.24.2010

San Juan, Puerto Rico, – A new study conducted in rats by the National Institutes of Health (NIH) and McLean Hospital/Harvard Medical School suggests that the misdiagnosis of attention-deficit hyperactivity disorder (ADHD) combined with prescription drug use in children may lead to a higher risk of developing depressive symptoms in adulthood. This work, released at the annual American College of Neuropsychopharmacology (ACNP) conference in Puerto Rico, is among the first to examine the effects of early Ritalin exposure in rats on behavior and brain function during the later periods of life. "Attention-deficit hyperactivity disorder can be a serious medical problem for children and their parents," says lead researcher William Carlezon, Ph.D., director of McLean Hospital's Behavioral Genetics Laboratory and associate professor of psychiatry at Harvard Medical School. "While Ritalin is an effective medication that improves the quality of life for many children with ADHD, accurately diagnosing and identifying the correct treatment regimen for the disorder is essential, especially when considering health effects that can last through adulthood." Ritalin is a generic medication prescribed for children with attention-deficit hyperactivity disorder (ADHD), a condition that consists of a persistent pattern of abnormally high level of activity, impulsivity, and/or inattention. Usually diagnosed in children of preschool or elementary school age, ADHD has been estimated to affect 3 to 12 percent of children and is twice as common among boys. Children with ADHD are also likely to have other disorders, such as a learning disability, oppositional defiant disorder, conduct disorder, depression, or anxiety.

Keyword: ADHD
Link ID: 6569 - Posted: 06.24.2010

The language network of the brain seemed simpler in the past. One brain area was recognized to be critical for the production of language, another for its comprehension. A dense bundle of nerve fibers connected the two. But there have always been naysayers who pointed to evidence that failed to fit this tidy picture. Now a study employing a powerful variant of magnetic resonance imaging (MRI) confirms these suspicions. The study will be published December 13, 2004 in the online edition of Annals of Neurology (http://www.interscience.wiley.com/journal/ana). "We were surprised that the two classical language areas were densely connected to a third area, whose presence had already been suspected but whose connections with the classical network were unknown," said lead author Marco Catani, M.D., of the Institute of Psychiatry at King's College London. The authors dubbed this language area "Geschwind's territory" in honor of the American neurologist Norman Geschwind who championed its linguistic significance decades ago. Language is generated and understood in the cortex, the outermost covering of the brain. Paul Broca and Carl Wernicke, 19th Century neurologists, noted that damage to specific cortical areas, which came to bear their names, produced primarily language production or language processing disorders, but not both. A large bundle of nerve fibers was found to connect Broca's and Wernicke's areas, and damage to this pathway also produced language disorders, or aphasias.

Keyword: Language
Link ID: 6568 - Posted: 12.13.2004

If anyone doubted that drug use can damage the brain, new studies using brain scans show methamphetamine abusers' brains have damage similar to dementia, as well as considerable brain inflammation. This ScienCentral news video has more. The poetry in Lee L.'s voice as he describes his great love is hypnotic. "I loved Crystal," he gushes. Lee's not speaking of a person, but methamphetamine, known by its street name, Crystal Meth. "It gave me a sense of power. It made me feel hungry. It made me feel sexual. It made me feel virile. It was like all of the switches in my body and in my brain felt like they finally got turned on." Lee—a 42-year-old composer who asked that his last name not be used in keeping with his involvement in the twelve-step program, New York Crystal Meth Anonymous—hunted down the drug as the days dragged between runs, even though he knew it was doing considerable bodily damage. "The physical body collapses a little every time, certainly in my case, every time that I used," he recalls. "The reward that it got was it hit upon a pleasure center in the brain." © ScienCentral, 2000- 2004.

Keyword: Drug Abuse
Link ID: 6567 - Posted: 06.24.2010

Bruce Bower Here's a discovery worth toasting: Chemical analyses of pottery fragments from a prehistoric village in northern China indicate that people living there between 8,000 and 9,000 years ago concocted a fermented, winelike drink from rice, honey, and fruit. That's the oldest known evidence of an intoxicating beverage, says archaeological chemist Patrick E. McGovern of the University of Pennsylvania Museum of Archaeology and Anthropology in Philadelphia. He led the international team that scrutinized the ancient pottery. Until now, the earliest chemical evidence of wine came from Iranian jars from about 7,400 years ago. Middle Eastern beer-brewing sites date to roughly 5,000 years ago (SN: 10/2/04, p. 216: http://www.sciencenews.org/articles/20041002/bob8.asp). The new results are the latest hints that modern civilizations developed in parallel in eastern Asia and the Middle East, starting around 10,000 years ago, according to McGovern. "The domestication of plants, construction of complex villages, and production of fermented drinks began at the same time in both regions," he says. Copyright ©2004 Science Service.

Keyword: Drug Abuse
Link ID: 6566 - Posted: 06.24.2010

Roxanne Khamsi Is appreciation of music a uniquely human trait, or does any animal with decent hearing prefer pleasant combinations of notes? Cognitive scientists have discovered that tamarin monkeys have no taste for the consonant tones that mostly make up music, suggesting that musicality may be restricted to humans alone. Consonant tones are combinations of sound waves whose wavelengths are simple multiples of each other. The sounds overlap to create a smooth waveform that is pleasing to our ears. But dissonant sounds are produced when the wavelengths are very slightly different, so the two waves come in and out of phase, creating an unpleasant, jarring noise. For years, scientists have sought to explain why we prefer consonant sounds to dissonant ones. One theory is that our dislike of dissonance is related to the sensation of 'beats' that occur when the notes interfere. Previous research has shown that macaque monkeys and songbirds can tell the difference between consonant and dissonant sounds1. But the question of whether or not animals actually prefer consonant tones has been unanswered, until now. ©2004 Nature Publishing Group

Keyword: Hearing; Evolution
Link ID: 6565 - Posted: 06.24.2010

Cell replacement therapy offers a novel and powerful medical technology. A type of embryonic stem cell, called a neural crest stem cell, that persists into adulthood in hair follicles was recently discovered by Maya Sieber-Blum, Ph.D., of the Medical College of Wisconsin, Milos Grim, MD Ph.D., of Charles University Prague, and their collaborators. The discovery – reported recently in Developmental Dynamics, a journal of the American Association of Anatomists published by John Wiley & Sons, Inc. – may in many instances provide a non-controversial substitute for embryonic stem cells. Embryonic stem cells are unique, because they can differentiate into any cell type of the body. Their use, however, raises ethical concerns because embryos are being destroyed in the process. In contrast, neural crest stem cells from adults have several advantages: similar to embryonic stem cells, they have the innate ability to differentiate into many diverse cell types; they are easily accessible in the skin of adults; and the patient's own neural crest stem cells could be used for cell therapy. The latter avoids both rejection of the implant and graft-versus-host disease. Studies in the mouse showed that neural crest stem cells from adult hair follicles are able to differentiate into neurons, nerve supporting cells, cartilage/bone cells, smooth muscle cells, and pigment cells. Preliminary data indicate that equivalent stem cells reside in human hair follicles.

Keyword: Stem Cells
Link ID: 6564 - Posted: 06.24.2010

By JANE GROSS RADELL, N.J. - When Mark Plage, 15, forgets to padlock the door of his bedroom, his 13-year-old autistic brother, Derek, barges in and leaves the place a shambles. When Mark tries to toss a football with Derek, the boy turns his back and walks away. Mark's mother, by her own admission, used to scream at him for the smallest thing, unable to contain her frustration with Derek. Mark often wished she would come to his ice hockey games with his father. But Debi Plage had to stay home with her disabled son. Mark recounts these experiences without reproach and with insight well beyond his years. When Derek "messes something up," Mark said, "I just fix it." As for his brother's inability to play, he said, "I know that it's not that he won't do it, but that he can't." His mother's rages were "harder to deal with," Mark said, but "after a while I realized she wasn't really yelling at me." He can even brush aside her occasional threats to leave home and never come back. "I knew in the back of my mind she'd never do it," Mark said. "She was just saying stuff because she was really upset." Copyright 2004 The New York Times Company

Keyword: Autism
Link ID: 6563 - Posted: 12.10.2004

DURHAM, N.C. – A newly discovered genetic defect might represent an important risk factor for major depression, a condition which effects 20 million people in the U.S., according to Duke University Medical Center researchers. The mutation in the gene -- whose protein product plays a primary role in synthesizing the brain chemical serotonin -- could lead to the first diagnostic test for genetic predisposition to depression, the team said. "Abnormalities in brain levels of serotonin have been widely suspected as a key contributor to major depression and other neuropsychiatric disorders," said James B. Duke professor Marc Caron, Ph.D., a researcher in the department of cell biology, the Duke Institute for Genome Sciences and Policy and senior author of the study. "Our findings provide a novel molecular mechanism underlying dysfunction in serotonin neurotransmission in some patients with depression." The genetic defect is the first genetic variant of functional consequence in the production of serotonin identified in any psychiatric disorder, the researchers said. Patients with depression who carry the abnormal gene also show resistance to treatment with selective serotonin reuptake inhibitors (SSRIs), a class of drugs that includes paroxetine (PaxilTM), sertraline (ZoloftTM), and fluoxetine (ProzacTM), the team found. In addition to its diagnostic use, the genetic marker might therefore also aid in identifying, in advance, those patients who will likely fail to respond well to SSRI therapy.

Keyword: Depression; Genes & Behavior
Link ID: 6562 - Posted: 06.24.2010

ANN ARBOR, Mich.---Men are more likely to want to marry women who are their assistants at work rather than their colleagues or bosses, a University of Michigan study finds. The study, published in the current issue of Evolution and Human Behavior, highlights the importance of relational dominance in mate selection and discusses the evolutionary utility of male concerns about mating with dominant females. "These findings provide empirical support for the widespread belief that powerful women are at a disadvantage in the marriage market because men may prefer to marry less accomplished women," said Stephanie Brown, lead author of the study and a social psychologist at the U-M Institute for Social Research (ISR). For the study, supported in part by a grant from the National Institute of Mental Health, Brown and co-author Brian Lewis from UCLA tested 120 male and 208 female undergraduates by asking them to rate their attraction and desire to affiliate with a man and a woman they were said to know from work.

Keyword: Sexual Behavior; Evolution
Link ID: 6561 - Posted: 12.10.2004

Michael Hopkin Capuchins in the dry forests of northeastern Brazil have an unusual approach to food: they have been caught using tools to dig up tubers, a feat previously only seen in humans. "They're using their minds, not just brute force," claims Phyllis Lee of the University of Cambridge, UK, who reports the discovery with her colleague Antonio Moura in this week's Science1. Although many primates, particularly chimpanzees and orang utans, are thought to be good at reasoning things out for themselves, digging for food has never been seen before, in the wild or in captivity. Several species are known to use 'tools', such as the birds of prey that dash their hard-shelled prey on to rocks to crack them open. But the latest case of tool use differs from many of these examples because it may be based on an understanding of cause and effect. ©2004 Nature Publishing Group

Keyword: Evolution
Link ID: 6560 - Posted: 06.24.2010

Roxanne Khamsi Studies showing that magnetic stimulation of the brain induces spiritual experiences are being queried by researchers who cannot reproduce key results. If the traditional theory is wrong, scientists will be left struggling to explain how such thoughts and sensations are generated. In the past, scientists have claimed that religious or out-of-body experiences result from excessive bursts of electrical activity in the brain. In the 1980s, Michael Persinger, a neuroscientist at the Laurentian University in Ontario, Canada, began exploring this idea through a series of experiments. Participants wore helmets that targeted their temporal lobes with weak magnetic fields, of roughly the same strength as those generated by a computer monitor. Persinger found that this caused 80% of the people he tested to feel an unexplained presence in the room. Persinger suggested that magnetism causes bursts of electrical activity in the temporal lobes of the brain, and he linked this to the spiritual experiences. A group of Swedish researchers has now repeated the work, but they say their study involves one crucial difference. They ensured that neither the participants nor the experimenters interacting with them had any idea who was being exposed to the magnetic fields, a 'double-blind' protocol. ©2004 Nature Publishing Group

Keyword: Miscellaneous
Link ID: 6559 - Posted: 06.24.2010

More research has been published linking smoking to health risks - with a study suggesting the habit affects IQ Researchers from the Universities of Aberdeen and Edinburgh looked at how the cognitive abilities of smokers and non-smokers changed over time. They found smokers performed significantly worse in five separate tests. The research, part of the Scottish Mental Health Survey, is published in New Scientist magazine. Around 465 people were tested on their mental abilities in 1947 when they were aged 11. They were then tested a second time between 2002 and 2002, when they reached the age of 64. On this occasion they underwent tests to evaluate their non-verbal reasoning, memory and learning, how quickly they processed information, decisions about how to act in particular circumstances and construction tasks. Current or former smokers were found to perform less well in the tests even after factors such as childhood IQ, education, occupation and alcohol consumption were taken into account. The effect appeared to be stronger in current smokers according to the study, which was also published in the journal Addictive Behaviors. The researchers suggest a "small but significant" negative effect of 4% linked to the combined effects of smoking and impaired lung function - itself linked to smoking. It has been suggested in previous studies that there could be a link between impaired lung function and a negative effect on the thinking processes, but it is not clear what the mechanism for that might be. (C)BBC

Keyword: Drug Abuse; Intelligence
Link ID: 6558 - Posted: 12.09.2004