Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 14081 - 14100 of 29332

by Catherine de Lange I TRY to forget about potential onlookers as I crawl around a central London park, blindfolded and on all fours. With a bit of luck, the casual passer-by might not notice the blindfold and think I'm just looking for a contact lens. In fact, I'm putting my sense of smell to the test, and attempting to emulate the sensory skills of a sniffer dog. Just as a beagle can swiftly hunt down a pheasant using only its nasal organ, I am using mine to follow a 10-metre trail of cinnamon oil. Such a challenge might sound doomed to failure. After all, dog noses are renowned for their sensitivity to smells, while human noses are poor by comparison. Yet that might be a misconception. According to a spate of recent studies, our noses are in fact exquisitely sensitive instruments that guide our everyday life to a surprising extent. Subtle smells can change your mood, behaviour and the choices you make, often without you even realising it. Our own scents, meanwhile, flag up emotional states such as fear or sadness to those around us. The big mystery is why we aren't aware of our nasal activity for more of the time. Noses have certainly never been at the forefront of sensory research, and were pushed aside until recently in favour of the seemingly more vital senses of vision and hearing. "There has been a lot of prejudice that people are not that influenced by olfactory stimuli, especially compared to other mammals," says Lilianne Mujica-Parodi, who studies the neurobiology of human stress at Stony Brook University in New York. © Copyright Reed Business Information Ltd.

Keyword: Chemical Senses (Smell & Taste); Emotions
Link ID: 15821 - Posted: 09.20.2011

by Lisa Grossman The key to pleasant music may be that it pleases our neurons. A new model suggests that harmonious musical intervals trigger a rhythmically consistent firing pattern in certain auditory neurons, and that sweet sounds carry more information than harsh ones. Since the time of the ancient Greeks, we have known that two tones whose frequencies were related by a simple ratio like 2:1 (an octave) or 3:2 (a perfect fifth) produce the most pleasing, or consonant, musical intervals. This effect doesn't depend on musical training – infants and even monkeys can hear the difference. But it was unclear whether consonant chords are easier on the ears because of the way the sound waves combine in the air, or the way our brains convert them to electrical impulses. A new mathematical model presents a strong case for the brain. "We have found that the reason for this difference is somewhere at the level of neurons," says Yuriy Ushakov at the N. I. Lobachevsky State University of Nizhniy Novgorod in Russia. Ushakov and colleagues considered a simple mathematical model of the way sound travels from the ear to the brain. In their model, two sensory neurons react to different tones. Each sends an electrical signal to a third neuron, called an interneuron, which sends a final signal to the brain. The model's interneuron fires when it receives input from either or both sensory neurons. © Copyright Reed Business Information Ltd.

Keyword: Hearing; Emotions
Link ID: 15820 - Posted: 09.20.2011

by Marianne English Though research has shown that women are more likely than men to remember the emotional details of an event, there may be another dividing factor when it comes to memory: birth control. Scientists know people's hormones shape how their memories form. For instance, our fight-or-flight hormones influence how the brain encodes a specific memory, with traumatic events making more of an impact than everyday activities. A portion of the brain called the amygdala works on the receiving end of these hormones and is thought to play a central role in making and storing new memories. Birth control works by reducing the amount of estrogen and progesterone in a woman's body to limit ovulation, but it's unclear whether these hormones affect how a person recalls an event. In one study, researchers looked at whether women taking oral contraceptives remembered events from an experiment differently than women with normal menstrual cycles not on birth control. Seventy-two female subjects were recruited for the study, half on the pill and half not. Each group watched variations of a slide show story that involved a young boy being hit by a car. Before and throughout the slide show, researchers collected saliva samples to measure alpha-amylase -- a chemical that signifies a drop or rise in the fight-or-flight hormone norepinephrine, which increases a person's heart rate during emergencies or stressful situations. © 2011 Discovery Communications, LLC.

Keyword: Hormones & Behavior; Learning & Memory
Link ID: 15819 - Posted: 09.20.2011

By Janet Raloff In obese people, even when the brain knows the body isn’t hungry, it responds to food as if it were, new brain-scan data show. That means that when obese people try to shed weight, they may find themselves on the losing side of a battle with neural centers that unconsciously encourage them to eat. For instance, in normal-weight people a neural reward system that reinforces positive feelings associated with food turns off when levels of the blood sugar glucose return to normal after a meal — a signal that the body’s need for calories has been sated. But in obese people, that reward center in the central brain turns on at the sight of high-calorie food even when their blood sugar levels are normal. The new findings show that “the regulatory role of glucose was missing in the obese,” says Elissa Epel of the University of California, San Francisco, an obesity researcher not involved with the new study. She says the data might “explain the drive to eat that some obese people feel despite how much they’ve eaten.” For the study, nine lean and five obese adult volunteers viewed pictures of foods such as ice cream, french fries, cauliflower or a salad while undergoing brain scans. Throughout the procedure, researchers asked the recruits to rate their hunger and how much they wanted a particular item. Volunteers arrived for their brain scans several hours after eating, and the researchers used insulin pumps to establish volunteers’ blood sugar levels at either normal background values (roughly 90 milligrams per deciliter), or at the “mild” end of low (around 70 milligrams per deciliter). That low value can occur briefly in some people during the day, especially in people with diabetes or metabolic conditions that precede diabetes, notes endocrinologist Robert Sherwin of Yale University, coauthor of the new study. © Society for Science & the Public 2000 - 2011

Keyword: Obesity
Link ID: 15818 - Posted: 09.20.2011

By RONI CARYN RABIN “Can I draw something for you — what should I draw?” Lonni Sue Johnson asked, but she didn’t wait for an answer. She drew a squiggly line that became a curly halo of hair around the cheerful face of a seated man stretching one leg upward, balancing a large bird on his foot. Within minutes, she had added a cat wearing a necklace, stars and a tiny, grinning airplane. “I like this part, because you want people to be happy,” she said, beaming. “Every sheet of paper is a treat.” Ms. Johnson, 61, is an artist and illustrator whose playful, bright-hued and often complex work has appeared in a wide array of publications, from the cover of The New Yorker to children’s books to murder mysteries to The New York Times — even a physics textbook. All that changed in December 2007, when she was stricken with viral encephalitis, a life-threatening disease that did severe damage to parts of her brain — including the hippocampus, where new memories are formed. She survived, but remembered little about her life before the illness. Yet she is still able to make art, though it is simpler and more childlike than her professional work. Her case is rare, experts say, because few accomplished artists continue to create after sustaining severe brain damage. © 2011 The New York Times Company

Keyword: Learning & Memory
Link ID: 15817 - Posted: 09.20.2011

By ERIK OLSEN OFF THE BAHAMAS — In a remote patch of turquoise sea, Denise L. Herzing splashes into the water with a pod of 15 Atlantic spotted dolphins. For the next 45 minutes, she engages the curious creatures in a game of keep-away, using a piece of Sargassum seaweed like a dog’s chew toy. Dr. Herzing is no tourist cavorting with marine mammals. As the world’s leading authority on the species, she has been studying the dolphins for 25 years as part of the Wild Dolphin Project, the longest-running underwater study of its kind. “I’m kind of an old-school naturalist,” she said. “I really believe in immersing yourself in the environment of the animal.” Immerse herself she has. Based in Jupiter, Fla., she has tracked three generations of dolphins in this area. She knows every animal by name, along with individual personalities and life histories. She has captured much of their lives on video, which she is using to build a growing database. And next year Dr. Herzing plans to begin a new phase of her research, something she says has been a lifetime goal: real-time two-way communication, in which dolphins take the initiative to interact with humans. Up to now, dolphins have shown themselves to be adept at responding to human prompts, with food as a reward for performing a task. “It’s rare that we ask dolphins to seek something from us,” Dr. Herzing said. © 2011 The New York Times Company

Keyword: Animal Communication; Evolution
Link ID: 15816 - Posted: 09.20.2011

People with schizophrenia are six times more likely to develop epilepsy, says a study which finds a strong relationship between the two diseases. Writing in Epilepsia, researchers in Taiwan say this could be due to genetic, neurobiological or environmental factors. The study followed around 16,000 patients with epilepsy and schizophrenia between 1999 and 2008. An epilepsy expert says it is an interesting and convincing study. The study used data from the Taiwan National Health Insurance database and was led by researchers from the China Medical University Hospital in Taichung. They identified 5,195 patients with schizophrenia and 11,527 patients with epilepsy who were diagnosed during the nine years period. These groups of patients were compared to groups of the same sex and age who did not have either epilepsy or schizophrenia. The findings show that the incidence of epilepsy was 6.99 per 1,000 person-years in the schizophrenia patient group compared to 1.19 in the non-schizophrenia group. The incidence of schizophrenia was 3.53 per 1,000 person-years for patients with epilepsy compared to 0.46 in the non-epilepsy group. Previous studies had suggested a prevalence of psychosis among epilepsy patients. BBC © 2011

Keyword: Schizophrenia; Epilepsy
Link ID: 15815 - Posted: 09.19.2011

By AMY HARMON MONTCLAIR, N.J. — For weeks, Justin Canha, a high school student with autism, a love of cartoons and a gift for drawing, had rehearsed for the job interview at a local animation studio. As planned, he arrived that morning with a portfolio of his comic strips and charcoal sketches, some of which were sold through a Chelsea gallery. Kate Stanton-Paule, the teacher who had set up the meeting, accompanied him. But his first words upon entering the office were, like most things involving Justin, not in the script. “Hello, everybody,” he announced, loud enough to be heard behind the company president’s door. “This is going to be my new job, and you are going to be my new friends.” As the employees exchanged nervous glances that morning in January 2010, Ms. Stanton-Paule, the coordinator of a new kind of “transition to adulthood” program for special education students at Montclair High School, wondered if they were all in over their heads. Justin, who barely spoke until he was 10, falls roughly in the middle of the spectrum of social impairments that characterize autism, which affects nearly one in 100 American children. He talks to himself in public, has had occasional angry outbursts, avoids eye contact and rarely deviates from his favorite subject, animation. His unabashed expression of emotion and quirky sense of humor endear him to teachers, therapists and relatives. Yet at 20, he had never made a true friend. © 2011 The New York Times Company

Keyword: Autism
Link ID: 15814 - Posted: 09.19.2011

by Carl Zimmer For 100 million people around the globe who suffer from macular degeneration and other diseases of the retina, life is a steady march from light into darkness. The intricate layers of neurons at the backs of their eyes gradually degrade and lose the ability to snatch photons and translate them into electric signals that are sent to the brain. Vision steadily blurs or narrows, and for some, the world fades to black. Until recently some types of retinal degeneration seemed as inevitable as the wrinkling of skin or the graying of hair—only far more terrifying and debilitating. But recent studies offer hope that eventually the darkness may be lifted. Some scientists are trying to inject signaling molecules into the eye to stimulate light-collecting photoreceptor cells to regrow. Others want to deliver working copies of broken genes into retinal cells, restoring their function. And a number of researchers are taking a fundamentally different, technology-driven approach to fighting blindness. They seek not to fix biology but to replace it, by plugging cameras into people’s eyes. Scientists have been trying to build visual prostheses since the 1970s. This past spring the effort reached a crucial milestone, when European regulators approved the first commercially available bionic eye. The Argus II, a device made by Second Sight, a company in California, includes a video camera housed in a special pair of glasses. It wirelessly transmits signals from the camera to a 6 pixel by 10 pixel grid of electrodes attached to the back of a subject’s eye. The electrodes stimulate the neurons in the retina, which send secondary signals down the optic nerve to the brain. © 2011, Kalmbach Publishing Co.

Keyword: Vision; Robotics
Link ID: 15813 - Posted: 09.17.2011

by Ferris Jabr Slimy and often sluggish they may be, but some molluscs deserve credit for their brains – which, it now appears, they managed to evolve independently, four times. The mollusc family includes the most intelligent invertebrates on the planet: octopuses, squid and cuttlefishMovie Camera. Now, the latest and most sophisticated genetic analysis of their evolutionary history overturns our previous understanding of how they got so brainy. The new findings expand a growing body of evidence that in very different groups of animals – molluscs and mammals, for instance – central nervous systems evolved not once, but several times, in parallel. Kevin Kocot of Auburn University, Alabama, and his colleagues are responsible for the new evolutionary history of the mollusc family, which includes 100,000 living species in eight lineages. They analysed genetic sequences common to all molluscs and looked for differences that have accumulated over time: the more a shared sequence differs between two species, the less related they are. The findings, which rely on advanced statistical analyses, fundamentally rearrange branches on the mollusc family tree. In the traditional tree, snails and slugs (gastropods) are most closely related to octopuses, squid, cuttlefish and nautiluses (cephalopods), which appears to make sense in terms of their nervous systems: both groups have highly centralised nervous systems compared with other molluscs and invertebrates. Snails and slugs have clusters of ganglia – bundles of nerve cells – which, in many species, are fused into a single organ; cephalopods have highly developed central nervous systems that enable them to navigate a maze, use tools, mimic other species, learn from each other and solve complex problems. © Copyright Reed Business Information Ltd.

Keyword: Evolution; Intelligence
Link ID: 15812 - Posted: 09.17.2011

Sandrine Ceurstemont, editor, New Scientist TV It looks like a house falling apart at the seams. But although the construction seems to have hinges that open and close, it actually has no bending or moving parts. Keep watching the video and its true form is revealed. The brain trick, reproduced in this video by illusion enthusiast Rex Young, was originally created by magician Jerry Andrus. We've previously shown you two similar illusions that exploit the same effect - can you remember what they were? The first person to post the two correct hyperlinks in the Comments section below will win a New Scientist goodie bag. You can also make you own collapsing house illusion by using this template. © Copyright Reed Business Information Ltd.

Keyword: Vision
Link ID: 15811 - Posted: 09.17.2011

by Michael Marshall People will do almost anything if they think it will help them cheat death. The futurist Ray Kurzweil has utterly transformed his lifestyle in a bid to live until 2050, by when he thinks technology will allow his consciousness to be uploaded into a computer, making him immortal. His anti-ageing regimen is based on established research that has identified ways to slow the process. Cutting your intake of calories and getting plenty of exercise both seem to help. One of Kurzweil's ploys is to get lots of sleep too. In this, he is unwittingly emulating the Djungarian hamster. These rodents use short hibernatory naps to reverse the ageing process. Djungarian hamsters suffer from a Chaucerian degree of uncertainty over how to spell their name. Because the word has been transliterated from Mongolian, they can be called "Djungarian", "Dzungarian" or "Dzhungarian", not to mention "Siberian" and "Russian winter white dwarf". Popular as pets, they're only distantly related to the golden hamsters most commonly kept. Each Djungarian hamster lives alone in an underground burrow. When conditions are good it seeks out seeds and insects, which it brings back to the nest in its cheek pouches. It only meets other hamsters to mate. © Copyright Reed Business Information Ltd.

Keyword: Biological Rhythms
Link ID: 15810 - Posted: 09.17.2011

Mo Costandi Research showing that action video games have a beneficial effect on cognitive function is seriously flawed, according to a review published this week in Frontiers in Psychology1. Numerous studies published over the past decade have found that training on fast-paced video games such as Medal of Honor and Grand Theft Auto that require a wide focus and quick responses has broad 'transfer effects' that enhance other cognitive functions, such as visual attention. Some of the studies have been highly cited and widely publicized: one, by cognitive scientists Daphne Bavelier and Shawn Green of the University of Rochester in New York, published in Nature in 20032, has been cited more than 650 times, and was widely reported by the media as showing that video games boost visual skills. But, say the authors of the review, that paper and the vast majority of other such studies contain basic methodological flaws and do not meet the gold standard of a properly conducted clinical trial. "Our main focus was recent work specifically examining the effects of modern action games on college-aged participants," says Walter Boot, a psychologist at Florida State University in Tallahassee, and lead author of the review. "To our knowledge, we've captured all of these papers in our review, and all of the literature suffers from the limitations we discuss." © 2011 Nature Publishing Group,

Keyword: Learning & Memory; Intelligence
Link ID: 15809 - Posted: 09.17.2011

By Bruce Bower For an instant identity crisis, just peruse some photographs of a stranger’s face. In many instances, people view different mug shots of an unfamiliar person as entirely different individuals, say psychologist Rob Jenkins of the University of Glasgow, Scotland, and his colleagues. Yet photos of a celebrity or other recognizable person retain a uniform identity despite changes in lighting, facial expression and other factors across images, the scientists report in a paper published online September 3 in Cognition. To better understand issues such as eyewitness memory, and with an eye on creating reliable facial-recognition software, psychologists, vision researchers and computer scientists are studying how people recognize faces of individuals they’ve just seen and faces of those they’ve encountered over many years. These studies typically examine whether volunteers recognize an image of a person’s face and distinguish it from individual shots of other faces. Variability in photos of the same face has gone largely unexplored, but the issue could pose problems, researchers say. “A complete theory of face recognition should explain not only how we tell people apart, but also how we tell people together,” Jenkins’ team concludes. A strong tendency to see different people in different images of the same face raises questions about whether passports and other photo IDs provide reliable proof of identity, the researchers contend. © Society for Science & the Public 2000 - 2011

Keyword: Attention
Link ID: 15808 - Posted: 09.17.2011

By PAGAN KENNEDY “Fingers!” Gerwin Schalk sputtered, waving his hands around in the air. “Fingers are made to pick up a hammer.” He prodded the table, mimicking the way we poke at computer keyboards. “It’s totally ridiculous,” he said. I was visiting Schalk, a 40-year-old computer engineer, at his bunkerlike office in the Wads­worth Center, a public-health lab outside Albany that handles many of New York State’s rabies tests. It so happens that his lab is also pioneering a new way to control our computers — with thoughts instead of fingers. Schalk studies people at the Albany Medical Center who have become, not by choice, some of the world’s first cyborgs. One volunteer was a young man in his 20s who suffers from a severe form of epilepsy. He had been outfitted with a temporary device, a postcard-­size patch of electrodes that sits on the brain’s cortex, known as an electrocorticographic (ECoG) implant. Surgeons use these implants to home in on the damaged tissue that causes seizures. Schalk took advantage of the implant to see if the patient could control the actions in a video game called Galaga using only his thoughts. In the videotape of this experiment, you see a young man wearing a turban of bandages with wires running from his head to a computer in a cart. “Pew, pew,” the ship on the computer screen whines, as it decimates buglike creatures. The patient flicks the spaceship back and forth by imagining that he is moving his tongue. This creates a pulse in his brain that travels through the wires into a computer. Thus, a thought becomes a software command. © 2011 The New York Times Company

Keyword: Robotics
Link ID: 15807 - Posted: 09.17.2011

by Bob Holmes THE two men in the hospital ward had both hit their heads in car accidents, but that was where the similarities ended. One would spend weeks unconscious in critical care, near to death. The other had only a mild concussion; he never lost consciousness, but somehow didn't feel quite right. Yet months later their roles were reversed. "The one with the severe injury is almost back to normal function," says Douglas Smith, director of the Center for Brain Injury Repair at the University of Pennsylvania in Philadelphia, "and the one with concussion can't go back to work, probably ever." Smith's two patients illustrate one of the frustrating paradoxes of head injuries: even seemingly mild impacts can have devastating long-term consequences. And we have no way of predicting who will fully recover and who will have lingering problems. Concussion, or mild traumatic brain injury as doctors call it, has long been seen as a benign and temporary affliction. But over the past decade there has been growing realisation that longer-term symptoms can affect between 10 and 15 per cent of those diagnosed with it. These range from fuzzy thinking and memory lapses to, for the most unfortunate, serious neurological conditions such as premature Alzheimer's disease. In fact, concussion is thought to be the single biggest environmental cause of Alzheimer's. Even a mild impact can double the risk of developing early dementia, according to a massive study of older military veterans in the US, which was presented at the Alzheimer's Association International Conference in July (bit.ly/pDhlHJ). In that, the risk jumped from 7 to 15 per cent. © Copyright Reed Business Information Ltd.

Keyword: Brain Injury/Concussion
Link ID: 15806 - Posted: 09.15.2011

by Celeste Biever HOW intelligent are you? I'd like to think I know how smart I am, but the test in front of me is making me reconsider. On my computer screen, a puzzling row of boxes appears: some contain odd-looking symbols, while others are empty. I click on one of the boxes. A red sign indicates I made an error. Dammit. I concentrate, and try again. Yes, a green reward! Despite this small success, I am finding it tough to make sense of what's going on: this is unlike any exam I've ever done. Perhaps it's not surprising that it feels unfamiliar - it's not your average IQ test. I am taking part in the early stages of an effort to develop the first "universal" intelligence test. While traditional IQ and psychometric tests are designed to home in on differences between people, a universal test would rank humans, robots, chimps and perhaps even aliens on a single scale - using a mathematically derived definition of intelligence, rather than one tainted by human bias. What's the point? The idea for a universal test has emerged from the study of artificial intelligence and a desire for better ways to measure it. Next year, the most famous test for gauging the smarts of machines will be widely celebrated on the 100th anniversary of the birth of Alan Turing, its creator. The Turing test is, however, flawed. To pass it, a machine has to fool a human judge into believing he or she is conversing with another person. But exactly how much smarter are you than the cleverest robot? The test cannot tell you. It also cannot measure intelligence greater than a human's. Machines are getting smarter - possibly smarter than us, soon - so we need a much better way to gauge just how clever they are. © Copyright Reed Business Information Ltd.

Keyword: Intelligence; Evolution
Link ID: 15805 - Posted: 09.15.2011

By Laura Sanders Researchers have found a new way to make a temporary chink in the brain’s armor, opening the door for treatments to get in. The results of a rodent study, published September 14 in the Journal of Neuroscience, may highlight a method to sneak therapies for diseases such as Alzheimer’s, HIV and cancer past the blood-brain barrier. “It’s a very interesting development,” says Celia Brosnan of the Albert Einstein College of Medicine in the Bronx, N.Y. “It has considerable potential, but obviously there’s a lot that needs to be followed up on.” Many potentially useful compounds exist to treat neurological diseases, notes study coauthor Margaret Bynoe of Cornell University College of Veterinary Medicine in Ithaca, N.Y. “The problem is getting them into the brain,” she says, where walls of specialized cells line blood vessels and keep harmful substances out. Since this barrier also impedes drugs, researchers have been eager to find a safe and effective way to get past it. In the new study, Bynoe and colleagues targeted molecules that sit on the surface of blood-brain barrier cells. Using a compound called NECA to activate these molecules, called adenosine receptors, in mice caused the barrier to open for up to 18 hours, the team found. After this activation, bulky sugar molecules introduced by the researchers, along with giant antibodies engineered to scoop up the Alzheimer’s-related protein amyloid-beta, crossed from the blood into the brain. © Society for Science & the Public 2000 - 2011

Keyword: Drug Abuse
Link ID: 15804 - Posted: 09.15.2011

By JAMES GORMAN Laughter is regularly promoted as a source of health and well being, but it has been hard to pin down exactly why laughing until it hurts feels so good. The answer, reports Robin Dunbar, an evolutionary psychologist at Oxford, is not the intellectual pleasure of cerebral humor, but the physical act of laughing. The simple muscular exertions involved in producing the familiar ha, ha, ha, he said, trigger an increase in endorphins, the brain chemicals known for their feel-good effect. His results build on a long history of scientific attempts to understand a deceptively simple and universal behavior. “Laughter is very weird stuff, actually,” Dr. Dunbar said. “That’s why we got interested in it.” And the findings fit well with a growing sense that laughter contributes to group bonding and may have been important in the evolution of highly social humans. Social laughter, Dr. Dunbar suggests, relaxed and contagious, is “grooming at a distance,” an activity that fosters closeness in a group the way one-on-one grooming, patting and delousing promote and maintain bonds between individual primates of all sorts. In five sets of studies in the laboratory and one field study at comedy performances, Dr. Dunbar and colleagues tested resistance to pain both before and after bouts of social laughter. The pain came from a freezing wine sleeve slipped over a forearm, an ever tightening blood pressure cuff or an excruciating ski exercise. The findings, published in the Proceedings of the Royal Society B: Biological Sciences, eliminated the possibility that the pain resistance measured was the result of a general sense of well being rather than actual laughter. And, Dr. Dunbar said, they also provided a partial answer to the ageless conundrum of whether we laugh because we feel giddy or feel giddy because we laugh. © 2011 The New York Times Company

Keyword: Pain & Touch; Emotions
Link ID: 15803 - Posted: 09.15.2011

By Nicholas Luther A study by a team of UC Berkeley neuroscientists has uncovered new cerebral mechanisms behind tinnitus, a currently incurable condition that produces a constant ringing or buzzing sound in the ear in the absence of other noise. The study, published Sept. 6 in the journal Proceedings of the National Academy of Sciences, provides an alternative hypothesis to the prevailing view that tinnitus is associated with neurons in the region of the cortex that is affected by hearing loss. The research, which was conducted by the UC Berkeley Helen Wills Neuroscience Institute, shows that the higher-frequency neurons in the area of the cortex affected by hearing loss are responsible for the high-pitched noise characteristic of tinnitus. According to Shaowen Bao, co-author of the study and adjunct assistant professor of neuroscience at UC Berkeley, this study is the first to attribute tinnitus to the sensory-deprived region — which is affected by hearing loss — of the cortex. “Researchers have come up with the prevailing theory based on numerous studies and findings,” he said. “However, these findings are somewhat inconsistent. We hope we can get a more coherent idea of what’s happening.” According to the American Tinnitus Association, tinnitus affects more than 50 million Americans with varying degrees of severity ranging from barely noticeable to debilitating. Although scientists have yet to identify a cure for the condition, popular treatment options include masking the tinnitus noise with music, which provides temporary relief, and training the neurons in the auditory cortex to enhance their response to lost frequencies, which helps to gradually reconnect the ear to the sensory-deprived neurons. © Copyright 2011 The Daily Californian

Keyword: Hearing
Link ID: 15802 - Posted: 09.15.2011