Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 12041 - 12060 of 29242

by Jacob Aron The mystery of how our brains perceive sound has deepened, now that musicians have smashed a limit on sound perception imposed by a famous algorithm. On the upside this means it should be possible to improve upon today's gold-standard methods for audio perception. Devised over 200 years ago, the Fourier transform is a mathematical process that splits a sound wave into its individual frequencies. It is the most common method for digitising analogue signals and some had thought that brains make use of the same algorithm when turning the cacophony of noise around us into individual sounds and voices. To investigate, Jacob Oppenheim and Marcelo Magnasco of Rockefeller University in New York turned to the Gabor limit, a part of the Fourier transform's mathematics that makes the determination of pitch and timing a trade-off. Rather like the uncertainty principle of quantum mechanics, the Gabor limit states you can't accurately determine a sound's frequency and its duration at the same time. 13 times better The pair reasoned that if people's hearing obeyed the Gabor limit, this would be a sign that they were using the Fourier transform. But when 12 musicians, some instrumentalists, some conductors, took a series of tests, such as judging slight changes in the pitch and duration of sounds at the same time, they beat the limit by up to a factor of 13. © Copyright Reed Business Information Ltd.

Keyword: Hearing
Link ID: 17775 - Posted: 02.09.2013

By Breanna Draxler Brain differences between the 23 participants were quantified at each surface vertex. Values below the global mean are shown in cool colors while values above this average are shown in warm colors. Image courtesy of Sophia Mueller et al. Every person thinks and acts a little differently than the other 7 billion on the planet. Scientists now say that variations in brain connections account for much of this individuality, and they’ve narrowed it down to a few specific regions of the brain. This might help us better understand the evolution of the human brain as well as its development in individuals. Each human brain has a unique connectome—the network of neural pathways that tie all of its parts together. Like a fingerprint, every person’s connectome is unique. To find out where these individual connectomes differed the most, researchers used an MRI scanning technique to take cross-sectional pictures of 23 people’s brains at rest. Researchers found very little variation in the areas of the participants’ brains responsible for basic senses and motor skills. It’s a pretty straight shot from the finger to the part of the brain that registers touch, for example, or from the eye to the vision center. Thus we apparently all sense the world in more or less the same way. The real variety arose in the parts of the brain associated with personality, like the frontoparietal lobe. This multipurpose area in the brain curates sensory data into complex thoughts, feelings or actions and allows us to interpret the things we sense (i.e., we recognize a red, round object as an apple). Because there are many ways to get from sensation to reaction, and many different ways to react to what we sense, each individual’s brain blazes its own paths.

Keyword: Brain imaging; Emotions
Link ID: 17774 - Posted: 02.09.2013

By Sam McNerney and Txchnologist Why do humans see colors? For years the leading hypothesis was that color vision evolved to help us spot nutritious fruits and vegetation in the forest. But in 2006, evolutionary neurobiologist Mark Changizi and colleagues proposed that color vision evolved to perceive oxygenation and hemoglobin variations in skin in order to detect social cues, emotions and the states of our friends or enemies. Just think about the reddening and whitening of the face called blushing and blanching. They elicit distinct physiological reactions that would be impossible without color vision. A few years ago Changizi left Rensselaer Polytechnic Institute where he was professor to co-found 2AI Labs with Dr. Tim Barber. Their Boise, Idaho-based research institute, funded via technology spin-offs coming out of their work, aimed at solving foundational problems in cognitive science and artificial intelligence. The move allowed Changizi to continue to conduct academic work with more intellectual freedom and less of a reliance on grants. Last summer the team at 2AI developed three pairs of glasses called O2Amps based on Changizi’s color vision theory. By visually enhancing oxygenated blood and blood pooling, the lenses amplify the social cues that allow users to perceive emotions more clearly. The eyewear is being used for a number of innovative applications. The first is medical. The lenses enhance vasculature beneath skin, helping nurses identify veins; they also amplify trauma and bruising that might be invisible to the naked eye. Many hospitals are putting the O2Amps through trials, and seeing positive results. The eyewear is also potentially useful for police and security officers– imagine if a TSA agent could more easily perceive nervousness– as well as poker players. © 2013 Scientific American,

Keyword: Vision
Link ID: 17773 - Posted: 02.06.2013

By SARAH LYALL IPSWICH, England — Who knows what the worst moment was for Paul Mason — there were so many awful milestones, as he grew fatter and fatter — but a good bet might be when he became too vast to leave his room. To get him to the hospital for a hernia operation, the local fire department had to knock down a wall and extricate him with a forklift. That was nearly a decade ago, when Mr. Mason weighed about 980 pounds, and the spectacle made him the object of fascinated horror, a freak-show exhibit. The British news media, which likes a superlative, appointed him “the world’s fattest man.” Now the narrative has shifted to one of redemption and second chances. Since a gastric bypass operation in 2010, Mr. Mason, 52 years old and 6-foot-4, has lost nearly two-thirds of his body weight, putting him at about 336 pounds — still obese, but within the realm of plausibility. He is talking about starting a jewelry business. “My meals are a lot different now than they used to be,” Mr. Mason said during a recent interview in his one-story apartment in a cheerful public housing complex here. For one thing, he no longer eats around the clock. “Food is a necessity, but now I don’t let it control my life anymore,” he said. But the road to a new life is uphill and paved with sharp objects. When he answered the door, Mr. Mason did not walk; he glided in an electric wheelchair. And though Mr. Mason looks perfectly normal from the chest up, horrible vestiges of his past stick to him, literally, in the form of a huge mass of loose skin choking him like a straitjacket. Folds and folds of it encircle his torso and sit on his lap, like an unwanted package someone has set there; more folds encase his legs. All told, he reckons, the excess weighs more than 100 pounds. © 2013 The New York Times Company

Keyword: Obesity
Link ID: 17772 - Posted: 02.06.2013

By Samuel McNerney How much does environment influence intelligence? Several years ago University of Virginia Professor Eric Turkheimer demonstrated that growing up in an impoverished and chaotic household suppresses I.Q. – without nurture, innate advantages vanish. What about genes? They matter too. After decades of research most psychologists agree that somewhere between 50% and 80% of intelligence is genetic. After all, numerous studies demonstrate that identical twins raised apart have remarkably similar I.Q.’s. A 2008 paper out of the University of Michigan turned all of this on its head. The researchers led by Susanne M. Jaeggi and Martin Buschkuehl, now at the University of Maryland, found that participants who engaged in short sessions of “cognitive training” that targeted working memory with a simple but difficult game known as the n-back task boosted a core feature of general intelligence called fluid intelligence. Crystalized intelligence improves with age and experience. Fluid intelligence, in contrast, is the capacity to make insights, solve new problems and perceive new patterns to new situations independent of previous knowledge. For decades researchers believed that fluid intelligence was immutable during adulthood because it was largely determined by genetics. The implication of the 2008 study suggested otherwise: with some cognitive training people could improve fluid intelligence and, therefore, become smarter. This brings me to a brand new paper recently published in the journal Neuroscience by DRDC Toronto researcher and Adjunct Assistant Professor of Psychology at the University of Toronto-Scarborough, Oshin Vartanian. In the study, Vartanian and his team asked if working memory training improved performance on a test of divergent thinking known as the Alternate Uses Task. Psychological research demonstrates that divergent thinking “loads” on working memory, meaning that when people engage a divergent thinking task their working memory capacity is accessed accordingly. © 2013 Scientific American,

Keyword: Intelligence; Learning & Memory
Link ID: 17771 - Posted: 02.06.2013

By Gareth Cook Just about every dog owner is convinced their dog is a genius. For a long time, scientists did not take their pronouncements particularly seriously, but new research suggests that canines are indeed quite bright, and in some ways unique. Brian Hare, an associate professor in the Department of Evolutionary Anthropology and the Center for Cognitive Neuroscience at Duke University, is one of the leading figures in the quest to understand what dogs know. The founder of the Duke Canine Cognition Center, Hare has now written a book, “The Genius of Dogs,” with his wife, the journalist Vanessa Woods. Hare answered questions from Mind Matters editor Gareth Cook. Cook: What is the biggest misconception people have about the dog mind? Hare: That there are “smart” dogs and “dumb” dogs. There’s still this throwback to a uni-dimensional version of intelligence, as though there is only one type of intelligence that you either have more or less of. In reality there are different types of intelligence. Different dogs are good at different things. Unfortunately, the very clever strategies some dogs are using are not apparent without playing a cognitive game. This means people can often underestimate the intelligence of their best friend. The pug drooling on your shoe may not look like the brightest bulb in the box, but she comes from a long line of successful dogs and is a member of the most successful mammal species on the planet besides us. Rest assured – she is a genius. © 2013 Scientific American

Keyword: Intelligence; Evolution
Link ID: 17770 - Posted: 02.06.2013

By PAM BELLUCK People with mental illness are 70 percent more likely to smoke cigarettes than people without mental illness, two federal health agencies reported Tuesday. New data from the Centers for Disease Control and Prevention and the Substance Abuse and Mental Health Services Administration show that one of every three adults with mental illness smokes, compared with one in five adults without mental illness. Adults with mental illness smoke about a third of all the cigarettes in the United States, and they smoke more cigarettes per month and are significantly less likely to quit than people without mental illness, the report said. There are nearly 46 million adults with mental illness in the United States, about a fifth of the population. “Many people with mental illness are at greater risk of dying early from smoking than of dying from their mental health conditions,” said Dr. Thomas R. Frieden, director of the Centers for Disease Control, during a press briefing. The report is based on information from the National Survey on Drug Use and Health, which interviewed 138,000 adults in their homes from 2009 to 2011. People were asked 14 questions to assess psychological distress and disability, and were deemed to have mental illness if their responses indicated they had a mental, behavior or emotional disorder in the past 12 months. Those with substance abuse or developmental disorders were not considered people with mental illness. The report did not include patients in psychiatric hospitals or individuals serving in the military. © 2013 The New York Times Company

Keyword: Schizophrenia; Drug Abuse
Link ID: 17769 - Posted: 02.06.2013

By Tina Hesman Saey The common mole may be homely but its nose is a wonder to behold. The eastern American mole, also known as the common mole, tracks down an earthworm treat by recognizing the slightly different odor cues entering each nostril, neurobiologist Kenneth Catania of Vanderbilt University in Nashville reports online February 5 in Nature Communications. The finding suggests that even though mole nostrils are separated by a fraction of a centimeter, each gets its own scent information that can guide an animal’s actions. “It’s an elegant demonstration of what many people suspected,” says Peter Brunjes, a neuroscientist at the University of Virginia. Previous experiments with people and rats had reached contradictory conclusions regarding whether smell, like sight and hearing, is a bilateral sense. Catania never expected the common mole, Scalopus aquaticus, to have uncommon abilities. “I’ve described it as the unlucky, stupid cousin of the star-nosed mole,” he says. Star-nosed moles, Condylura cristata, have an incredible sense of touch in their tentacled schnozzes and are among the world’s fastest foragers. But compared with other mole species, the eastern American mole has a poor sense of touch. The animals also can’t see. Catania turned to common moles because he thought they would have a hard time finding food and could be tested against star-nosed moles in future experiments. But when he placed a common mole in a semicircular arena with a chopped up bit of earthworm as bait, he says, “it would wiggle its nose around and go in a beeline toward the food.” © Society for Science & the Public 2000 - 2013

Keyword: Chemical Senses (Smell & Taste)
Link ID: 17768 - Posted: 02.06.2013

By Nathan Seppa The link between obesity and vitamin D deficiency appears to be a one-way street. A large study of the genetics underpinning both conditions finds that obesity may drive down vitamin D levels, but a predisposition to the vitamin deficiency doesn’t lead to obesity. The findings also suggest that boosting vitamin D levels won’t reverse obesity. An association between the two has been observed for years, but determining cause and effect has been difficult. “I find this very plausible and a correct interpretation of the data,” says Robert Heaney, an endocrinologist at Creighton University in Omaha, Neb. “I think it’s worth reporting.” In the new study, researchers tapped into a huge international database, accessing the genetic profiles of more than 42,000 people. The scientists noted whether a person harbored any of 12 genetic variants associated with being overweight. Not surprisingly, people with these variants were more likely to be obese than those without them. People with these obesity-associated gene variants were also apt to have low vitamin D levels, Elina Hyppönen, an epidemiologist and nutritionist at University College London, and colleagues report online February 5 in PLOS Medicine. When the researchers tested for four genetic variants linked to low vitamin D levels, they found that people with the variants were not necessarily prone to obesity. The researchers checked both findings against a separate database of people and got similar results. © Society for Science & the Public 2000 - 2013

Keyword: Obesity
Link ID: 17767 - Posted: 02.06.2013

by Virginia Morell The male Eurasian jay is an accommodating fellow. When his mate has been feasting steadily on mealworm larvae, he realizes that she'd now prefer to dine on wax moth larvae, which he feeds her himself. The finding adds to a small but growing number of studies that show that some animals have something like the human ability to understand what others are thinking. "It's great for a first test of this ability in birds," says Thomas Bugnyar, a cognitive biologist at the University of Vienna in Austria who was not involved in the work. Scientists still debate about whether even our closest ape relatives can attribute an unseen, mental desire to another; some continue to argue that this is a peculiarly human talent. "But some of us think that some aspects of this ability should be found here and there in different species," Bugnyar says, "and so it is good to have this jay study to compare" with the other studies on primates, humans, and human children. Male Eurasian jays feed their mates during courtship displays, says Ljerka Ostojić, a comparative psychologist and postdoc at the University of Cambridge in the United Kingdom who led the study. Because of that behavior, Ostojić and her colleagues thought that the jays might be good subjects for testing whether these birds understand their mates' desires. The group's previous research had shown that Eurasian jays and scrub jays can plan for the future. "It is commonly thought that any action animals take is determined solely by whatever they want at that moment," Ostojić says, "but the jays also plan for needs in the future." © 2010 American Association for the Advancement of Science

Keyword: Intelligence; Evolution
Link ID: 17766 - Posted: 02.05.2013

By DOUGLAS QUENQUA Why are some people able to use cocaine without becoming addicted? A new study suggests the answer may lie in the shape of their brains. Sporadic cocaine users tend to have a larger frontal lobe, a region associated with self-control, while cocaine addicts are more likely to have small frontal lobes, according to the study, which was published in the journal Biological Psychiatry. The scientists, at the University of Cambridge, collected brain scans and personality tests from people who had used cocaine over several years — some addicted, some not. While the nonaddicts shared a penchant for risk-taking behavior, the increased gray matter seemed to help them resist addiction by exerting more self-control and making more advantageous decisions. “They could take it or leave it,” said Karen Ersche, the lead author. The researchers believe the differences in brain shape predated the drug use rather than occurring as a result of it. Dr. Ersche said the findings reinforced the idea, now popular among addiction experts, that addiction depends less on character and more on biological makeup. “It’s not the Nancy Reagan approach, just say no or one day or another you will get addicted,” she said. “How the drugs work and how much you are at risk depends on what type of person you are and what type of brain you have.” © 2013 The New York Times Company

Keyword: Drug Abuse
Link ID: 17765 - Posted: 02.05.2013

By melody Yesterday, Alan Schwarz, reporting for the Sunday edition of The New York Times, published an alarmist piece on Adderall abuse. The story chronicles the short life of Richard Fee, a popular young pre-med who, after dabbling in fast-acting stimulants in college, faked his way into an ADHD diagnosis and, within months of filling his first prescription, began heavily abusing the drug, leading to severe addiction and psychosis, and ultimately to his suicide, two years ago, at the age of twenty-four. The story of Richard Fee is a tragic one, and one that highlights both the dangers of prescribing ADHD drugs to neurotypical adults and some of the problems endemic in psychiatric diagnosis. Regrettably, the reporter seems to believe that these problems are somehow specific to amphetamines, signaling “widespread failings in the system through which five million Americans take medication for ADHD”, and that Richard’s harrowing case, while undoubtedly rare, “underscores aspects of ADHD treatment that are mishandled every day with countless patients”. Schwarz is a Pulitzer-prize nominated journalist, renowned for exposing the danger of concussive head injuries in football. More recently, he has cast that same critical eye on how attention-deficit disorder is diagnosed. The question is – to what end? Presumably – in the case of this story – to tighten the restrictions on how amphetamines are prescribed to adults, and to ward against the kind of negligence and lack of oversight that characterized Richard’s case. But there is a delicate balance to be struck here between serving the needs of the ADHD population, many of whom benefit tremendously from the regulated use of stimulants, and potential drug addicts, like Richard. It is also far from clear, given the nature of psychiatric nosology, that there are any surefire ways of stopping con-artists and addicts from gaming the system. © 2013 Scientific American

Keyword: ADHD; Drug Abuse
Link ID: 17764 - Posted: 02.05.2013

By Laura Hambleton, Winter often brings the flu, coughs, ski injuries and shoveling strains. Add to these ailments a more deadly one: heart attacks. A recent study has found that more fatal heart attacks and strokes occur during the winter than at other times of the year. And it doesn’t seem to matter if the winter is occurring in the warmer climes of Southern California or the frostier ones of Boston. After sifting through about 1.7 million death certificates filed between 2005 and 2008, cardiologists Bryan Schwartz of the University of New Mexico and Robert A. Kloner of the Heart Institute at the Good Samaritan Hospital in Los Angeles found a 26 to 36 percent greater death rate for heart attacks in winter than summer “despite different locations and climates,” Kloner says. The worst months are December, January, February and the beginning of March. The doctors analyzed the cause of death for people in Texas, Arizona, Georgia, Los Angeles, Washington state, Pennsylvania and Massachusetts. Of those who died of heart disease, the winter weather pattern was clear. In Los Angeles, for example, there were about 70 deaths per day from cardiac disease, Schwartz said. “In the summer, L.A. had an average circulatory death rate of about . . . 55 deaths per day.” The research uncovered patterns in cardiac deaths from “seven different climate patterns,” according to the study, and “death rates at all sites clustered closely together and no one site was statistically different from any other site.” An abstract of the study was published in the American Heart Association journal Circulation. © 1996-2013 The Washington Post

Keyword: Biological Rhythms; Stroke
Link ID: 17763 - Posted: 02.05.2013

By Ian Chant Stress and neglect at home take an obvious toll on kids as they grow up. Many decades of research have documented the psychological consequences in adulthood, including struggles with depression and difficulties maintaining relationships. Now studies are finding that a troubled home life has profound effects on neural development. Kids' brains are exquisitely sensitive. Even sleeping infants are affected by family arguments, a new study concludes. Researchers at the University of Oregon showed with functional MRI scans that infants from families who reported more than the usual levels of conflict in the home were more sensitive to aggressive or angry voices. While asleep, these babies had an uptick in brain activity in response to sentences read in an angry tone of voice, with most of the activity clustered in the parts of the brain responsible for regulating emotions and stress. “Infants are constantly absorbing and learning things, not just when we think we're teaching them,” says Alice Graham, a doctoral student who led the study, forthcoming in the journal Psychological Science. “We should expect that what's going on in the environment is literally shaping the physical connections in their brains.” As with family fighting, neglect leaves no external marks but powerfully affects the architecture of the brain. A Yale University study of teenagers found evidence using MRI scans that neglect and emotional abuse during childhood reduces the density of cells in emotion-regulating regions of the brain later on. The teens in the study did not meet the criteria for full-blown psychiatric disorders, according to the paper published in 2011 in the Journal of the American Medical Association, yet many experienced emotional problems such as impulsive behavior and risk taking. © 2013 Scientific American,

Keyword: Stress; Development of the Brain
Link ID: 17762 - Posted: 02.05.2013

Marissa Fessenden U.S. business and policy leaders have made it a priority to increase the number of students pursuing degrees in science, technology, engineering and math, collectively known as STEM. But one source of STEM talent is often overlooked: young people with autism spectrum disorders. A study published late last year in the Journal of Autism and Developmental Disorders found that students with autism choose majors in science, technology, engineering and math at higher rates than students in the general population. Yet students with autism enter college at far lower rates. The authors say the results highlight the need to encourage students with autism to pursue a post-secondary education and that doing so may strengthen participation in the STEM fields. The only previous study to directly examine the connection between autism spectrum disorders and STEM majors was limited to a single university in the U.K. That paper, co-authored by Simon Baron-Cohen, director of the Autism Research Center at the University of Cambridge, found a link between autism and mathematical talent. The new study, led by researchers at the independent research institute SRI International, based in Menlo Park, CA, examined 11,000 students across the country and found that more young adults with an autism spectrum disorder choose STEM majors than their peers in the general population (34.31 vs. 22.8 percent) as well as their peers in 10 other disability groups (which included visual disabilities, intellectual disabilities, speech and language impairment and others). Students with autism, however, were unlikely to enroll in college at all—their rate of enrollment was the third lowest of all disability categories. © 2013 Nature Publishing Group

Keyword: Autism
Link ID: 17761 - Posted: 02.05.2013

By C. CLAIBORNE RAY Q. Nearing 70, I have increasing difficulty hearing conversations, yet music in restaurants is too loud. Why? A. Age-related hearing loss, called presbycusis, is characterized by loss of hair cells in the base of the cochlea, or inner ear, that are attuned to capture and transmit high-frequency sounds, said Dr. Anil K. Lalwani, director of otology, neurotology and skull-base surgery at NewYork-Presbyterian Hospital/Columbia University Medical Center. Loss of high-frequency hearing leads to deterioration in the ability to distinguish words in conversation. Additionally, any noise in the environment leads to even greater loss in clarity of hearing. “Contrary to expectation, presbycusis is also associated with sensitivity to loud noises,” Dr. Lalwani said. “This is due to a poorly understood phenomenon called recruitment.” Normally, a specific sound frequency activates a specific population of hair cells located at a specific position within the cochlea. With hearing loss, this specificity is lost, and a much larger population of hair cells in the adjacent areas is “recruited” and also activated, producing sensitivity to noise. “Patients with presbycusis perceive an incremental increase in loudness to be much greater than those with normal hearing,” he said. “This explains why the elderly parent complains that ‘I am not deaf!’ ” when a son or daughter repeats a misheard sentence. © 2013 The New York Times Company

Keyword: Hearing; Development of the Brain
Link ID: 17760 - Posted: 02.05.2013

By James Gallagher Health and science reporter, BBC News A tiny "genetic patch" can be used to prevent a form of deafness which runs in families, according to animal tests. Patients with Usher syndrome have defective sections of their genetic code which cause problems with hearing, sight and balance. A study, published in the journal Nature Medicine, showed the same defects could be corrected in mice to restore some hearing. Experts said it was an "encouraging" start. There are many types of Usher syndrome tied to different errors in a patient's DNA - the blueprint for building every component of the body. One of those mutations runs in families descended from French settlers in North America. When they try to build a protein called hormonin, which is needed to form the tiny hairs in the ear that detect sound, they do not finish the job. It results in hearing loss at birth and has a similar effect in the eye where it causes a gradual loss of vision. Scientists at the Rosalind Franklin University of Medicine and Science, in Chicago in the US, designed a small strip of genetic material which attaches to the mutation and keeps the body's factories building the protein. There has been something of a flurry of developments in restoring hearing in the past year. BBC © 2013

Keyword: Hearing; Genes & Behavior
Link ID: 17759 - Posted: 02.05.2013

By Erin Wayman The story of the Neandertals may need a new ending, a controversial study suggests. Using improved radiocarbon methods, scientists redated two of the youngest known Neandertal cave sites and concluded that they are at least 10,000 years older than previous studies have found. The findings cast doubt on the reliability of radiocarbon dates from other recent Neandertal sites, the researchers suggest online February 4 in the Proceedings of the National Academy of Sciences. This means the last Neandertals might have died out much earlier than previously thought, which could cause anthropologists to rethink how and why these hominids vanished. Researchers have long debated whether the harsh Ice Age climate, the appearance of modern humans migrating out of Africa, or some other factor drove Neandertals to extinction. “The paper is simply excellent,” says archaeologist Olaf Jöris of the Romano-Germanic Central Museum in Mainz, Germany. The new research supports Jöris’ own review of Neandertal dates, in which he concluded that the most-recent Neandertals probably lived around 42,000 years ago. The standard view suggests that the last of these hominids occupied Europe as recently as about 28,000 years ago. But other archaeologists are not convinced by the new work. “We shouldn’t get too carried away over results that amount to a few radiocarbon dates from two sites,” says Paul Pettitt, an archaeologist at Durham University in England. © Society for Science & the Public 2000 - 2013

Keyword: Evolution
Link ID: 17758 - Posted: 02.05.2013

By Melissa Dahl, NBC News Scarlet fever plays the villain in some of the best children's books: It got "Little Women's" Beth March. It got the child in "The Velveteen Rabbit" (although the kid survives, so, really, the fever got the stuffed rabbit). And it robbed Mary Ingalls, sweet sister of "Little House" series author Laura Ingalls Wilder, of her sight. Or so we were told. But today, the journal Pediatrics asserts that it wasn't scarlet fever that caused Mary's blindness -- it was viral meningoencephalitis, an inflammatory disease that attacks the brain. This is the sort of thing that is extremely interesting if you are interested in this sort of thing. And we'd wager many people are: The "Little House" books have remained in print ever since the initial publication of "Little House in the Big Woods" in 1932, and they're still popular today, with three titles landing on the School Library Journal's 2012 list of best children's chapter books. Even if you never read the books, you probably remember the TV series, which aired from 1974 to 1983. Dr. Beth Tarini, assistant professor of pediatrics at the University of Michigan, and her co-authors make their claim after scouring epidemiological data on blindness and infectious disease around the time of Mary's illness, plus analyzing local newspapers and Laura's unpublished memoir, "Pioneer Girl." For Tarini, it's the culmination of a project she began in medical school 10 years ago, after a confusing conversation with a professor. © 2013 NBCNews.com

Keyword: Vision
Link ID: 17757 - Posted: 02.05.2013

By Tia Ghose The identity of a mysterious patient who helped scientists pinpoint the brain region responsible for language has been discovered, researchers report. The finding, detailed in the January issue of the Journal of the History of the Neurosciences, identifies the patient as Louis Leborgne, a French craftsman who battled epilepsy his entire life. In 1840, a wordless patient was admitted to the Bicetre Hospital outside Paris for aphasia, or an inability to speak. He was essentially just kept there, slowly deteriorating. It wasn’t until 1861 that the man, who was known only as “Monsieur Leborgne” and who was nicknamed “Tan” for the only word he could say, came to physician Paul Broca’s ward at the hospital. Leborgne died shortly after the meeting, and Broca performed his autopsy, during which Broca found a lesion in a region of the brain tucked back and up behind the eyes. After doing a detailed examination, Broca concluded that Tan’s aphasia was caused by damage to this region and that the particular brain region controlled speech. That part of the brain was later renamed Broca’s area. At the time, scientists were debating whether different areas of the brain performed separate functions or whether it was an undifferentiated lump that did one task, like the liver, said Marjorie Lorch, a neurolinguist in London who was not involved in the study. © 1996-2013 The Washington Post

Keyword: Language; Stroke
Link ID: 17756 - Posted: 02.05.2013