Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Nicholas Bakalar Short-term psychotherapy may be an effective way to prevent repeated suicide attempts. Using detailed Danish government health records, researchers studied 5,678 people who had attempted suicide and then received a program of short-term psychotherapy based on needs, including crisis intervention, cognitive therapy, behavioral therapy, and psychodynamic and psychoanalytic treatment. They compared them with 17,034 people who had attempted suicide but received standard care, including admission to a hospital, referral for treatment or discharge with no referral. They were able to match the groups in more than 30 genetic, health, behavioral and socioeconomic characteristics. The study is online in Lancet Psychiatry. Treatment focused on suicide prevention and comprised eight to 10 weeks of individual sessions. Over a 20-year follow-up, 16.5 percent of the treated group attempted suicide again, compared with 19.1 percent of the untreated group. In the treated group, 1.6 percent died by suicide, compared with 2.2 percent of the untreated. “Suicide is a rare event,” said the lead author, Annette Erlangsen, an associate professor at the Johns Hopkins Bloomberg School of Public Health, “and you need a huge sample to study it. We had that, and we were able to find a significant effect.” The authors estimate that therapy prevented 145 suicide attempts and 30 deaths by suicide in the group studied. © 2014 The New York Times Company
Keyword: Depression
Link ID: 20379 - Posted: 12.02.2014
By Sarah C. P. Williams Craving a stiff drink after the holiday weekend? Your desire to consume alcohol, as well as your body’s ability to break down the ethanol that makes you tipsy, dates back about 10 million years, researchers have discovered. The new finding not only helps shed light on the behavior of our primate ancestors, but also might explain why alcoholism—or even the craving for a single drink—exists in the first place. “The fact that they could put together all this evolutionary history was really fascinating,” says Brenda Benefit, an anthropologist at New Mexico State University, Las Cruces, who was not involved in the study. Scientists knew that the human ability to metabolize ethanol—allowing people to consume moderate amounts of alcohol without getting sick—relies on a set of proteins including the alcohol dehydrogenase enzyme ADH4. Although all primates have ADH4, which performs the crucial first step in breaking down ethanol, not all can metabolize alcohol; lemurs and baboons, for instance, have a version of ADH4 that’s less effective than the human one. Researchers didn’t know how long ago people evolved the more active form of the enzyme. Some scientists suspected it didn’t arise until humans started fermenting foods about 9000 years ago. Matthew Carrigan, a biologist at Santa Fe College in Gainesville, Florida, and colleagues sequenced ADH4 proteins from 19 modern primates and then worked backward to determine the sequence of the protein at different points in primate history. Then they created copies of the ancient proteins coded for by the different gene versions to test how efficiently each metabolized ethanol. They showed that the most ancient forms of ADH4—found in primates as far back as 50 million years ago—only broke down small amounts of ethanol very slowly. But about 10 million years ago, the team reports online today in the Proceedings of the National Academy of Sciences, a common ancestor of humans, chimpanzees, and gorillas evolved a version of the protein that was 40 times more efficient at ethanol metabolism. © 2014 American Association for the Advancement of Science.
Keyword: Drug Abuse; Evolution
Link ID: 20378 - Posted: 12.02.2014
By Nicholas Bakalar Researchers have found that people diagnosed with diabetes in their 50’s are significantly more likely than others to suffer mental decline by their 70’s. The study, published Monday in the Annals of Internal Medicine, started in 1990. Scientists examined 13,351 black and white adults, aged 48 to 67, for diabetes and prediabetes using self-reported physician diagnoses and glucose control tests. They also administered widely used tests of memory, reasoning, problem solving and planning. About 13 percent had diabetes at the start. The researchers followed them with five periodic examinations over the following 20 years. By that time, 5,987 participants were still enrolled. After adjusting for numerous health and behavioral factors, and for the large attrition in the study, the researchers found people with diabetes suffered a 30 percent larger decline in mental acuity than those without the disease. Diabetes can impair blood circulation, and the authors suggest that the association of diabetes with thinking and memory problems may be the result of damage to small blood vessels in the brain. “People may think cognitive decline with age is inevitable, but it’s not,” said the senior author, Elizabeth Selvin, an associate professor of epidemiology at the Johns Hopkins Bloomberg School of Public Health. “Factors like diabetes are potentially modifiable. If we can better control diabetes we can stave off cognitive decline and future dementia.” © 2014 The New York Times Company
Keyword: Obesity; Learning & Memory
Link ID: 20377 - Posted: 12.02.2014
By EUGENIA BONE I TRIED magic mushrooms out of curiosity and in middle age. I’d been on the amateur mycological circuit for a couple of years, but hallucinogenic species were rarely mentioned at the foraging expeditions and conferences I attended. It’s almost as if they were the black sheep of mycology: embarrassing to serious taxonomy jocks. I read some books on the subject, but most were tripper’s guides that didn’t utilize, um, specific language or current science. Psychoactive mushrooms had been in a kind of scientific ghetto ever since they were criminalized in 1968. But now the drug derived from the mushroom, psilocybin, is finally being re-examined for its medical applications. A study published last month in the Journal of the Royal Society Interface compared M.R.I.s of the brains of subjects injected with psilocybin with scans of their normal brain activity. The brains on psilocybin showed radically different connectivity patterns between cortical regions (the parts thought to play an important role in consciousness). The researchers mapped out these connections, revealing the activity of new neural networks between otherwise disconnected brain regions. The researchers suspect that these unusual connections may be responsible for the synesthetic experience trippers describe, of hearing colors, for example, and seeing sounds. The part of the brain that processes sound may be connecting to the part of the brain that processes sight. The study’s leader, Paul Expert at King’s College London, told me that his team doubted that this psilocybin-induced connectivity lasted. They think they are seeing a temporary modification of the subject’s brain function. © 2014 The New York Times Company
Keyword: Depression; Drug Abuse
Link ID: 20376 - Posted: 12.01.2014
by Andy Coghlan What would Stewart Little make of it? Mice have been created whose brains are half human. As a result, the animals are smarter than their siblings. The idea is not to mimic fiction, but to advance our understanding of human brain diseases by studying them in whole mouse brains rather than in dishes. The altered mice still have mouse neurons – the "thinking" cells that make up around half of all their brain cells. But practically all the glial cells in their brains, the ones that support the neurons, are human. "It's still a mouse brain, not a human brain," says Steve Goldman of the University of Rochester Medical Center in New York. "But all the non-neuronal cells are human." Goldman's team extracted immature glial cells from donated human fetuses. They injected them into mouse pups where they developed into astrocytes, a star-shaped type of glial cell. Within a year, the mouse glial cells had been completely usurped by the human interlopers. The 300,000 human cells each mouse received multiplied until they numbered 12 million, displacing the native cells. "We could see the human cells taking over the whole space," says Goldman. "It seemed like the mouse counterparts were fleeing to the margins." Astrocytes are vital for conscious thought, because they help to strengthen the connections between neurons, called synapses. Their tendrils (see image) are involved in coordinating the transmission of electrical signals across synapses. © Copyright Reed Business Information Ltd.
Keyword: Learning & Memory; Glia
Link ID: 20375 - Posted: 12.01.2014
By CATHERINE SAINT LOUIS Nearly 55 percent of infants nationwide are put to bed with soft blankets or covered by a comforter, even though such bedding raises the chances of suffocation or sudden infant death syndrome, federal researchers reported Monday. Their study, published in the journal Pediatrics, is the first to estimate how many infants sleep with potentially hazardous quilts, bean bags, blankets or pillows. Despite recommendations to avoid putting anything but a baby in a crib, two-thirds of black and Latino parents still use bedding that is both unnecessary and unsafe, the study also found. “I was startled a little bit by the number of people still using bedding in the sleep area,” said Dr. Michael Goodstein, a neonatologist in York, Pa., who serves on a task force on sleep-related infant deaths at the American Academy of Pediatrics. “Sleeping face down on soft bedding increases the risks of SIDS 21-fold.” Among the risk factors for SIDS, “bedding has fallen through the cracks,” said Dr. Thomas G. Keens, the chairman of the California SIDS Advisory Council. “This article is a wake-up call.” The new analysis looked at data gathered from 1993 to 2010 in the National Infant Sleep Position Study, which surveyed a random sample of nearly 19,000 parents by telephone. Use of infant bedding declined roughly 23 percent annually from 1993 to 2000. In recent years, however, the declines have slowed or stalled entirely. From 2001 to 2010, use of inappropriate bedding for white and Hispanic infants declined just 5 to 7 percent annually. There was no decline in the use of such bedding for black infants. Parents in the new study were not asked their reasons for using bedding. Previous research has found that they worry infants will be cold, or that the crib mattress is too hard. © 2014 The New York Times Company
Keyword: Sleep; Development of the Brain
Link ID: 20374 - Posted: 12.01.2014
Some teenagers appear to show changes in their brains after one season of playing American football, a small study suggests. Even though players were not concussed during the season, researchers found abnormalities similar to the effects of mild traumatic brain injury. Twenty-four players aged between 16 and 18 were studied and devices on their helmets measured head impacts. The study was presented to the Radiological Society of North America. In recent years, a number of reports have expressed concern about the potential effects on young, developing brains of playing contact sports. These studies have tended to focus on brain changes as a result of concussion. But this study focused on the effects of head impacts on the brain, even when players did not suffer concussion at any point during the season. Using detailed scans of the players' brains before the season began and then again after it ended, the researchers were able to identify slight changes to the white matter of the brain. White matter contains millions of nerve fibres which act as communication cables between the brain's regions. Those players who were hit harder and hit more often were more likely to show these changes in post-season brain scans. Dr Alex Powers, co-author and paediatric neurosurgeon at Wake Forest Baptist Medical Centre in North Carolina, said the changes were a direct result of the hits received by the young players during their football season. BBC © 2014
Keyword: Brain Injury/Concussion; Brain imaging
Link ID: 20373 - Posted: 12.01.2014
By BILL PENNINGTON It happens dozens of times in every N.F.L. game. There is a fierce collision, or perhaps a running back is slammed to the ground. Most of the time, all the players rise to their feet uneventfully. Other times, as the pileup unravels, a player gets up slowly. His gait may be unsteady. For decades in the N.F.L., the operative term for the situation was that someone “got dinged.” It was a cute, almost harmless-sounding description of what was often a concussion or a worrying subconcussive blow to the head. But with the N.F.L. agreeing to pay hundreds of millions of dollars to settle a lawsuit brought by about 5,000 former players who said the league hid from them the dangers of repeated hits to the head, a backpedaling league has corrected its lingo and hastily amended its methodology. The N.F.L. now has a concussion management protocol, outlined in an inches-thick document that commands teams to institute a specific, detailed game-day and postconcussion course of action. Once, the treatment of players with head injuries varied from team to team and could be haphazard. Beginning last season, all players suspected of having a head injury — should they lose consciousness from a collision or experience symptoms like a headache, dizziness or disorientation — were required to go through the concussion protocol system. It features a broad cast: a head-injury spotter in the press box, athletic trainers on the bench, doctors and neuro-trauma specialists on the sideline and experts in neuro-cognitive testing in the locker room. The system is far from foolproof — players with serious symptoms remain in games. But as the N.F.L. grapples with a sobering threat to the welfare of its work force, not to mention a public-relations nightmare, the new concussion protocol is meant to establish a systemic, itemized policy on how potential brain injuries should be handled. © 2014 The New York Times Company
Keyword: Brain Injury/Concussion
Link ID: 20372 - Posted: 12.01.2014
By John Edward Terrell We will certainly hear it said many times between now and the 2016 elections that the country’s two main political parties have “fundamental philosophical differences.” But what exactly does that mean? At least part of the schism between Republicans and Democrats is based in differing conceptions of the role of the individual. We find these differences expressed in the frequent heated arguments about crucial issues like health care and immigration. In a broad sense, Democrats, particularly the more liberal among them, are more likely to embrace the communal nature of individual lives and to strive for policies that emphasize that understanding. Republicans, especially libertarians and Tea Party members on the ideological fringe, however, often trace their ideas about freedom and liberty back to Enlightenment thinkers of the 17th and 18th centuries, who argued that the individual is the true measure of human value, and each of us is naturally entitled to act in our own best interests free of interference by others. Self-described libertarians generally also pride themselves on their high valuation of logic and reasoning over emotion. The basic unit of human social life is not and never has been the selfish and self-serving individual. Philosophers from Aristotle to Hegel have emphasized that human beings are essentially social creatures, that the idea of an isolated individual is a misleading abstraction. So it is not just ironic but instructive that modern evolutionary research, anthropology, cognitive psychology and neuroscience have come down on the side of the philosophers who have argued that the basic unit of human social life is not and never has been the selfish, self-serving individual. Contrary to libertarian and Tea Party rhetoric, evolution has made us a powerfully social species, so much so that the essential precondition of human survival is and always has been the individual plus his or her relationships with others. © 2014 The New York Times Company
Keyword: Evolution
Link ID: 20371 - Posted: 12.01.2014
By Anna North What is depression? Anyone who has dealt with the condition knows what it can feel like — but what causes it, what sustains it, and what’s the best way to make it subside? Despite the prevalence of the disorder — in one Centers for Disease Control and Prevention study, 9.1 percent of adults met the criteria for depression — experts haven’t fully answered these questions. And to fully do so, some say we need new ways of thinking about depression entirely. For Turhan Canli, a professor of integrative neuroscience at Stony Brook University, that means looking at the possibility that depression could be caused by an infection. “I’ve always been struck by the fact that the treatment options did not seem to have dramatically improved over the course of decades,” Dr. Canli told Op-Talk. “I always had a feeling that somehow we seem to be missing the actual treatment of the disease.” He was intrigued by research showing a connection between depression and inflammation in the body, and he started to think about the known causes of inflammation — among them pathogens like bacteria, viruses and parasites. In a paper published in the journal Biology of Mood and Anxiety Disorders, he lays out his case for rethinking depression as a response to infection. He notes that the symptoms of depression are similar to those of infection: “Patients experience loss of energy; they commonly have difficulty getting out of bed and lose interest in the world around them. Although our Western conceptualization puts affective symptoms front-and-center, non-Western patients who meet DSM criteria for major depression report primarily somatic symptoms.” © 2014 The New York Times Company
Keyword: Depression; Neuroimmunology
Link ID: 20370 - Posted: 11.29.2014
Daniel Freeman and Jason Freeman “Although it is a waste of time to argue with a paranoid patient about his delusions, he may still be persuaded to keep them to himself, to repress them as far as possible and to forgo the aggressive action they might suggest, in general to conduct his life as if they did not exist.” This quote from Clinical Psychiatry, a hugely influential textbook in the 1950s and 1960s, epitomises the way in which unusual mental states were generally understood for much of the 20th century. Delusions (such as paranoid thoughts) and hallucinations (hearing voices, for example) were of interest purely as symptoms of psychosis, or what used to be called madness. Apart from their utility in diagnosis, they were deemed to be meaningless: the incomprehensible effusions of a diseased brain. Or in the jargon: “empty speech acts, whose informational content refers to neither world nor self”. There’s a certain irony here, of course, in experts supposedly dedicated to understanding the way the mind works dismissing certain thoughts as unworthy of attention or explanation. The medical response to these phenomena, which were considered to be an essentially biological problem, was to eradicate them with powerful antipsychotic drugs. This is not to say that other strategies weren’t attempted: in one revealing experiment in the 1970s, patients in a ward for “paranoid schizophrenics” in Vermont, US, were rewarded with tokens for avoiding “delusional talk”. These tokens could be exchanged for items including “meals, extra dessert, visits to the canteen, cigarettes, time off the ward, time in the TV and game room, time in bedroom between 8am and 9pm, visitors, books and magazines, recreation, dances on other wards.” (It didn’t work: most patients modified their behaviour temporarily, but “changes in a patient’s delusional system and general mental status could not be detected by a psychiatrist”.) © 2014 Guardian News and Media Limited
Keyword: Schizophrenia
Link ID: 20369 - Posted: 11.29.2014
|By Jason G. Goldman A sharp cry pierces the air. Soon a worried mother deer approaches the source of the sound, expecting to find her fawn. But the sound is coming from a speaker system, and the call isn't that of a baby deer at all. It's an infant fur seal's. Because deer and seals do not live in the same habitats, mother deer should not know how baby seal screams sound, reasoned biologists Susan Lingle of the University of Winnipeg and Tobias Riede of Midwestern University, who were running the acoustic experiment. So why did a mother deer react with concern? Over two summers, the researchers treated herds of mule deer and white-tailed deer on a Canadian farm to modified recording of the cries of a wide variety of infant mammals—elands, marmots, bats, fur seals, sea lions, domestic cats, dogs and humans. By observing how mother deer responded, Lingle and Riede discovered that as long as the fundamental frequency was similar to that of their own infants' calls, those mothers approached the speaker as if they were looking for their offspring. Such a reaction suggests deep commonalities among the cries of most young mammals. (The mother deer did not show concern for white noise, birdcalls or coyote barks.) Lingle and Riede published their findings in October in the American Naturalist. Researchers had previously proposed that sounds made by different animals during similar experiences—when they were in pain, for example—would share acoustic traits. “As humans, we often ‘feel’ for the cry of young animals,” Lingle says. That empathy may arise because emotions are expressed in vocally similar ways among mammals. © 2014 Scientific American
Keyword: Language; Evolution
Link ID: 20368 - Posted: 11.29.2014
by Aviva Rutkin THERE is only one real rule to conversing with a baby: talking is better than not talking. But that one rule can make a lifetime of difference. That's the message that the US state of Georgia hopes to send with Talk With Me Baby, a public health programme devoted to the art of baby talk. Starting in January, nurses will be trained in the best way to speak to babies to help them learn language, based on what the latest neuroscience says. Then they, along with teachers and nutritionists, will model this good behaviour for the parents they meet. Georgia hopes to expose every child born in 2015 in the Atlanta area to this speaking style; by 2018, the hope is to reach all 130,000 or so newborns across the state. Talk With Me Baby is the latest and largest attempt to provide "language nutrition" to infants in the US – a rich quantity and variety of words supplied at a critical time in the brain's development. Similar initiatives have popped up in Providence, Rhode Island, where children have been wearing high-tech vests that track every word they hear, and Hollywood, where the Clinton Foundation has encouraged television shows like Parenthood and Orange is the New Black to feature scenes demonstrating good baby talk. "The idea is that language is as important to the brain as food is to physical growth," says Arianne Weldon, director of Get Georgia Reading, one of several partner organisations involved in Talk With Me Baby. © Copyright Reed Business Information Ltd.
Keyword: Language; Development of the Brain
Link ID: 20367 - Posted: 11.29.2014
By Virginia Morell When we listen to someone talking, we hear some sounds that combine to make words and other sounds that convey such things as the speaker’s emotions and gender. The left hemisphere of our brain manages the first task, while the right hemisphere specializes in the second. Dogs also have this kind of hemispheric bias when listening to the sounds of other dogs. But do they have it with human sounds? To find out, two scientists had dogs sit facing two speakers. The researchers then played a recorded short sentence—“Come on, then”—and watched which way the dogs turned. When the animals heard recordings in which individual words were strongly emphasized, they turned to the right—indicating that their left hemispheres were engaged. But when they listened to recordings that had exaggerated intonations, they turned to the left—a sign that the right hemisphere was responding. Thus, dogs seem to process the elements of speech very similarly to the way humans do, the scientists report online today in Current Biology. According to the researchers, the findings support the idea that our canine pals are indeed paying close attention not only to who we are and how we say things, but also to what we say. © 2014 American Association for the Advancement of Science.
Keyword: Animal Communication; Language
Link ID: 20366 - Posted: 11.29.2014
By Amy Ellis Nutt Scientists say the "outdoor effect" on nearsighted children is real: natural light is good for the eyes. (Photo by Bill O'Leary/The Washington Post) It's long been thought kids are more at risk of nearsightedness, or myopia, if they spend hours and hours in front of computer screens or fiddling with tiny hand-held electronic devices. Not true, say scientists. But now there is research that suggests that children who are genetically predisposed to the visual deficit can improve their chances of avoiding eyeglasses just by stepping outside. Yep, sunshine is all they need -- more specifically, the natural light of outdoors -- and 14 hours a week of outdoor light should do it. Why this is the case is not exactly clear. "We don't really know what makes outdoor time so special," said Donald Mutti, the lead researcher of the study from Ohio State University College of Optometry, in a press release. "If we knew, we could change how we approach myopia." What is known is that UVB light, (invisible ultraviolet B rays), plays a role in the cellular production of vitamin D, which is believed to help the eyes focus light on the retina. However, the Ohio State researchers think there is another possibility. "Between the ages of five and nine, a child's eye is still growing," said Mutti. "Sometimes this growth causes the distance between the lens and the retina to lengthen, leading to nearsightedness. We think these different types of outdoor light may help preserve the proper shape and length of the eye during that growth period."
Keyword: Vision; Development of the Brain
Link ID: 20365 - Posted: 11.29.2014
By BENEDICT CAREY Quick: Which American president served before slavery ended, John Tyler or Rutherford B. Hayes? If you need Google to get the answer, you are not alone. (It is Tyler.) Collective cultural memory — for presidents, for example — works according to the same laws as the individual kind, at least when it comes to recalling historical names and remembering them in a given order, researchers reported on Thursday. The findings suggest that leaders who are well known today, like the elder President George Bush and President Bill Clinton, will be all but lost to public memory in just a few decades. The particulars from the new study, which tested Americans’ ability to recollect the names of past presidents, are hardly jaw-dropping: People tend to recall best the presidents who served recently, as well as the first few in the country’s history. They also remember those who navigated historic events, like the ending of slavery (Abraham Lincoln) and World War II (Franklin D. Roosevelt). But the broader significance of the report — the first to measure forgetfulness over a 40-year period, using a constant list — is that societies collectively forget according to the same formula as, say, a student who has studied a list of words. Culture imitates biology, even though the two systems work in vastly different ways. The new paper was published in the journal Science. “It’s an exciting study, because it mixes history and psychology and finds this one-on-one correspondence” in the way memory functions, said David C. Rubin, a psychologist at Duke University who was not involved in the research. The report is based on four surveys by psychologists now at Washington University in St. Louis, conducted from 1974 to 2014. In the first three, in 1974, 1991 and 2009, Henry L. Roediger III gave college students five minutes to write down as many presidents as they could remember, in order. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20364 - Posted: 11.29.2014
by Bethany Brookshire We all experience stress, but some handle it better than others. A lot of research has focused on what makes animals and people susceptible to stress and how that, in turn, can trigger depression. It makes sense to study the condition, not the people that don’t experience it. Depression and susceptibility are the broken state. Resilience seems normal by comparison. But resilience is not just the absence of susceptibility. It turns out that a protein called beta-catenin plays an active role in resilience. A new study, from Eric Nestler’s laboratory at the Mount Sinai School of Medicine in New York City, also identifies a large number of new targets that could help scientists understand why some people are susceptible to stress — and how they might be made more resilient. “When people study stress responses, we often just assume that in an animal that’s stressed, there’s an active process that creates these depression-like behaviors,” says Andre Der-Avakian, a neuroscientist at the University of California, San Diego. “But this study and studies from others have shown that resilience is also an active process.” The nucleus accumbens is an area of the brain most often linked with reward and pleasure from items we enjoy, such as food or drugs. But the area also shows changes in people with depression. “It makes sense — here’s a region important in responding to rewards,” Nestler explains. “One of the symptoms of people with depression is that they don’t derive pleasure from things in life.” © Society for Science & the Public 2000 - 2014
Keyword: Stress; Depression
Link ID: 20363 - Posted: 11.26.2014
Ewen Callaway Nerve cells that transmit pain, itch and other sensations to the brain have been made in the lab for the first time. Researchers say that the cells will be useful for developing new painkillers and anti-itch remedies, as well as understanding why some people experience unexplained extreme pain and itching. “The short take-home message would be ‘pain and itch in a dish’, and we think that’s very important,” says Kristin Baldwin, a stem-cell scientist at the Scripps Research Institute in La Jolla, California, whose team converted mouse and human cells called fibroblasts into neurons that detect sensations such as pain, itch or temperature1. In a second paper2, a separate team took a similar approach to making pain-sensing cells. Both efforts were published on 24 November in Nature Neuroscience. Peripheral sensory neurons, as these cells are called, produce specialized ‘receptor’ proteins that detect chemical and physical stimuli and convey them to the brain. The receptor that a cell makes determines its properties — some pain-sensing cells respond to chilli oil, for example, and others respond to different pain-causing chemicals. Mutations in the genes encoding these receptors can cause some people to experience chronic pain or, in rare cases, to become impervious to pain. To create these cells in the lab, independent teams led by Baldwin and by Clifford Woolf, a neuroscientist at Boston Children’s Hospital in Massachusetts, identified combinations of proteins that — when expressed in fibroblasts — transformed them into sensory neurons after several days. Baldwin's team identified neurons that make receptors that detect sensations including pain, itch, and temperature, whereas Woolf’s team looked only at pain-detecting cells. Both teams generated cells that resembled neurons in shape and fired in response to capsaicin, which gives chilli peppers their kick, and mustard oil. © 2014 Nature Publishing Group
Keyword: Pain & Touch
Link ID: 20362 - Posted: 11.26.2014
By Amy Ellis Nutt In a novel use of video game playing, researchers at Ohio State have found a Pac-Man-like game, when played repetitively, can improve vision in both children and adults who have "lazy eye" or poor depth perception. In the Pac-Man-style game, players wear red-green 3-D glasses that filter images to the right and left eyes. The lazy or weak eye sees two discs containing vertical, horizontal or diagonal lines superimposed on a background of horizontal lines. The dominant eye sees a screen of only horizontal lines. The player controls the larger, Pac-man-like disc and chases the smaller one. In another game, the player must match discs with rows based on the orientation of their lines. Ten Leng Ooi, professor of optometry at Ohio State University, presented her research findings at last week's annual meeting of the Society for Neuroscience. Only a handful of test subjects were involved in the experimental training, but all saw weak-eye improvement to 20/20 vision or better and for a period of at least eight months. Lazy eye, or amblyopia, affects between 2 and 3 percent of the U.S. population. The disorder usually occurs in infancy when the neural pathway between the brain and one eye (or sometimes both) fails to fully develop. Often the cause of lazy eye is strabismus, in which the eyes are misaligned or "crossed." To prevent double vision, the brain simply blocks the fuzzy images from one eye, thereby causing incomplete visual development. The result: lazy eye.
Keyword: Vision; Development of the Brain
Link ID: 20361 - Posted: 11.26.2014
|By Piercarlo Valdesolo Google “successful Thanksgiving” and you will get a lot of different recommendations. Most you’ve probably heard before: plan ahead, get help, follow certain recipes. But according to new research from Florida State University, enjoying your holiday also requires a key ingredient that few guests consider as they wait to dive face first into the turkey: a belief in free will. What does free will have to do with whether or not Aunt Sally leaves the table in a huff? These researchers argue that belief in free will is essential to experiencing the emotional state that makes Thanksgiving actually about giving thanks: gratitude. Previous research has shown that our level of gratitude for an act depends on three things: 1) the cost to the benefactor (in time, effort or money), 2) the value of the act to the beneficiary, and 3) the sincerity of the benefactor’s intentions. For example, last week my 4-year-old daughter gave me a drawing of our family. This act was costly (she spent time and effort), valuable (I love the way she draws herself bigger than everyone else in the family), and sincere (she drew it because she knew I would like it). But what if I thought that she drew it for a different reason? What if I thought that she was being coerced by my wife? Or if I thought that this was just an assignment at her pre-school? In other words, what if I thought she had no choice but to draw it? I wouldn’t have defiantly thrown it back in her face, but I surely would have felt differently about the sincerity of the action. It would have diminished my gratitude. © 2014 Scientific American
Keyword: Consciousness; Emotions
Link ID: 20360 - Posted: 11.26.2014