Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 8581 - 8600 of 29326

Helen Thomson Genetic changes stemming from the trauma suffered by Holocaust survivors are capable of being passed on to their children, the clearest sign yet that one person’s life experience can affect subsequent generations. The conclusion from a research team at New York’s Mount Sinai hospital led by Rachel Yehuda stems from the genetic study of 32 Jewish men and women who had either been interned in a Nazi concentration camp, witnessed or experienced torture or who had had to hide during the second world war. They also analysed the genes of their children, who are known to have increased likelihood of stress disorders, and compared the results with Jewish families who were living outside of Europe during the war. “The gene changes in the children could only be attributed to Holocaust exposure in the parents,” said Yehuda. Her team’s work is the clearest example in humans of the transmission of trauma to a child via what is called “epigenetic inheritance” - the idea that environmental influences such as smoking, diet and stress can affect the genes of your children and possibly even grandchildren. The idea is controversial, as scientific convention states that genes contained in DNA are the only way to transmit biological information between generations. However, our genes are modified by the environment all the time, through chemical tags that attach themselves to our DNA, switching genes on and off. Recent studies suggest that some of these tags might somehow be passed through generations, meaning our environment could have and impact on our children’s health. © 2015 Guardian News and Media Limited

Keyword: Epigenetics; Stress
Link ID: 21325 - Posted: 08.22.2015

By Christian Jarrett If we’re being honest, most of us have at least some selfish aims – to make money, to win a promotion at work, and so on. But importantly, we pursue these goals while at the same time conforming to basic rules of decency. For example, if somebody helps us out, we’ll reciprocate, even if doing so costs us time or cash. Yet there is a minority of people out there who don’t play by these rules. These selfish individuals consider other people as mere tools to be leveraged in the pursuit of their aims. They think nothing of betrayal or backstabbing, and they basically believe everyone else is in it for themselves too. Psychologists call these people “Machiavellians,” and there’s a questionnaire that tests for this trait (one of the so-called “dark triad” of personality traits along with narcissism and psychopathy). People high in Machiavellianism are more likely to agree with statements like: It is wise to flatter important people and The best way to handle people is to tell them what they want to hear. Calling them Machiavellian is too kind. These people are basically jerks. Related Stories Inside the Brains of Happily Married Couples Lonely People’s Brains Work Differently Now a team of Hungarian researchers from the University of Pécs has scanned the brains of high scorers on Machiavellianism while they played a simple game of trust. Reporting their results in the journal Brain and Cognition, the researchers said they found that Machiavellians’ brains went into overdrive when they encountered a partner who exhibited signs of being fair and cooperative. Why? Tamas Bereczkei and his team say it’s because the Machiavellians are immediately figuring out how to exploit the situation for their own gain. The game involved four stages and the student participants — a mix of high and low scorers on Machiavellianism — played several times with different partners. First, the participants were given roughly $5 worth of Hungarian currency and had to decide how much to “invest” in their partner. Any money they invested was always tripled as it passed to their partner. © 2015, New York Media LLC.

Keyword: Emotions
Link ID: 21324 - Posted: 08.22.2015

By Catherine Saint Louis People who work 55 hours or more per week have a 33 percent greater risk of stroke and a 13 percent greater risk of coronary heart disease than those working standard hours, researchers reported on Wednesday in the Lancet. The new analysis includes data on more than 600,000 individuals in Europe, the United States and Australia, and is the largest study thus far of the relationship between working hours and cardiovascular health. But the analysis was not designed to draw conclusions about what caused the increased risk and could not account for all relevant confounding factors. “Earlier studies have pointed to heart attacks as a risk of long working hours, but not stroke,” said Dr. Urban Janlert, a professor of public health at Umea University in Sweden, who wrote an accompanying editorial. “That’s surprising.” Mika Kivimaki, a professor of epidemiology at University College London, and his colleagues combined the results of multiple studies and tried to account for factors that might skew the results. In addition to culling data from published studies, the researchers also compiled unpublished information from public databases and asked authors of previous work for additional data. Dr. Steven Nissen, the chief of cardiovascular medicine at the Cleveland Clinic, found the methodology unconvincing. “It’s based upon exclusively observational studies, many of which were unpublished,” and some never peer-reviewed, he said. Seventeen studies of stroke included 528,908 men and women who were tracked on average 7.2 years. Some 1,722 nonfatal and deadly strokes were recorded. After controlling for smoking, physical activity and high blood pressure and cholesterol, the researchers found a one-third greater risk of stroke among those workers who reported logging 55 or more hours weekly, compared with those who reported working the standard 35 to 40 hours. © 2015 The New York Times Company

Keyword: Stroke; Stress
Link ID: 21323 - Posted: 08.22.2015

Almost fully-formed brain grown in a lab. Woah: Scientists grow first nearly fully-formed human brain. Boffins raise five-week-old fetal human brain in the lab for experimentation. On Tuesday, all the above appeared as headlines for one particular story. What was it all about? Mini-brains 3 to 4 millimetres across have been grown in the lab before, but if a larger brain had been created – and the press release publicising the claim said it was the size of a pencil eraser – that would be a major breakthrough. New Scientist investigated the claims. The announcement was made by Rene Anand, a neuroscientist at Ohio State University in Columbus, at a military health research meeting in Florida. Anand says he has grown a brain – complete with a cortex, midbrain and brainstem – in a dish, comparable in maturity to that of a fetus aged 5 weeks. Anand and his colleague Susan McKay started with human skin cells, which they turned into induced pluripotent stem cells (iPSCs) using a tried-and-tested method. By applying an undisclosed technique, one that a patent has been applied for, the pair say they were able to encourage these stem cells to form a brain. “We are replicating normal development,” says Anand. He says they hope to be able to create miniature models of brains experiencing a range of diseases, such as Parkinson’s and Alzheimer’s. Inconclusive evidence But not everyone is convinced, especially as Anand hasn’t published his results. Scientists we sent Anand’s poster presentation to said that although the team has indeed grown some kind of miniature collection of cells, or “organoid”, in a dish, the structure isn’t much like a fetal brain. © Copyright Reed Business Information Ltd.

Keyword: Development of the Brain
Link ID: 21322 - Posted: 08.22.2015

Tina Hesman Saey Researchers have discovered a “genetic switch” that determines whether people will burn extra calories or save them as fat. A genetic variant tightly linked to obesity causes fat-producing cells to become energy-storing white fat cells instead of energy-burning beige fat, researchers report online August 19 in the New England Journal of Medicine. Previously scientists thought that the variant, in a gene known as FTO (originally called fatso), worked in the brain to increase appetite. The new work shows that the FTO gene itself has nothing to do with obesity, says coauthor Manolis Kellis, a computational biologist at MIT and the Broad Institute. But the work may point to a new way to control body fat. In humans and many other organisms, genes are interrupted by stretches of DNA known as introns. Kellis and Melina Claussnitzer of Harvard Medical School and colleagues discovered that a genetic variant linked to increased risk of obesity affects one of the introns in the FTO gene. It does not change the protein produced from the FTO gene or change the gene’s activity. Instead, the variant doubles the activity of two genes, IRX3 and IRX5, which are involved in determining which kind of fat cells will be produced. FTO’s intron is an enhancer, a stretch of DNA needed to control activity of far-away genes, the researchers discovered. Normally, a protein called ARID5B squats on the enhancer and prevents it from dialing up activity of the fat-determining genes. In fat cells of people who have the obesity-risk variant, ARID5B can’t do its job and the IRX genes crank up production of energy-storing white fat. © Society for Science & the Public 2000 - 2015.

Keyword: Obesity; Genes & Behavior
Link ID: 21321 - Posted: 08.20.2015

By Gretchen Reynolds Sticking to a diet requires self-control and a willingness to forgo present pleasures for future benefits. Not surprisingly, almost everyone yields to temptation at least sometimes, opting for the cookie instead of the apple. Wondering why we so often override our resolve, scientists at the Laboratory for Social and Neural Systems Research at the University of Zurich recently considered the role of stress, which is linked to a variety of health problems, including weight gain. (There’s something to the rom-com cliché of the jilted lover eating ice cream directly from the carton.) But just how stress might drive us to sweets has not been altogether clear. It turns out that even mild stress may immediately alter the workings of our brains in ways that undermine willpower. For their study, published this month in Neuron, researchers recruited 51 young men who said they were trying to maintain a healthy diet and lifestyle. The men were divided into two groups, one of which served as a control, and then all were asked to skim through images of different kinds of food on a computer screen, rating them for taste and healthfulness. Next, the men in the experimental group were told to plunge a hand into a bowl of icy water for as long as they could, a test known to induce mild physiological and psychological stress. Relative to the control group, the men developed higher levels of cortisol, a stress hormone. After that, men from each group sat in a brain-scanning machine and watched pictures of paired foods flash across a screen. Generally, one of the two foods was more healthful than the other. The subjects were asked to click rapidly on which food they would choose to eat, knowing that at the end of the test they would actually be expected to eat one of these picks (chosen at random from all of their choices). © 2015 The New York Times Company

Keyword: Obesity; Stress
Link ID: 21320 - Posted: 08.20.2015

Bill McQuay The natural world is abuzz with the sound of animals communicating — crickets, birds, even grunting fish. But scientists learning to decode these sounds say the secret signals of African elephants — their deepest rumblings — are among the most intriguing calls any animal makes. Katy Payne, the same biologist who recognized song in the calls of humpback whales in the 1960s, went on to help create the Elephant Listening Project in the Central African Republic in the 1980s. At the time, Payne's team was living in shacks in a dense jungle inhabited by hundreds of rare forest elephants. That's where one of us — Bill McQuay — first encountered the roar of an elephant in 2002, while reporting a story for an NPR-National Geographic collaboration called Radio Expeditions. Here's how Bill remembers that day in Africa: I was walking through this rainforest to an observation platform built up in a tree — out of the reach of the elephants. I climbed up onto the platform, a somewhat treacherous exercise with all my recording gear. Then I set up my recording equipment, put on the headphones, and started listening. That first elephant roar sounded close. But I was so focused on the settings on my recorder that I didn't bother to look around. The second roar sounded a lot closer. I thought, this is so cool! What I didn't realize was, there was this huge bull elephant standing right underneath me — pointing his trunk up at me, just a few feet away. Apparently he was making a "dominance display." © 2015 NPR

Keyword: Language; Evolution
Link ID: 21319 - Posted: 08.20.2015

Helen Thomson Modafinil is the world’s first safe “smart drug”, researchers at Harvard and Oxford universities have said, after performing a comprehensive review of the drug. They concluded that the drug, which is prescribed for narcolepsy but is increasingly taken without prescription by healthy people, can improve decision- making, problem-solving and possibly even make people think more creatively. While acknowledging that there was limited information available on the effects of long-term use, the reviewers said that the drug appeared safe to take in the short term, with few side effects and no addictive qualities. Modafinil has become increasingly common in universities across Britain and the US. Prescribed in the UK as Provigil, it was licensed in 2002 for use as a treatment for narcolepsy - a brain disorder that can cause a person to suddenly fall asleep at inappropriate times or to experience chronic pervasive sleepiness and fatigue. Used without prescription, and bought through easy-to-find websites, modafinil is what is known as a smart drug - used primarily by people wanting to improve their focus before an exam. A poll of Nature journal readers suggested that one in five have used drugs to improve focus, with 44% stating modafinil as their drug of choice. But despite its increasing popularity, there has been little consensus on the extent of modafinil’s effects in healthy, non-sleep-disordered humans. A new review of 24 of the most recent modafinil studies suggests that the drug has many positive effects in healthy people, including enhancing attention, improving learning and memory and increasing something called “fluid intelligence” - essentially our capacity to solve problems and think creatively. © 2015 Guardian News and Media Limited

Keyword: ADHD; Sleep
Link ID: 21318 - Posted: 08.20.2015

By Mitch Leslie Some microbes that naturally dwell in our intestines might be bad for our eyes, triggering autoimmune uveitis, one of the leading causes of blindness. A new study suggests that certain gut residents produce proteins that enable destructive immune cells to enter the eyes. The idea that gut microbes might promote autoimmune uveitis “has been there in the back of our minds,” says ocular immunologist Andrew Taylor of the Boston University School of Medicine, who wasn’t connected to the research. “This is the first time that it’s been shown that the gut flora seems to be part of the process.” As many as 400,000 people in the United States have autoimmune uveitis, in which T cells—the commanders of the immune system—invade the eye and damage its middle layer. All T cells are triggered by specific molecules called antigens, and for T cells that cause autoimmune uveitis, certain eye proteins are the antigens. Even healthy people carry these T cells, yet they don't usually swarm the eyes and unleash the disease. That's because they first have to be triggered by their matching antigen. However, those proteins don't normally leave the eye. So what could stimulate the T cells? One possible explanation is microbes in the gut. In the new study, immunologist Rachel Caspi of the National Eye Institute in Bethesda, Maryland, and colleagues genetically engineered mice so their T cells recognized one of the same eye proteins targeted in autoimmune uveitis. The rodents developed the disease around the time they were weaned. But dosing the animals with four antibiotics that killed off most of their gut microbes delayed the onset and reduced the severity of the disease. © 2015 American Association for the Advancement of Science.

Keyword: Vision; Obesity
Link ID: 21317 - Posted: 08.19.2015

Helen Thomson An almost fully-formed human brain has been grown in a lab for the first time, claim scientists from Ohio State University. The team behind the feat hope the brain could transform our understanding of neurological disease. Though not conscious the miniature brain, which resembles that of a five-week-old foetus, could potentially be useful for scientists who want to study the progression of developmental diseases. It could also be used to test drugs for conditions such as Alzheimer’s and Parkinson’s, since the regions they affect are in place during an early stage of brain development. The brain, which is about the size of a pencil eraser, is engineered from adult human skin cells and is the most complete human brain model yet developed, claimed Rene Anand of Ohio State University, Columbus, who presented the work today at the Military Health System Research Symposium in Fort Lauderdale, Florida. Previous attempts at growing whole brains have at best achieved mini-organs that resemble those of nine-week-old foetuses, although these “cerebral organoids” were not complete and only contained certain aspects of the brain. “We have grown the entire brain from the get-go,” said Anand. Anand and his colleagues claim to have reproduced 99% of the brain’s diverse cell types and genes. They say their brain also contains a spinal cord, signalling circuitry and even a retina. The ethical concerns were non-existent, said Anand. “We don’t have any sensory stimuli entering the brain. This brain is not thinking in any way.” © 2015 Guardian News and Media Limited

Keyword: Development of the Brain
Link ID: 21316 - Posted: 08.19.2015

Daniel Cressey In 2013, Beau Kilmer took on a pretty audacious head count. Citizens in the state of Washington had just voted to legalize marijuana for recreational use, and the state's liquor control board, which would regulate the nascent industry, was anxious to understand how many people were using the drug — and importantly, how much they were consuming. The task was never going to be straightforward. Users of an illicit substance, particularly heavy users, often under-report the amounts they take. So Kilmer, co-director of the RAND Drug Policy Research Center in Santa Monica, California, led a team to develop a web-based survey that would ask people how often they had used cannabis in the past month and year. To help them gauge the amounts, the surveys included scaled pictures showing different quantities of weed. The survey, along with other data the team had collected, revealed a rift between perception and reality. Based on prior data, state officials had estimated use at about 85 tonnes per year; Kilmer's research suggested that it was actually double that, about 175 tonnes1. The take-home message, says Kilmer, was “we're going to have to start collecting more data”. Scientists around the world would echo that statement. Laws designed to legalize cannabis or lessen the penalties associated with it are taking effect around the world. They are sweeping the sale of the drug out of stairwells and shady alleys and into modern shopfronts under full view of the authorities. In 2013, Uruguay became the first nation to legalize marijuana trade. And several countries in Europe — Spain and Italy among them — have moved away from tough penalties for use and possession. Thirty-nine US states plus Washington DC have at least some provisions for medicinal use of the drug. Washington, Colorado, Alaska and Oregon have gone further, legalizing the drug for recreational consumption. A handful of other states including California and Massachusetts are expected to vote on similar recreational-use measures by the end of 2016. © 2015 Nature Publishing Group

Keyword: Drug Abuse
Link ID: 21315 - Posted: 08.19.2015

By Lisa Rapaport (Reuters Health) - U.S. teens who try electronic cigarettes may be more than twice as likely to move on to smoking conventional cigarettes as those who have never tried the devices, report researchers from the University of Southern California. The findings, published August 18 in JAMA, offer some of the best evidence yet at establishing a link between e-cigarettes and smoking, said Dr. Nancy Rigotti, an expert in tobacco research at Massachusetts General Hospital and author of an editorial accompanying the study. "Adolescent brains appear to be especially susceptible to becoming addicted to nicotine when exposed," Rigotti told Reuters Health in an email. About 2 million middle- and high-school students tried e-cigarettes in 2014, triple the number of teen users in 2013, the Centers for Disease Control and Prevention reported in April. The data sparked alarm among tobacco control advocates who fear e-cigarettes will create a new generation of nicotine addicts who may eventually switch to conventional cigarettes. Big tobacco companies, including Altria Group Inc, Lorillard Tobacco Co and Reynolds American Inc, are all developing e-cigarettes. The battery-powered devices feature a glowing tip and a heating element that turns liquid nicotine and other flavorings into a cloud of vapor that users inhale. An international review of published research by the Cochrane Review in December concluded that the devices could help smokers quit but said much of the existing evidence on e-cigarettes was thin. © 2015 Scientific American

Keyword: Drug Abuse
Link ID: 21314 - Posted: 08.19.2015

—By Chris Mooney It is still considered highly uncool to ascribe a person's political beliefs, even in part, to that person's biology: hormones, physiological responses, even brain structures and genes. And no wonder: Doing so raises all kinds of thorny, non-PC issues involving free will, determinism, toleration, and much else. There's just one problem: Published scientific research keeps going there, with ever increasing audacity (not to mention growing stacks of data). The past two weeks have seen not one but two studies published in scientific journals on the biological underpinnings of political ideology. And these studies go straight at the role of genes and the brain in shaping our views, and even our votes. First, in the American Journal of Political Science, a team of researchers including Peter Hatemi of Penn State University and Rose McDermott of Brown University studied the relationship between our deep-seated tendencies to experience fear—tendencies that vary from person to person, partly for reasons that seem rooted in our genes—and our political beliefs. What they found is that people who have more fearful disposition also tend to be more politically conservative, and less tolerant of immigrants and people of races different from their own. As McDermott carefully emphasizes, that does not mean that every conservative has a high fear disposition. "It's not that conservative people are more fearful, it's that fearful people are more conservative," as she puts it. I interviewed the paper's lead author, Peter Hatemi, about his research for my 2012 book The Republican Brain. Hatemi is both a political scientist and also a microbiologist, and as he stressed to me, "nothing is all genes, or all environment." These forces combine to make us who we are, in incredibly intricate ways. ©2015 Mother Jones

Keyword: Emotions; Genes & Behavior
Link ID: 21313 - Posted: 08.19.2015

Helen Thomson Serious mood disorders such as bipolar may be the price humans have had to pay for our intelligence and creativity. That’s according to new research which links high childhood IQ to an increased risk of experiencing manic bipolar traits in later life. Researchers examined data from a large birth cohort to identify the IQ of 1,881 individuals at age eight. These same individuals were then assessed for manic traits at the age of 22 or 23. The statements they provided were part of a checklist widely used to diagnose bipolar disorder. Each person was given a score out of 100 related to how many manic traits they had previously experienced. Individuals who scored in the top 10% of manic features had a childhood IQ almost 10 points higher than those who scored in the lowest 10%. This correlation appeared strongest for those with high verbal IQ. “Our study offers a possible explanation for how bipolar disorder may have been selected through generations,” said Daniel Smith of the University of Glasgow , who led the study. “There is something about the genetics underlying the disorder that are advantageous. One possibility is that serious disorders of mood - such as bipolar disorder - are the price that human beings have had to pay for more adaptive traits such as intelligence, creativity and verbal proficiency.” Smith emphasises that as things stand, having a high IQ is only an advantage: “A high IQ is not a clear-cut risk factor for bipolar, but perhaps the genes that confer intelligence can get expressed as illness in the context of other risk factors, such as exposure to maternal influenza in the womb or childhood sexual abuse.” © 2015 Guardian News and Media Limited

Keyword: Schizophrenia; Genes & Behavior
Link ID: 21312 - Posted: 08.19.2015

By ANDREW POLLACK The first prescription drug to enhance women’s sexual drive won regulatory approval on Tuesday, clinching a victory for a lobbying campaign that had accused the Food and Drug Administration of gender bias for ignoring the sexual needs of women. The drug — Addyi from Sprout Pharmaceuticals — is actually the first drug approved to treat a flagging or absent libido for either sex. Viagra and other drugs available for men are approved to help achieve erections, or to treat certain deficiencies of the hormone testosterone, not to increase desire. Advocates who pressed for approval of Addyi, many of them part of a coalition called Even the Score, said that a drug to improve women’s sex lives was long overdue, given the many options available to men. “This is the biggest breakthrough for women’s sexual health since the pill,” said Sally Greenberg, executive director of the National Consumers League. But critics said the campaign behind Addyi had made a mockery of the system that regulates pharmaceuticals and had co-opted the women’s movement to pressure the F.D.A. into approving a drug that was at best minimally effective and could cause side effects like low blood pressure, fainting, nausea, dizziness and sleepiness. In announcing the approval, Dr. Janet Woodcock, a senior F.D.A. official, said the agency was “committed to supporting the development of safe and effective treatments for female sexual dysfunction.” The F.D.A. decision on Tuesday was not a surprise since an advisory committee of outside experts had recommended by a vote of 18 to 6 in June that the drug be approved, albeit with precautions required to try to limit the risks and ensure that it was not overused. © 2015 The New York Times Company

Keyword: Sexual Behavior
Link ID: 21311 - Posted: 08.19.2015

By Zoe Kleinman Technology reporter, BBC News More than 200 academics have signed an open letter criticising controversial new research suggesting a link between violent video games and aggression. The findings were released by the American Psychological Association. It set up a taskforce that reviewed hundreds of studies and papers published between 2005 and 2013. The American Psychological Association concluded while there was "no single risk factor" to blame for aggression, violent video games did contribute. "The research demonstrates a consistent relation between violent video game use and increases in aggressive behaviour, aggressive cognitions and aggressive affect, and decreases in pro-social behaviour, empathy and sensitivity to aggression," said the report. "It is the accumulation of risk factors that tends to lead to aggressive or violent behaviour. The research reviewed here demonstrates that violent video game use is one such risk factor." However, a large group of academics said they felt the methodology of the research was deeply flawed as a significant part of material included in the study had not been subjected to peer review. "I fully acknowledge that exposure to repeated violence may have short-term effects - you would be a fool to deny that - but the long-term consequences of crime and actual violent behaviour, there is just no evidence linking violent video games with that," Dr Mark Coulson, associate professor of psychology at Middlesex University and one of the signatories of the letter told the BBC. "If you play three hours of Call of Duty you might feel a little bit pumped, but you are not going to go out and mug someone." © 2015 BBC

Keyword: Aggression
Link ID: 21310 - Posted: 08.19.2015

Alexander Christie-Miller You could say they sent the first tweets. An ancient whistling language that sounds a little like birdsong has been found to use both sides of the brain – challenging the idea that the left side is all important for communicating. The whistling language is still used by around 10,000 people in the mountains of north-east Turkey, and can carry messages as far as 5 kilometres. Researchers have now shown that this language involves the brain’s right hemisphere, which was already known to be important for understanding music. Until recently, it was thought that the task of interpreting language fell largely to the brain’s left hemisphere. Onur Güntürkün of Ruhr University Bochum in Germany wondered whether the musical melodies and frequencies of whistled Turkish might require people to use both sides of their brain to communicate. His team tested 31 fluent whistlers by playing slightly different spoken or whistled syllables into their left and right ears at the same time, and asking them to say what they heard. The left hemisphere depends slightly more on sounds received by the right ear, and vice versa for the right hemisphere. By comparing the number of times the whistlers reported the syllables that had been played into either their right or left ear, they could tell how often each side of the brain was dominant. As expected, when the syllables were spoken, the right ear and left hemisphere were dominant 75 per cent of the time. But when syllables were whistled, the split between right and left dominance was about even. © Copyright Reed Business Information Ltd.

Keyword: Language; Laterality
Link ID: 21309 - Posted: 08.18.2015

By Perri Klass, A little more than a year ago, the American Academy of Pediatrics issued a policy statement saying that all pediatric primary care should include literacy promotion, starting at birth. That means pediatricians taking care of infants and toddlers should routinely be advising parents about how important it is to read to even very young children. The policy statement, which I wrote with Dr. Pamela C. High, included a review of the extensive research on the links between growing up with books and reading aloud, and later language development and school success. But while we know that reading to a young child is associated with good outcomes, there is only limited understanding of what the mechanism might be. Two new studies examine the unexpectedly complex interactions that happen when you put a small child on your lap and open a picture book. This month, the journal Pediatrics published a study that used functional magnetic resonance imaging to study brain activity in 3-to 5-year-old children as they listened to age-appropriate stories. The researchers found differences in brain activation according to how much the children had been read to at home. Children whose parents reported more reading at home and more books in the home showed significantly greater activation of brain areas in a region of the left hemisphere called the parietal-temporal-occipital association cortex. This brain area is “a watershed region, all about multisensory integration, integrating sound and then visual stimulation,” said the lead author, Dr. John S. Hutton, a clinical research fellow at Cincinnati Children’s Hospital Medical Center. This region of the brain is known to be very active when older children read to themselves, but Dr. Hutton notes that it also lights up when younger children are hearing stories. What was especially novel was that children who were exposed to more books and home reading showed significantly more activity in the areas of the brain that process visual association, even though the child was in the scanner just listening to a story and could not see any pictures. © 2015 The New York Times Company

Keyword: Language; Development of the Brain
Link ID: 21308 - Posted: 08.18.2015

Every brain cell has a nucleus, or a central command station. Scientists have shown that the passage of molecules through the nucleus of a star-shaped brain cell, called an astrocyte, may play a critical role in health and disease. The study, published in the journal Nature Neuroscience, was partially funded by the National Institutes of Health (NIH). “Unexpectedly we may have discovered a hidden pathway to understanding how astrocytes respond to injury and control brain processes. The pathway may be common to many brain diseases and we’re just starting to follow it,” said Katerina Akassoglou, Ph.D., a senior investigator at the Gladstone Institute for Neurological Disease, a professor of neurology at the University of California, San Francisco, and a senior author of the study. Some neurological disorders are associated with higher than normal brain levels of the growth factor TGF-beta, including Alzheimer's disease and brain injury. Previous studies found that after brain injury, astrocytes produce greater amounts of p75 neurotrophin receptor (p75NTR), a protein that helps cells detect growth factors. The cells also react to TGF-beta by changing their shapes and secreting proteins that alter neuronal activity. Dr. Akassoglou’s lab showed that eliminating the p75NTR gene prevented hydrocephalus in mice genetically engineered to have astrocytes that produce higher levels of TGF-beta. Hydrocephalus is a disorder that fills the brain with excess cerebral spinal fluid. Eliminating the p75NTR gene also prevented astrocytes in the brains of the mice from forming scars after injuries and restored gamma oscillations, which are patterns of neuronal activity associated with learning and memory.

Keyword: Brain Injury/Concussion; Glia
Link ID: 21307 - Posted: 08.18.2015

By Kate Kelland LONDON (Reuters) - Scientists have genetically modified mice to be super-intelligent and found they are also less anxious, a discovery that may help the search for treatments for disorders such as Alzheimer's, schizophrenia and post traumatic stress disorder (PTSD). Researchers from Britain and Canada found that altering a single gene to block the phosphodiesterase-4B (PDE4B) enzyme, which is found in many organs including the brain, made mice cleverer and at the same time less fearful. "Our work using mice has identified phosphodiesterase-4B as a promising target for potential new treatments," said Steve Clapcote, a lecturer in pharmacology at Britain's Leeds University, who led the study. He said his team is now working on developing drugs that will specifically inhibit PDE4B. The drugs will be tested first in animals to see whether any of them might be suitable to go forward into clinical trials in humans. In the experiments, published on Friday in the journal Neuropsychopharmacology, the scientists ran a series of behavioral tests on the PDE4B-inhibited mice and found they tended to learn faster, remember events longer and solve complex problems better than normal mice. The "brainy" mice were better at recognizing a mouse they had seen the previous day, the researchers said, and were also quicker at learning the location of a hidden escape platform.

Keyword: Learning & Memory; Genes & Behavior
Link ID: 21306 - Posted: 08.18.2015