Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
Robin McKie Scientists have found an unexpected use for virtual reality headsets: to help pinpoint people who may later develop Alzheimer’s disease. The devices, widely used by computer gamers, display images that can be used to test the navigational skills of people thought to be at risk of dementia. Those who do worse in the tests will be the ones most likely to succumb to Alzheimer’s later in life, scientists now believe. By identifying potential patients far earlier than is possible at present, researchers hope it should then become easier in the long term to develop treatments aimed at halting or slowing their condition. “It is usually thought memory is the first attribute affected in Alzheimer’s,” said project leader Dennis Chan, a neuroscientist based at Cambridge University. “But difficulty with navigation is increasingly recognised as one of the very earliest symptoms. This may predate the onset of other symptoms. “By pinpointing those who are beginning to lose their navigational skills, we hope to show that we can target people at a much earlier stage of the condition and one day become far more effective in treating them.” The discovery that loss of navigational skills was associated with Alzheimer’s disease was made several years ago by Chan and colleagues based at several centres in the UK. These studies used tablet computers to test navigational tasks. © 2018 Guardian News and Media Limited
Keyword: Alzheimers
Link ID: 25790 - Posted: 12.17.2018
By Benedict Carey For the past two decades, scientists have been exploring the genetics of schizophrenia, autism and other brain disorders, looking for a path toward causation. If the biological roots of such ailments could be identified, treatments might follow, or at least tests that could reveal a person’s risk level. In the 1990s, researchers focused on genes that might possibly be responsible for mental distress, but then hit a wall. Choosing so-called candidate genes up front proved to be fruitless. In the 2000s, using new techniques to sample the entire genome, scientists hit many walls: Hundreds of common gene variants seemed to contribute some risk, but no subset stood out. Even considered together, all of those potential contributing genes — some 360 have been identified for schizophrenia — offered nothing close to a test for added risk. The inherited predisposition was real; but the intricate mechanisms by which all those genes somehow led to symptoms such as psychosis or mania were a complete mystery. Now, using more advanced tools, brain scientists have begun to fill out the picture. In a series of 11 papers, published in Science and related journals, a consortium of researchers has produced the most richly detailed model of the brain’s genetic landscape to date, one that incorporates not only genes but also gene regulators, cellular data and developmental information across the human life span. The work is a testament to how far brain biology has come, and how much further it has to go, toward producing anything of practical value to doctors or patients, experts said. © 2018 The New York Times Company
Keyword: Schizophrenia; Genes & Behavior
Link ID: 25789 - Posted: 12.15.2018
Laura Sanders An Alzheimer’s protein found in contaminated vials of human growth hormone can spread in the brains of mice. That finding, published online December 13 in Nature, adds heft to the idea that, in very rare cases, amyloid-beta can travel from one person’s brain to another’s. Decades ago, over a thousand young people in the United Kingdom received injections of growth hormone derived from cadavers’ brains as treatment for growth deficiencies. Four of these people died with unusually high levels of A-beta in their brains, a sign of Alzheimer’s disease (SN: 10/17/15, p. 12). The results hinted that A-beta may have been delivered along with the growth hormone. Now researchers have confirmed not only that A-beta was in some of those old vials, but also that it can spark A-beta accumulation in mice’s brains. Neurologist John Collinge of University College London and colleagues found that brain injections of the contaminated growth hormone led to clumps of A-beta in the brains of mice genetically engineered to produce the protein, while brain injections with synthetic growth hormone did not. The results suggest that A-beta can “seed” the protein in people’s brains, under the right circumstances. Still, that doesn’t mean that Alzheimer’s disease is transmissible in day-to-day life. |© Society for Science & the Public 2000 - 2018
Keyword: Alzheimers; Prions
Link ID: 25788 - Posted: 12.15.2018
Ewen Callaway No human has the brain of a Neanderthal — but some have hints of its shape. The brain shape of some people with European ancestry is influenced by Neanderthal DNA acquired through interbreeding tens of thousands of years ago, researchers report on 13 December in Current Biology1. These DNA variants seem to affect the expression of two genes in such a way as to make the brains of some humans slightly less round, and more like the Neanderthals’ elongated brains. “It’s a really subtle shift in the overall roundedness,” says team member Philipp Gunz, a palaeoanthropologist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. “I don’t think you would see it with your naked eye. These are not people that would look Neanderthal-like.” The Neanderthal DNA variants alter gene expression in brain regions involved in planning, coordination and learning of movements. These faculties are used in speech and language, but there is no indication that the Neanderthal DNA affects cognition in modern humans. Instead, the researchers say, their discovery points to biological changes that might have endowed the human brain with its distinct shape. Earlier this year, Gunz and two colleagues determined that the rounded brain shape of modern humans evolved gradually, reaching its current appearance between 35,000 and 100,000 years ago2. The earliest human fossils from across Africa, dating to around 200,000–300,000 years ago, have large yet elongated brains. “There really is something going on in the brain that changes over time in the Homo sapiens lineage,” says Gunz. © 2018 Springer Nature Publishing AG
Keyword: Evolution; Development of the Brain
Link ID: 25787 - Posted: 12.15.2018
By Robert F. Service BOSTON—Implanted electronics can steady hearts, calm tremors, and heal wounds—but at a cost. These machines are often large, obtrusive contraptions with batteries and wires, which require surgery to implant and sometimes need replacement. That's changing. At a meeting of the Materials Research Society here last month, biomedical engineers unveiled bioelectronics that can do more in less space, require no batteries, and can even dissolve when no longer needed. "Huge leaps in technology [are] being made in this field," says Shervanthi Homer-Vanniasinkam, a biomedical engineer at University College London. By making bioelectronics easier to live with, these advances could expand their use. "If you can tap into this, you can bring a new approach to medicine beyond pharmaceuticals," says Bernhard Wolfrum, a neuroelectronics expert at the Technical University of Munich in Germany. "There are a lot of people moving in this direction." One is John Rogers, a materials scientist at Northwestern University in Evanston, Illinois, who is trying to improve on an existing device that surgeons use to stimulate healing of damaged peripheral nerves in trauma patients. During surgery, doctors suture severed nerves back together and then provide gentle electrical stimulation by placing electrodes on either side of the repair. But because surgeons close wounds as soon as possible to prevent infection, they typically provide this stimulation for an hour or less. © 2018 American Association for the Advancement of Science
Keyword: Obesity; Robotics
Link ID: 25786 - Posted: 12.13.2018
Tom Goldman Tim Green first noticed the symptoms about five years ago. The former NFL player, whose strength was a job requirement, suddenly found his hands weren't strong enough to use a nail clipper. His words didn't come out as fast as he was thinking them. "I'm a strange guy," Tim says. "I get something in my head and I can just run with it. I was really afraid I had ALS. But there was enough doubt that I said, 'Alright, I don't. Let's not talk about it. Let's not do anything.' " Denying pain and injury had been a survival strategy in football. "I was well trained in that verse," he says. But a diagnosis in 2016 made denial impossible. Doctors confirmed that Tim, also a former NPR commentator, had ALS, known as Lou Gehrig's disease. The degenerative illness attacks the body's motor nerve cells, weakening muscles in the arms and legs as well as the muscles that control speech, swallowing and breathing. Tim tried to keep it private — he didn't want people feeling sorry for him. But he says, "I got to a point where I couldn't hide it anymore." So Tim went on 60 Minutes and revealed his illness. "What we said is, you either write your own history or someone's going to write it for you," says his 24-year-old son, Troy Green. When one isn't enough I was one of Tim Green's producers for his Morning Edition commentaries back in the 1990s. We went to dinner once when he was in Washington, D.C., for a game — his Atlanta Falcons were playing Washington. Tim had a huge plate of pasta. When we finished, the waiter came over and asked, "Anything else?" Tim pointed to his clean plate and said, "Yeah. Let's do it again." © 2018 npr
Keyword: ALS-Lou Gehrig's Disease
Link ID: 25785 - Posted: 12.13.2018
It started when Andi Dreher was only three years old. Her head slumped over, her face went blank. It was the first of many epileptic seizures that the Ontario child would endure. At the beginning, Andi would have a couple of seizures a year, but the condition slowly progressed. By the time she turned seven, she was having up to 150 seizures a day. Her family has come to call them "glitches." "The other day at school, she had 27 glitches in less than an hour," said her mom, Lori Dreher. The seizures make it difficult for Andi to do even the simplest tasks, such as walking, talking and eating. "She knows she used to play soccer and she used to do cheerleading — that she used to do these things and now she can't. That's hard." her mom said. 'We're guinea pigs': Canada's oversight process for implanted medical devices stuns suffering patients Among serious neurological conditions in children, epilepsy is the most common. For most, the condition can be controlled by medications. "But about one-third of children who have epilepsy don't respond to medication. A subset of them can potentially be helped by a variety of surgical treatment," said Dr. George Ibrahim, the pediatric neurosurgeon at the Hospital for Sick Children who operated on Andi. Dr. George Ibrahim, pediatric neurosurgeon at the Hospital for Sick Children, examines an image of Andi's brain. (Kelda Yuen/ CBC) When Andi and her family came from Kitchener to meet him last year, Ibrahim said he was struck by the severity of her case. "Her brain is developed in a very unique way," he said. ©2018 CBC/Radio-Canada
Keyword: Epilepsy
Link ID: 25784 - Posted: 12.13.2018
Aimee Cunningham Babies born dependent on opioids have smaller heads than babies not exposed to the drugs in the womb. The finding, published online December 10 in Pediatrics, raises concerns that the drugs are impairing brain growth during development. And it highlights questions about the safest approach to managing opioid addiction during pregnancy, researchers say. Pregnant women who use opioids — or the drugs methadone or buprenorphine, opioids taken to treat addiction — pass the drugs through the bloodstream to babies. Infants can become dependent on the drugs in the womb, and experience withdrawal symptoms after birth. The disorder, marked by excessive crying, tremors or difficulty sleeping or feeding, is called neonatal abstinence syndrome, or NAS (SN: 6/10/17, p. 16). In the new study, researchers compared the head sizes of close to 860 babies born from 2014 to 2016, half with NAS and half from mothers who had not taken opioids while pregnant. Newborns with NAS had a head circumference nearly 1 centimeter smaller, on average, than babies not exposed to the drugs, the team found. And of the NAS babies, 30 percent had especially small heads. That was true for only 12 percent of babies without the condition. A smaller head is a possible sign of a smaller brain. The new work suggests that for those NAS babies who later have learning and behavioral problems, a contributing factor may be the effect of opioids on brain growth and development, says neonatologist Jonathan Davis. But it will be essential to determine “the actual impact of the smaller heads on how these children are developing,” says Davis, of Floating Hospital for Children at Tufts Medical Center in Boston. |© Society for Science & the Public 2000 - 2018
Keyword: Drug Abuse
Link ID: 25783 - Posted: 12.13.2018
By Ryan Dalton You might be forgiven for having never heard of the NotPetya cyberattack. It didn’t clear out your bank account, or share your social media passwords, or influence an election. But it was one of the most costly and damaging cyberattacks in history, for what it did target: shipping through ports. By the time the engineers at Maersk realized that their computers were infected with a virus, it was too late: worldwide shipping would grind to a halt for days. Imagine a similar situation, in which the target was another port: the synapse, the specialized port of communication between neurons. Much of our ability to learn and remember comes down to the behavior of synapses. What would happen then, if one neuron infected another with malware? Ports and synapses both run on rules, meant to ensure that their cargo can be exchanged not only quickly and reliably, but also adaptably, so that they can quickly adjust to current conditions and demands. This ‘synaptic plasticity’, is fundamental to the ability of animals to learn, and without it we would no more be able to tie our shoes than to remember our own names. Just as shipping rules are determined by treaties and laws, the rules of synaptic plasticity are written into a multitude of genes in our DNA. For example, one gene might be involved in turning up the volume on one side of the synapse, while another gene might ask the other side of the synapse to turn up the gain. Studying the function of these genes has been one of the core approaches to understanding what it is, at the microscopic level, to learn and to remember. © 2018 Scientific American
Keyword: Learning & Memory; Intelligence
Link ID: 25782 - Posted: 12.12.2018
Diana Kwon The effects of antidepressant exposure during early development can pass down through three generations of offspring—at least in zebrafish. A new study, published today (December 10) in PNAS, reveals that fluoxetine, a commonly used antidepressant that goes by the brand name Prozac, can alter hormone levels and blunt stress responses in an exposed embryo and its descendants. “The paper is very intriguing,” says Tim Oberlander, a developmental pediatrician at the British Columbia Children’s Hospital who was not involved in this work. The question of whether these medications have a transgenerational effect is “a really important one that requires further study in other animal models, and ultimately, when we have the data, we need to figure out whether it’s also true in humans.” Fluoxetine is a selective serotonin reuptake inhibitor (SSRI), a class of drugs widely used to treat depression as well as other conditions such as obsessive-compulsive disorder and anxiety disorders. Recent data from the US National Health and Nutrition Survey show increasing antidepressant use, from approximately 7.7 percent of the population in 1999–2002 to 12.7 percent from 2011–2014. SSRIs are often prescribed as the first-line treatment for pregnant women with depression, and prior studies in humans suggest infants exposed to SSRIs while in the womb may experience developmental disturbances such as delayed motor development and increased levels of anxiety later in childhood. Oberlander, whose research is focused on the influence of prenatal exposure to these medications, notes that it has been unclear whether those correlations represent a direct result of the drugs or if other factors, such as a genetic propensity for those outcomes or growing up with a parent with a mood disorder, may also play a part. © 1986 - 2018 The Scientist
Keyword: Epigenetics; Depression
Link ID: 25781 - Posted: 12.12.2018
In his enthralling 2009 collection of parables, Sum: Forty Tales from the Afterlives, the neuroscientist David Eagleman describes a world in which a person only truly dies when they are forgotten. After their bodies have crumbled and they leave Earth, all deceased must wait in a lobby and are allowed to pass on only after someone says their name for the last time. “The whole place looks like an infinite airport waiting area,” Eagleman writes. “But the company is terrific.” Most people leave just as their loved ones arrive — for it was only the loved ones who were still remembering. But the truly famous have to hang around for centuries; some, keen to be off, are with an “aching heart waiting for statues to fall”. Eagleman’s tale is an interpretation of what psychologists and social scientists call collective memory. Continued and shared attention to people and events is important because it can help to shape identity — how individuals see themselves as part of a group — and because the choice of what to commemorate, and so remember, influences the structures and priorities of society. This week in Nature Human Behaviour, researchers report a surprising discovery about collective memory: the pattern of its decay follows a mathematical law (C. Candia et al. Nature Hum. Behav. http://doi.org/cxq2; 2018). The attention we pay to academic papers, films, pop songs and tennis players decays in two distinct stages. In theory, the findings could help those who compete for society’s continued attention — from politicians and companies to environmental campaigners — to find ways to stay in the public eye, or at least in the public’s head. © 2018 Springer Nature Publishing AG
Keyword: Learning & Memory
Link ID: 25780 - Posted: 12.12.2018
By Joshua Tan Recently, a blog by Tam Hunt was published at Scientific American which provocatively declared that “The Hippies Were Right: It’s All About Vibrations, Man.” Hunt’s claim is that consciousness emerges from resonant effects found in nature at a wide range of scales. This is reminiscent of arguments that have been made since the development of the science of thermodynamics more than two hundred years ago. In brief, very intriguing and surprising characteristics of complex systems have been discovered and rigorously defined with such tantalizing terms as “emergence,” “resonance” and “self-organization.” These kinds of features of the natural world are so amazing—even uncanny—that they have inspired wild speculation as to their possible implications. Are there deep connections between these phenomena and the more mysterious aspects of our existence such as life, consciousness, and intelligence? Might they even provide us with insight into possible answers to expansively fundamental questions like why there is something rather than nothing? Speculating on such mysteries is an understandable pastime. Diverse thinkers from physicists to philosophers, psychologists to theologians have written libraries worth of treatises attempting to shed light on the possible answers to these deep questions. Along the way, ideas inspired by scientific results have had varying degrees of success. Concepts such as animal magnetism, vitalism, synchronicity, and quantum mysticism all had their day in the Sun, only to end up debunked or dismissed by skeptics and scientists who either pointed out a lack of empirical data supporting the claims or showed that the ideas were incompatible with what we have discovered about the natural world. © 2018 Scientific American
Keyword: Consciousness
Link ID: 25779 - Posted: 12.12.2018
Bruce Bower A nearly complete hominid skeleton known as Little Foot has finally been largely freed from the stony shell in which it was discovered in a South African cave more than 20 years ago. And in the first formal analyses of the fossils, researchers say the 3.67-million-year-old Little Foot belonged to its own species. In four papers posted online at bioRxiv.org between November 29 and December 5, paleoanthropologist Ronald Clarke of the University of the Witwatersrand in Johannesburg and colleagues assign Little Foot to a previously proposed species, Australopithecus prometheus, that has failed to gain traction among many researchers. Clarke has held that controversial view for more than a decade (SN: 5/2/15, p. 8). He found the first of Little Foot’s remains in a storage box of fossils from a site called Sterkfontein in 1994. Excavations of the rest of the skeleton began in 1997. Many other researchers, however, regard Little Foot as an early member of a hominid species called Australopithecus africanus. Anthropologist Raymond Dart first identified A. africanus in 1924 from an ancient youngster’s skull called the Taung Child. Hundreds of A. africanus fossils have since been found in South African caves, including Sterkfontein. One of those caves, Makapansgat, produced a partial braincase that Dart assigned to A. prometheus in 1948. But Dart dropped that label after 1955, assigning the braincase and another Makapansgat fossil to A. africanus. |© Society for Science & the Public 2000 - 2018.
Keyword: Evolution
Link ID: 25778 - Posted: 12.12.2018
Jef Akst Alan McElligott, an animal behavior researcher at the University of Roehampton in the UK, continues to be impressed by goats. Since he started studying the charismatic ungulates a decade ago, he’s found that mothers remember the calls of their kids several months after they’ve been separated, and that goats can solve a two-step puzzle box akin to those typically used in primate research—and remember how to do it a year later. Now his team has found that goats at the Buttercups Sanctuary in Kent, UK, can distinguish between happy and angry human expressions. “Given some of the other things that we’ve found out about goats, I guess we shouldn’t really be that surprised,” says McElligott, who’s hoping to improve welfare guidelines for the animals by revealing their smart and social nature. McElligott’s experiment was simple. Working with 20 goats at the sanctuary, he and his colleagues presented each with two black-and-white images—one of a person smiling, and the other of the same person making an angry expression—then sat back and watched what the animal did. “If the goats ignored the photographs, for example, or walked up to the photographs and ripped them off metal panels and chewed on them, would I have been shocked? Possibly not,” says McElligott. “But . . . the goats did seem to take the time to have a look at these photographs and actually study them, believe it or not.” And based on the time they spent interacting with each image, the goats seemed to prefer the happy snapshot (R Soc Open Sci, 5:180491, 2018). © 1986 - 2018 The Scientist
Keyword: Emotions; Attention
Link ID: 25777 - Posted: 12.12.2018
By Tom Garlinghouse Male and female bees may look similar, but they have dramatically different dining habits, according to a new study. Despite both needing nectar to survive, they get this nutrient from different flowers—so different, in fact, that males and females might as well belong to separate species. To make the find, researchers spent 11 weeks observing the foraging habits of 152 species of bees in several flower-rich New Jersey fields. Then they brought the insects—nearly 19,000 in all—back to the lab and meticulously identified their species and sex. Males and females rarely drank nectar from the same type of flower, the team will report in Animal Behaviour. Using a statistical test the researchers found that male and female bee diets overlap significantly less than would be expected at random. © 2018 American Association for the Advancement of Science
Keyword: Sexual Behavior
Link ID: 25776 - Posted: 12.12.2018
By Gina Kolata You’d think that scientists at an international conference on obesity would know by now which diet is best, and why. As it turns out, even the experts still have widely divergent opinions. At a recent meeting of the Obesity Society, organizers held a symposium during which two leading scientists presented the somewhat contradictory findings of two high-profile diet studies. A moderator tried to sort things out. In one study, by Christopher Gardner, a professor of medicine at Stanford, patients were given low-fat or low-carb diets with the same amount of calories. After a year, weight loss was the same in each group, Dr. Gardner reported. Another study, by Dr. David Ludwig of Boston Children’s Hospital, reported that a low-carbohydrate diet was better than a high-carbohydrate diet in helping subjects keep weight off after they had dieted and lost. The low-carbohydrate diet, he found, enabled participants to burn about 200 extra calories a day. So does a low-carbohydrate diet help people burn more calories? Or is the composition of the diet irrelevant if the calories are the same? Does it matter if the question is how to lose weight or how to keep it off? There was no consensus at the end of the session. But here are a few certainties about dieting amid the sea of unknowns. What we know People vary — a lot — in how they respond to dieting. Some people thrive on low-fat diets, others do best on low-carb diets. Still others succeed with gluten-free diets or Paleo diets or periodic fasts or ketogenic diets or other options on the seemingly endless menu of weight-loss plans. Most studies comparing diets have produced results like Dr. Gardner’s: no difference © 2018 The New York Times Company
Keyword: Obesity
Link ID: 25775 - Posted: 12.11.2018
By Kelly Servick If you’ve ever unwittingly grabbed a hot pan, you know our bodies have exquisite reflexes for avoiding or minimizing injuries. But once the damage is done, we also have a spontaneous urge to sooth the pain—to blow on a burned hand, cradle a broken toe, or suck on a cut finger. A new study reveals a neural circuit behind this soothing response in mice. Many common animal tests of pain don’t involve this circuit, the authors contend, which could explain why some painkillers that seem to work in mice prove ineffective in people. “We know there is not just one ‘pain pathway’ or a single brain site involved in processing pain,” says Kathleen Sluka, a neuroscientist at the University of Iowa in Iowa City, who was not involved in the new work. “Understanding the different pathways that underlie unique behaviors could one day help us to individualize treatments” for patients based on how they respond to pain. Harvard University neurobiologist Qiufu Ma and his team wanted to tease apart different aspects of pain, not just in the brain but in the neurons throughout our bodies that relay signals up the spinal cord. Ma and his collaborators previously proposed two general groups of sensory neurons: ones that project to the outermost layer of skin and ones that branch to deeper tissue throughout the body—the underlying skin layers, bones, joints, and muscles. Ma suggests the first group is a first-line defense that monitors our surroundings for danger and prompts us to pull away from a hot pan or a sharp prick. The deeper nerves, he suggests, are attuned to the lasting pain of an injury or illness—and may drive the experience of unpleasantness and distress that comes with pain. Our reflexes avoid potential harm, Ma explains; “the suffering of pain is very different.” © 2018 American Association for the Advancement of Science
Keyword: Pain & Touch
Link ID: 25774 - Posted: 12.11.2018
Alison Abbott Doris Tsao launched her career deciphering faces — but for a few weeks in September, she struggled to control the expression on her own. Tsao had just won a MacArthur Foundation ‘genius’ award, an honour that comes with more than half a million dollars to use however the recipient wants. But she was sworn to secrecy — even when the foundation sent a film crew to her laboratory at the California Institute of Technology (Caltech) in Pasadena. Thrilled and embarrassed at the same time, she had to invent an explanation, all while keeping her face in check. It was her work on faces that won Tsao awards and acclaim. Last year, she cracked the code that the brain uses to recognize faces from a multitude of minuscule differences in shapes, distances between features, tones and textures. The simplicity of the coding surprised and impressed the neuroscience community. “Her work has been transformative,” says Tom Mrsic-Flogel, director of the Sainsbury Wellcome Centre for Neural Circuits and Behaviour at University College London. But Tsao doesn’t want to be remembered just as the scientist who discovered the face code. It is a means to an end, she says, a good tool for approaching the question that really interests her: how does the brain build up a complete, coherent model of the world by filling in gaps in perception? “This idea has an elegant mathematical formulation,” she says, but it has been notoriously hard to put to the test. Tsao now has an idea of how to begin. © 2018 Springer Nature Publishing AG
Keyword: Attention
Link ID: 25773 - Posted: 12.11.2018
By Benedict Carey A generation ago, parents worried about the effects of TV; before that, it was radio. Now, the concern is “screen time,” a catchall term for the amount of time that children, especially preteens and teenagers, spend interacting with TVs, computers, smartphones, digital pads, and video games. This age group draws particular attention because screen immersion rises sharply during adolescence, and because brain development accelerates then, too, as neural networks are pruned and consolidated in the transition to adulthood. On Sunday evening, CBS’s “60 Minutes” reported on early results from the A.B.C.D. Study (for Adolescent Brain Cognitive Development), a $300 million project financed by the National Institutes of Health. The study aims to reveal how brain development is affected by a range of experiences, including substance use, concussions, and screen time. As part of an exposé on screen time, “60 Minutes” reported that heavy screen use was associated with lower scores on some aptitude tests, and to accelerated “cortical thinning" — a natural process — in some children. But the data is preliminary, and it’s unclear whether the effects are lasting or even meaningful. Does screen addiction change the brain? Yes, but so does every other activity that children engage in: sleep, homework, playing soccer, arguing, growing up in poverty, reading, vaping behind the school. The adolescent brain continually changes, or “rewires” itself, in response to daily experience, and that adaptation continues into the early to mid 20s. What scientists want to learn is whether screen time, at some threshold, causes any measurable differences in adolescent brain structure or function, and whether those differences are meaningful. Do they cause attention deficits, mood problems, or delays in reading or problem-solving ability? © 2018 The New York Times Company
Keyword: Development of the Brain; Learning & Memory
Link ID: 25772 - Posted: 12.11.2018
Doing crossword puzzles and Sudoku does not appear to protect against mental decline, according to a new study. The idea of "use it or lose it" when it comes to our brains in later life has previously been widely accepted. The new Scottish study showed that people who regularly do intellectual activities throughout life have higher mental abilities. This provides a "higher cognitive point" from which to decline, say the researchers. But the study did not show that they decline any slower. The work, published in the BMJ, was undertaken by Dr Roger Staff at Aberdeen Royal Infirmary and the University of Aberdeen. It looked at 498 people born in 1936 who had taken part in a group intelligence test at the age of 11. This current study started when they were about 64 years old and they were recalled for memory and mental-processing-speed testing up to five times over a 15-year period. It found engagement in problem solving did not protect an individual from decline. However, engaging in intellectually stimulating activities on a regular basis was linked to level of mental ability in old age. The study uses modelling to look at associations and cannot prove any causal link. Also, many of the participants were unable to complete the whole study - some dropped out, others died. Image copyright Getty Images Previously, some studies have found that cognitive training can improve some aspects of memory and thinking, particularly for people who are middle-aged or older. They found so-called brain training may help older people to manage their daily tasks better. No studies have shown that brain training prevents dementia. And last year a report from the Global Council on Brain Health recommended that people should take part in stimulating activities such as learning a musical instrument, designing a quilt or gardening rather than brain training to help their brain function in later life. © 2018 BBC
Keyword: Alzheimers; Learning & Memory
Link ID: 25771 - Posted: 12.11.2018


.gif)

