Chapter 13. Memory, Learning, and Development

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 6846

By James Gorman There’s something about a really smart dog that makes it seem as if there might be hope for the world. China is in the midst of a frightening disease outbreak and nobody knows how far it will spread. The warming of the planet shows no signs of stopping; it reached a record 70 degrees in Antarctica last week. Not to mention international tensions and domestic politics. But there’s a dog in Norway that knows not only the names of her toys, but also the names of different categories of toys, and she learned all this just by hanging out with her owners and playing her favorite game. So who knows what other good things could be possible? Right? This dog’s name is Whisky. She is a Border collie that lives with her owners and almost 100 toys, so it seems like things are going pretty well for her. Even though I don’t have that many toys myself, I’m happy for her. You can’t be jealous of a dog. Or at least you shouldn’t be. Whisky’s toys have names. Most are dog-appropriate like “the colorful rope” or “the small Frisbee.” However, her owner, Helge O. Svela said on Thursday that since the research was done, her toys have grown in number from 59 to 91, and he has had to give some toys “people” names, like Daisy or Wenger. “That’s for the plushy toys that resemble animals like ducks or elephants (because the names Duck and Elephant were already taken),” he said. During the research, Whisky proved in tests that she knew the names for at least 54 of her 59 toys. That’s not just the claim of a proud owner, and Mr. Svela is quite proud of Whisky, but the finding of Claudia Fugazza, an animal behavior researcher from Eötvös Loránd University in Budapest, who tested her. That alone makes Whisky part of a very select group, although not a champion. You may recall Chaser, another Border collie that knew the names of more than 1,000 objects and also knew words for categories of objects. And there are a few other dogs with shockingly large vocabularies, Dr. Fugazza said, including mixed breeds, and a Yorkie. These canine verbal prodigies are, however, few and far between. “It is really, really unusual, and it is really difficult to teach object names to dogs,” Dr. Fugazza said. © 2020 The New York Times Company

Keyword: Language; Learning & Memory
Link ID: 27063 - Posted: 02.21.2020

By Viviane Callier In 1688 Irish philosopher William Molyneux wrote to his colleague John Locke with a puzzle that continues to draw the interest of philosophers and scientists to this day. The idea was simple: Would a person born blind, who has learned to distinguish objects by touch, be able to recognize them purely by sight if he or she regained the ability to see? The question, known as Molyneux’s problem, probes whether the human mind has a built-in concept of shapes that is so innate that such a blind person could immediately recognize an object with restored vision. The alternative is that the concepts of shapes are not innate but have to be learned by exploring an object through sight, touch and other senses, a process that could take a long time when starting from scratch. An attempt was made to resolve this puzzle a few years ago by testing Molyneux's problem in children who were congenitally blind but then regained their sight, thanks to cataract surgery. Although the children were not immediately able to recognize objects, they quickly learned to do so. The results were equivocal. Some learning was needed to identify an object, but it appeared that the study participants were not starting completely from scratch. Lars Chittka of Queen Mary University of London and his colleagues have taken another stab at finding an answer, this time using another species. To test whether bumblebees can form an internal representation of objects, Chittka and his team first trained the insects to discriminate spheres and cubes using a sugar reward. The bees were trained in the light, where they could see but not touch the objects that were isolated inside a closed petri dish. Then they were tested in the dark, where they could touch but not see the spheres or cubes. The researchers found that the invertebrates spent more time in contact with the shape they had been trained to associate with the sugar reward, even though they had to rely on touch rather than sight to discriminate the objects. © 2020 Scientific American

Keyword: Development of the Brain; Vision
Link ID: 27061 - Posted: 02.21.2020

By Kim Tingley Hearing loss has long been considered a normal, and thus acceptable, part of aging. It is common: Estimates suggest that it affects two out of three adults age 70 and older. It is also rarely treated. In the U.S., only about 14 percent of adults who have hearing loss wear hearing aids. An emerging body of research, however, suggests that diminished hearing may be a significant risk factor for Alzheimer’s disease and other forms of dementia — and that the association between hearing loss and cognitive decline potentially begins at very low levels of impairment. In November, a study published in the journal JAMA Otolaryngology — Head and Neck Surgery examined data on hearing and cognitive performance from more than 6,400 people 50 and older. Traditionally, doctors diagnose impairment when someone experiences a loss in hearing of at least 25 decibels, a somewhat arbitrary threshold. But for the JAMA study, researchers included hearing loss down to around zero decibels in their analysis and found that they still predicted correspondingly lower scores on cognitive tests. “It seemed like the relationship starts the moment you have imperfect hearing,” says Justin Golub, the study’s lead author and an ear, nose and throat doctor at the Columbia University Medical Center and NewYork-Presbyterian. Now, he says, the question is: Does hearing loss actually cause the cognitive problems it has been associated with and if so, how? Preliminary evidence linking dementia and hearing loss was published in 1989 by doctors at the University of Washington, Seattle, who compared 100 patients with Alzheimer’s-like dementia with 100 demographically similar people without it and found that those who had dementia were more likely to have hearing loss, and that the extent of that loss seemed to correspond with the degree of cognitive impairment. But that possible connection wasn’t rigorously investigated until 2011, when Frank Lin, an ear, nose and throat doctor at Johns Hopkins School of Medicine, and colleagues published the results of a longitudinal study that tested the hearing of 639 older adults who were dementia-free and then tracked them for an average of nearly 12 years, during which time 58 had developed Alzheimer’s or another cognitive impairment. They discovered that a subject’s likelihood of developing dementia increased in direct proportion to the severity of his or her hearing loss at the time of the initial test. The relationship seems to be “very, very linear,” Lin says, meaning that the greater the hearing deficit, the greater the risk a person will develop the condition. © 2020 The New York Times Company

Keyword: Hearing; Alzheimers
Link ID: 27057 - Posted: 02.20.2020

By Judith Graham, Kaiser Health News Do I know I’m at risk for developing dementia? You bet. My father died of Alzheimer’s disease at age 72; my sister was felled by frontotemporal dementia at 58. And that’s not all: Two maternal uncles had Alzheimer’s, and my maternal grandfather may have had vascular dementia. (In his generation, it was called senility.) So what happens when I misplace a pair of eyeglasses or can’t remember the name of a movie I saw a week ago? “Now comes my turn with dementia,” I think. Then I talk myself down from that emotional cliff. Am I alone in this? Hardly. Many people, like me, who’ve watched this cruel illness destroy a family member, dread the prospect that they, too, might become demented. The lack of a cure or effective treatments only adds to the anxiety. It seems a common refrain, the news that another treatment to stop Alzheimer’s has failed. How do we cope as we face our fears and peer into our future? Andrea Kline, whose mother, as well as her aunt and uncle, had Alzheimer’s disease, just turned 71 and lives in Boynton Beach, Fla. She’s a retired registered nurse who teaches yoga to seniors at community centers and assisted-living facilities. “I worry about dementia incessantly: Every little thing that goes wrong, I’m convinced it’s the beginning,” she told me. Because Ms. Kline has had multiple family members with Alzheimer’s, she’s more likely to have a genetic vulnerability than someone with a single occurrence in their family. But that doesn’t mean this condition lies in her future. A risk is just that: It’s not a guarantee. The age of onset is also important. People with close relatives struck by dementia early — before age 65 — are more likely to be susceptible genetically. Ms. Kline was the primary caregiver for her mother, Charlotte Kline, who received an Alzheimer’s diagnosis in 1999 and passed away in 2007 at age 80. “I try to eat very healthy. I exercise. I have an advance directive, and I’ve discussed what I want” in the way of care “with my son,” she said. © 2020 The New York Times Company

Keyword: Alzheimers
Link ID: 27056 - Posted: 02.20.2020

Maternal obesity may increase a child’s risk for attention-deficit hyperactivity disorder (ADHD), according to an analysis by researchers from the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), part of the National Institutes of Health. The researchers found that mothers — but not fathers — who were overweight or obese before pregnancy were more likely to report that their children had been diagnosed with attention-deficit hyperactivity disorder (ADHD) or to have symptoms of hyperactivity, inattentiveness or impulsiveness at ages 7 to 8 years old. Their study appears in The Journal of Pediatrics. The study team analyzed the NICHD Upstate KIDS Study, which recruited mothers of young infants and followed the children through age 8 years. In this analysis of nearly 2,000 children, the study team found that women who were obese before pregnancy were approximately twice as likely to report that their child had ADHD or symptoms of hyperactivity, inattention or impulsiveness, compared to children of women of normal weight before pregnancy. The authors suggest that, if their findings are confirmed by additional studies, healthcare providers may want to screen children of obese mothers for ADHD so that they could be offered earlier interventions. The authors also note that healthcare providers could use evidence-based strategies to counsel women considering pregnancy on diet and lifestyle. Resources for plus-size pregnant women and their healthcare providers are available as part of NICHD’s Pregnancy for Every Body initiative.

Keyword: ADHD; Development of the Brain
Link ID: 27055 - Posted: 02.20.2020

Amy Schleunes New Zealand’s North Island robins (Petroica longipes), known as toutouwai in Maori, are capable of remembering a foraging task taught to them by researchers for up to 22 months in the wild, according to a study published on February 12 in Biology Letters. These results echo the findings of a number of laboratory studies of long-term memory in animals, but offer a rare example of a wild animal retaining a learned behavior with no additional training. The study also has implications for conservation and wildlife management: given the birds’ memory skills, researchers might be able to teach them about novel threats and resources in their constantly changing habitat. “This is the first study to show [memory] longevity in the wild,” says Vladimir Pravosudov, an animal behavior researcher at the University of Nevada, Reno, who was not involved in the study. Rachael Shaw, a coauthor and behavioral ecologist at Victoria University in New Zealand, says she was surprised that the birds remembered the new skill she had taught them. “Wild birds have so much that they have to contend with in their daily lives,” she says. “You don’t really expect that it’s worth their while to retain this learned task they hardly had the opportunity to do, and they can’t predict that they will have an opportunity to do again.” Shaw is generally interested in the cognitive abilities of animals and the evolution of intelligence, and the toutouwai, trainable food caching birds that can live up to roughly 10 years, make perfect subjects for her behavioral investigations. “They’ve got this kind of boldness and curiosity that a lot of island bird species share,” says Shaw. These qualities make them vulnerable to predation by invasive cats, rats, and ermines (also known as stoats), but also inquisitive and relatively unafraid of humans, an ideal disposition for testing memory retention in the field. © 1986–2020 The Scientist

Keyword: Learning & Memory; Evolution
Link ID: 27053 - Posted: 02.20.2020

Ian Sample Science editor Consuming a western diet for as little as one week can subtly impair brain function and encourage slim and otherwise healthy young people to overeat, scientists claim. Researchers found that after seven days on a high fat, high added sugar diet, volunteers in their 20s scored worse on memory tests and found junk food more desirable immediately after they had finished a meal. The finding suggests that a western diet makes it harder for people to regulate their appetite, and points to disruption in a brain region called the hippocampus as the possible cause. “After a week on a western-style diet, palatable food such as snacks and chocolate becomes more desirable when you are full,” said Richard Stevenson, a professor of psychology at Macquarie University in Sydney. “This will make it harder to resist, leading you to eat more, which in turn generates more damage to the hippocampus and a vicious cycle of overeating.” Previous work in animals has shown that junk food impairs the hippocampus, a brain region involved in memory and appetite control. It is unclear why, but one idea is that the hippocampus normally blocks or weakens memories about food when we are full, so looking at a cake does not flood the mind with memories of how nice cake can be. “When the hippocampus functions less efficiently, you do get this flood of memories, and so food is more appealing,” Stevenson said. To investigate how the western diet affects humans, the scientists recruited 110 lean and healthy students, aged 20 to 23, who generally ate a good diet. Half were randomly assigned to a control group who ate their normal diet for a week. The other half were put on a high energy western-style diet, which featured a generous intake of Belgian waffles and fast food. © 2020 Guardian News & Media Limited

Keyword: Learning & Memory; Obesity
Link ID: 27050 - Posted: 02.19.2020

Blake Richards Despite billions of dollars spent and decades of research, computation in the human brain remains largely a mystery. Meanwhile, we have made great strides in the development of artificial neural networks, which are designed to loosely mimic how brains compute. We have learned a lot about the nature of neural computation from these artificial brains and it’s time to take what we’ve learned and apply it back to the biological ones. Neurological diseases are on the rise worldwide, making a better understanding of computation in the brain a pressing problem. Given the ability of modern artificial neural networks to solve complex problems, a framework for neuroscience guided by machine learning insights may unlock valuable secrets about our own brains and how they can malfunction. Our thoughts and behaviours are generated by computations that take place in our brains. To effectively treat neurological disorders that alter our thoughts and behaviours, like schizophrenia or depression, we likely have to understand how the computations in the brain go wrong. However, understanding neural computation has proven to be an immensely difficult challenge. When neuroscientists record activity in the brain, it is often indecipherable. In a paper published in Nature Neuroscience, my co-authors and I argue that the lessons we have learned from artificial neural networks can guide us down the right path of understanding the brain as a computational system rather than as a collection of indecipherable cells. © 2010–2020, The Conversation US, Inc.

Keyword: Brain imaging; Robotics
Link ID: 27042 - Posted: 02.14.2020

By Gina Kolata The study aimed to show that Alzheimer’s disease could be stopped if treatment began before symptoms emerged. The participants were the best candidates that scientists could find: still healthy, but with a rare genetic mutation that guaranteed they would develop dementia. For five years, on average, the volunteers received monthly infusions or injections of one of two experimental drugs, along with annual blood tests, brain scans, spinal taps and cognitive tests. Now, the verdict is in: The drugs did nothing to slow or stop cognitive decline in these subjects, dashing the hopes of scientists. Dr. Randall Bateman, a neurologist at Washington University in St. Louis and principal investigator of the study, said he was “shocked” when he first saw the data: “It was really crushing.” The results are a deep disappointment, scientists said — but not a knockout punch. The drugs did not work, but the problems may be fixable: perhaps the doses were too low, or they should have been given to patients much younger. Few experts want to give up on the hypothesis that amyloid plaques in the brain are intimately involved in Alzheimer’s disease. The data from this international study, called DIAN-TU, are still being analyzed and are to be presented on April 2 at scientific conferences in Vienna in April and in Amsterdam in July. The trial was sponsored by Washington University in St. Louis, two drug companies that supplied the drugs — Eli Lilly and Roche, with a subsidiary, Genentech — the National Institutes of Health and philanthropies, including the Alzheimer’s Association. © 2020 The New York Times Company

Keyword: Alzheimers
Link ID: 27038 - Posted: 02.13.2020

By Perri Klass, M.D. Whenever I write about attention deficit hyperactivity disorder — whether I’m writing generally about the struggles facing these children and their families or dealing more specifically with medications — I know that some readers will write in to say that A.D.H.D. is not a real disorder. They say that the rising numbers of children taking stimulant medication to treat attentional problems are all victims, sometimes of modern society and its unfair expectations, sometimes of doctors, and most often of the rapacious pharmaceutical industry. I do believe that A.D.H.D. is a valid diagnosis, though a diagnosis that has to be made with care, and I believe that some children struggle with it mightily. Although medication should be neither the first nor the only treatment used, some children find that the stimulants significantly change their educational experiences, and their lives, for the better. Dr. Mark Bertin, a developmental pediatrician in Pleasantville, N.Y., who is the author of “Mindful Parenting for A.D.H.D.,” said, “On a practical level, we know that correctly diagnosed A.D.H.D. is real, and we know that when they’re used properly, medications can be both safe and effective.” The choice to use medications can be a difficult one for families, he said, and is made even more difficult by “the public perception that they’re not safe, or that they fundamentally change kids.” He worries, he says, that marketing is really effective, and wants to keep it “at arm’s length,” far away from his own clinical decisions, not allowing drug reps in the office, not accepting gifts — but acknowledging, all the same, that it’s probably not possible to avoid the effects of marketing entirely. Still, he said, when it comes to stimulants, “the idea that we’re only using them because of the pharmaceutical industry is totally off base,” and can make it much harder to talk with parents about the potential benefits — and the potential problems — of treating a particular child with a particular medication. “When it comes to A.D.H.D. in particular, it’s a hard enough thing for families to be dealing with without all the fear and judgment added on.” © 2020 The New York Times Company

Keyword: ADHD; Drug Abuse
Link ID: 27030 - Posted: 02.10.2020

By Chris Woolston Sometimes it takes multitudes to reveal scientific truth. Researchers followed more than 7,000 subjects to show that a Mediterranean diet can lower the risk of heart disease. And the Women’s Health Initiative enlisted more than 160,000 women to show, among other findings, that postmenopausal hormone therapy put women at risk of breast cancer and stroke. But meaningful, scientifically valid insights don’t always have to come from studies of large groups. A growing number of researchers around the world are taking a singular approach to pain, nutrition, psychology and other highly personal health issues. Instead of looking for trends in many people, they’re designing studies for one person at a time. A study of one person — also called an N of 1 trial — can uncover subtle, important results that would be lost in a large-scale study, says geneticist Nicholas Schork of the Translational Genomics Research Institute in Phoenix. The results, he says, can be combined to provide insights for the population at large. But with N of 1 studies, the individual matters above all. “People differ at fundamental levels,” says Schork, who discussed the potential of N of 1 studies in a 2017 issue of the Annual Review of Nutrition. And the only way to understand individuals is to study them. Case studies of individuals in odd circumstances have a long history in medical literature. But the concept of a clinical medicine N of 1 study gathering the same level of information as a large study goes back to an article published in the New England Journal of Medicine in 1986. Hundreds of N of 1 studies have been published since then, and the approach is gaining momentum, says Suzanne McDonald, N of 1 research coordinator at the University of Queensland in Brisbane, Australia.

Keyword: Genes & Behavior; Schizophrenia
Link ID: 27027 - Posted: 02.10.2020

By Laura Sanders Immune cells in the brain chew up memories, a new study in mice shows. The finding, published in the Feb. 7 Science, points to a completely new way that the brain forgets, says neuroscientist Paul Frankland of the Hospital for Sick Children Research Institute in Toronto, who wasn’t involved in the study. That may sound like a bad thing, but forgetting is just as important as remembering. “The world constantly changes,” Frankland says, and getting rid of unimportant memories — such as a breakfast menu from two months ago — allows the brain to collect newer, more useful information. Exactly how the brain stores memories is still debated, but many scientists suspect that connections between large groups of nerve cells are important (SN: 1/24/18). Forgetting likely involves destroying or changing these large webs of precise connections, called synapses, other lines of research have suggested. The new result shows that microglia, immune cells that can clear debris from the brain, “do exactly that,” Frankland says. Microglia are master brain gardeners that trim extra synapses away early in life, says Yan Gu, a neuroscientist at Zhejiang University School of Medicine in Hangzhou, China. Because synapses have a big role in memory storage, “we started to wonder whether microglia may induce forgetting by eliminating synapses,” Gu says. Gu’s team first gave mice an unpleasant memory: mild foot shocks, delivered in a particular cage. Five days after the shocks, the mice would still freeze in fear when they were placed in the cage. But 35 days later, they had begun to forget and froze less often in the room. © Society for Science & the Public 2000–2020

Keyword: Learning & Memory; Neuroimmunology
Link ID: 27026 - Posted: 02.07.2020

Abby Olena Researchers have shown previously that excessive proliferation of the cells of the brain, which can cause macrocephaly, or large head size, is associated with autism. Now, the authors of a study published in Cell Stem Cell last week (January 30) have connected that overgrowth with replication stress, subsequent DNA damage, and dysfunction in neural progenitor cells derived from induced pluripotent stem cells from patients with autism spectrum disorder. “It is striking,” Bjoern Schwer, a molecular biologist at the University of California, San Francisco, who studies DNA repair and genomic stability in neural cells and did not participate in the study, writes in an email to The Scientist. “These are fascinating findings with many implications for autism spectrum disorder—and potentially for other neurodevelopmental disorders too.” In 2016, a group led by Schwer and Frederick Alt of Boston Children’s Hospital showed that mice have clusters of double-strand DNA breaks in the genomes of their neural progenitor cells. These hotspots are concentrated in neural-specific genes, which tend to be longer than genes expressed in other cell types and have also been associated with neurological diseases. Rusty Gage, a neuroscientist at the Salk institute, Meiyan Wang, a graduate student in the Gage lab, and their colleagues collaborated with Alt to explore whether or not these same damaged clusters would show up in the genomes of human neural progenitor cells. Wang went to the Alt lab to learn how to map genome-wide double-strand breaks. Then, she used the technique on several neural progenitor cell lines that had been previously derived in the Gage lab: three from patients with macrocephalic autism spectrum disorder and three from neurotypical controls. © 1986–2020 The Scientist

Keyword: Autism; Genes & Behavior
Link ID: 27025 - Posted: 02.07.2020

Sarah O’Meara Xiaoming Zhou is a neurobiologist at East China Normal University in Shanghai. Here he speaks to Nature about his research into age-related hearing loss, and explains why he hopes that brain training could help to lessen declines in sensory perception generally, and so ward off neurodegenerative diseases. What is your current research focus? We want to better understand the neural basis for why a person’s hearing function declines as they grow older. For example, we have performed research to see whether we can reverse age-related changes to the auditory systems of rodents. We gave the animals a set of tasks, such as learning to discriminate between sounds of different frequencies or intensities. These exercises caused the rodents’ hearing to improve, and also promoted changes to the hippocampus, a part of the brain structure closely associated with learning and memory. The relationship with the hippocampus suggests that new kinds of brain training might help to attenuate our declines in perception and other brain functions, such as learning and memory, as we grow older — and so have the potential to stave off neurodegenerative diseases. How is ageing-related science developing in China? As has happened in the rest of the world, a rapidly ageing population has brought significant concern to policymakers. However, as far as I know, only a few neuroscience laboratories in China are specifically focused on learning more about the underlying mechanisms that cause changes in brain function as we age. This is despite the fact that such research could have a considerable impact on the welfare of older people in the future. © 2020 Springer Nature Limited

Keyword: Alzheimers
Link ID: 27023 - Posted: 02.07.2020

By Charles Zanor We all know people who say they have “no sense of direction,” and our tendency is almost always to minimize such claims rather than take them at full force. Yet for some people that description is literally true, and true in all circumstances: If they take a single wrong turn on an established route they often become totally lost. This happens even when they are just a few miles from where they live. Ellen Rose had been a patient of mine for years before I realized that she had this life-long learning disability. Advertisement I was made aware of it not long after I moved my psychology office from Agawam, Massachusetts to Suffield, Connecticut, just five miles away. I gave Ellen a fresh set of directions from the Springfield, Massachusetts area that took her south on Interstate 91 to Exit 47W, then across the Connecticut River to Rte 159 in Suffield. I thought it would pose no problem at all for her. A few minutes past her scheduled appointment time she called to say that she was lost. She had come south on Route 91 and had taken the correct exit, but she got confused and almost immediately hooked a right onto a road going directly north, bringing her back over the Massachusetts line to the town of Longmeadow. She knew this was wrong but did not know how to correct it, so I repeated the directions to get on 91 South and so on. Minutes passed, and then more minutes passed, and she called again to say that somehow she had driven by the exit she was supposed to take and was in Windsor, Connecticut. I kept her on the phone and guided her turn by turn to my office. Advertisement When I asked her why she hadn’t taken Exit 47W, she said that she saw it but it came up sooner than she expected so she kept going. This condition—developmental topographic disorientation—didn’t even have a formal name until 2009, when Giuseppe Iaria reported his first case in the journal Neuropsychologia. To understand DTD it is best to begin by saying that there are two main ways that successful travelers use to navigate their environment. © 2020 Scientific American,

Keyword: Learning & Memory; Development of the Brain
Link ID: 27021 - Posted: 02.05.2020

Jon Hamilton Scientists have found a clue to how autism spectrum disorder disrupts the brain's information highways. The problem involves cells that help keep the traffic of signals moving smoothly through brain circuits, a team reported Monday in the journal Nature Neuroscience. The team found that in both mouse and human brains affected by autism, there's an abnormality in cells that produce a substance called myelin. That's a problem because myelin provides the "insulation" for brain circuits, allowing them to quickly and reliably carry electrical signals from one area to another. And having either too little or too much of this myelin coating can result in a wide range of neurological problems. For example, multiple sclerosis occurs when the myelin around nerve fibers is damaged. The results, which vary from person to person, can affect not only the signals that control muscles, but also the ones involved in learning and thinking. The finding could help explain why autism spectrum disorders include such a wide range of social and behavioral features, says Brady Maher, a lead investigator at the Lieber Institute for Brain Development and an associate professor in the psychiatry department at Johns Hopkins School of Medicine. "Myelination could be a problem that ties all of these autism spectrum disorders together," Maher says. And if that's true, he says, it might be possible to prevent or even reverse the symptoms using drugs that affect myelination. © 2020 npr

Keyword: Autism; Glia
Link ID: 27019 - Posted: 02.04.2020

Alison Abbott Researchers studying the biological basis of mental illness have conducted the first genomic analysis of schizophrenia in an African population, and have identified multiple rare mutations that occur more frequently in people with the condition. The mutations are mainly in genes that are important for brain development and the brain’s synapses, tiny structures that coordinate communication between neurons. The genes match those identified in other similar studies of schizophrenia — but nearly all previous research has been conducted in European or Asian populations. The latest work was published1 in Science on 31 January. This research is particularly important because Africa has represented a big gap in the populations that geneticists have studied, says psychiatric geneticist Andreas Meyer-Lindenberg, director of the Central Institute of Mental Health in Mannheim, Germany. He says that the work lends support to current hypotheses about the biological origins of schizophrenia, which can cause a range of symptoms including hallucinations, delusions and disordered thinking. Researchers think that each mutation might contribute a small amount to the overall risk of developing the condition, and that disruption to synapses could be crucial to the disease’s development. Over the past decade, as studies that use genome sequencing to identify the genetic basis of diseases have flourished, geneticists have come under increasing fire for failing to sample diverse populations, largely neglecting African people. Around 80% of participants in genetic studies are of European descent, and less than 3% are of African descent. © 2020 Springer Nature Limited

Keyword: Schizophrenia; Genes & Behavior
Link ID: 27016 - Posted: 02.04.2020

By Nicholas Bakalar Flavonols, a large class of compounds found in most fruits and vegetables, may be associated with a reduced risk for Alzheimer’s disease. Flavonols are known to have antioxidant and anti-inflammatory effects, and animal studies have suggested they may improve memory and learning. A study in Neurology involved 921 men and women, average age 81 and free of dementia, who reported their diet using well-validated food questionnaires. During an average follow-up of six years, 220 developed Alzheimer’s disease. People with the highest levels of flavonol intake tended to have higher levels of education and were more physically active. But after controlling for these factors plus age, sex, the Apo E4 gene (which increases the risk for dementia) and late-life cognitive activity, the scientists found that compared with those in the lowest one-fifth for flavonol intake, those in the highest one-fifth had a 48 percent reduced risk for Alzheimer’s disease. The study covered four types of flavonols: kaempferol, quercetin, isorhamnetin and myricetin. All except quercetin showed a strong association with Alzheimer’s risk reduction. These flavonols are available as supplements, but the lead author, Dr. Thomas M. Holland, a professor of medicine at Rush Medical College in Chicago, said that foods are a better source. “You get a broader intake of vitamins, minerals and bioactives in food than in the supplements,” he said. © 2020 The New York Times Company

Keyword: Alzheimers
Link ID: 27015 - Posted: 02.04.2020

Timothy Bella The headaches had become so splitting for Gerardo Moctezuma that the pain caused him to vomit violently. The drowsiness that came with it had intensified for months. But it wasn’t until Moctezuma, 40, fainted without explanation at a soccer match in Central Texas last year that he decided to figure out what was going on. When Jordan Amadio looked down at his MRI results, the neurosurgeon recognized — but almost couldn’t believe — what looked to be lodged in Moctezuma’s brain. As he opened up Moctezuma’s skull during an emergency surgery in May 2019, he was able to confirm what it was that had uncomfortably set up shop next to the man’s brain stem: a tapeworm measuring about an inch-and-a-half. “It’s very intense, very strong, because it made me sweat too, sweat from the pain,” Moctezuma said to KXAN. The clear and white parasite came from tapeworm larva that Amadio believes Moctezuma, who moved from Mexico to the U.S. 14 years before his diagnosis, might have had in his brain for more than a decade undetected. His neurological symptoms had intensified due to his neurocysticercosis, which was the direct result of the tapeworm living in his brain. The cyst would trigger hydrocephalus, an accumulation of cerebrospinal fluid that increased pressure to the skull to the point that the blockage and pain had become life-threatening. “It’s a remarkable case where a patient came in and, if he had not been treated urgently, he would have died from tremendous pressure in the brain,” Amadio, attending neurosurgeon at the Ascension Seton Brain and Spine Institute in Austin, told The Washington Post on Thursday night.

Keyword: Development of the Brain
Link ID: 27013 - Posted: 02.01.2020

Madeline Andrews, Aparna Bhaduri, Arnold Kriegstein What was going on with our brain organoids? As neuroscientists, we use these three-dimensional clusters of cells grown in petri dishes to learn more about how the human brain works. Researchers culture various kinds of organoids from stem cells – cells that have the potential to become one of many different cell types found throughout the body. We use chemical signals to direct stem cells to produce brain-like cells that together resemble certain structural aspects of a real brain. While they are not “brains in a dish” – organoids cannot function or think independently – the idea is that organoid models let scientists see developmental processes that may yield insights into how the human brain works. If researchers better understand normal development, we may be able to understand when and how things go wrong in diseases. When we recently compared our lab’s organoid cells to normal brain cells, we were surprised to find that they didn’t look as similar as we’d expected. Our brain organoids, each the size of a few millimeters, were stressed out. Our investigation into why has important implications for this popular new method since many labs are using it to study brain function and neurological disease. Without accurate models of the brain, scientists will not be able to work toward disease treatments. Our lab is particularly interested in the human cerebral cortex – the brain’s bumpy exterior – because it is so different in human beings than it is in any other species. The human cortex is proportionally bigger than in our closest living relatives, the great apes, containing more and different types of cells. It’s the source of many unique human abilities, including our cognitive capacity. © 2010–2020, The Conversation US, Inc.

Keyword: Development of the Brain
Link ID: 27011 - Posted: 01.31.2020