Chapter 13. Memory and Learning
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Sara Talpos It’s been more than a decade since scientists first started publishing papers on neural organoids, the small clusters of cells grown in labs and designed to mimic various parts of the human brain. Since then, organoids have been used to study everything from bipolar disorder and Alzheimer’s disease, to tumors and parasitic infections. Because these new tools have the potential to reduce the use of animals in research — a goal of the current Trump administration — the field’s future may be more financially secure than other areas of scientific research. In September, for example, the federal government announced an $87 million investment into organoid research broadly. Matthew Owen brings a unique perspective to this emerging field. As a philosopher of mind, he focuses on trying to understand both what the mind is and how it relates to the body and the brain. He draws on the work of historical philosophers and applies some of their ideas to modern-day science. In 2020, as a visiting scholar in a neuroscience lab at McGill University, he was introduced to researchers working with organoids. Owen, who also does research in bioethics, wanted to help them address a perhaps unsettling question: Could these miniature cell clusters ever develop consciousness? Some experts believe that organoid consciousness is not likely to happen anytime in the near future, if at all. Still, certain experiments are prompting the question. In 2022, for example, researchers, including Brett Kagan of the Australian start-up Cortical Labs, published a paper explaining how they had taught their lab-grown brain cells to play a ping-pong-like video game. (Because the cells were placed in a single layer, the structures were not technically organoids, though they are expected to have similar capabilities.) In the process, the authors wrote, the tiny cell clusters displayed “sentience.” Undark recently spoke with Owen about this particular experiment and about his own writing on organoids.
Keyword: Consciousness; Development of the Brain
Link ID: 30048 - Posted: 12.13.2025
Alison Abbott For decades, neuroscientists focused almost exclusively on only half of the cells in the brain. Neurons were the main players, they thought, and everything else was made up of uninteresting support systems. By the 2010s, memory researcher Inbal Goshen was beginning to question that assumption. She was inspired by innovative molecular tools that would allow her to investigate the contributions of another, more mysterious group of cells called astrocytes. What she discovered about their role in learning and memory excited her even more. At the beginning, she felt like an outsider, especially at conferences. She imagined colleagues thinking, “Oh, that’s the weird one who works on astrocytes,” says Goshen, whose laboratory is at the Hebrew University of Jerusalem. A lot of people were sceptical, she says. But not any more. A rush of studies from labs in many subfields are revealing just how important these cells are in shaping our behaviour, mood and memory. Long thought of as support cells, astrocytes are emerging as key players in health and disease. “Neurons and neural circuits are the main computing units of the brain, but it’s now clear just how much astrocytes shape that computation,” says neurobiologist Nicola Allen at the Salk Institute for Biological Studies in La Jolla, California, who has spent her career researching astrocytes and other non-neuronal cells, collectively called glial cells. “Glial meetings are now consistently oversubscribed.” As far back as the nineteenth century, scientists could see with their simple microscopes that mammalian brains included two major types of cell — neurons and glia — in roughly equal numbers. © 2025 Springer Nature Limited
Keyword: Glia
Link ID: 30038 - Posted: 12.03.2025
By Pam Belluck A recently recognized form of dementia is changing the understanding of cognitive decline, improving the ability to diagnose patients and underscoring the need for a wider array of treatments. Patients are increasingly being diagnosed with the condition, known as LATE, and guidelines advising doctors how to identify it were published this year. LATE is now estimated to affect about a third of people 85 and older and 10 percent of those 65 and older, according to those guidelines. Some patients who have been told they have Alzheimer’s may actually have LATE, dementia experts say. “In about one out of every five people that come into our clinic, what previously was thought to maybe be Alzheimer’s disease actually appears to be LATE,” said Dr. Greg Jicha, a neurologist and an associate director of the University of Kentucky’s Sanders-Brown Center on Aging. “It can look like Alzheimer’s clinically — they have a memory problem,” Dr. Jicha said. “It looks like a duck, walks like a duck, but then it doesn’t quack, it snorts instead. ” On its own, LATE, shorthand for Limbic-predominant age-related TDP-43 encephalopathy, is usually less severe than Alzheimer’s and unfolds more slowly, said Dr. Pete Nelson, an associate director of the Sanders-Brown Center, who helped galvanize efforts to identify the disorder. That can be reassuring to patients and their families. But there is no specific treatment for LATE. Also, many older people have more than one type of dementia pathology, and when LATE occurs in conjunction with Alzheimer’s, it exacerbates symptoms and speeds decline, he said. © 2025 The New York Times Company
Keyword: Alzheimers
Link ID: 30033 - Posted: 11.29.2025
By Angie Voyles Askham Rats, like people, jump at the chance to repeat a task that rewards them handsomely, but they are less eager when the reward is paltry: They learn from past experience and update their behavior accordingly. That learning is shaped by the hormone estradiol, according to a new study. And when estradiol levels peak during the estrus cycle, female rats adapt their behavior in response to reward size more quickly than they do during other phases—and faster than males overall. The female rats also have a larger release of dopamine in response to an unexpected reward, along with reduced expression of dopamine transporters in a reward center of their brain after the hormone peaks, the new work shows. “It’s giving mechanistic insight into how estrogen modulates reinforcement learning—all the way down to the molecular mechanism,” says Ilana Witten, professor of neuroscience at Princeton University and Howard Hughes Medical Institute investigator, who was not involved in the study. The team behind the new work used a task that measures how much an animal values an anticipated reward: Thirsty rats poke their nose into a central port and then listen for a tone that indicates how much water one of two side ports will dispense. The animals choose to either hold out at the cued location for the reward or to abandon the trial and start a new one by poking their nose into the other side. Rats learn to initiate their next trial more quickly when the experiment is doling out large rewards and to hold off on initiating new trials when rewards are small, previous work from the group has shown. “It takes a lot of energy to initiate a trial, so if there are small rewards, it’s not as motivating,” says study investigator Carla Golden, a postdoctoral researcher in Christine Constantinople’s lab at New York University. © 2025 Simons Foundation
Keyword: Hormones & Behavior; Attention
Link ID: 30028 - Posted: 11.26.2025
Hannah Devlin Science correspondent Scientists have identified five major “epochs” of human brain development in one of the most comprehensive studies to date of how neural wiring changes from infancy to old age. The study, based on the brain scans of nearly 4,000 people aged under one to 90, mapped neural connections and how they evolve during our lives. This revealed five broad phases, split up by four pivotal “turning points” in which brain organisation moves on to a different trajectory, at around the ages of nine, 32, 66 and 83 years. “Looking back, many of us feel our lives have been characterised by different phases. It turns out that brains also go through these eras,” said Prof Duncan Astle, a researcher in neuroinformatics at Cambridge University and senior author of the study. “Understanding that the brain’s structural journey is not a question of steady progression, but rather one of a few major turning points, will help us identify when and how its wiring is vulnerable to disruption.” The childhood period of development was found to occur between birth until the age of nine, when it transitions to the adolescent phase – an era that lasts up to the age of 32, on average. In a person’s early 30s the brain’s neural wiring shifts into adult mode – the longest era, lasting more than three decades. A third turning point around the age of 66 marks the start of an “early ageing” phase of brain architecture. Finally, the “late ageing” brain takes shape at around 83 years old. The scientists quantified brain organisation using 12 different measures, including the efficiency of the wiring, how compartmentalised it is and whether the brain relies heavily on central hubs or has a more diffuse connectivity network. From infancy through childhood, our brains are defined by “network consolidation”, as the wealth of synapses – the connectors between neurons – in a baby’s brain are whittled down, with the more active ones surviving. During this period, the study found, the efficiency of the brain’s wiring decreases. © 2025 Guardian News & Media Limited
Keyword: Development of the Brain; Brain imaging
Link ID: 30027 - Posted: 11.26.2025
By Gina Kolata Hopes were high. In retrospect, perhaps too high. On Monday, Novo Nordisk announced that two large studies failed to find any effect of the drug semaglutide on cognition and functioning in people with mild cognitive impairment — an early stage of Alzheimer’s — or with dementia. The participants were randomly assigned to take a pill of semaglutide, the compound at the heart of the weight-loss injections Ozempic and Wegovy, or a placebo for two years. “Today we announced that our efforts to slow down the progression of Alzheimer’s disease has come to an end,” said Maziar Mike Doustdar, chief executive at Novo Nordisk, in a video posted on LinkedIn. He added, “Based on the indicative data points we had, this is not the outcome we had hoped for.” The studies, involving 1,855 people in one trial and 1,953 in the other, seemed to stem an initial phase of optimism. The drugs appeared miraculous in their treatment of obesity, diabetes, heart disease and kidney disease. Alzheimer’s and other brain illnesses looked like the next frontier. But there had been other recent warnings, in two smaller studies of brain diseases. One, done by researchers in Britain, asked if a similar drug could help with Parkinson’s disease. That drug had no effect. Another study found that semaglutide did not help with cognitive impairment in people with major depression, a severe form of the disease. The company will present more detailed results from its Alzheimer’s study at a conference on Dec. 3, and another in March of 2026. Novo Nordisk’s stock was down nearly 6 percent on Monday, deepening a monthslong slump for the once-surging company. “We always knew there would be a low likelihood of success, but it was important to determine if semaglutide could take on one of medicine’s most challenging frontiers,” Mr. Doustdar said. © 2025 The New York Times Company
Keyword: Alzheimers; Obesity
Link ID: 30026 - Posted: 11.26.2025
By Meghan Rosen Taking just a few thousand steps daily could potentially stave off Alzheimer’s disease. People with the disease tend to experience debilitating cognitive challenges, like memory loss and difficulty communicating, that worsen over time. But physical activity may slow that steady downward march. In an observational study of people at risk for Alzheimer’s, researchers linked walking between 3,000 and 5,000 steps per day to a three-year delay in cognitive decline, compared with sedentary individuals. For people who walked between 5,000 and 7,500 steps per day, the reprieve appeared to last even longer — seven years, Harvard Medical School behavioral neurologist Jasmeer Chhatwal and his colleagues report November 3 in Nature Medicine. The association still needs to be tested in a clinical trial, Chhatwal says, but his team’s results hint at something important. Quality of life for people with Alzheimer’s and their families often plummets in the later stages of the disease. “If the disease can be delayed,” he says, “that can have a very big impact on people’s lives.” Previous studies have reported links between physical activity and delayed Alzheimer’s progression, says Deborah Barnes, an epidemiologist who studies dementia at the University of California, San Francisco, and who was not part of the research team. But the new study pinpoints the step count where people begin to see benefits. It also “helps to explain how,” she says. Chhatwal’s team reported a connection between exercise and less accumulation of certain Alzheimer’s proteins in the brain. It’s a mechanism that illustrates how physical activity probably works to slow Alzheimer’s progression, Barnes says. © Society for Science & the Public 2000–2025
Keyword: Alzheimers
Link ID: 30025 - Posted: 11.26.2025
On 19 November 2025, the U.S. Centers for Disease Control and Prevention changed language on a “vaccine safety” page on its website to assert that the statement “vaccines do not cause autism” is not evidence based. The updated CDC page now incorrectly suggests that a link between infant vaccination and autism exists, and it casts doubt on a wealth of research that has produced evidence to the contrary. The updated language contradicts decades of research findings that show vaccines do not cause autism. The move has also prompted backlash from multiple groups, including the Coalition of Autism Scientists and the Autism Science Foundation. “These sort of claims have been repeatedly debunked by good science and multiple independent replications of negative studies, and for years no scientist has opined that more research is needed,” Eric Fombonne, professor emeritus of psychiatry at Oregon Health & Science University, told The Transmitter. He noted several problems with the arguments presented on the CDC website, including the citation of “fringe studies executed by uncredentialed authors with poor methodologies and published in low-quality journals.” Fombonne described the authors of the page as having “cherry pick[ed data] … in support of their preconceived beliefs” and mischaracterizing well-conducted and replicated research. Experts The Transmitter spoke with raised many concerns about the agency’s statements, including how those statements could confuse families and whether they indicate shifts in priorities that threaten solid scientific research. “Families deserve honest answers,” says David Mandell, professor of psychiatry at the University of Pennsylvania and director of the Penn Center for Mental Health. © 2025 Simons Foundation
Keyword: Autism; Neuroimmunology
Link ID: 30020 - Posted: 11.22.2025
By Oliver Whang Owen Collumb was paralyzed in 1993, when he was 21 years old. A tire on his motorbike blew out and he fell into a ravine, breaking a single bone in his spine. When he recovered, he couldn’t move his legs and could control only the biceps in his arms, meaning that he could lift his hands but, to put them down, he had to twist his shoulders and let gravity unbend his elbows. He spent years in an assisted living home before petitioning to move to his own place in Dublin, with the help of home aides. Living alone was liberating; he could choose what he ate and when he woke in the morning. He began working multiple jobs for foundations and advocating for people with disabilities. One of his assistants, Sylwia Filipiek, a Polish immigrant to Ireland, had been employed at a printing factory. She had no experience with home care and struggled to help Mr. Collumb into his wheelchair at first. But, over the years, they learned how to work together, and grew close. In the summer of 2024, Mr. Collumb and Ms. Filipiek flew to Bath, England, to train for the Cybathlon, an international competition run every four years to encourage the development of assistive technologies. The competition, hosted in Switzerland by the university ETH Zurich, consists of eight races for teams and their pilots (which is what the primary competitors, with varying disabilities, are called), each targeting different innovations, such as arm prostheses, leg prostheses and vision assistance. Each race consists of remote tasks that are supposed to simulate everyday life for the pilots: walking across a room, picking up a grocery bag, throwing a ball. One of Cybathlon’s founders, Roland Sigrist, compared it to Formula 1. Teams are encouraged to develop prototypes toward the ultimate goal of “the independence of people with disabilities,” but the competition is straightforward and real, with all its accompaniments: nerves, heartbreak, glory. The pilots are the ones that put themselves on the line. “They’re the masters of the technology, and not the other way around,” Mr. Sigrist said. © 2025 The New York Times Company
Keyword: Robotics
Link ID: 30018 - Posted: 11.19.2025
By Lauren Schenkman The purported autism-microbiome connection is having a moment. It’s the focus of a new $50-million call for proposals from Wellcome Leap—a research initiative of the Wellcome Trust—and a 2024 Netflix documentary portrays fecal microbiota transplants as a promising treatment for autism-related traits. “It seems to have captured the public’s imagination,” says Kevin Mitchell, associate professor of genetics and neuroscience at Trinity College in Dublin. But Mitchell says he has long been skeptical. Eventually, he and some colleagues “collectively got exasperated enough by this that we felt that we had to say something about it,” Mitchell says. Today, they published a comprehensive review in Neuron of more than 30 studies on the autism-microbiome connection, including preclinical experiments in mice, human observational studies and clinical trials. After accounting for statistical, technical and conceptual flaws, the team reached a clear conclusion: “There’s nothing there,” Mitchell says. Research projects that include the keywords “autism” and “microbiome” have netted about $20 million to $25 million in U.S. federal funding annually since 2018, Mitchell’s team found using the funding database NIH RePORTER. It’s worrying that funders assume “there’s a solid foundation of work,” Mitchell says. “It’s just this huge amount of scientific effort and funding going into exploring these ideas.” Mitchell spoke with The Transmitter about the problems he sees with studies that claim to show a microbiome-autism link, and how neuroscientists can read them with an analytical eye. © 2025 Simons Foundation
Keyword: Autism
Link ID: 30015 - Posted: 11.19.2025
By Dana Rubi Levy, Kevin Mastro, Michael Ryan Any seasoned baker knows the importance of being flexible. If you are missing an ingredient or hosting a guest with dietary restrictions, you might need to swap yogurt for eggs or oil for butter. The final product may differ, but it can still be rich and satisfying. In much the same way, our brain constantly makes substitutions and adjustments in response to the inevitable changes in our internal and external environments. To understand these changes, scientists often compare the brain and behavior of older people, aged 60 and up, with those of younger people, aged 20 to 30. Despite considerable individual variability, older people—on average—have slower processing speeds, rely more on past experience to solve problems, and have less behavioral flexibility. These findings have shaped our theories about how age-related changes in the brain drive behavior. In recent years, however, a conceptual shift has emerged, raising questions about whether some age-related changes are not solely the result of cognitive decline. Instead, some may be adaptive and address age-related constraints, such as changes in metabolism and increased inflammation. Moreover, scientists have begun to question whether young adulthood, characterized by a period of highly flexible decision-making, is the right benchmark to assess cognition across the lifespan. Given the evolving landscape of the aging brain, change is necessary, and not all deviations from the young-adult “benchmark” should be seen as decline. The main challenge for neuroscientists is to determine which of these age-related adaptations are beneficial and which are detrimental. In other words, which substitutions retain the original flavors, and which result in a dish that falls flat? © 2025 Simons Foundation
Keyword: Development of the Brain; Learning & Memory
Link ID: 30012 - Posted: 11.15.2025
David Adam In a town on the shores of Lake Geneva sit clumps of living human brain cells for hire. These blobs, about the size of a grain of sand, can receive electrical signals and respond to them — much as computers do. Research teams from around the world can send the blobs tasks, in the hope that they will process the information and send a signal back. Welcome to the world of wetware, or biocomputers. In a handful of academic laboratories and companies, researchers are growing human neurons and trying to turn them into functional systems equivalent to biological transistors. These networks of neurons, they argue, could one day offer the power of a supercomputer without the outsized power consumption. The results so far are limited. But keen scientists are already buying or borrowing online access to these brain-cell processors — or even investing tens of thousands of dollars to secure their own models. Some want to use these biocomputers as straightforward replacements for ordinary computers, whereas others want to use them to study how brains work. “Trying to understand biological intelligence is a very interesting scientific problem,” says Benjamin Ward-Cherrier, a robotics researcher at the University of Bristol, UK, who rents time on the Swiss brain blobs. “And looking at it from the bottom up — with simple small versions of our brain and building those up — I think is a better way of doing it than top down.” Biocomputing advocates claim that these systems could one day rival the capability of artificial intelligence and the potential of quantum computers. Other researchers who work with human neurons are more sceptical of what’s possible. And they warn that hype — and the science-fictional allure of what are sometimes labelled brain-in-a-jar systems — could even be counterproductive. If the idea that these systems possess sentience and consciousness takes hold, there could be repercussions for the research community. © 2025 Springer Nature Limited
Keyword: Learning & Memory; Robotics
Link ID: 30010 - Posted: 11.12.2025
By Kevin Berger Steve Ramirez was feeling on top of the world in 2015. His father, Pedro Ramirez, had snuck into the United States in the 1980s to escape the civil war in El Salvador. Pedro Ramirez held jobs as a door-to-door salesman for tombstones, a janitor in a diner, and a technician in an animal lab. After years of ’round-the-clock work, Pedro Ramirez became a U.S. citizen. And here was his son, born in America, with a Ph.D. from the Massachusetts Institute of Technology, still in his 20s, being celebrated as one of the most exciting and promising neuroscientists in the country. Steve Ramirez had published research papers with his MIT mentor Xu Liu that reported how they used lasers to erase fear memories, spur positive memories, and even fabricate new memories in the brain. The experiments were only in mice. But they were impressive. Memories are made of networks of brain cells called engrams. The lasers targeted specific cells in engrams. Zap those cells and the whole engram was muted. The pair of neuroscientists gave a popular TED Talk on memory manipulation and were featured in international press stories that invariably mentioned the plotlines in the movies Eternal Sunshine of the Spotless Mind and Inception could be real. Bad memories could be deleted. New memories could be implanted. One night in 2013 Ramirez and Liu were celebrating the publication of one of their papers in a jazz lounge at the top of the Prudential Building in Boston. The music was grooving, and the city below glittered like stars. Ramirez thought, I’ve never been so happy and so fully alive. In early 2015, Liu, age 37, died suddenly. There had been no warning signs. Ramirez had never had a friend like Liu. Liu opened his mind to experiences in science he couldn’t have imagined. Their relationship felt organic from Ramirez’s first day in the lab. Liu joked they would always have chemistry doing science together. Grief is when the future your brain plans for is cut off. Ramirez’s thoughts of doing science without Liu became a trapdoor that landed him in a cellar of pain. © 2025 NautilusNext Inc.,
Keyword: Learning & Memory; Drug Abuse
Link ID: 30007 - Posted: 11.12.2025
Katie Kavanagh Speaking multiple languages could slow down brain ageing and help to prevent cognitive decline, a study of more than 80,000 people has found. The work, published in Nature Aging on 10 November1, suggests that people who are multilingual are half as likely to show signs of accelerated biological ageing as are those who speak just one language. “We wanted to address one of the most persistent gaps in ageing research, which is if multilingualism can actually delay ageing,” says study co-author Agustín Ibáñez, a neuroscientist at the Adolfo Ibáñez University in Santiago, Chile. Previous research in this area has suggested that speaking multiple languages can improve cognitive functions such memory and attention2, which boosts brain health as we get older. But many of these studies rely on small sample sizes and use unreliable methods of measuring ageing, which leads to results that are inconsistent and not generalizable. “The effects of multilingualism on ageing have always been controversial, but I don’t think there has been a study of this scale before, which seems to demonstrate them quite decisively,” says Christos Pliatsikas, a cognitive neuroscientist at the University of Reading, UK. The paper’s results could “bring a step change to the field”, he adds. They might also “encourage people to go out and try to learn a second language, or keep that second language active”, says Susan Teubner-Rhodes, a cognitive psychologist at Auburn University in Alabama. © 2025 Springer Nature Limited
Keyword: Language; Alzheimers
Link ID: 30005 - Posted: 11.12.2025
By Nora Bradford Here are three words: pine, crab, sauce. There’s a fourth word that combines with each of the others to create another common word. What is it? When the answer finally comes to you, it’ll likely feel instantaneous. You might even say “Aha!” This kind of sudden realization is known as insight, and a research team recently uncovered how the brain produces it (opens a new tab), which suggests why insightful ideas tend to stick in our memory. Maxi Becker (opens a new tab), a cognitive neuroscientist at Duke University, first got interested in insight after reading the landmark 1962 book The Structure of Scientific Revolutions (opens a new tab) by the historian and philosopher of science Thomas Kuhn. “He describes how some ideas are so powerful that they can completely shift the way an entire field thinks,” she said. “That got me wondering: How does the brain come up with those kinds of ideas? How can a single thought change how we see the world?” Such moments of insight are written across history. According to the Roman architect and engineer Vitruvius, in the third century BCE the Greek mathematician Archimedes suddenly exclaimed “Eureka!” after he slid into a bathtub and saw the water level rise by an amount equal to his submerged volume (although this tale may be apocryphal (opens a new tab)). In the 17th century, according to lore, Sir Isaac Newton had a breakthrough in understanding gravity after an apple fell on his head. In the early 1900s, Einstein came to a sudden realization that “if a man falls freely, he would not feel his weight,” which led him to his theory of relativity, as he later described in a lecture. Insights are not limited to geniuses: We have these cognitive experiences all the time when solving riddles or dealing with social or intellectual problems. They are distinct from analytical problem-solving, such as the process of doing formulaic algebra, in which you arrive at a solution slowly and gradually as if you’re getting warmer. Instead, insights often follow periods of confusion. You never feel as if you’re getting warmer; rather, you go from cold to hot, seemingly in an instant. Or, as the neuropsychologist Donald Hebb, known for his work building neurobiological models of learning, wrote in the 1940s, sometimes “learning occurs as a single jump, an all-or-none affair.” © 2025 Simons Foundation
Keyword: Attention; Learning & Memory
Link ID: 30004 - Posted: 11.08.2025
By Carl Zimmer In Paola Arlotta’s lab at Harvard is a long, windowless hallway that is visited every day by one of her scientists. They go there to inspect racks of scientific muffin pans. In every cavity of every pan is a pool of pink liquid, at the bottom of which are dozens of translucent nuggets no bigger than peppercorns. The nuggets are clusters of neurons and other cells, as many as two million, normally found in the human brain. On their daily rounds, the scientists check that the nuggets are healthy and well-fed. “No first-year students walk in that corridor,” Dr. Arlotta said. “You have to be experienced enough to go there, because the risk is very high that you’re going to mess up the work that took years to build.” The oldest nuggets are now seven years old. Back in 2018, Dr. Arlotta and her colleagues created them from skin cells originally donated by volunteers. A chemical cocktail transformed them into the progenitor cells normally found in the fetal human brain. The cells multiplied into neurons and other types of brain cells. They wrapped their branches around each other and pulsed with electrical activity, much like the pulses that race around inside our heads. One such nugget can contain more neurons than the entire brain of a honeybee. But Dr. Arlotta is quick to stress that they are not brains. She and her colleagues call them brain organoids. “It’s so important to call them organoids and not brains, because they’re no such thing,” she said. “They are reductionist replicas that can show us some things that are the same, and many others that are not.” And yet the similarities are often remarkable, as Dr. Arlotta and her colleagues recently demonstrated in a new report on their long-lived organoids. After the organoids started growing in 2018, their neurons began behaving like the those in a fetal human brain, down to way their genes switched on and off. And as the months passed, the neurons matured to resemble the neurons in a baby after birth. © 2025 The New York Times Company
Keyword: Development of the Brain
Link ID: 30003 - Posted: 11.08.2025
Miryam Naddaf Scientists have created the most detailed maps yet of how our brains differentiate from stem cells during embryonic development and early life. In a Nature collection including five papers published yesterday, researchers tracked hundreds of thousands of early brain cells in the cortices of humans and mice, and captured with unprecedented precision the molecular events that give rise to a mixture of neurons and supporting cells. “It’s really the initial first draft of any ‘cell atlases’ for the developing brain,” says Hongkui Zeng, executive vice-president director of the Allen Institute for Brain Science in Seattle, Washington, and a co-author of two papers in the collection. These atlases could offer new ways to study neurological conditions such as autism and schizophrenia. Researchers can now “mine the data, find genes that may be critical for a particular event in a particular cell type and at a particular time point”, says Zeng. “We have a very exciting time coming,” adds Zoltán Molnár, a developmental neuroscientist at the University of Oxford, UK, who was not involved with any of the studies. The work is part of the BRAIN Initiative Cell Atlas Network (BICAN) — a project launched in 2022 by the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative at the US National Institutes of Health with US$500 million in funding to build reference maps of mammalian brains. Patterns of development Two of the papers map parts of the mouse cerebral cortex — the area of the brain involved in cognitive functions and perception. Zeng and her colleagues focused on how the visual cortex develops from 11.5-day-old embryos to 56-day-old mice. They created an atlas of 568,654 individual cells and identified 148 cell clusters and 714 subtypes1. “It’s the first complete high-resolution atlas of the cortical development, including both prenatal and postnatal” phases, says Zeng. © 2025 Springer Nature Limited
Keyword: Development of the Brain; Neurogenesis
Link ID: 30002 - Posted: 11.08.2025
By Holly Barker Multiple mouse and human brain atlases track the emergence of distinct cell types during development and uncover some of the pathways that decide a cell’s fate. The findings were published today in a collection of Nature papers. The papers highlight the timing and location of cell diversification and offer fresh insights into the evolution of those cells. Neuronal subtypes emerge at starkly different times in distinct brain regions, according to multiple mouse studies. And the work upends ideas about cell migration, including the notion that a portion of cortical neurons are made on site, developmental maps of the human brain suggest. “This is a dramatic revision of the fundamental principles that we thought were true in the cerebral cortex,” says Tomasz Nowakowski, associate professor of neurological surgery, anatomy and psychiatry, and of behavioral sciences, at the University of California, San Francisco and an investigator on one of the new studies. The special issue comprises 12 papers—including 6 newly published ones—from groups working as part of the BRAIN Initiative Cell Atlas Network. The work builds on the network’s complete cell census, published in 2023, that cataloged 34 classes and 5,322 unique cell types in the adult mouse brain. “Those cell types don’t appear out of a vacuum at the same time,” says Nowakowski, who co-authored a commentary on the new collection. Pinpointing when those cells emerge and where they originate from was the “obvious next question,” he says. At birth, the mouse brain contains all the initial cell classes that diversify into the multitude of neurons and glia found in older rodents. But precisely when that diversification occurs varies among brain regions: In the visual cortex, new cell types emerge weeks after birth and peak twice—once when the animal first opens its eyes and then again at the onset of the critical period, according to one study. © 2025 Simons Foundation
Keyword: Development of the Brain; Neurogenesis
Link ID: 30001 - Posted: 11.08.2025
By Paula Span For years, the two patients had come to the Penn Memory Center at the University of Pennsylvania, where doctors and researchers follow people with cognitive impairment as they age, as well as a group with normal cognition. Both patients, a man and a woman, had agreed to donate their brains after they died for further research. “An amazing gift,” said Dr. Edward Lee, the neuropathologist who directs the brain bank at the university’s Perelman School of Medicine. “They were both very dedicated to helping us understand Alzheimer’s disease.” The man, who died at 83 with dementia, had lived in the Center City neighborhood of Philadelphia with hired caregivers. The autopsy showed large amounts of amyloid plaques and tau tangles, the proteins associated with Alzheimer’s disease, spreading through his brain. Researchers also found infarcts, small spots of damaged tissue, indicating that he had suffered several strokes. By contrast, the woman, who was 84 when she died of brain cancer, “had barely any Alzheimer’s pathology,” Dr. Lee said. “We had tested her year after year, and she had no cognitive issues at all.” The man had lived a few blocks from Interstate 676, which slices through downtown Philadelphia. The woman had lived a few miles away in the suburb of Gladwyne, Pa., surrounded by woods and a country club. The amount of air pollution she was exposed to — specifically, the level of fine particulate matter called PM2.5 — was less than half that of his exposure. Was it a coincidence that he had developed severe Alzheimer’s while she had remained cognitively normal? With increasing evidence that chronic exposure to PM2.5, a neurotoxin, not only damages lungs and hearts but is also associated with dementia, probably not. © 2025 The New York Times Company
Keyword: Alzheimers; Neurotoxins
Link ID: 30000 - Posted: 11.05.2025
Ian Sample Science editor Even modest amounts of daily exercise may slow the progression of Alzheimer’s disease in older people who are at risk of developing the condition, researchers have said. People are often encouraged to clock up 10,000 steps a day as part of a healthy routine, but scientists found 3,000 steps or more appeared to delay the brain changes and cognitive decline that Alzheimer’s patients experience. Results from the 14-year-long study showed cognitive decline was delayed by an average of three years in people who walked 3,000 to 5,000 steps a day, and by seven years in those who managed 5,000 to 7,000 steps daily. “We’re encouraging older people who are at risk of Alzheimer’s to consider making small changes to their activity levels, to build sustained habits that protect or benefit their brain and cognitive health,” said Dr Wai-Ying Yau, the first author on the study at Mass General Brigham hospital in Boston. Dementia affects an estimated 50 million people worldwide, with Alzheimer’s disease the most common cause. In the UK, more than 500,000 people have Alzheimer’s. The condition is linked to the buildup of two toxic forms of proteins in the brain, namely amyloid-beta plaques and tau tangles. Yau and her colleagues analysed data from 296 people aged 50 to 90 who were cognitively unimpaired at the beginning of the study. The data included annual cognitive assessments, step counts measured by pedometers, and PET imaging to detect levels of amyloid and tau in the volunteers’ brains. People with little brain amyloid at the start showed very little cognitive decline or buildup of tau protein over the course of the study. The risk of Alzheimer’s was greater for those with elevated amyloid at baseline, and among them, higher step counts were linked to slower rates of cognitive decline and a delayed buildup of tau proteins. In sedentary individuals, the buildup of tau and cognitive decline was substantially faster, the researchers report in the journal Nature Medicine. © 2025 Guardian News & Media Limited
Keyword: Alzheimers
Link ID: 29998 - Posted: 11.05.2025


.gif)

