Chapter 13. Memory and Learning

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 2058

Jon Hamilton New tests of blood and spinal fluid could help doctors quickly identify patients who would most benefit from treatment. New tests of blood and spinal fluid could help doctors quickly identify patients who would most benefit from treatment. Andrew Brookes/Getty Images When doctors suspect Alzheimer's, they can order a blood test to learn whether a patient's brain contains the sticky amyloid plaques that are a hallmark of the disease. But the results of that test won't tell the whole story, says Dr. Randall Bateman, a neurology professor at Washington University in St. Louis. "People can have a head full of amyloid, but no dementia or memory loss," Bateman says. So he and a team of scientists have developed a new blood test that can show whether Alzheimer's has actually begun to affect a person's thinking and memory. It joins another new test, this one of spinal fluid, that can predict whether the brain changes associated with Alzheimer's are likely to affect cognitive function. "It's a strong indicator of memory impairment," says Tony Wyss-Coray, a neurology professor at Stanford University. Both tests, described in the journal Nature Medicine, could help doctors identify patients who are likely to benefit from drugs that can clear the brain of amyloid plaques. Both were developed with funding from the National Institutes of Health. © 2025 npr

Keyword: Alzheimers
Link ID: 29725 - Posted: 04.02.2025

By Sergiu P. Pasca The unbearable inaccessibility of the human brain has been a major barrier to understanding both how the human nervous system assembles itself and how psychiatric and neurological disorders emerge. But thanks to new advances, it is becoming possible to access functional aspects of human brain development and function that were previously out of reach. This progress has been driven primarily by advances in stem cell technologies, which make it possible to recapitulate developmental processes outside the human body. The journey began decades ago with the ability to grow stem cells in a dish, followed by the use of developmental signals to guide them into becoming neural cells. The field was truly catalyzed by the discovery of cell reprogramming and the democratization of stem cell technologies it enabled. Starting more than 15 years ago, my team and others began creating neurons from patients—initially rather inefficiently, but then with increasing ease as culture systems became more sophisticated. For example, cortical neurons derived from people with Timothy syndrome—a genetic form of autism and epilepsy caused by a mutation in a calcium channel present in excitable cells—revealed calcium deficits following depolarization. Some of these defects became more apparent when moving beyond traditional 2D preparations, such as when looking at the morphology of human neurons. For more than a decade, we and others have developed methods for growing these cells into more complex 3D structures, known as organoids, that mimic some of the structure and function of regions of the nervous system, offering a new window into human neurobiology and disease. Giving cells this third dimension of freedom unleashes self-organization: Mirroring in-vivo development, organoids generate diverse neural and glial cell types, starting from radial glia to intermediate progenitors, deep and superficial layer neurons and then astrocytes. These organoids can be maintained in vitro for years. Fascinatingly, developmental timing in organoids is largely preserved. For example, neurons maintained in culture for about nine months can transition to a postnatal state simply by surviving long enough in the dish. This observation in organoids offers a fundamental insight into development: Brain cells have an intrinsic, species-specific developmental “timer.” © 2025 Simons Foundation

Keyword: Development of the Brain
Link ID: 29724 - Posted: 04.02.2025

By Christina Caron Health Secretary Robert F. Kennedy Jr. has often criticized prescription stimulants, such as Adderall, that are primarily used to treat attention deficit hyperactivity disorder. “We have damaged this entire generation,” he said last year during a podcast, referring to the number of children taking psychiatric medications. “We have poisoned them.” In February, the “Make America Healthy Again” commission, led by Mr. Kennedy, announced plans to evaluate the “threat” posed by drugs like prescription stimulants. But are they a threat? And if so, to whom? Like many medications, prescription stimulants have potential side effects, and there are people who misuse them. Yet these drugs are also considered some of the most effective and well-researched treatments that psychiatry has to offer, said Dr. Jeffrey H. Newcorn, the director of the Division of A.D.H.D. and Learning Disorders at the Icahn School of Medicine at Mount Sinai in New York. Here are some answers to common questions and concerns about stimulants. What are prescription stimulants? Prescription stimulants are drugs that help change the way the brain works by increasing the communication among neurons. They are divided into two classes: methylphenidates (like Ritalin, Focalin and Concerta) and amphetamines (like Vyvanse and Adderall). © 2025 The New York Times Company

Keyword: ADHD; Drug Abuse
Link ID: 29723 - Posted: 04.02.2025

By RJ Mackenzie New footage documents microglia pruning synapses at high resolution and in real time. The recordings, published in January, add a new twist to a convoluted debate about the range of these cells’ responsibilities. Microglia are the brain’s resident immune cells. For about a decade, some have also credited them with pruning excess synaptic connections during early brain development. But that idea was based on static images showing debris from destroyed synapses within the cells—which left open the possibility that microglia clean up after neurons do the actual pruning. In the January movies, though, a microglia cell expressing a green fluorescent protein clearly reaches out a ghostly green tentacle to a budding presynapse on a neuron and lifts it away, leaving the neighboring blue axon untouched. “Their imaging is superb,” says Amanda Sierra, a researcher at the Achucarro Basque Center for Neuroscience, who was not involved in the work. But “one single video, or even two single videos, however beautiful they are, are not sufficient evidence that this is the major mechanism of synapse elimination,” she says. In the new study, researchers isolated microglia and neurons from mice and grew them in culture with astrocytes, labeling the microglia, synapses and axons with different fluorescent dyes. Their approach ensured that the microglia formed ramified processes—thin, branching extensions that don’t form when they are cultured in isolation, says Ryuta Koyama, director of the Department of Translational Neurobiology at Japan’s National Center of Neurology and Psychiatry, who led the work. “People now know that ramified processes of microglia are really necessary to pick up synapses,” he says. “In normal culture systems, you can’t find ramified processes. They look amoeboid.” © 2025 Simons Foundation

Keyword: Learning & Memory; Glia
Link ID: 29720 - Posted: 03.27.2025

By Catherine Offord Scientists say they have found a long–sought-after population of stem cells in the retina of human fetuses that could be used to develop therapies for one of the leading causes of blindness. The use of fetal tissue, a source of ethical debate and controversy in some countries, likely wouldn’t be necessary for an eventual therapy: Transplanting similar human cells generated in the lab into the eyes of mice with retinal disease protected the animals’ vision, the team reported this week in Science Translational Medicine. “I see this as potentially a very interesting advancement of this field, where we are really in need of a regenerative treatment for retinal diseases,” says Anders Kvanta, a retinal specialist at the Karolinska Institute who was not involved in the work. He and others note that more evidence is needed to show the therapeutic usefulness of the newly described cells. The retina, a layer of light-sensing tissue at the back of the eye, can degenerate with age or because of an inherited condition such as retinitis pigmentosa, a rare disease that causes gradual breakdown of retinal cells. Hundreds of millions of people worldwide are affected by retinal degeneration, and many suffer vision loss or blindness as a result. Most forms can’t be treated. Scientists have long seen a potential solution in stem cells, which can regenerate and repair injured tissue. Several early-stage clinical trials are already evaluating the safety and efficacy of transplanting stem cells derived from cell lines established from human embryos, for example, or adult human cells that have been reprogrammed to a stem-like state. Other approaches include transplanting so-called retinal progenitor cells (RPCs)—immature cells that give rise to photoreceptors and other sorts of retinal cells—from aborted human fetuses. Some researchers have argued that another type of cell, sometimes referred to as retinal stem cells (RSCs), could also treat retinal degeneration. These cells’ long lifespans and ability to undergo numerous cells divisions could make them better candidates to regenerate damaged tissue than RPCs. RSCs have been found in the eyes of zebrafish and some other vertebrates, but evidence for their existence in mammals has been controversial. Reports announcing their discovery in adult mice in the early 2000s were later discounted.

Keyword: Vision; Stem Cells
Link ID: 29719 - Posted: 03.27.2025

Ari Daniel Tristan Yates has no doubt about her first memory, even if it is a little fuzzy. "I was about three and a half in Callaway Gardens in Georgia," she recalls, "just running around with my twin sister trying to pick up Easter eggs." But she has zero memories before that, which is typical. This amnesia of our babyhood is pretty much the rule. "We have memories from what happened earlier today and memories from what happened earlier last week and even from a few years ago," says Yates, who's a cognitive neuroscientist at Columbia University. "But all of us lack memories from our infancy." Is that because we don't make memories when we're babies, or is there something else responsible? Now, in new research published by Yates and her colleagues in the journal Science, they propose that babies are able to form memories, even if they become inaccessible later in life. These results might reveal something crucial about the earliest moments of our development. "That's the time when we learn who our parents are, that's when we learn language, that's when we learn how to walk," Yates says. "What happens in your brain in the first two years of life is magnificent," says Nick Turk-Browne, a cognitive neuroscientist at Yale University. "That's the period of by far the greatest plasticity across your whole life span. And better understanding how your brain learns and remembers in infancy lays the foundation for everything you know and do for the rest of your life. © 2025 npr

Keyword: Learning & Memory; Development of the Brain
Link ID: 29715 - Posted: 03.22.2025

By Paula Span Joan Presky worries about dementia. Her mother lived with Alzheimer’s disease for 14 years, the last seven in a memory-care residence, and her maternal grandfather developed dementia, too. “I’m 100 percent convinced that this is in my future,” said Ms. Presky, 70, a retired attorney in Thornton, Colo. Last year, she spent almost a full day with a neuropsychologist, undergoing an extensive evaluation. The results indicated that her short-term memory was fine — which she found “shocking and comforting” — and that she tested average or above in every cognitive category but one. She’s not reassured. “I saw what Alzheimer’s was like,” she said of her mother’s long decline. “The memory of what she went through is profound for me.” The prospect of dementia, which encompasses Alzheimer’s disease and a number of other cognitive disorders, so frightens Americans that a recent study projecting steep increases in cases over the next three decades drew enormous public attention. The researchers’ findings, published in January in Nature Medicine, even showed up as a joke on the Weekend Update segment of “Saturday Night Live.” “Dementia is a devastating condition, and it’s very much related to the oldest ages,” said Dr. Josef Coresh, director of the Optimal Aging Institute at NYU Langone Health and the senior author of the study. “The globe is getting older.” Now the findings are being challenged by other dementia researchers who say that while increases are coming, they will be far smaller than Dr. Coresh and his co-authors predicted. © 2025 The New York Times Company

Keyword: Alzheimers
Link ID: 29713 - Posted: 03.22.2025

By Laura Sanders There are countless metaphors for memory. It’s a leaky bucket, a steel trap, a file cabinet, words written in sand. But one of the most evocative — and neuroscientifically descriptive — invokes Lego bricks. A memory is like a Lego tower. It’s built from the ground up, then broken down, put away in bins and rebuilt in a slightly different form each time it’s taken out. This metaphor is beautifully articulated by psychologists Ciara Greene and Gillian Murphy in their new book, Memory Lane. Perhaps the comparison speaks to me because I have watched my kids create elaborate villages of Lego bricks, only to be dismantled, put away (after much nagging) and reconstructed, always with a similar overall structure but with minor and occasionally major changes. These villages’ blueprints are largely stable, but also fluid and flexible, subject to the material whims of the builders at any point in time. Memory works this way, too, Greene and Murphy propose. Imagine your own memory lane as a series of buildings, modified in ways both small and big each time you call them to mind. “As we walk down Memory Lane, the buildings we pass — our memories of individual events — are under constant reconstruction,” Greene and Murphy write. In accessible prose, the book covers a lot of ground, from how we form memories to how delicate those memories really are. Readers may find it interesting (or perhaps upsetting) to learn how bad we all are at remembering why we did something, from trivial choices, like buying an album, to consequential ones, such as a yes or no vote on an abortion referendum. People change their reasoning — or at least, their memories of their reasoning — on these sorts of events all the time. © Society for Science & the Public 2000–2025

Keyword: Learning & Memory
Link ID: 29712 - Posted: 03.22.2025

By Claudia López Lloreda For a neuroscientist, the opportunity to record single neurons in people doesn’t knock every day. It is so rare, in fact, that after 14 years of waiting by the door, Florian Mormann says he has recruited just 110 participants—all with intractable epilepsy. All participants had electrodes temporarily implanted in their brains to monitor their seizures. But the slow work to build this cohort is starting to pay off for Mormann, a group leader at the University of Bonn, and for other researchers taking a similar approach, according to a flurry of studies published in the past year. For instance, certain neurons selectively respond not only to particular scents but also to the words and images associated with them, Mormann and his colleagues reported in October. Other neurons help to encode stimuli, form memories and construct representations of the world, recent work from other teams reveals. Cortical neurons encode specific information about the phonetics of speech, two independent teams reported last year. Hippocampal cells contribute to working memory and map out time in novel ways, two other teams discovered last year, and some cells in the region encode information related to a person’s changing knowledge about the world, a study published in August found. These studies offer the chance to answer questions about human brain function that remain challenging to answer using animal models, says Ziv Williams, associate professor of neurosurgery at Harvard Medical School, who led one of the teams that worked on speech phonetics. “Concept cells,” he notes by way of example, such as those Mormann identified, or the “Jennifer Aniston” neurons famously described in a 2005 study, have proved elusive in the monkey brain. © 2025 Simons Foundation

Keyword: Attention; Learning & Memory
Link ID: 29709 - Posted: 03.19.2025

Nicola Davis Science correspondent Cat owners are being asked share their pet’s quirky traits and even post researchers their fur in an effort to shed light on how cats’ health and behaviour are influenced by their genetics. The scientists behind the project, Darwin’s Cats, are hoping to enrol 100,000 felines, from pedigrees to moggies, with the DNA of 5,000 cats expected to be sequenced in the next year. The team say the goal is to produce the world’s largest feline genetic database. “Unlike most existing databases, which tend to focus on specific breeds or veterinary applications, Darwin’s Cats is building a diverse, large-scale dataset that includes pet cats, strays and mixed breeds from all walks of life,” said Dr Elinor Karlsson, the chief scientist at the US nonprofit organisation Darwin’s Ark, director of the vertebrate genomics group at the Broad Institute of MIT and Harvard and associate professor at the UMass Chan medical school. “It’s important to note, this is an open data project, so we will share the data with other scientists as the dataset grows,” she added. The project follows on the heels of Darwin’s Dogs, a similar endeavour that has shed light on aspects of canine behaviour, disease and the genetic origins of modern breeds. Darwin’s Cats was launched in mid-2024 and already has more than 3,000 cats enrolled, although not all have submitted fur samples. Participants from all parts of the world are asked to complete a number of free surveys about their pet’s physical traits, behaviour, environment, and health. © 2025 Guardian News & Media Limited

Keyword: Genes & Behavior; Development of the Brain
Link ID: 29708 - Posted: 03.19.2025

By Gina Kolata Women’s brains are superior to men’s in at least in one respect — they age more slowly. And now, a group of researchers reports that they have found a gene in mice that rejuvenates female brains. Humans have the same gene. The discovery suggests a possible way to help both women and men avoid cognitive declines in advanced age. The study was published Wednesday in the journal Science Advances. The journal also published two other studies on women’s brains, one on the effect of hormone therapy on the brain and another on how age at the onset of menopause shapes the risk of getting Alzheimer’s disease. The evidence that women’s brains age more slowly than men’s seemed compelling. Researchers, looking at the way the brain uses blood sugar, had already found that the brains of aging women are years younger, in metabolic terms, than the brains of aging men. Other scientists, examining markings on DNA, found that female brains are a year or so younger than male brains. And careful cognitive studies of healthy older people found that women had better memories and cognitive function than men of the same age. Dr. Dena Dubal, a professor of neurology at the University of California, San Francisco, set out to understand why. “We really wanted to know what could underlie this female resilience,” Dr. Dubal said. So she and her colleagues focused on the one factor that differentiates females and males: the X chromosome. Females have two X chromosomes; males have one X and one Y chromosome. Early in pregnancy, one of the X chromosomes in females shuts down and its genes go nearly silent. But that silencing changes in aging, Dr. Dubal found. © 2025 The New York Times Company

Keyword: Alzheimers; Sexual Behavior
Link ID: 29704 - Posted: 03.12.2025

By Kelly Servick New York City—A recent meeting here on consciousness started from a relatively uncontroversial premise: A newly fertilized human egg isn’t conscious, and a preschooler is, so consciousness must emerge somewhere in between. But the gathering, sponsored by New York University (NYU), quickly veered into more unsettled territory. At the Infant Consciousness Conference from 28 February to 1 March, researchers explored when and how consciousness might arise, and how to find out. They also considered hints from recent brain imaging studies that the capacity for consciousness could emerge before birth, toward the end of gestation. “Fetal consciousness would have been a less central topic at a meeting like this a few years ago,” says Claudia Passos-Ferreira, a bioethicist at NYU who co-organized the gathering. The conversation has implications for how best to care for premature infants, she says, and intersects with thorny issues such as abortion. “Whatever you claim about this, there are some moral implications.” How to define consciousness is itself the subject of debate. “Each of us might have a slightly different definition,” neuroscientist Lorina Naci of Trinity College Dublin acknowledged at the meeting before describing how she views consciousness—as the capacity to have an experience or a subjective point of view. There’s also vigorous debate about where consciousness arises in the brain and what types of neural activity define it. That makes it hard to agree on specific markers of consciousness in beings—such as babies—that can’t talk about their experience. Further complicating the picture, the nature of consciousness could be different for infants than adults, researchers noted at the meeting. And it may emerge gradually versus all at once, on different timescales for different individuals.

Keyword: Consciousness; Development of the Brain
Link ID: 29703 - Posted: 03.12.2025

By Angie Voyles Askham Synaptic plasticity in the hippocampus involves both strengthening relevant connections and weakening irrelevant ones. That sapping of synaptic links, called long-term depression (LTD), can occur through two distinct routes: the activity of either NMDA receptors or metabotropic glutamate receptors (mGluRs). The mGluR-dependent form of LTD, required for immediate translation of mRNAs at the synapse, appears to go awry in fragile X syndrome, a genetic condition that stems from loss of the protein FMRP and is characterized by intellectual disability and often autism. Possibly as a result, mice that model fragile X exhibit altered protein synthesis regulation in the hippocampus, an increase in dendritic spines and overactive neurons. Treatments for fragile X that focus on dialing down the mGluR pathway and tamping down protein synthesis at the synapse have shown success in quelling those traits in mice, but they have repeatedly failed in human clinical trials. But the alternative pathway—via the NMDA receptor—may provide better results, according to a new study. Signaling through the NMDA receptor subunit GluN2B can also decrease spine density and alleviate fragile-X-linked traits in mice, the work shows. “You don’t have to modulate the protein synthesis directly,” says Lynn Raymond, professor of psychiatry and chair in neuroscience at the University of British Columbia, who was not involved in the work. Instead, activation of part of the GluN2B subunit can indirectly shift the balance of mRNAs that are translated at the synapse. “It’s just another piece of the puzzle, but I think it’s a very important piece,” she says. Whether this insight will advance fragile X treatments remains to be seen, says Wayne Sossin, professor of neurology and neurosurgery at Montreal Neurological Institute-Hospital, who was not involved in the study. Multiple groups have cured fragile-X-like traits in mice by altering what happens at the synapse, he says. “Altering translation in a number of ways seems to change the balance that is off when you lose FMRP. And it’s not really clear how specific that is for FMRP.” © 2025 Simons Foundation

Keyword: Development of the Brain; Learning & Memory
Link ID: 29700 - Posted: 03.12.2025

By Tim Vernimmen On a rainy day in July 2024, Tim Bliss and Terje Lømo are in the best of moods, chuckling and joking over brunch, occasionally pounding the table to make a point. They’re at Lømo’s house near Oslo, Norway, where they’ve met to write about the late neuroscientist Per Andersen, in whose lab they conducted groundbreaking experiments more than 50 years ago. The duo only ever wrote one research paper together, in 1973, but that work is now considered a turning point in the study of learning and memory. Published in the Journal of Physiology, it was the first demonstration that when a neuron — a cell that receives and sends signals throughout the nervous system — signals to another neuron frequently enough, the second neuron will later respond more strongly to new signals, not for just seconds or minutes, but for hours. It would take decades to fully understand the implications of their research, but Bliss and Lømo had discovered something momentous: a phenomenon called long-term potentiation, or LTP, which researchers now know is fundamental to the brain’s ability to learn and remember. Today, scientists agree that LTP plays a major role in the strengthening of neuronal connections, or synapses, that allow the brain to adjust in response to experience. And growing evidence suggests that LTP may also be crucially involved in a variety of problems, including memory deficits and pain disorders. Bliss and Lømo never wrote another research article together. In fact, they would soon stop working on LTP — Bliss for about a decade, Lømo for the rest of his life. Although the researchers knew they had discovered something important, at first the paper “didn’t make a big splash,” Bliss says. By the early 1970s, neuroscientist Eric Kandel had demonstrated that some simple forms of learning can be explained by chemical changes in synapses — at least in a species of sea slug. But scientists didn’t yet know if such findings applied to mammals, or if they could explain more complex and enduring types of learning, such as the formation of memories that may last for years.

Keyword: Learning & Memory
Link ID: 29694 - Posted: 03.05.2025

By Lola Butcher Last September, Eliezer Masliah, a prominent Alzheimer’s disease researcher, stepped away from his influential position at the National Institutes of Health after the organization, where he oversaw a $2.6 billion budget for neuroscience research, found falsified or fabricated images in his scientific articles. That same month, the Securities and Exchange Commission announced neuroscientist Lindsay Burns, her boss, and their company would pay more than $40 million to settle charges they had made misleading statements about research results from their clinical trial of a possible treatment for Alzheimer’s disease. Also in September: A $30 million clinical trial to study a stroke treatment developed by Berislav Zlokovic, a well-known Alzheimer’s expert, and his colleagues was canceled amid an investigation into whether he had manipulated images and data in research publications. Shortly thereafter, Zlokovic, director of the Zilkha Neurogenetic Institute at the University of Southern California medical school, was placed on indefinite administrative leave. Is there a pattern here? And, if there is, can neurology patients trust treatments that are based on published scientific research? That is what Charles Piller, an investigative reporter for Science magazine, examines in “Doctored: Fraud, Arrogance, and Tragedy in the Quest to Cure Alzheimer’s,” and his analysis is not comforting. As for the first question — is there a pattern? — Piller’s relentless reporting reveals that dozens of neuroscientists, including some of the most prominent in the world, appear to be responsible for inaccurate images in their published research. Those problematic images have prompted many of their articles to be retracted, corrected, or flagged as being “of concern” by the journals in which they were published.

Keyword: Alzheimers
Link ID: 29687 - Posted: 03.01.2025

By Ingrid Wickelgren After shuffling the cards in a standard 52-card deck, Alex Mullen, a three-time world memory champion, can memorize their order in under 20 seconds. As he flips though the cards, he takes a mental walk through a house. At each point in his journey — the mailbox, front door, staircase and so on — he attaches a card. To recall the cards, he relives the trip. This technique, called “method of loci” or “memory palace,” is effective because it mirrors the way the brain naturally constructs narrative memories: Mullen’s memory for the card order is built on the scaffold of a familiar journey. We all do something similar every day, as we use familiar sequences of events, such as the repeated steps that unfold during a meal at a restaurant or a trip through the airport, as a home for specific details — an exceptional appetizer or an object flagged at security. The general narrative makes the noteworthy features easier to recall later. “You are taking these details and connecting them to this prior knowledge,” said Christopher Baldassano (opens a new tab), a cognitive neuroscientist at Columbia University. “We think this is how you create your autobiographical memories.” Psychologists empirically introduced (opens a new tab) this theory some 50 years ago, but proof of such scaffolds in the brain was missing. Then, in 2018, Baldassano found it: neural fingerprints of narrative experience, derived from brain scans, that replay sequentially during standard life events. He believes that the brain builds a rich library of scripts for expected scenarios — restaurant or airport, business deal or marriage proposal — over a person’s lifetime. These standardized scripts, and departures from them, influence how and how well we remember specific instances of these event types, his lab has found. And recently, in a paper published in Current Biology in fall 2024, they showed that individuals can select a dominant script (opens a new tab) for a complex, real-world event — for example, while watching a marriage proposal in a restaurant, we might opt, subconsciously, for either a proposal or a restaurant script — which determines what details we remember. © 2025 Simons Foundation

Keyword: Learning & Memory; Attention
Link ID: 29685 - Posted: 02.26.2025

Jon Hamilton People who inherit one very rare gene mutation are virtually guaranteed to develop Alzheimer's before they turn 50. Except for Doug Whitney. "I'm 75 years old, and I think I'm functioning fairly well," says Whitney, who lives near Seattle. "I'm still not showing any of the symptoms of Alzheimer's." Now a team of scientists is trying to understand how Whitney's brain has defied his genetic destiny. "If we are able to learn what is causing the protection here, then we could translate that to therapeutic approaches and apply that to the more common forms of the disease," says Dr. Jorge Llibre-Guerra, an assistant professor of neurology at Washington University School of Medicine in St. Louis. One possibility is high levels of heat shock proteins found in Whitney's brain, the team reports in the journal Nature Medicine. There are hints that these proteins can prevent the spread of a toxic protein that is one of the hallmarks of Alzheimer's, Llibre-Guerra says. A genetic surprise Early-onset Alzheimer's is everywhere in Whitney's family. His mother and 11 of her 13 siblings all had the disease by about age 50. "None of them lasted past 60," Whitney says. Whitney's wife, Ione, saw this up close. "We went home for Thanksgiving, and his mom couldn't remember the pumpkin pie recipe," she says. "A year later when we went back, she was already wandering off and not finding her way back home." © 2025 npr

Keyword: Alzheimers; Genes & Behavior
Link ID: 29675 - Posted: 02.19.2025

By Michael S. Rosenwald In early February, Vishvaa Rajakumar, a 20-year-old Indian college student, won the Memory League World Championship, an online competition pitting people against one another with challenges like memorizing the order of 80 random numbers faster than most people can tie a shoelace. The renowned neuroscientist Eleanor Maguire, who died in January, studied mental athletes like Mr. Rajakumar and found that many of them used the ancient Roman “method of loci,” a memorization trick also known as the “memory palace.” The technique takes several forms, but it generally involves visualizing a large house and assigning memories to rooms. Mentally walking through the house fires up the hippocampus, the seahorse-shaped engine of memory deep in the brain that consumed Dr. Maguire’s career. We asked Mr. Rajakumar about his strategies of memorization. His answers, lightly edited and condensed for clarity, are below. Q. How do you prepare for the Memory League World Championship? Hydration is very important because it helps your brain. When you memorize things, you usually sub-vocalize, and it helps to have a clear throat. Let’s say you’re reading a book. You’re not reading it out loud, but you are vocalizing within yourself. If you don’t drink a lot of water, your speed will be a bit low. If you drink a lot of water, it will be more and more clear and you can read it faster. Q. What does your memory palace look like? Let’s say my first location is my room where I sleep. My second location is the kitchen. And the third location is my hall. The fourth location is my veranda. Another location is my bathroom. Let’s say I am memorizing a list of words. Let’s say 10 words. What I do is, I take a pair of words, make a story out of them and place them in a location. And I take the next two words. I make a story out of them. I place them in the second location. The memory palace will help you to remember the sequence. © 2025 The New York Times Company

Keyword: Learning & Memory; Attention
Link ID: 29673 - Posted: 02.15.2025

By Angie Voyles Askham Identifying what a particular neuromodulator does in the brain—let alone how such molecules interact—has vexed researchers for decades. Dopamine agonists increase reward-seeking, whereas serotonin agonists decrease it, for example, suggesting that the two neuromodulators act in opposition. And yet, neurons in the brain’s limbic regions release both chemicals in response to a reward (and also to a punishment), albeit on different timescales, electrophysiological recordings have revealed, pointing to a complementary relationship. This dual response suggests that the interplay between dopamine and serotonin may be important for learning. But no tools existed to simultaneously manipulate the neuromodulators and test their respective roles in a particular area of the brain—at least, not until now—says Robert Malenka, professor of psychiatry and behavioral sciences at Stanford University. As it turns out, serotonin and dopamine join forces in the nucleus accumbens during reinforcement learning, according to a new study Malenka led, yet they act in opposition: dopamine as a gas pedal and serotonin as a brake on signaling that a stimulus is rewarding. The mice he and his colleagues studied learned faster and performed more reliably when the team optogenetically pressed on the animals’ dopamine “gas” as they simultaneously eased off the serotonin “brake.” “It adds a very rich and beguiling picture of the interaction between dopamine and serotonin,” says Peter Dayan, director of computational neuroscience at the Max Planck Institute for Biological Cybernetics. In 2002, Dayan proposed a different framework for how dopamine and serotonin might work in opposition, but he was not involved in the new study. The new work “partially recapitulates” that 2002 proposal, Dayan adds, “but also poses many more questions.” © 2025 Simons Foundation

Keyword: Learning & Memory
Link ID: 29672 - Posted: 02.15.2025

By Michael S. Rosenwald Eleanor Maguire, a cognitive neuroscientist whose research on the human hippocampus — especially those belonging to London taxi drivers — transformed the understanding of memory, revealing that a key structure in the brain can be strengthened like a muscle, died on Jan. 4 in London. She was 54. Her death, at a hospice facility, was confirmed by Cathy Price, her colleague at the U.C.L. Queen Square Institute of Neurology. Dr. Maguire was diagnosed with spinal cancer in 2022 and had recently developed pneumonia. Working for 30 years in a small, tight-knit lab, Dr. Maguire obsessed over the hippocampus — a seahorse-shaped engine of memory deep in the brain — like a meticulous, relentless detective trying to solve a cold case. An early pioneer of using functional magnetic resonance imaging (f.M.R.I.) on living subjects, Dr. Maguire was able to look inside human brains as they processed information. Her studies revealed that the hippocampus can grow, and that memory is not a replay of the past but rather an active reconstructive process that shapes how people imagine the future. “She was absolutely one of the leading researchers of her generation in the world on memory,” Chris Frith, an emeritus professor of neuropsychology at University College London, said in an interview. “She changed our understanding of memory, and I think she also gave us important new ways of studying it.” In 1995, while she was a postdoctoral fellow in Dr. Frith’s lab, she was watching television one evening when she stumbled on “The Knowledge,” a quirky film about prospective London taxi drivers memorizing the city’s 25,000 streets to prepare for a three-year-long series of licensing tests. Dr. Maguire, who said she rarely drove because she feared never arriving at her destination, was mesmerized. “I am absolutely appalling at finding my way around,” she once told The Daily Telegraph. “I wondered, ‘How are some people so bloody good and I am so terrible?’” In the first of a series of studies, Dr. Maguire and her colleagues scanned the brains of taxi drivers while quizzing them about the shortest routes between various destinations in London. © 2025 The New York Times Company

Keyword: Learning & Memory
Link ID: 29671 - Posted: 02.15.2025