Chapter 17. Learning and Memory
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
Lynne Peeples Sometimes the hardest part of doing an unpleasant task is simply getting started — typing the first word of a long report, lifting a dirty dish on the top of an overfilled sink or removing clothes from an unused exercise machine. The obstacle isn’t necessarily a lack of interest in completing a task, but the brain’s resistance to taking the first step. Now, scientists might have identified the neural circuit behind this resistance, and a way to ease it. In a study1 published today in Current Biology, researchers describe a pathway in the brain that seems to act as a ‘motivation brake’, dampening the drive to begin a task. When the team selectively suppressed this circuit in macaque monkeys, goal-directed behaviour rebounded. “The change after this modulation was dramatic,” says study co-author Ken-ichi Amemori, a neuroscientist at Kyoto University in Japan. The motivation brake, which can be particularly stubborn for people with certain psychiatric conditions, such as schizophrenia and major depressive disorder, is distinct from the avoidance of tasks driven by risk aversion in anxiety disorders. Pearl Chiu, a computational psychiatrist at Virginia Tech in Roanoke, who was not involved in the study, says that understanding this difference is essential for developing new treatments and refining current ones. “Being able to restore motivation, that’s especially exciting,” she says. Motivated macaques Previous work on task initiation has implicated a neural circuit connecting two parts of the brain known as the ventral striatum and ventral pallidum, both of which are involved in processing motivation and reward2,3,4. But attempts to isolate the circuit’s role have fallen short. Electrical stimulation, for example, inadvertently activates downstream regions, affecting motivation, but also anxiety. © 2026 Springer Nature Limited
Keyword: Learning & Memory; Emotions
Link ID: 30079 - Posted: 01.14.2026
By Holly Barker In early life, astrocytes help to mold neural pathways in response to the environment. In adulthood, however, those cells curb plasticity by secreting a protein that stabilizes circuits, according to a mouse study published last month in Nature. “It’s a new and unique take on the field,” says Ciaran Murphy-Royal, assistant professor of neuroscience at Montreal University, who was not involved in the study. Most research focuses on how glial cells drive plasticity but “not how they apply the brakes,” he says. Astrocytes promote synaptic remodeling during the development of sensory circuits by secreting factors and exerting physical control—in humans, a single astrocyte can clamp onto 2 million synapses, previous studies suggest. But the glial cells are also responsible for shutting down critical periods for vision and motor circuits in mice and fruit flies, respectively. It has been unclear whether this loss of plasticity can be reversed. Some evidence hints that modifying the neuronal environment—through matrix degradation or transplantation of young neurons—can rekindle flexibility in adult brains. The new findings confirm that in adulthood, plasticity is only dormant, rather than lost entirely, says Nicola Allen, professor of molecular neurobiology at the Salk Institute for Biological Studies and an investigator on the new paper. “Neurons don’t lose an intrinsic ability to remodel, but that process is controlled by secreted factors in the environment,” she says. Specifically, astrocytes orchestrate that dormancy by releasing CCN1, a protein that stabilizes circuits by prompting the maturation of inhibitory neurons and glial cells, Allen’s team found. The findings suggest that astrocytes have an active role in stabilizing adult brain circuits. © 2026 Simons Foundation
Keyword: Learning & Memory; Glia
Link ID: 30069 - Posted: 01.07.2026
Alison Abbott For decades, neuroscientists focused almost exclusively on only half of the cells in the brain. Neurons were the main players, they thought, and everything else was made up of uninteresting support systems. By the 2010s, memory researcher Inbal Goshen was beginning to question that assumption. She was inspired by innovative molecular tools that would allow her to investigate the contributions of another, more mysterious group of cells called astrocytes. What she discovered about their role in learning and memory excited her even more. At the beginning, she felt like an outsider, especially at conferences. She imagined colleagues thinking, “Oh, that’s the weird one who works on astrocytes,” says Goshen, whose laboratory is at the Hebrew University of Jerusalem. A lot of people were sceptical, she says. But not any more. A rush of studies from labs in many subfields are revealing just how important these cells are in shaping our behaviour, mood and memory. Long thought of as support cells, astrocytes are emerging as key players in health and disease. “Neurons and neural circuits are the main computing units of the brain, but it’s now clear just how much astrocytes shape that computation,” says neurobiologist Nicola Allen at the Salk Institute for Biological Studies in La Jolla, California, who has spent her career researching astrocytes and other non-neuronal cells, collectively called glial cells. “Glial meetings are now consistently oversubscribed.” As far back as the nineteenth century, scientists could see with their simple microscopes that mammalian brains included two major types of cell — neurons and glia — in roughly equal numbers. © 2025 Springer Nature Limited
Keyword: Glia
Link ID: 30038 - Posted: 12.03.2025
By Angie Voyles Askham Rats, like people, jump at the chance to repeat a task that rewards them handsomely, but they are less eager when the reward is paltry: They learn from past experience and update their behavior accordingly. That learning is shaped by the hormone estradiol, according to a new study. And when estradiol levels peak during the estrus cycle, female rats adapt their behavior in response to reward size more quickly than they do during other phases—and faster than males overall. The female rats also have a larger release of dopamine in response to an unexpected reward, along with reduced expression of dopamine transporters in a reward center of their brain after the hormone peaks, the new work shows. “It’s giving mechanistic insight into how estrogen modulates reinforcement learning—all the way down to the molecular mechanism,” says Ilana Witten, professor of neuroscience at Princeton University and Howard Hughes Medical Institute investigator, who was not involved in the study. The team behind the new work used a task that measures how much an animal values an anticipated reward: Thirsty rats poke their nose into a central port and then listen for a tone that indicates how much water one of two side ports will dispense. The animals choose to either hold out at the cued location for the reward or to abandon the trial and start a new one by poking their nose into the other side. Rats learn to initiate their next trial more quickly when the experiment is doling out large rewards and to hold off on initiating new trials when rewards are small, previous work from the group has shown. “It takes a lot of energy to initiate a trial, so if there are small rewards, it’s not as motivating,” says study investigator Carla Golden, a postdoctoral researcher in Christine Constantinople’s lab at New York University. © 2025 Simons Foundation
Keyword: Hormones & Behavior; Attention
Link ID: 30028 - Posted: 11.26.2025
By Oliver Whang Owen Collumb was paralyzed in 1993, when he was 21 years old. A tire on his motorbike blew out and he fell into a ravine, breaking a single bone in his spine. When he recovered, he couldn’t move his legs and could control only the biceps in his arms, meaning that he could lift his hands but, to put them down, he had to twist his shoulders and let gravity unbend his elbows. He spent years in an assisted living home before petitioning to move to his own place in Dublin, with the help of home aides. Living alone was liberating; he could choose what he ate and when he woke in the morning. He began working multiple jobs for foundations and advocating for people with disabilities. One of his assistants, Sylwia Filipiek, a Polish immigrant to Ireland, had been employed at a printing factory. She had no experience with home care and struggled to help Mr. Collumb into his wheelchair at first. But, over the years, they learned how to work together, and grew close. In the summer of 2024, Mr. Collumb and Ms. Filipiek flew to Bath, England, to train for the Cybathlon, an international competition run every four years to encourage the development of assistive technologies. The competition, hosted in Switzerland by the university ETH Zurich, consists of eight races for teams and their pilots (which is what the primary competitors, with varying disabilities, are called), each targeting different innovations, such as arm prostheses, leg prostheses and vision assistance. Each race consists of remote tasks that are supposed to simulate everyday life for the pilots: walking across a room, picking up a grocery bag, throwing a ball. One of Cybathlon’s founders, Roland Sigrist, compared it to Formula 1. Teams are encouraged to develop prototypes toward the ultimate goal of “the independence of people with disabilities,” but the competition is straightforward and real, with all its accompaniments: nerves, heartbreak, glory. The pilots are the ones that put themselves on the line. “They’re the masters of the technology, and not the other way around,” Mr. Sigrist said. © 2025 The New York Times Company
Keyword: Robotics
Link ID: 30018 - Posted: 11.19.2025
By Dana Rubi Levy, Kevin Mastro, Michael Ryan Any seasoned baker knows the importance of being flexible. If you are missing an ingredient or hosting a guest with dietary restrictions, you might need to swap yogurt for eggs or oil for butter. The final product may differ, but it can still be rich and satisfying. In much the same way, our brain constantly makes substitutions and adjustments in response to the inevitable changes in our internal and external environments. To understand these changes, scientists often compare the brain and behavior of older people, aged 60 and up, with those of younger people, aged 20 to 30. Despite considerable individual variability, older people—on average—have slower processing speeds, rely more on past experience to solve problems, and have less behavioral flexibility. These findings have shaped our theories about how age-related changes in the brain drive behavior. In recent years, however, a conceptual shift has emerged, raising questions about whether some age-related changes are not solely the result of cognitive decline. Instead, some may be adaptive and address age-related constraints, such as changes in metabolism and increased inflammation. Moreover, scientists have begun to question whether young adulthood, characterized by a period of highly flexible decision-making, is the right benchmark to assess cognition across the lifespan. Given the evolving landscape of the aging brain, change is necessary, and not all deviations from the young-adult “benchmark” should be seen as decline. The main challenge for neuroscientists is to determine which of these age-related adaptations are beneficial and which are detrimental. In other words, which substitutions retain the original flavors, and which result in a dish that falls flat? © 2025 Simons Foundation
Keyword: Development of the Brain; Learning & Memory
Link ID: 30012 - Posted: 11.15.2025
David Adam In a town on the shores of Lake Geneva sit clumps of living human brain cells for hire. These blobs, about the size of a grain of sand, can receive electrical signals and respond to them — much as computers do. Research teams from around the world can send the blobs tasks, in the hope that they will process the information and send a signal back. Welcome to the world of wetware, or biocomputers. In a handful of academic laboratories and companies, researchers are growing human neurons and trying to turn them into functional systems equivalent to biological transistors. These networks of neurons, they argue, could one day offer the power of a supercomputer without the outsized power consumption. The results so far are limited. But keen scientists are already buying or borrowing online access to these brain-cell processors — or even investing tens of thousands of dollars to secure their own models. Some want to use these biocomputers as straightforward replacements for ordinary computers, whereas others want to use them to study how brains work. “Trying to understand biological intelligence is a very interesting scientific problem,” says Benjamin Ward-Cherrier, a robotics researcher at the University of Bristol, UK, who rents time on the Swiss brain blobs. “And looking at it from the bottom up — with simple small versions of our brain and building those up — I think is a better way of doing it than top down.” Biocomputing advocates claim that these systems could one day rival the capability of artificial intelligence and the potential of quantum computers. Other researchers who work with human neurons are more sceptical of what’s possible. And they warn that hype — and the science-fictional allure of what are sometimes labelled brain-in-a-jar systems — could even be counterproductive. If the idea that these systems possess sentience and consciousness takes hold, there could be repercussions for the research community. © 2025 Springer Nature Limited
Keyword: Learning & Memory; Robotics
Link ID: 30010 - Posted: 11.12.2025
By Kevin Berger Steve Ramirez was feeling on top of the world in 2015. His father, Pedro Ramirez, had snuck into the United States in the 1980s to escape the civil war in El Salvador. Pedro Ramirez held jobs as a door-to-door salesman for tombstones, a janitor in a diner, and a technician in an animal lab. After years of ’round-the-clock work, Pedro Ramirez became a U.S. citizen. And here was his son, born in America, with a Ph.D. from the Massachusetts Institute of Technology, still in his 20s, being celebrated as one of the most exciting and promising neuroscientists in the country. Steve Ramirez had published research papers with his MIT mentor Xu Liu that reported how they used lasers to erase fear memories, spur positive memories, and even fabricate new memories in the brain. The experiments were only in mice. But they were impressive. Memories are made of networks of brain cells called engrams. The lasers targeted specific cells in engrams. Zap those cells and the whole engram was muted. The pair of neuroscientists gave a popular TED Talk on memory manipulation and were featured in international press stories that invariably mentioned the plotlines in the movies Eternal Sunshine of the Spotless Mind and Inception could be real. Bad memories could be deleted. New memories could be implanted. One night in 2013 Ramirez and Liu were celebrating the publication of one of their papers in a jazz lounge at the top of the Prudential Building in Boston. The music was grooving, and the city below glittered like stars. Ramirez thought, I’ve never been so happy and so fully alive. In early 2015, Liu, age 37, died suddenly. There had been no warning signs. Ramirez had never had a friend like Liu. Liu opened his mind to experiences in science he couldn’t have imagined. Their relationship felt organic from Ramirez’s first day in the lab. Liu joked they would always have chemistry doing science together. Grief is when the future your brain plans for is cut off. Ramirez’s thoughts of doing science without Liu became a trapdoor that landed him in a cellar of pain. © 2025 NautilusNext Inc.,
Keyword: Learning & Memory; Drug Abuse
Link ID: 30007 - Posted: 11.12.2025
By Nora Bradford Here are three words: pine, crab, sauce. There’s a fourth word that combines with each of the others to create another common word. What is it? When the answer finally comes to you, it’ll likely feel instantaneous. You might even say “Aha!” This kind of sudden realization is known as insight, and a research team recently uncovered how the brain produces it (opens a new tab), which suggests why insightful ideas tend to stick in our memory. Maxi Becker (opens a new tab), a cognitive neuroscientist at Duke University, first got interested in insight after reading the landmark 1962 book The Structure of Scientific Revolutions (opens a new tab) by the historian and philosopher of science Thomas Kuhn. “He describes how some ideas are so powerful that they can completely shift the way an entire field thinks,” she said. “That got me wondering: How does the brain come up with those kinds of ideas? How can a single thought change how we see the world?” Such moments of insight are written across history. According to the Roman architect and engineer Vitruvius, in the third century BCE the Greek mathematician Archimedes suddenly exclaimed “Eureka!” after he slid into a bathtub and saw the water level rise by an amount equal to his submerged volume (although this tale may be apocryphal (opens a new tab)). In the 17th century, according to lore, Sir Isaac Newton had a breakthrough in understanding gravity after an apple fell on his head. In the early 1900s, Einstein came to a sudden realization that “if a man falls freely, he would not feel his weight,” which led him to his theory of relativity, as he later described in a lecture. Insights are not limited to geniuses: We have these cognitive experiences all the time when solving riddles or dealing with social or intellectual problems. They are distinct from analytical problem-solving, such as the process of doing formulaic algebra, in which you arrive at a solution slowly and gradually as if you’re getting warmer. Instead, insights often follow periods of confusion. You never feel as if you’re getting warmer; rather, you go from cold to hot, seemingly in an instant. Or, as the neuropsychologist Donald Hebb, known for his work building neurobiological models of learning, wrote in the 1940s, sometimes “learning occurs as a single jump, an all-or-none affair.” © 2025 Simons Foundation
Keyword: Attention; Learning & Memory
Link ID: 30004 - Posted: 11.08.2025
By Sara Talpos As a new Ph.D. student in 2011, Steve Ramirez and his mentor performed a groundbreaking experiment in the field of memory manipulation. They placed a mouse in a small distinctive box and administered a mild electrical shock to its feet. When the rodent was placed in the box a second time, it froze up — anticipating another shock. From there, the young neuroscientists placed the mouse in a different box, one where nothing bad had happened. They then directed pulses of light to a very specific region in the mouse’s brain that had been genetically modified to respond to the light. This caused the mouse to immediately freeze. Ramirez and his mentor, it turned out, had found a way to artificially activate a fear-inducing memory. “How to Change a Memory: One Neuroscientist’s Quest to Alter the Past,” by Steve Ramirez will be available on November 4, 2025 (Princeton University Press, 256 pages). What was the point? A central goal of such science is to learn how memories form and function in the brain and to then apply this knowledge to treat brain disorders, writes Ramirez in his forthcoming book, “How to Change a Memory: One Neuroscientist’s Quest to Alter the Past.” Perhaps one day, he suggests, it will be possible to activate positive memories to curb depression or to retrieve memories that have seemingly been lost to Alzheimer’s disease. In the book, Ramirez explores the fascinating science of memory while tracing his own journey to becoming a successful professor at Boston University. His path was not without challenges, including the sudden death of his mentor and a decade-long struggle with alcohol addiction. “This book,” he writes, “is my attempt to make sense of the enigma of memory — the snippets of remembrances, the brief moments in time, the decisions we make, the blackouts, the imagined, and the dreamt of — all the things the brain does to breathe life into the past so that we can heal and become whole again.”
Keyword: Depression; Learning & Memory
Link ID: 29988 - Posted: 10.29.2025
Jon Hamilton Scientists are reporting the first compelling evidence in people that cognitive training can boost levels of a brain chemical that typically declines with age. A 10-week study of people 65 or older found that doing rigorous mental exercises for 30 minutes a day increased levels of the chemical messenger acetylcholine by 2.3% in a brain area involved in attention and memory. This illustration shows a pink human brain with stick legs and stick arms. The pink stick arms are holding up a black barbell with black disk-shaped weights on each end. The background is light blue. Your Health Even healthy brains decline with age. Here's what you can do The increase "is not huge," says Étienne de Villers-Sidani, a neurologist at McGill University in Montreal. "But it's significant, considering that you get a 2.5% decrease per decade normally just with aging." So, at least in this brain area, cognitive training appeared to turn back the clock by about 10 years. The chemical change observed after intensive brain training is persuasive, says Michael Hasselmo, director of the Center for Systems Neuroscience at Boston University, who was not involved in the study. "It was compelling enough that I thought, 'Maybe I need to be doing this,'" he says. The result backs earlier research in animals showing that environments that stimulate the brain can increase levels of certain neurotransmitters. Studies of people have suggested that cognitive training can improve thinking and memory. Never skip brain day The study, funded by the National Institutes of Health, comes amid a proliferation of online brain-training programs, including Lumosity, Elevate, Peak, CogniFit and BrainHQ. © 2025 npr
Keyword: Alzheimers; Learning & Memory
Link ID: 29976 - Posted: 10.22.2025
Katie Kavanagh Why are we able to remember emotional events so well? According to a study published today in Nature1, a type of cell in the brain called an astrocyte is a key player in stabilizing memories for long-term recall. Astrocytes were thought to simply support neurons in creating the physical traces of memories in the brain, but the study found that they have a much more active role — and can even be directly triggered by repeated emotional experiences. The researchers behind the finding suggest that the cells could be a fresh target for treating memory conditions such as those associated with post-traumatic stress disorder and Alzheimer’s disease. “We provide an answer to the question of how a specific memory is stored for the long term,” says study co-author Jun Nagai, a neuroscientist at RIKEN Center for Brain Science in Wako, Japan. By studying astrocytes, Nagai said, the study identifies how the brain selectively filters important memories at the cellular level. Stable memories Nagai and his colleagues focused on the question of memory stabilization: how a short-term memory becomes more permanent in the brain. Previous research had found physical traces of memories in neuronal networks in brain regions such as the hippocampus and amygdala2. But it was unclear how these ‘engrams’ were stored in the brain as lasting memories after repeated exposure to the same stimulus. To dig deeper, the researchers developed a method for measuring activation patterns in astrocytes across a whole brain of a mouse as it completes a memory task. They measured the upregulation of a gene called Fos — an early marker of cell activity that is associated with the physical traces of memories in the brain3. © 2025 Springer Nature Limited
Keyword: Learning & Memory; Emotions
Link ID: 29975 - Posted: 10.18.2025
By David Adam In February of this year, George Mentis and his colleagues published data from a small clinical trial they said showed that degraded motor neurons aren’t irreparable. In the study, electrical stimulation to the spine in three people with spinal muscular atrophy (SMA) appeared to resuscitate lost motor neurons, the authors said, as well as restore some of the cellular processes needed to activate muscle. “It was incredible,” says Mentis, professor of pathology and cell biology (in neurology) at Columbia University. “We’re unleashing or tapping on the potential of dysfunctional neurons to show plasticity.” The authors wrote that the results showed it was possible to “effectively rescue motor neuron function” and that the electrical stimulation had rebuilt neuronal circuitry and reversed—at least for a while—some degeneration. Mentis and his team think their results are coalescing into a theory, even if they don’t fully understand it yet. The researchers are essentially altering the electrical properties of the motor neurons so they start to behave better and closer to normal, says Genís Prat-Ortega, a postdoctoral associate in the Rehab Neural Engineering Labs at the University of Pittsburgh and an investigator on the study. “The motor neurons change and repair,” he says. “Somehow, we are reversing a neurodegenerative process.” Not everyone is so sure. Tim Hagenacker, professor of neurology at the University of Duisburg-Essen, says rebuilding the neural circuit is “not entirely convincing” as an explanation for the study’s results. He thinks that “other cell types play a crucial co-role” in restoring neuronal plasticity or that dysfunctional motor neurons could exist in some form of hibernation. © 2025 Simons Foundation
Keyword: ALS-Lou Gehrig's Disease
; Regeneration
Link ID: 29965 - Posted: 10.11.2025
Violeta Ruiz On 25 November 1915, the American newspaper The Review published the extraordinary case of an 11-year-old boy with prodigious mathematical abilities. Perched on a hill close to a set of railroad tracks, he could memorise all the numbers of the train carriages that sped by at 30 mph, add them up, and provide the correct total sum. What was remarkable about the case was not just his ability to calculate large numbers (and read them on a moving vehicle), but the fact that he could barely eat unassisted or recognise the faces of people he met. The juxtaposition between his supposed arrested development and his numerical facility made his mathematical feats even more impressive. ‘How can you account for it?’ asked the article’s author. The answer took the form of a medical label: the boy was what 19th-century medicine termed an ‘idiot savant’. He possessed an exceptional talent, despite a profound impairment of the mental faculties that affected both his motor and social skills. A century after The Review relayed the prodigious child’s mathematical abilities, trying to understand ‘how they do it’ still drives psychological research into savantism or ‘savant syndrome’ to this day. The SSM Health Treffert Centre in Wisconsin – named after Darold Treffert (1933-2020), one of the leading experts in the field – defines the savant phenomenon as ‘a rare condition in which persons with various developmental disorders, including autistic disorder, have an amazing ability and talent’. Today, savantism is largely comprehended through the lens of neurodivergence, since the association between savantism and autism is strong: roughly one in 10 people with autism exhibit some savant skills, while savantism in the absence of autism is much rarer. Psychological studies by Simon Baron-Cohen and Michael Lombardo, for example, have focused on the neurological basis of ‘systemising’, where exceptional mathematical or musical skills exist among people diagnosed with autism: such people are ‘hypersystemisers’, that is, they are especially good at identifying ‘laws, rules, and/or regularities’. It is believed that their brain’s systemising mechanisms are ‘tuned to very high levels’, making them acutely sensitive to sensory input and also capable of intense attentional focus and rule-learning. © Aeon Media Group Ltd. 2012-2025.
Keyword: Intelligence; Learning & Memory
Link ID: 29964 - Posted: 10.11.2025
Gemma Conroy Whether it’s dancing the tango or playing the guitar, engaging in a creative pastime can slow brain ageing, according to a study of dancers, musicians, artists and video game players from multiple countries. The analysis used brain clocks — models that measure the difference between a person’s chronological age and the age their brain appears to be — to assess whether creative activities help to maintain neurological youth. In brain regions that are most susceptible to ageing, engaging in creative activities increased connections with different areas of the brain. Although experts had ‘younger’ brains than their less-experienced counterparts did, even learning a creative skill from scratch had an anti-ageing effect on the brain. The findings were published on 3 October in Nature Communications1. Song and dance Previous studies suggest that engaging in creative activities can help to keep the brain young and foster emotional well-being. But few have investigated the biological basis of these brain benefits or what drives them, says study co-author Agustín Ibáñez, a neuroscientist at Adolfo Ibáñez University in Santiago, Chile. “There is really poor mechanistic evidence,” he says. How fast are you ageing? Ordinary brain scans reveal the pace To address this gap, Ibáñez and his colleagues created brain clocks using neuroimaging data of brain activity taken from 1,240 participants across 10 countries. These machine-learning models used functional connectivity, a measure of how brain regions work together, to estimate brain age. The researchers then applied their brain clocks to 232 tango dancers, musicians, visual artists and video game players of different ages and experience levels to calculate their ‘brain age gap’ — the difference between their predicted brain age and their actual age. © 2025 Springer Nature Limited
Keyword: Alzheimers; Learning & Memory
Link ID: 29954 - Posted: 10.04.2025
Tobi Thomas Health and inequalities correspondent Scientists have linked the impact of living in an unequal society to structural changes in the brains of children – regardless of individual wealth – for the first time. A study of more than 10,000 young people in the US discovered altered brain development in children from wealthy and lower-income families in areas with higher rates of inequality, which were also associated with poorer mental health. The data was gathered from the Adolescent Brain Cognitive Development study and published in the journal Nature Mental Health. Researchers at King’s College London, Harvard University, and the University of York then measured inequality within a particular US state by scoring how evenly income is measured. States with higher levels of inequality included New York, Connecticut, California and Florida, while Utah, Wisconsin, Minnesota and Vermont were more equal. MRI scans were analysed to study the surface area and thickness of regions in the cortex, including those involved in higher cognitive functions including memory, emotion, attention and language. Connections between different regions of the brain were also analysed by the scans, where changes in blood flow indicate brain activity. The research found that children living in areas with higher levels of societal inequality, including socioeconomic imbalances and deprivation for example, were linked to having a reduced surface area of the brain’s cortex, and altered connections between multiple regions of the brain. The findings, the first to reveal the impact societal inequality has on the structures of the brain, also provided evidence that the impacted neurodevelopment might relate to future mental health and cognitive function. Notably, these brain changes in children were seen regardless of their economic background. © 2025 Guardian News & Media Limited
Keyword: Development of the Brain; Learning & Memory
Link ID: 29951 - Posted: 10.01.2025
By Calli McMurray Studying animal behavior in the wild often gets hairy, with little experimental control and an abundance of extraneous data. And when multiple animals get together, the way they look, act and smell all influence one another, making it difficult to parse complex social interactions, says Andres Bendesky, associate professor of ecology, evolution and environmental biology at Columbia University. Robotic or animated partners, however, can simplify that equation. Studying animal-robot interaction gives researchers complete control over one partner during any tête-à-tête, Bendesky says. It makes it possible to present the same stimulus to an animal repeatedly or compare how different individuals react. And the method complements observation-based research: Scientists can use a robot- or animation-based paradigm to test ideas gleaned from studies that use artificial-intelligence tools to track behavior. Bendesky is part of a growing cohort of neuroscientists turning to robots to help them decode social interactions. The quirks are still being ironed out, but the approach is already helping several groups tackle questions about schooling, fighting and chatting behaviors. The rigor of the results depends on whether a critter believes what it sees, says Tim Landgraf, professor of artificial and collective intelligence at Freie Universität Berlin, who uses robots to study group behavior in guppies. That can be hard to gauge; there’s no handbook that describes what traits make a robot believable, he says. But researchers can compare how animals act toward a real peer versus a counterfeit one, says Steve Chang, associate professor of psychology and neuroscience at Yale University, who doesn’t work with robots but studies the social behavior of macaques and marmosets. © 2025 Simons Foundation
Keyword: Robotics; Sexual Behavior
Link ID: 29936 - Posted: 09.20.2025
By Sujata Gupta Anne-Laure Le Cunff was something of a wild child. As a teenager, she repeatedly disabled the school fire alarm to sneak smoke breaks and helped launch a magazine filled with her teachers’ fictional love lives. Later, as a young adult studying neuroscience, Le Cunff would spend hours researching complex topics but struggled to complete simple administrative tasks. And she often obsessed over random projects before abruptly abandoning them. Then, three years ago, a colleague asked Le Cunff if she might have attention-deficit/hyperactivity disorder, or ADHD, a condition marked by distractibility, hyperactivity and impulsivity. Doctors confirmed her colleague’s suspicions. But fearing professional stigma, Le Cunff — by then by then a postdoctoral fellow in the ADHD Lab at King’s College London — kept her diagnosis secret until this year. Le Cunff knew all too well about the deficits associated with ADHD. But her research — and personal experience — hinted at an underappreciated upside. “I started seeing … breadcrumbs pointing at a potential association between curiosity and ADHD,” she says. People within the ADHD community have long recognized that the condition can be both harmful and helpful. Researchers, though, have largely focused on the harms. And those studying treatments tend to define success as a reduction in ADHD symptoms, with little regard to possible benefits. That’s starting to change. For instance, Norwegian researchers asked 50 individuals with ADHD to describe their positive experiences with the disorder as part of an effort to develop more holistic treatments. People cited their creativity, energy, adaptability, resilience and curiosity, researchers reported in BMJ Open in October 2023. © Society for Science & the Public 2000–2025.
Rachel Fieldhouse Deep in the rainforests of the Democratic Republic of the Congo, Mélissa Berthet found bonobos doing something thought to be uniquely human. During the six months that Berthet observed the primates, they combined calls in several ways to make complex phrases1. In one example, bonobos (Pan paniscus) that were building nests together added a yelp, meaning ‘let’s do this’, to a grunt that says ‘look at me’. “It’s really a way to say: ‘Look at what I’m doing, and let’s do this all together’,” says Berthet, who studies primates and linguistics at the University of Rennes, France. In another case, a peep that means ‘I would like to do this’ was followed by a whistle signalling ‘let’s stay together’. The bonobos combine the two calls in sensitive social contexts, says Berthet. “I think it’s to bring peace.” The study, reported in April, is one of several examples from the past few years that highlight just how sophisticated vocal communication in non-human animals can be. In some species of primate, whale2 and bird, researchers have identified features and patterns of vocalization that have long been considered defining characteristics of human language. These results challenge ideas about what makes human language special — and even how ‘language’ should be defined. Perhaps unsurprisingly, many scientists turn to artificial intelligence (AI) tools to speed up the detection and interpretation of animal sounds, and to probe aspects of communication that human listeners might miss. “It’s doing something that just wasn’t possible through traditional means,” says David Robinson, an AI researcher at the Earth Species Project, a non-profit organization based in Berkeley, California, that is developing AI systems to decode communication across the animal kingdom. As the research advances, there is increasing interest in using AI tools not only to listen in on animal speech, but also to potentially talk back. © 2025 Springer Nature Limited
Keyword: Animal Communication; Language
Link ID: 29931 - Posted: 09.17.2025
Jon Hamilton People who inherit two copies of a gene variant called APOE4 have a 60% chance of developing Alzheimer's by age 85. Only about 2% to 3% of people in the U.S. have this genetic profile, and most of them don't know it because they've never sought genetic testing. But three scientists are among those who did get tested, and learned that they are in the high-risk group. Now, each is making an effort to protect not only their own brain, but the brains of others with the genotype known as APOE4-4. "I just felt like the end of the world," says June, who asked to use only her first name out of fear that making her genetic status public could affect her job or health insurance. June was 57 when she found out. As someone with a doctorate in biochemistry, she quickly understood what the results meant. New tests of blood and spinal fluid could help doctors quickly identify patients who would most benefit from treatment. "People with our genotype are almost destined to get the disease," she says. "We tend to get symptoms 7 to 10 years earlier than the general population, which means that I had about seven years left before I may get the disease." At first, June spent sleepless nights online, reading academic papers about Alzheimer's and genetics. She even looked into physician-assisted suicide in an effort to make sure she would not become a burden to her adult son. © 2025 npr
Keyword: Alzheimers; Genes & Behavior
Link ID: 29913 - Posted: 09.03.2025


.gif)

