Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Alexa Robles-Gil Having an imaginary friend, playing house or daydreaming about the future were long considered uniquely human abilities. Now, scientists have conducted the first study indicating that apes have the ability to play pretend as well. The findings, published Thursday in the journal Science, suggest that imagination is within the cognitive potential of an ape and can possibly be traced back to our common evolutionary ancestors. “This is one of those things that we assume is distinct about our species,” said Christopher Krupenye, a cognitive scientist at Johns Hopkins University and an author of the study. “This kind of finding really shows us that there’s much more richness to these animals’ minds than people give them credit for,” he said. Researchers knew that apes were capable of certain kinds of imagination. If an ape watches someone hide food in a cup, it can imagine that the food is there despite not seeing it. Because that perception is the reality — the food is actually there — it requires the ape to sustain only one view of the world, the one that it knows to be true. “This kind of work goes beyond it,” Dr. Krupenye said. “Because it suggests that they can, at the same time, consider multiple views of the world and really distinguish what’s real from what’s imaginary.” Bonobos, an endangered species found only in the Democratic Republic of Congo, are difficult to study in the wild. For this research, Dr. Krupenye and Amalia Bastos, a cognitive scientist at the University of St. Andrews, relied on an organization known as the Ape Initiative to study Kanzi, a male bonobo famous for demonstrating some understanding of spoken English. (Kanzi was an enculturated ape born in captivity; he died last year at age 44.) © 2026 The New York Times Company
Keyword: Consciousness; Evolution
Link ID: 30112 - Posted: 02.07.2026
By Nora Bradford For more than a century, psychologists thought that the infant experience was, as the psychologist and philosopher William James famously put it, a “blooming, buzzing confusion.” But new research suggests babies are born with a surprisingly sophisticated neurological toolkit that can organize the visual world into categories and pick out the beat in a song. In the first of two new studies, neuroscientists managed a rare feat: performing functional MRI (fMRI) scans on more than 100 awake 2-month-old infants to see how their brains categorize visual objects. fMRI requires near-stillness, which makes scanning babies notoriously difficult. While the infants lay in the machines, images of animals, food, household objects and other familiar items appeared above their heads like “an IMAX for babies,” says Cliona O’Doherty, a developmental neuroscientist at Stanford University who conducted the work at Trinity College Dublin. “MRI is difficult even under ‘ideal’ circumstances when research participants can follow instructions to hold still,” says Scott Johnson, a developmental psychologist at UCLA who was not involved in the study. “Babies can’t take instruction, so these researchers must have the patience of saints.” The imaging showed that a brain region called the ventral visual cortex, responsible for recognizing what we see, already responded similarly to that of adults, O’Doherty and colleagues report February 2 in Nature Neuroscience. In both adults and 2-month olds, the ventral visual cortex’s activity is distinct for different categories of objects, pushing back against the traditional view that the brain gradually learns to distinguish between categories throughout development. © Society for Science & the Public 2000–2026
Keyword: Hearing; Development of the Brain
Link ID: 30111 - Posted: 02.07.2026
By Natalia Mesa A region of the cerebellum shows language specificity akin to that of cortical language regions, indicating that it might be part of the broader language network, according to a new brain-imaging study. “This is the first time we see an area outside of the core left-hemisphere language areas that behaves so similarly to those core areas,” says study investigator Ev Fedorenko, associate professor of brain and cognitive sciences at the Massachusetts Institute of Technology. Initially thought to coordinate only movement, the cerebellum also contributes to cognitive processes, such as social reward, abstract reasoning and working memory, according to studies from the past decade. But despite the fact that people with cerebellar lesions have subtle language struggles, the region’s contributions to that skill have been ignored until recently, Fedorenko says. With this new work, “I think it becomes harder to dismiss language responses as somehow artifactual.” Fedorenko and her team analyzed nearly 1,700 whole-brain functional MRI experiments conducted over the course of 15 years. They originally collected and analyzed those scans to identify language-selective regions of the neocortex, but they reanalyzed many of them to determine the cerebellum’s role in linguistic processing. Four cerebellar regions activated robustly when participants performed language-related tasks, such as reading passages of text or listening to someone else reading the passages aloud, in line with previous work. But only one region responded exclusively to these language-related tasks; it did not activate during a variety of nonlinguistic tasks—including movement, arithmetic tasks and a spatial working memory task—or when participants listened to music or watched videos of faces and bodies. The findings were published last month in Neuron. © 2026 Simons Foundation
Keyword: Language
Link ID: 30110 - Posted: 02.07.2026
By Molly Glick Not long after upending federal diet guidelines in order to prioritize “real food” on our plates, United States Health and Human Services Secretary Robert F. Kennedy Jr. has offered a new piece of questionable advice. During a tour to promote these dietary recommendations, Kennedy recently claimed that a keto diet can cure schizophrenia—an assertion that experts have quickly thrown cold water on. The ketogenic diet promotes fat-rich meals and low amounts of carbohydrates. While keto eating has skyrocketed in popularity in recent years—it ranked the most Googled diet in the U.S. in 2020—it was initially designed in the early 20th century for patients with epilepsy. More recent studies have confirmed that the diet is effective for certain types of epilepsy because it can control seizures. Meanwhile, we have much less evidence for its impacts on symptoms of schizophrenia. So far, small studies have offered some early evidence that ketogenic diets may help people with the condition. “There is currently no credible evidence that ketogenic diets cure schizophrenia,” Mark Olfson, a psychiatrist at Columbia University, told The New York Times. Kennedy also proclaimed that the diet can essentially cure bipolar disorder, according to studies he recently read. But as with schizophrenia, keto’s impacts on bipolar disorder have only been examined in limited numbers of patients so far. Preliminary findings have also hinted that a keto diet could ease symptoms of depression. It may offer “small antidepressant benefits” for people who don’t respond to medication, according to a recently published JAMA Psychiatry paper. But this work is in the early stages as well and remains far from conclusive. © 2026 NautilusNext Inc.
Keyword: Schizophrenia; Depression
Link ID: 30109 - Posted: 02.07.2026
Peter Lukacs Popular wisdom holds we can ‘rewire’ our brains: after a stroke, after trauma, after learning a new skill, even with 10 minutes a day on the right app. The phrase is everywhere, offering something most of us want to believe: that when the brain suffers an assault, it can be restored with mechanical precision. But ‘rewiring’ is a risky metaphor. It borrows its confidence from engineering, where a faulty system can be repaired by swapping out the right component; it also smuggles that confidence into biology, where change is slower, messier and often incomplete. The phrase has become a cultural mantra that is easier to comprehend than the scientific term, neuroplasticity – the brain’s ability to change and form new neural connections throughout life. But what does it really mean to ‘rewire’ the brain? Is it a helpful shorthand for describing the remarkable plasticity of our nervous system or has it become a misleading oversimplification that distorts our grasp of science? After all, ‘rewiring your brain’ sounds like more than metaphor. It implies an engineering project: a system whose parts can be removed, replaced and optimised. The promise is both alluring and oddly mechanical. The metaphor actually did come from engineering. To an engineer, rewiring means replacing old and faulty circuits with new ones. As the vocabulary of technology crept into everyday life, it brought with it a new way of thinking about the human mind. Medical roots of the phrase trace back to 1912, when the British surgeon W Deane Butcher compared the body’s neural system to a house’s electrical wiring, describing how nerves connect to muscles much like wires connect appliances to a power source. By the 1920s, the Harvard psychologist Leonard Troland was referring to the visual system as ‘an extremely intricate telegraphic system’, reinforcing the comparison between brain function and electrical networks. © Aeon Media Group Ltd. 2012-2026.
Keyword: Learning & Memory; Development of the Brain
Link ID: 30108 - Posted: 02.04.2026
Elizabeth Quill Think about your breakfast this morning. Can you imagine the pattern on your coffee mug? The sheen of the jam on your half-eaten toast? Most of us can call up such pictures in our minds. We can visualize the past and summon images of the future. But for an estimated 4% of people, this mental imagery is weak or absent. When researchers ask them to imagine something familiar, they might have a concept of what it is, and words and associations might come to mind, but they describe their mind’s eye as dark or even blank. Systems neuroscientist Mac Shine at the University of Sydney, Australia, first realized that his mental experience differed in this way in 2013. He and his colleagues were trying to understand how certain types of hallucination come about1, and were discussing the vividness of mental imagery. “When I close my eyes, there’s absolutely nothing there,” Shine recalls telling his colleagues. They immediately asked him what he was talking about. “Whoa. What’s going on?” Shine thought. Neither he nor his colleagues had realized how much variation there is in the experiences people have when they close their eyes. This moment of revelation is common to many people who don’t form mental images. They report that they might never have thought about this aspect of their inner life if not for a chance conversation, a high-school psychology class or an article they stumbled across (see ‘How do you imagine?’). Although scientists have known for more than a century that mental imagery varies between people, the topic received a surge of attention when, a decade ago, an influential paper coined the term aphantasia to describe the experience of people with no mental imagery2. © 2026 Springer Nature Limited
Keyword: Attention; Consciousness
Link ID: 30107 - Posted: 02.04.2026
By Ellen Barry A new analysis of birth cohorts in the Canadian province of Ontario has found a striking rise in the incidence of psychotic disorders among young people, a finding that its authors said could reflect teens’ increasing use of substances like cannabis, stimulants and hallucinogens. The study, published on Monday in The Canadian Medical Association Journal, found that the rate of new diagnoses of psychotic disorders among people ages 14 to 20 increased by 60 percent between 1997 and 2023, while new diagnoses at older ages plateaued or declined. Compared with people born in the late 1970s, those born in the early 2000s were about twice as likely to have been diagnosed with a psychotic disorder by age 20. The researchers included 12 million people born in Ontario between 1960 and 2009, of which 0.9 percent were diagnosed with a psychotic disorder during the study period. The study was epidemiological and did not try to identify a cause for the rising prevalence. There are a number of possible explanations, among them older paternal age, the stress of migration, neonatal health problems and early intervention programs that now regularly identify the disorders at younger ages, the authors note. But Dr. Daniel Myran, one of the study’s authors, said he undertook the study, in part, to follow up on concerns that the legalization of cannabis might increase population-level rates of schizophrenia and other psychotic disorders. “I was expecting to see some increases in these younger folks, but I was quite surprised by the scale,” said Dr. Myran, a family physician and research chair at North York General Hospital. He said the results suggested a need for more research into the impact of expanding cannabis use by young people. © 2026 The New York Times Company
Keyword: Schizophrenia; Drug Abuse
Link ID: 30106 - Posted: 02.04.2026
By Marla Vacek Broadfoot Nearly 1 in 8 dementia cases — about half a million nationwide — may be linked to insomnia. The new findings, reported December 27 in the Journals of Gerontology: Series A, add weight to growing evidence that sleep is a modifiable risk factor for dementia, akin to hearing loss and hypertension. The study does not establish a direct cause-and-effect relationship between insomnia and dementia for individuals, says Yuqian Lin, a data analyst at Massachusetts General Hospital in Boston. Rather, she says, it looks at the overall extent to which insomnia may contribute to dementia across the population. Lin and her colleagues analyzed data from the National Health and Aging Trends Study, or NHATS, a long-running survey of 5,900 U.S. adults ages 65 and older. Participants reported whether they had difficulty falling asleep, staying asleep or both. Dementia was identified using standard research tools that rely on cognitive testing and reports from family members or caregivers. To estimate the impact of insomnia on the population, Lin and her team calculated the proportion of dementia cases that could theoretically be prevented if insomnia-related sleep disturbances were eliminated. The calculation combined the prevalence of insomnia and dementia in the NHATS population with relative risk estimates drawn from recent large meta-analyses linking insomnia to dementia later in life. © Society for Science & the Public 2000–2026.
Keyword: Sleep; Alzheimers
Link ID: 30105 - Posted: 02.04.2026
By Jake Buehler Though fearsome predators, snakes can go weeks or even months without eating. Now, scientists think they may know how they do it. Snakes have lost the genes to produce ghrelin, a key hormone that regulates appetite, digestion, and fat storage, researchers report today in Royal Society Open Biology. Chameleons and a group of desert lizards called toadhead agamas that also have huge spaces between meals have also lost the same genes, hinting that cutting off ghrelin is a key way to excel at fasting, possibly by suppressing appetite and holding onto fat stores. “I give [the researchers] a lot of credit for looking more deeply into the data that was staring us all in the face—myself included,” says Todd Castoe, a genomicist at the University of Texas at Arlington not involved with the study. The hormone is ubiquitous across vertebrates, from fish to mammals. So finding that reptiles have repeatedly ditched it is “pretty remarkable,” he says. When scientists first discovered ghrelin nearly 30 years ago, they thought this “hunger hormone” could be key to fighting obesity in humans. But it hasn’t been that simple. Since then, researchers have found that ghrelin has a complicated role within a network of hormones constantly tweaking hunger and energy stores. And even though ghrelin is commonly found in vertebrates, it’s been unclear how it has evolved across various groups of vertebrates. So in the new study, Rui Resende Pinto, an evolutionary biologist at the University of Porto, and his colleagues focused on reptiles, many of which can go long periods without food. The researchers scanned the genomes of 112 species. In snakes, chameleons, and toadhead agamas, ghrelin genes were either missing or so warped by mutations they could no longer encode the hormone, the team found. The degree of the genes’ erosion also varied considerably between snake families: Some snakes such as boas and pythons had malformed ghrelin genes, but others, such as vipers, cobras, and their relatives, barely had anything left.
Keyword: Obesity; Evolution
Link ID: 30104 - Posted: 02.04.2026
By Ingrid Wickelgren The human brain is a vast network of billions of neurons. By exchanging signals to depress or excite each other, they generate patterns that ripple across the brain up to 1,000 times per second. For more than a century, that dizzyingly complex neuronal code was thought to be the sole arbiter of perception, thought, emotion, and behavior, as well as related health conditions. If you wanted to understand the brain, you turned to the study of neurons: neuroscience. But a recent body of work from several labs, published as a trio of papers in Science in 2025, provides the strongest evidence yet that a narrow focus on neurons is woefully insufficient for understanding how the brain works. The experiments, in mice, zebra fish, and fruit flies, reveal that the large brain cells called astrocytes serve as supervisors. Once viewed as mere support cells for neurons, astrocytes are now thought to help tune brain circuits and thereby control overall brain state or mood — say, our level of alertness, anxiousness, or apathy. Astrocytes, which outnumber neurons in many brain regions, have complex and varied shapes, and sometimes tendrils, that can envelop hundreds of thousands or millions of synapses, the junctions where neurons exchange molecular signals. This anatomical arrangement perfectly positions astrocytes to affect information flow, though whether or how they alter activity at synapses has long been controversial, in part because the mechanisms of potential interactions weren’t fully understood. In revealing how astrocytes temper synaptic conversations, the new studies make astrocytes’ influence impossible to ignore. “We live in the age of connectomics, where everyone loves to say [that] if you understand the connections [between neurons], we can understand how the brain works. That’s not true,” said Marc Freeman (opens a new tab), the director of the Vollum Institute, an independent neuroscience research center at Oregon Health and Science University, who led one of the new studies. “You can get dramatic changes in firing patterns of neurons with zero changes in [neuronal] connectivity.” © 2026 Simons Foundation
Keyword: Glia; Learning & Memory
Link ID: 30103 - Posted: 01.31.2026
By Amy X. Wang Alice, fumbling through Wonderland, comes across a mushroom. One bite of it shrinks her down in size. Chowing on the other side makes her swell up, huge, taller than the treetops. Urgently, Alice sets to work “nibbling first at one and then at the other, and growing sometimes taller and sometimes shorter,” until finally she succeeds in “bringing herself down to her usual height” — whereupon everything feels “quite strange.” Is this Lewis Carroll’s 1865 fantasy tale or … the average body-conscious, improvement-obsessed 2026 Whole Foods shopper? Mushrooms, long venerated in literature as dark transformative forces, have become Goopified. Nowadays, you can chug “adaptogenic mushroom coffee,” slurp “functional mushroom cocoa,” doze off with “mushroom sleep drops” or ingest/imbibe any number of other tinctures in the billion-dollar fungal supplements market that promise to fine-tune, or even totally recalibrate, the self. The latest and hottest items in this booming new retail category are mushroom gummies, gushed over by wellness influencers, spilling out from supermarket shelves right there next to your standard cough drops and protein bars. Fungi have aided medical advances like antibiotics and statins, it’s true, and certain species have shown promising results in fighting Parkinson’s or cancer — but what these pastel gumdrops proffer is a broader, more elliptical “cellular well-being.” The mystique feels intentional on product-makers’ part: Like Carroll’s baffled heroine, maybe you’re meant to be in a bit of thrall to the mysterious, almighty mushroom — lurching through Wonderland, charmed and confused by design. After all, you wonder, what are these ancient, alien creatures, growing in the secret dark? Hippocrates was supposedly using them to cauterize wounds around the 5th century B.C.E. In the Super Mario video games, mushrooms might give you extra lives; in HBO’s “The Last of Us,” they bring about the ruin of human civilization. © 2026 The New York Times Company
Keyword: Attention; Drug Abuse
Link ID: 30102 - Posted: 01.31.2026
By Calli McMurray In 2010, Ardem Patapoutian unmasked a piece of cellular machinery that had long evaded identification: PIEZO channels, pores wrenched open by changes in a cell’s membrane tension to allow ions to flow through, thereby converting mechanical force into electrical activity. The discovery marked a turning point for the field of mechanosensation—a process that can be unwieldy to study, says Arthur Beyder, associate professor of physiology and medicine at the Mayo Clinic, because “it reaches its fingers into everything.” The field needed “something to grab onto,” he says, to untangle these processes from other sensory ones—and PIEZO channels provided the first handhold. The PIEZO discovery garnered much attention, and since then, a flurry of studies have outlined how the channels contribute to touch, itch and proprioception. In 2021, Patapoutian shared the Nobel Prize in Physiology or Medicine for his contributions to this work. Now, a growing cadre of researchers is using these receptors as a tool to explore interoception, or the brain’s sense of what the internal organs are doing. “We’re seeing a resurgence and an expansion of research in this area,” says Miriam Goodman, professor of molecular and cellular physiology at Stanford University. The field, she adds, is in the middle of a “PIEZO-driven renaissance.” Even a body at rest is in constant motion: The heart pumps blood, the lungs expand and contract, the gut squeezes food, and the bladder stretches with urine. Biologists had intuited that mechanical force was a key part of these processes—and also part of how organs communicate with the brain—but for decades they did not have a way to dive into the molecular mechanisms behind them. © 2026 Simons Foundation
Keyword: Pain & Touch
Link ID: 30101 - Posted: 01.31.2026
By Azeen Ghorayshi Health Secretary Robert F. Kennedy Jr. has overhauled a panel that helps the federal government set priorities for autism research and social services, installing several members who have said that vaccines can cause autism despite decades of research that has failed to establish such a link. The panel, the Interagency Autism Coordinating Committee, was established in 2000 and has historically included autistic people, parents, scientists and clinicians, as well as federal employees, who hold public meetings to debate how federal funds should best be allocated to support people with autism. The 21 new public members selected by Mr. Kennedy include many outspoken activists, among them a former employee of a super PAC that supported Mr. Kennedy’s presidential campaign, a doctor who has been sued over dangerous heavy metal treatments for a young child with autism, a political economist who has testified against vaccines before a congressional committee, and parents who have spoken publicly about their belief that their children’s autism was caused by vaccines. The group, which also includes 21 government members across many federal agencies, will advise the federal government on how to prioritize the $2 billion allocated by Congress toward autism research and services over the next five years. Though it’s not yet clear what the committee will do — or what it can do, given that it serves only an advisory function — many longtime autism advocates and researchers said they were alarmed by the fact that the committee seemed stacked to advance Mr. Kennedy’s priorities on vaccines. “The new committee does not represent the autism community,” said Alison Singer, who served on the committee from 2007 to 2019. Ms. Singer, whose 28-year-old daughter has profound autism, is the head of the Autism Science Foundation. “It disproportionately, excruciatingly so, represents an extremely small subset of families who believe vaccines cause autism.” © 2026 The New York Times Company
Keyword: Autism
Link ID: 30100 - Posted: 01.31.2026
By Simon Makin Positive thinking may boost the body’s defenses against disease. Increasing activity in a brain region that controls motivation and expectation, specifically the brain’s reward system, is linked with making more antibodies after receiving a vaccine. The finding suggests these boosts were related to the placebo effect, researchers report January 19 in Nature Medicine. “Placebo is a self-help mechanism, and here we actually harness it,” says Talma Hendler, a neuroscientist at Tel Aviv University. “This suggests we could use the brain to help the body fight illness.” The work is important because it “is first-in-human evidence of a relationship between brain reward systems and immune function,” says Tor Wager, a neuroscientist at Dartmouth College in Hanover, N.H., who was not involved in the study. The study was not designed to test vaccine effectiveness. Larger studies, including more complete immune assessments, will be required to test this association as a medical intervention. Scientists have found many links between the brain and bodily health. Both negative and positive mental states can affect the immune system, and studies in rodents have suggested that the brain’s reward network is involved in these effects. To find out if the same circuitry was at play in humans, Hendler and colleagues trained healthy volunteers to regulate their brain activity using neurofeedback, a technique that uses brain imaging to show users the activity of the area they are trying to boost. The team randomly assigned 85 participants to receive training aimed at increasing activity in either their reward network or a different network, or to receive no training. © Society for Science & the Public 2000–2026.
Keyword: Neuroimmunology
Link ID: 30099 - Posted: 01.31.2026
By Alessio Cozzolino After a heart attack, the heart “talks” to the brain. And that conversation may make recovery worse. Shutting down nerve cells that send messages from injured heart cells to the brain boosted the heart’s ability to pump and decreased scarring, experiments in mice show. Targeting inflammation in a part of the nervous system where those “damage” messages wind up also improved heart function and tissue repair, scientists report January 27 in Cell. “This research is another great example highlighting that we cannot look at one organ and its disease in isolation,” says Wolfram Poller, an interventional cardiologist at Massachusetts General Hospital and Harvard Medical School who was not involved in the study. “And it opens the door to new therapeutic strategies and targets that go beyond the heart.” Someone in the United States has a heart attack about every 40 seconds, according to the U.S. Centers for Disease Control and Prevention. That adds up to about 805,000 people each year. A heart attack is a mechanical problem caused by the obstruction of a coronary artery, usually by a blood clot. If the blockage lasts long enough, the affected cells may start to die. Heart attacks can have long-term effects such as a weakened heart, a reduced ability to pump blood, irregular heart rhythms, and a higher risk of heart failure or another heart attack. Although experts knew from previous research that the nervous and immune systems could amplify inflammation and slow healing, the key players and pathways involved were unknown, says Vineet Augustine, a neurobiologist at the University of California, San Diego. © Society for Science & the Public 2000–2026
Keyword: Neuroimmunology
Link ID: 30098 - Posted: 01.28.2026
By Jackie Flynn Mogensen Everyone who menstruates and lives long enough experiences menopause in one form or another. Yet despite that, research into what happens during this natural cessation of menstruation and why is limited. Scientists know that menopause can cause a myriad of neurological symptoms, from hot flashes to poor sleep to depression. But what is going on in people’s brain during this period is still murky. Now new research offers clues to a link between menopause and changes in the brain’s gray matter, as well as anxiety and depression. Using brain scans from 10,873 people in the U.K., the researchers found that postmenopausal participants showed lower volumes of gray matter in the entorhinal cortex and hippocampus, which are involved in storing and retrieving memories, and in the anterior cingulate, which is involved in emotional regulation. The researchers also looked at whether hormone replacement therapy (HRT), a frontline but still rarely prescribed treatment for symptoms of menopause, might ameliorate some of these changes. Barbara Sahakian, a psychiatry professor at the University of Cambridge and an author of the study, explains that she and her colleagues theorized HRT might influence people’s experiences, tamping down their neurological symptoms, for instance. “That was the hypothesis,” she says, “but it didn’t seem to pan out completely that way.” They found that people who were treated with HRT for menopause showed lower volumes of gray matter in some areas of the brain than those who did not receive HRT. The HRT group also showed higher rates of anxiety and depression—importantly, Sahakian says their work doesn’t find that HRT treatment causes brain changes or menopause symptoms. Previous research suggests HRT prescribed during the run-up to menopause and early postmenopause can reduce anxiety, depending on the kind of HRT and dose, in at least some women. And a subsequent analysis found that participants who were prescribed HRT were more likely to have reported anxiety and depression before HRT treatment, the study explains. © 2025 SCIENTIFIC AMERICAN,
Keyword: Hormones & Behavior; Development of the Brain
Link ID: 30097 - Posted: 01.28.2026
By Joshua P. Johansen Growing up in the 1980s in Santa Cruz, California, where redwood-covered mountains descend to the rocky edge of the Pacific, might sound idyllic. But in the dark wake of the drug-fueled ’70s, the beach town could also be frightening. There was a bully at my high school who once chased me down the street threatening to hurt me. Unsurprisingly, catching sight of him in the hallways or at the skate park filled me with dread. Just walking past his house would trigger a wave of anxiety. Yet if I saw him in class, with teachers present, I felt more at ease. How did my brain know to fear him only in specific circumstances? More broadly, how did I infer emotional significance from the world around me? The fact that I or anyone can make these judgments suggests that emotion arises from an internal model in the brain that supports inference, abstraction and flexible, context-dependent evaluations of threat or safety. These model-based emotion systems helped me infer danger from otherwise innocuous features of the environment, such as the bully’s house, or to downgrade my alarm, as I did when an adult was present. Understanding the neural basis of emotion is a central question in neuroscience, with profound implications for the treatment of anxiety, trauma and mood disorders. Yet the field remains divided over what emotions are and how they should be defined, limiting progress. On one side are neurobiologists focused on the neural underpinnings of simple learned and innate defensive behaviors. On the other are psychological theorists who view emotions as subjective experiences arising from complex conceptual brain models of the world that are unique to humans. This divide fuels persistent arguments over whether emotion should be defined primarily as a conscious state or not. Though subjective feelings are undeniably important, limiting our definitions to conscious phenomena prevents us from studying the underlying mechanisms in nonhuman species. To move forward, we need to identify the conserved neural processes that support higher-order, internal-model-based emotional experiences across species, regardless of whether they rise to consciousness. © 2026 Simons Foundation
Keyword: Emotions; Consciousness
Link ID: 30096 - Posted: 01.28.2026
Jon Hamilton At a press conference in late 2025, federal officials made some big claims about leucovorin, a prescription drug usually reserved for people on cancer chemotherapy. "We're going to change the label to make it available [to children with autism spectrum disorder]," said Dr. Marty Makary, commissioner of the Food and Drug Administration. "Hundreds of thousands of kids, in my opinion, will benefit." The Trump administration has suggested that leucovorin, a drug used in cancer treatment, might have some benefit for children with autism. Many researchers and families aren't so sure. The FDA still hasn't made that label change. Since Makary's remarks, though, more than 25,000 people have joined a Facebook group called Leucovorin for Autism. Most members appear to be parents seeking the drug for their autistic children. Also since the press conference, some doctors have begun writing off-label prescriptions for autistic children, against the advice of medical groups including the American Academy of Pediatrics. The buzz about leucovorin has led to a shortage of the drug. In response, the FDA is temporarily allowing imports of tablets that are made in Spain and sold in Canada, but not approved in the U.S. All of this is part of a familiar cycle for Dr. Paul Offit, who directs the vaccine education center at Children's Hospital of Philadelphia. Offit says he realized years ago that leucovorin's popularity was far ahead of the science. Jason Mazzola walks to work at The Residence at Natick South, an LCB Senior Living community in Natick, MA. August 22, 2024. © 2026 npr
Keyword: Autism
Link ID: 30095 - Posted: 01.28.2026
By Yasemin Saplakoglu On a remote island in the Indian Ocean, six closely watched bats took to the star-draped skies. As they flew across the seven-acre speck of land, devices implanted in their brains pinged data back to a group of sleepy-eyed neuroscientists monitoring them from below. The researchers were working to understand how these flying mammals, who have brains not unlike our own, develop a sense of direction while navigating a new environment. The research, published in Science, reported that the bats used a network of brain cells (opens a new tab) that informed their sense of direction around the island. Their “internal compass” was tuned by neither the Earth’s magnetic field nor the stars in the sky, but rather by landmarks that informed a mental map of the animal’s environment. These first-ever wild experiments in mammalian mapmaking confirm decades of lab results and support one of two competing theories about how an internal neural compass anchors itself to the environment. “Now we’re understanding a basic principle about how the mammalian brain works” under natural, real-world conditions, said the behavioral neuroscientist Paul Dudchenko (opens a new tab), who studies spatial navigation at the University of Stirling in the United Kingdom and was not involved in the study. “It will be a paper people will be talking about for 50 years.” Follow-up experiments that haven’t yet been published show that other cells critical to navigation encode much more information in the wild than they do in the lab, emphasizing the need to test neurobiological theories in the real world. Neuroscientists believe that a similar internal compass, composed of neurons known as “head direction cells,” might also exist in the human brain — though they haven’t yet been located. If they are someday found, the mechanism could shed light on common sensations such as getting “turned around” and quickly reorienting oneself. It might even explain why some of us are so bad at finding our way. © 2026 Simons Foundation
Keyword: Learning & Memory
Link ID: 30094 - Posted: 01.24.2026
By Claudia López Lloreda The U.S. Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative is kicking off a new phase. In a road map published in November, it identified four research priorities for the next decade: integrating its databases, informing precision circuit therapies, understanding human neuroscience and advancing NeuroAI. The plan shows a thoughtful effort to “protect a very important initiative,” says J. Anthony Movshon, professor of neural science and psychology at New York University—at a time when its future seems unsettled. The BRAIN Initiative is co-led by the directors of the National Institute of Mental Health and the National Institute for Neurological Disorders and Stroke. But the NIMH has had an acting director since June 2024. Last month, the Trump administration terminated the initiative’s other co-director—Walter Koroshetz—from his role as director of the National Institute for Neurological Disorders and Stroke. And it is not clear whether the initiative will have sufficient funding or support to undertake this decade-long effort, says Joshua Sanes, professor emeritus of molecular and cellular biology at Harvard University and contributing editor for The Transmitter. “My guess is that if things continue politically the way they’re going now, [these goals] would not be accomplished in the United States in the next 10 years.” Even if the BRAIN Initiative receives the amount of funding it is expecting, many neuroscientists are too busy grappling with the fallout of grant cancellations, hiring freezes and the loss of training programs to think about the future, says Eve Marder, university professor of biology at Brandeis University. “I’m talking to all these people who are struggling to keep their labs open.” “You can have all the dreams in the universe,” but these big-picture speculations, which may require vast resources, are hard to reconcile with the erosion and destruction of academic science and training programs for young investigators, she adds. “It is difficult to look at a 10-year horizon, and [it] may be a waste of time and effort when we don’t know what is happening to science funding in the next year.” © 2026 Simons Foundation
Keyword: Brain imaging
Link ID: 30093 - Posted: 01.24.2026


.gif)

