Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
Heidi Ledford Telltale features in standard brain images can reveal how quickly a person is ageing, a study of more than 50,000 brain scans has shown1. Pivotal features include the thickness of the cerebral cortex — a region that controls language and thinking — and the volume of grey matter that it contains. These and other characteristics can predict how quickly a person’s ability to think and remember will decline with age, as well as their risk of frailty, disease and death. Although it’s too soon to use the new results in the clinic, the test provides advantages over previously reported ‘clocks’ — typically based on blood tests — that purport to measure the pace of ageing, says Mahdi Moqri, a computational biologist who studies ageing at Harvard Medical School in Boston, Massachusetts. “Imaging offers unique, direct insights into the brain’s structural ageing, providing information that blood-based or molecular biomarkers alone can’t capture,” says Moqri, who was not involved in the study. The results were published today in Nature Aging. Genetics, environment and disease all affect the speed of biological ageing. As a result, chronological age does not always reflect the pace at which time takes its toll on the body. Researchers have been racing to develop measures to fill that gap. Ageing clocks could be used early in life to assess an individual’s risk of age-related illness, when it might still be possible to intervene. They could also aid testing of treatments aimed at slowing ageing, by providing a marker to track the effects of the intervention in real time. © 2025 Springer Nature Limited
Keyword: Development of the Brain; Brain imaging
Link ID: 29848 - Posted: 07.02.2025
By Mohana Ravindranath A new analysis of data gathered from a small Indigenous population in the Bolivian Amazon suggests some of our basic assumptions about the biological process of aging might be wrong. Inflammation is a natural immune response that protects the body from injury or infection. Scientists have long believed that long-term, low-grade inflammation — also known as “inflammaging” — is a universal hallmark of getting older. But this new data raises the question of whether inflammation is directly linked to aging at all, or if it’s linked to a person’s lifestyle or environment instead. The study, which was published today, found that people in two nonindustrialized areas experienced a different kind of inflammation throughout their lives than more urban people — likely tied to infections from bacteria, viruses and parasites rather than the precursors of chronic disease. Their inflammation also didn’t appear to increase with age. Scientists compared inflammation signals in existing data sets from four distinct populations in Italy, Singapore, Bolivia and Malaysia; because they didn’t collect the blood samples directly, they couldn’t make exact apples-to-apples comparisons. But if validated in larger studies, the findings could suggest that diet, lifestyle and environment influence inflammation more than aging itself, said Alan Cohen, an author of the paper and an associate professor of environmental health sciences at Columbia University. “Inflammaging may not be a direct product of aging, but rather a response to industrialized conditions,” he said, adding that this was a warning to experts like him that they might be overestimating its pervasiveness globally. “How we understand inflammation and aging health is based almost entirely on research in high-income countries like the U.S.,” said Thomas McDade, a biological anthropologist at Northwestern University. But a broader look shows that there’s much more global variation in aging than scientists previously thought, he added. © 2025 The New York Times Company
Keyword: Development of the Brain; Neuroimmunology
Link ID: 29847 - Posted: 07.02.2025
By Laura Sanders GLP-1 drugs may possess a new power: Easing migraines. In a small, preliminary study, a GLP-1 drug nearly halved the number of days people spent with a migraine in a given month. The results, presented June 21 at the European Academy of Neurology Congress in Helsinki, Finland, expand the possible benefits of the powerful new class of obesity and diabetes drugs. These pernicious, debilitating headaches are estimated to affect one billion people worldwide. Earlier studies have shown that GLP-1 agonists can reduce the pressure inside the skull, a squeeze that’s been implicated in migraines. Neurologist Simone Braca of the University of Naples Federico II in Italy and his colleagues explored whether liraglutide, an older relative of Ozempic and Wegovy, might help migraine sufferers. Thirty-one adults, 26 of them women, got daily injections of liraglutide for 12 weeks. These adults all had obesity and continued to take their current migraine medicines too. At the start of the experiment, participants had headaches on about 20 days out of a month. After 12 weeks of liraglutide, the average number dropped to about 11 days. “Basically, we observed that patients saw their days with headache halved, which is huge,” Braca says. Participants’ weight stayed about the same during the trial, suggesting that headache reductions weren’t tied to weight loss. If the results hold up in larger studies, they may point to treatments for migraine sufferers who aren’t helped by existing drugs. The results may also lead to a deeper understanding of the role of pressure inside the head in migraines, Braca says. © Society for Science & the Public 2000–2025.
Keyword: Obesity; Pain & Touch
Link ID: 29846 - Posted: 07.02.2025
By Claudia López Lloreda When it comes to cognition and behavior, neurons usually take center stage. They famously drive everything from thoughts to movements by way of synaptic communication, with the help of neuromodulators such as dopamine, norepinephrine and certain immune molecules that regulate neuronal activity and plasticity. But astrocytes play essential roles in these processes behind the scenes, according to four independent studies published in the past two months. Rather than acting solely on neurons, neuromodulators also act on astrocytes to influence neuronal function and behavior—making astrocytes crucial intermediates in activities previously attributed to direct communication between neurons, the studies suggest. For instance, norepinephrine sensitizes astrocytes to neurotransmitters and prompts them to regulate circuit computations, synapse function and various behaviors across diverse animal models, three of the studies—all published last month in Science—show. “Do neurons actually signal through astrocytes in a meaningful way during normal behavior or normal circuit function?” asks Marc Freeman, senior scientist at Oregon Health & Science University and principal investigator on one of the Science studies. These new findings “argue very strongly the answer is yes.” Astrocytes can also detect peripheral inflammation and modify the neurons that drive a stress-induced fear behavior in mice, according to the fourth study, published in April in Nature. Although astrocytes are no longer thought of as simply support cells, they were still “not really considered for having a real plasticity and a real important role,” says Caroline Menard, associate professor of psychiatry and neurosciences at the University of Laval, who was not involved in any of the new studies. Now “there’s more consideration from the field that behavior is not only driven by neurons, but there’s other cell types involved.” © 2025 Simons Foundation
Keyword: Glia; Learning & Memory
Link ID: 29845 - Posted: 07.02.2025
Humberto Basilio Mindia Wichert has taken part in plenty of brain experiments as a cognitive-neuroscience graduate student at the Humboldt University of Berlin, but none was as challenging as one he faced in 2023. Inside a stark white room, he stared at a flickering screen that flashed a different image every 10 seconds. His task was to determine what familiar object appeared in each image. But, at least at first, the images looked like nothing more than a jumble of black and white patches. “I’m very competitive with myself,” says Wichert. “I felt really frustrated.” Cognitive neuroscientist Maxi Becker, now at Duke University in Durham, North Carolina, chose the images in an attempt to spark a fleeting mental phenomenon that people often experience but can’t control or fully explain. Study participants puzzling out what is depicted in the images — known as Mooney images, after a researcher who published a set of them in the 1950s1 — can’t rely on analytical thinking. Instead, the answer must arrive all at once, like a flash of lightning in the dark (take Nature’s Mooney-images quiz below). Becker asked some of the participants to view the images while lying inside a functional magnetic resonance imaging (fMRI) scanner, so she could track tiny shifts in blood flow corresponding to brain activity. She hoped to determine which regions produce ‘aha!’ moments. Over the past two decades, scientists studying such moments of insight — also known as eureka moments — have used the tools of neuroscience to reveal which regions of the brain are active and how they interact when discovery strikes. They’ve refined the puzzles they use to trigger insight and the measurements they take, in an attempt to turn a self-reported, subjective experience into something that can be documented and rigorously studied. This foundational work has led to new questions, including why some people are more insightful than others, what mental states could encourage insight and how insight might boost memory. © 2025 Springer Nature Limited
Keyword: Attention; Learning & Memory
Link ID: 29844 - Posted: 06.28.2025
By Katrina Miller Take a look at this video of a waiting room. Do you see anything strange? Perhaps you saw the rug disappear, or the couch pillows transform, or a few ceiling panels evaporate. Or maybe you didn’t. In fact, dozens of objects change in this video, which won second place in the Best Illusion of the Year Contest in 2021. Voting for the latest version of the contest opened on Monday. Illusions “are the phenomena in which the physical reality is divorced from perception,” said Stephen Macknik, a neuroscientist at SUNY Downstate Health Sciences University in Brooklyn. He runs the contest with his colleague and spouse, Susana Martinez-Conde. By studying the disconnect between perception and reality, scientists can better understand which brain regions and processes help us interpret the world around us. The illusion above highlights change blindness, the brain’s failure to notice shifts in the environment, especially when they occur gradually. To some extent, all sensory experience is illusory, Dr. Martinez-Conde asserts. “We are always constructing a simulation of reality,” she said. “We don’t have direct access to that reality. We live inside the simulation that we create.” She and Dr. Macknik have run the illusion contest since 2005. What began as a public outreach event at an academic conference has since blossomed into an annual competition open to anyone in the world. They initially worried that people would run out of illusions to submit. “But that actually never happened,” Dr. Martinez-Conde said. “What ended up happening instead is that people started developing illusions, actually, with an eye to competing in the contest.” © 2025 The New York Times Company
Keyword: Vision; Attention
Link ID: 29843 - Posted: 06.28.2025
By Gordy Slack, MindSite News Lauren Kennedy West was still a teenager when she began to smell and hear things that weren’t there. Then to see things, too, that were invisible to others. Meanwhile, her moods began to intensify, sometimes turning very, very dark. “It was confusing, disturbing, and depressing,” she recalls. She had periods of elation, too. But when she came down from these, she’d keep descending until she hit emotional bottom. It got so bad that in her early 20s, at college, Kennedy West tried to end her life twice. Finally, when she was 25, she was diagnosed with schizoaffective disorder, a form of schizophrenia with powerful mood swings. The medications she was prescribed eased her worst symptoms, she said, but they also had troubling side effects that ranged from extreme weight gain and “dry mouth” to feeling lethargic and an episodic condition called oculogyric crisis which causes people to continually, involuntarily, gaze upward. Worst of all, she said, was the feeling of being “emotionally blunted.” Learning that she’d likely be taking those medications for the rest of her life was a blow, but the diagnosis gave Kennedy West a meaningful framework for her struggle. To be as stable, happy, and engaged as possible she would have to cultivate acceptance of her condition and the limitations it imposed, she was told. Driven by a hope that others might be spared the disabling confusion and depression she suffered before her diagnosis, Kennedy West and her partner started a YouTube Channel, which they called “Living Well with Mental Illness” (now “Living Well with Schizophrenia“) In frequent posts, Kennedy West recounted her own struggles and triumphs and interviewed experts on mental illness and related subjects. In early 2023, Christopher Palmer was a guest on the channel.
Keyword: Schizophrenia
Link ID: 29842 - Posted: 06.28.2025
By Sydney Wyatt The shape and density of dendritic spines fluctuate in step with the estrous cycle in the hippocampus of living mice, a new study shows. And these structural changes coincide with shifts in the stability of place fields encoded by place cells. “You can literally see these oscillations in hippocampal spines, and they keep time with the endocrine rhythms being produced by the ovaries,” says study investigator Emily Jacobs, associate professor of psychological and brain sciences at the University of California, Santa Barbara. She and her colleagues used calcium imaging and surgically implanted microperiscopes to view the dynamics of the dendritic spines in real time. The findings, published in Neuron in May, replicate and expand upon a series of cross-sectional studies of rat brain tissue in the early 1990s that documented sex hormone receptors in the hippocampus and showed that changes in estradiol levels across the estrous cycle track with differences in dendritic spine density. “The field of neuroendocrinology was really changed in the early ’90s because of this discovery,” Jacobs says. The new work is a “very important advancement,” says John Morrison, professor of neurology at the University of California, Davis, who was not involved in the research. It shows that spines change across the natural cycle of living mice, supporting estradiol’s role in this process, and it links these changes to electrophysiological differences, he says. “The most surprising part of this study is that everything seems to follow each other. Usually biology doesn’t cooperate like this,” Morrison says. Before the early 1990s, estrogens were viewed only as reproductive hormones, and their effects in the brain were thought to be limited to the hypothalamus, says Catherine Woolley, professor of neurobiology at Northwestern University, who worked on the classic rat hippocampus studies when she was a graduate student in the lab of the late Bruce McEwen. For that reason, her rat hippocampus results were initially met with “resistance,” she adds. A leader in the field once told her to “get some better advice” from her adviser “because estrogens are reproductive hormones, and they don’t have effects in the hippocampus,” she recalls. © 2025 Simons Foundation
Keyword: Hormones & Behavior; Learning & Memory
Link ID: 29841 - Posted: 06.28.2025
By Nazeefa Ahmed Humans prefer fruit at its sweetest, whereas many birds happily snack on the sourest of the bunch, from zesty lemons to unripe honey mangoes. Researchers may now know why. A study published today in Science suggests birds have evolved a specialized taste receptor that’s suppressed by high acidity, which effectively dulls the sharp, sour taste of fruits they eat. The finding reveals the evolutionary history of the pucker-inducing diets of many fruit-eating birds around the world—and may also help explain birds’ knack for survival, by broadening their potential food sources. The study is a “robust” addition to our understanding of how birds taste sour foods, which is still a research area in its infancy, says Leanne Grieves, an ornithologist at Cornell University’s Lab of Ornithology. Scientists identified a sour taste receptor in vertebrates—known as OTOP1—only 7 years ago, and few studies focus on why birds eat what they eat, rather than simply what they eat. Grieves, who studies birds’ sense of smell but who was not involved with the current work, adds that the new study “provides a really nice starting point.” To examine how birds approach sour-tasting foods, scientists exposed OTOP1 receptors from mice, domestic pigeons, and canaries to various acidic solutions. The activity of the mouse version of the receptor increased with greater acidity—meaning more acidic foods register to mice, and other mammals like us, as increasingly sour. However, the pigeon and canary versions of OTOP1 became less active in solutions about as acidic as a lemon. As a result, the birds wouldn’t perceive as much of a sour taste, allowing them to take advantage of the fruits mammals can’t stomach. Determining why bird OTOP1 reacted differently was a challenge, according to study author Hao Zhang, an evolutionary biologist at the Chinese Academy of Sciences (CAS). So, the researchers mutated sections of the gene that encodes the OTOP1 receptor, which let them identify four candidate amino acids within the protein that are responsible for sour tolerance. One of them, known as G378, is found almost exclusively in songbirds such as the canary—a species that showed greater sour tolerance than the pigeon, which lacks this variance. “A single amino acid in the bird OTOP1 can increase sour tolerance,” says study author Lei Luo, a biologist at CAS. © 2025 American Association for the Advancement of Science.
Keyword: Chemical Senses (Smell & Taste); Evolution
Link ID: 29840 - Posted: 06.21.2025
James Doubek Researchers have some new evidence about what makes birds make so much noise early in the morning, and it's not for some of the reasons they previously thought. For decades, a dominant theory about why birds sing at dawn — called the "dawn chorus" — has been that they can be heard farther and more clearly at that time. Sound travels faster in humid air and it's more humid early in the morning. It's less windy, too, which is thought to lessen any distortion of their vocalizations. But scientists from the Cornell Lab of Ornithology's K. Lisa Yang Center for Conservation Bioacoustics and Project Dhvani in India combed through audio recordings of birds in the rainforest. They say they didn't find evidence to back up this "acoustic transmission hypothesis." It was among the hypotheses involving environmental factors. Another is that birds spend their time singing at dawn because there's low light and it's a bad time to look for food. "We basically didn't find much support for some of these environmental cues which have been purported in literature as hypotheses" for why birds sing more at dawn, says Vijay Ramesh, a postdoctoral research associate at Cornell and the study's lead author. The study, called "Why is the early bird early? An evaluation of hypotheses for avian dawn-biased vocal activity," was published this month in the peer-reviewed journal Philosophical Transactions of the Royal Society B. The researchers didn't definitively point to one reason for why the dawn chorus is happening, but they found support for ideas that the early morning racket relates to birds marking their territory after being inactive at night, and communicating about finding food. © 2025 npr
Keyword: Animal Communication; Evolution
Link ID: 29839 - Posted: 06.21.2025
By Nala Rogers Coffer illusion What do you see when you stare at this grid of line segments: a series of rectangles, or a series of circles? The way you perceive this optical illusion, known as the Coffer illusion, may tie back to the visual environment that surrounds you, a recent preprint suggests.Anthony Norcia/Smith-Kettlewell Eye Research Institute Himba people from rural Namibia can see right through optical illusions that trick people from the United States and United Kingdom. Even when there’s no “right” or “wrong” way to interpret an image, what Himba people see is often vastly different from what people see in industrialized societies, a new preprint suggests. That could mean people’s vision is fundamentally shaped by the environments they’re raised in—an old but controversial idea that runs counter to the way human perception is often studied. For example, when presented with a grid of line segments that can be seen as either rectangles or circles—an optical illusion known as the Coffer illusion—people from the U.S. and U.K. almost always see rectangles first, and they often struggle to see circles. The researchers suspect this is because they are surrounded by rectangular architecture, an idea known as the carpentered world hypothesis. In contrast, the traditional villages of Himba people are composed of round huts surrounding a circular livestock corral. People from these villages almost always see circles first, and about half don’t see rectangles even when prompted. “I’m surprised that you can’t see the round ones,” says Uapwanawa Muhenije, a Himba woman from a village in northern Namibia, speaking through an interpreter over a Zoom interview. “I wonder how you can’t see them.” Muhenije didn’t participate in the research because her village is less remote than those in the study, and it includes rectangular as well as circular buildings. She sees both shapes in the Coffer illusion easily. Although the study found dramatic differences in how people see four illusions, “the one experiment that’s going to overwhelm people is this Coffer,” says Jules Davidoff, a psychologist at the University of London who was not involved in the study. “There are other striking cultural differences in perception, but the one that they’ve produced here is a real humdinger.” The findings were published as a preprint on the PsyArXiv in February and updated this week. © 2025 American Association for the Advancement of Science.
Keyword: Vision; Development of the Brain
Link ID: 29838 - Posted: 06.21.2025
Katie Kavanagh Scientists have identified a group of neurons that might explain the mechanism behind how stress gives rise to problems with sleep and memory. The study — published last week in The Journal of Neuroscience1 — shows that neurons in a brain area called the hypothalamus mediate the effects of stress on sleep and memory, potentially providing a new target for the treatment of stress-related sleep disorders. Previous work has shown that in the hypothalamus, neurons in a structure called the paraventricular nucleus communicate with other areas important for sleep and memory. The neurons of the paraventricular nucleus release a hormone called corticotropin and have a role in regulating stress. But the neural mechanisms underlying the effect of stress on sleep and memory have remained elusive. For co-author Shinjae Chung, a neuroscientist at the University of Pennsylvania in Philadelphia, the question of exactly how stress affects these processes is personal, because, she says, “I experience a lot of sleep problems when I’m stressed”. She adds that “when I have an exam deadline, I have a tendency to have bad sleep that really affects my score the next day”. To study how neurons in the paraventricular nucleus translate stress into sleep and memory problems, the researchers put laboratory mice through a stressful experience by physically restraining the animals in a plastic tube. The team then tested the creatures’ spatial memory and monitored their brain activity as they slept. © 2025 Springer Nature Limited
Diana Kwon There might be a paradox in the biology of ageing. As humans grow older, their metabolisms tend to slow, they lose muscle mass and they burn many fewer calories. But certain cells in older people appear to do the exact opposite — they consume more energy than when they were young. These potential energy hogs are senescent cells, older cells that have stopped dividing and no longer perform the essential functions that they used to. Because they seem idle, biologists had assumed that zombie-like senescent cells use less energy than their younger, actively replicating counterparts, says Martin Picard, a psychobiologist at Columbia University in New York City. But in 2022, Gabriel Sturm, a former graduate student of Picard’s, painstakingly observed the life course of human skin cells cultured in a dish1 and, in findings that have not yet been published in full, found that cells that had stopped dividing had a metabolic rate about double that of younger cells. For Picard and his colleagues, the energetic mismatch wasn’t a paradox at all: ageing cells accumulate energetically costly forms of damage, such as alterations in DNA, and they initiate pro-inflammatory signalling. How that corresponds with the relatively low energy expenditure for ageing organisms is still unclear, but the researchers hypothesize that this tension might be an important driver of many of the negative effects of growing old, and that the brain might be playing a key part as mediator2. As some cells get older and require more energy, the brain reacts by stripping resources from other biological processes, which ultimately results in outward signs of ageing, such as greying hair or a reduction in muscle mass (see ‘Energy management and ageing’). Picard and his colleagues call this concept the ‘brain–body energy-conservation model’. And although many parts of the hypothesis are still untested, scientists are working to decipher the precise mechanisms that connect the brain to processes associated with ageing, such as senescence, inflammation and the shortening of telomeres — the stretches of repetitive DNA that cap the ends of chromosomes and protect them. © 2025 Springer Nature Limited
By Michael A. Yassa For nearly three decades, Alzheimer’s disease has been framed as a story about amyloid: A toxic protein builds up, forms plaques, kills neurons and slowly robs people of their memories and identity. The simplicity of this “amyloid cascade hypothesis” gave us targets, tools and a sense of purpose. It felt like a clean story. Almost too clean. We spent decades chasing it, developing dozens of animal models and pouring billions into anti-amyloid therapies, most of which failed. The few that made it to market offer only modest benefits, often with serious side effects. Whenever I think about this, I can’t help but picture Will Ferrell’s Buddy the Elf, in the movie “Elf,” confronting the mall Santa: “You sit on a throne of lies.” Not because anyone meant to mislead people (though maybe some did). But because we wanted so badly for the story to be true. So what happened? This should have worked … right? I would argue it was never going to work because we have been thinking about Alzheimer’s the wrong way. For decades, we have treated it as a single disease with a single straight line from amyloid to dementia. But what if that’s not how it works? What if Alzheimer’s only looks like one disease because we keep trying to force it into a single narrative? If that’s the case, then the search for a single cause—and a single cure—was always destined to fail. ”What if Alzheimer’s only looks like one disease because we keep trying to force it into a single narrative? If that’s the case, then the search for a single cause—and a single cure—was always destined to fail. Real progress, I believe, requires two major shifts in how we think. First, we have to let go of our obsession with amyloid. © 2025 Simons Foundation
Keyword: Alzheimers
Link ID: 29835 - Posted: 06.18.2025
By Amber Dance Back in 2008, neurovirologist Renée Douville observed something weird in the brains of people who’d died of the movement disorder ALS: virus proteins. But these people hadn’t caught any known virus. Instead, ancient genes originally from viruses, and still lurking within these patients’ chromosomes, had awakened and started churning out viral proteins. Our genomes are littered with scraps of long-lost viruses, the descendants of viral infections often from millions of years ago. Most of these once-foreign DNA bits are a type called retrotransposons; they make up more than 40 percent of the human genome. Pie chart shows that retrotransposons make up nearly half the human genome. Our genomes are riddled with DNA from ancient viral infections known as jumping genes. The majority of these are retrotransposons, which copy themselves via RNA intermediates; a smaller portion are cut-and-paste DNA transposons. Many retrotransposons seem to be harmless, most of the time. But Douville and others are pursuing the possibility that some reawakened retrotransposons may do serious damage: They can degrade nerve cells and fire up inflammation and may underlie some instances of Alzheimer’s disease and ALS (amyotrophic lateral sclerosis, or Lou Gehrig’s disease). The theory linking retrotransposons to neurodegenerative diseases — conditions in which nerve cells decline or die — is still developing; even its proponents, while optimistic, are cautious. “It’s not yet the consensus view,” says Josh Dubnau, a neurobiologist at the Renaissance School of Medicine at Stony Brook University in New York. And retrotransposons can’t explain all cases of neurodegeneration. Yet evidence is building that they may underlie some cases. Now, after more than a decade of studying this possibility in human brain tissue, fruit flies and mice, researchers are putting their ideas to the ultimate test: clinical trials in people with ALS, Alzheimer’s and related conditions. These trials, which borrow antiretroviral medications from the HIV pharmacopeia, have yielded preliminary but promising results. © 2025 Annual Reviews
Keyword: ALS-Lou Gehrig's Disease
; Alzheimers
Link ID: 29834 - Posted: 06.18.2025
By Andrew Jacobs When Gov. Greg Abbott of Texas approved legislation this week to spend $50 million in state money researching ibogaine, a powerful psychedelic, he put the spotlight on a promising, still illegal drug that has shown promise in treating opioid addiction, traumatic brain injury and depression. Interest in ibogaine therapy has surged in recent years, driven in large part by veterans who have had to travel to other countries for the treatment. The measure, which passed the Texas Legislature with bipartisan support, seeks to leverage an additional $50 million in private investment to fund clinical trials that supporters hope will provide a pathway for ibogaine therapy to win approval from the Food and Drug Administration, a process that could take years. The legislation directs the state to work with Texas universities and hospitals and tries to ensure that the state retains a financial stake in any revenue from the drug’s development. “You can’t put a price on a human life, but if this is successful and ibogaine becomes commercialized, it will help people all across the country and provide an incredible return on investment for the people of Texas,” said State Senator Tan Parker, a Republican who sponsored the bill. The initiative, one of the largest government investments in psychedelic medicine to date, is a watershed moment for a field that continues to gain mainstream acceptance. Regulated psilocybin clinics have opened in Oregon and Colorado, and ketamine has become widely available across the country as a treatment for depression and anxiety. There have been speed bumps. Last year, the F.D.A. rejected MDMA-assisted therapy for PTSD, the first psychedelic compound to make it through much of the agency’s rigorous drug review process. © 2025 The New York Times Company
Keyword: Drug Abuse; Stress
Link ID: 29833 - Posted: 06.18.2025
Associated Press Prairie dogs bark to alert each other to the presence of predators, with different cries depending on whether the threat is airborne or approaching by land. But their warnings also seem to help a vulnerable grassland bird. Curlews have figured out that if they eavesdrop on alarms from US prairie dog colonies they may get a jump on predators coming for them, too, according to research published on Thursday in the journal Animal Behavior. “Prairie dogs are on the menu for just about every predator you can think of – golden eagles, red-tailed hawks, foxes, badgers, even large snakes,” said Andy Boyce, a research ecologist in Montana at the Smithsonian’s National Zoo and Conservation Biology Institute. Such animals also gladly snack on grassland nesting birds such as the long-billed curlew, so the birds have adapted. Previous research has shown birds frequently eavesdrop on other bird species to glean information about food sources or danger, said Georgetown University ornithologist Emily Williams, who was not involved in the study. But, so far, scientists have documented only a few instances of birds eavesdropping on mammals. “That doesn’t necessarily mean it’s rare in the wild,” she said, “it just means we haven’t studied it yet.” Prairie dogs, a type of ground squirrel, live in large colonies with a series of burrows that may stretch for miles underground, especially on the vast US plains. When they hear each other’s barks, they either stand alert watching or dive into their burrows. “Those little barks are very loud; they can carry quite a long way,” said research co-author Andrew Dreelin, who also works for the Smithsonian. © 2025 Guardian News & Media Limited
Keyword: Animal Communication; Language
Link ID: 29832 - Posted: 06.18.2025
By Sofia Quaglia When octopuses extend their eight arms into hidden nooks and crannies in search of a meal, they are not just feeling around in the dark for their food. They are tasting their prey, and with even more sensory sophistication than scientists had already imagined. Researchers reported on Tuesday in the journal Cell that octopus arms are fine-tuned to “eavesdrop into the microbial world,” detecting microbiomes on the surfaces around them and deriving information from them, said Rebecka Sepela, a molecular biologist at Harvard and an author of the new study. Where octopus eyes cannot see, their arms can go to identify prey and make sense of their surroundings. Scientists knew that those eight arms (not tentacles) sense whether their eggs are healthy or need to be pruned. And the hundreds of suckers on each arm have over 10,000 chemotactile sensory receptors each, working with 500 million neurons to pick up that information and relay it throughout the nervous system. Yet, what exactly the octopus is tasting by probing and prodding — and how its arms can distinguish, say, a rock from an egg, a healthy egg in its clutch from a sick one or a crab that’s safe to eat from a rotting, toxic one — has long baffled scientists. What about the surfaces are they perceiving? For Dr. Sepela, this question was heightened when her team discovered 26 receptors along the octopuses’ arms that didn’t have a known function. She supposed those receptors were tuned only to molecules found on surfaces, rather than those diffused in water. So she and her colleagues collected swaths of molecules coating healthy and unhealthy crabs and octopus eggs. They grew and cultured the microbes from those surfaces in the lab, then tested 300 microbial strains, one by one, on two of those 26 receptors. During the screening, only particular microbes could switch open the receptors, and these microbes were more abundant on the decaying crabs and dying eggs than on their healthy counterparts. © 2025 The New York Times Company
Keyword: Chemical Senses (Smell & Taste); Neuroimmunology
Link ID: 29831 - Posted: 06.18.2025
By Tina Hesman Saey People trying to lose weight often count calories, carbs, steps and reps and watch the scales. Soon, they may have another number to consider: a genetic score indicating how many calories a person needs to feel full during a meal. This score may help predict whether someone will lose more weight on the drugs liraglutide or phentermine-topiramate, researchers report June 6 in Cell Metabolism. A separate study, posted to medRXiv.org in November, suggests that individuals with a higher genetic propensity for obesity benefit less from semaglutide compared to those with a lower genetic predisposition. Such genetic tests may one day help doctors and patients select personalized weight-loss treatments, some researchers say. But the genetic scores “are not perfect predictors of drug response,” says Paul Franks, a genetic epidemiologist at Queen Mary University of London who was not involved in either study. “They show a tendency.” For the Cell Metabolism study, Mayo Clinic researchers measured how many calories it took for about 700 adults with obesity to feel full when given an all-you-can-eat meal of lasagna, pudding and milk. The calorie intake varied widely, ranging from about 140 to 2,200 calories, with men generally needing more than women. The team used machine learning to compile a genetic score based on variants of 10 genes associated with obesity. That score is designed to reflect the calories people required to feel full. Then, the Mayo team and colleagues from Phenomix Sciences Inc, headquartered in Menlo Park, Calif., conducted two clinical trials. In one 16-week trial, people with obesity received either a placebo or liraglutide — a GLP-1 drug branded as Saxenda. GLP-1s are a class of diabetes drugs that have shown promise with weight loss. People with a lower genetic score lost more weight on liraglutide than those with higher genetic scores. © Society for Science & the Public 2000–2025.
Keyword: Obesity; Genes & Behavior
Link ID: 29830 - Posted: 06.14.2025
Elie Dolgin Sheree had maintained a healthy weight for 15 years, thanks to a surgery that wrapped a silicone ring around the top of her stomach. But when the gastric band repeatedly slipped and had to be removed, the weight came back — fast. She gained nearly 20 kilograms in just 2 months. Frustrated, she turned to the latest generation of anti-obesity medications, hoping to slow the rapid weight gain. She cycled through various formulations of the blockbuster therapies semaglutide (sold under the brand names Ozempic and Wegovy) and tirzepatide (sold as Zepbound for weight loss), finding some success with higher doses of these drugs, which mimic the effects of the appetite-suppressing hormone GLP-1. But each time, drug shortages disrupted her treatment, forcing her to start again with a new formulation or to go without the drugs for weeks. Tired of the uncertainty around the therapies, she decided to try something different. Sheree, who asked that her middle name be used to protect her privacy, underwent two minimally invasive procedures designed to reduce the size of her stomach and to blunt hunger cues. Developed over the past two decades, these ‘endoscopic’ procedures — performed using flexible tubes inserted through the mouth, and no scalpels — are just one part of a growing toolkit to help people who want to move away from GLP-1 therapy. More-conventional bariatric surgeries, used routinely since the 1980s to reroute the flow of food through the gut or to restrict the stomach’s size, might also gain wider appeal. And the search is picking up for other drugs that could offer lasting alternatives for a post-GLP-1 population. That momentum is driven by a convergence of factors: chronic shortages of GLP-1 therapies, high costs, insurance barriers and debilitating side effects. As a result, many people who start the drugs ultimately stop — with discontinuation rates in clinical trials ranging from 37% to 81% in the first year1. And once treatment ends, the weight lost often piles back on. © 2025 Springer Nature Limited
Keyword: Obesity
Link ID: 29829 - Posted: 06.14.2025