Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Alexa Robles-Gil Every elephant has about 1,000 whiskers on its trunk. They play a crucial role for the animals, which have thick skin and poor eyesight. Elephants cannot regrow these hairs, meaning a lost one creates a permanent sensory blind spot on a trunk, which they use for almost everything in daily life. And as such an important feature, they are also unique among mammalian facial hairs. “Elephant whiskers are aliens,” said Andrew Schulz, a mechanical engineer at the Max Planck Institute for Intelligent Systems in Germany. In a study published Thursday in the journal Science, Dr. Schulz and his colleagues identified the structural features that give elephant whiskers a kind of “built-in” intelligence, providing the sensitivity that the largest mammals on land need to navigate their world. While other animals like rats can move their whiskers around, a behavior known as “whisking,” elephants lack the necessary muscles. That leaves their whiskers essentially stationary, even if they protrude from the flexible trunk. This puzzled Dr. Schulz, who had previously studied the movement of their trunks. “If elephant trunk whiskers can’t move, there’s probably something built into them that allows them to” function in a way similar to mammals that whisk, Dr. Schulz said. To find out, Dr. Schulz gathered scientists from many fields. Engineers, neuroscientists, biologists and material scientists were among the few who studied whiskers from baby and adult Asian elephants. (All elephant whiskers came from animals that had died naturally, and were donated by a zoo veterinarian; “We did not go up and pluck whiskers from elephants,” Dr. Schulz said.) © 2026 The New York Times Company
Keyword: Pain & Touch; Evolution
Link ID: 30123 - Posted: 02.14.2026
By Lauren Schneider Microglia may be a key mediator between maternal immune activation and a pup’s memory of contextual fear conditioning in early infancy, a new mouse study reports. The findings sharpen the picture of memory formation in early life, but the study’s approach to microglia has raised questions. That scrutiny comes as scientists reevaluate concepts such as synaptic pruning, through which microglia may shape neuronal circuits in early life and beyond. Humans and rodents are unable to recall some of the earliest memories formed after birth. This period of infantile amnesia offers researchers a window to test conditions that may alter the survival of engrams, the changes in the brain tied to memory formation. “Development is an experiment that nature does for you,” says study investigator Tomás Ryan, professor in neuroscience at Trinity College Dublin. Activation of the immune system during pregnancy in mice leads to autism-like behaviors in their pups and reduces infantile amnesia, according to a previous study by Ryan’s team. Blocking microglial activity allows some infantile memories to persist in mice, Ryan and his colleagues report in the new paper, published 20 January in PLOS Biology. Administering minocycline in water daily starting one day before foot-shock conditioning at postnatal day 17 led to greater fear memory at postnatal day 25, after the typical onset of infantile amnesia. This recall was accompanied by a reactivation of associated engrams in the basolateral amygdala and central amygdalar nucleus. © 2026 Simons Foundation
Keyword: Glia; Development of the Brain
Link ID: 30122 - Posted: 02.14.2026
Andrew Gregory Health editor Reading, writing and learning a language or two can lower your risk of dementia by almost 40%, according to a study that suggests millions of people could prevent or delay the condition. Dementia is one of the world’s biggest health threats. The number of people living with the condition is forecast to triple to more than 150 million globally by 2050, and experts say it presents a big and rapidly growing threat to future health and social care systems in every community, country and continent. US researchers found that engaging in intellectually stimulating activities throughout life, such as reading, writing or learning a new language, was associated with a lower risk of Alzheimer’s disease, the most common form of dementia, and slower cognitive decline. The study author Andrea Zammit, of Rush University Medical Center in Chicago, said the discovery suggested cognitive health in later life was “strongly influenced” by lifelong exposure to intellectually stimulating environments. “Our findings are encouraging, suggesting that consistently engaging in a variety of mentally stimulating activities throughout life may make a difference in cognition. Public investments that expand access to enriching environments, like libraries and early education programs designed to spark a lifelong love of learning, may help reduce the incidence of dementia.” Researchers tracked 1,939 people with an average age of 80 who did not have dementia at the start of the study. They were followed for an average of eight years. Participants completed surveys about cognitive activities and learning resources during three stages. © 2026 Guardian News & Media Limited
Keyword: Alzheimers; Learning & Memory
Link ID: 30121 - Posted: 02.14.2026
Mariana Lenharo Exercise pumps up your muscles — but it might also be pumping up your neurons. According to a study published today in Neuron1, repeated exercise sessions on a treadmill strengthen the wiring in a mouse’s brain, making certain neurons quicker to activate. This ‘rewiring’ was essential for mice in the study to gradually improve their running endurance. The work reveals that the brain — in mice and, presumably, in humans — is actively involved in the development of endurance, the ability to get better at a physical activity with repeated practice, says Nicholas Betley, a neuroscientist at the University of Pennsylvania in Philadelphia, and a co-author of the paper. “You go for a run, and your lungs expand, your heart gets pumping better, your muscles break down and rebuild. All this great stuff happens, and the next time, it gets easier,” Betley says. “I didn’t expect that the brain was coordinating all of that.” Betley and his colleagues were curious about what happens in the brain as people get stronger through exercise. They decided to focus on the ventromedial hypothalamus, a brain region that regulates appetite and blood sugar. The team then zeroed in on a group of neurons in that region that produce a protein called steroidogenic factor 1 (SF1), which is known to play a part in regulating metabolism2. A previous study3 found that the deletion of the gene that codes for SF1 impairs endurance in mice. © 2026 Springer Nature Limited
Keyword: Obesity; Learning & Memory
Link ID: 30120 - Posted: 02.14.2026
Rachael Seidler Tianyi (Erik) Wang Going to space is harsh on the human body, and as a new study from our research team finds, the brain shifts upward and backward and deforms inside the skull after spaceflight. The extent of these changes was greater for those who spent longer in space. As NASA plans longer space missions, and space travel expands beyond professional astronauts, these findings will become more relevant. Why it matters On Earth, gravity constantly pulls fluids in your body and your brain toward the center of the Earth. In space, that force disappears. Body fluids shift toward the head, which gives astronauts a puffy face. Under normal gravity, the brain, cerebrospinal fluid and surrounding tissues reach a stable balance. In microgravity, that balance changes. Without gravity pulling downward, the brain floats in the skull and experiences various forces from the surrounding soft tissues and the skull itself. Earlier studies showed that the brain appears higher in the skull after spaceflight. But most of those studies focused on average or whole brain measures, which can hide important effects within different areas of the brain. Our goal was to look more closely. Astronauts need to exercise and take care of their bodies while in space. We analyzed brain MRI scans from 26 astronauts who spent different lengths of time in space, from a few weeks to over a year. To focus on the brain’s movement, we aligned each person’s skull across scans taken before and after spaceflight. One great story in your inbox every afternoon © 2010–2026, The Conversation US, Inc.
Keyword: Biomechanics
Link ID: 30119 - Posted: 02.14.2026
Yuki Noguchi At just over 5 foot, 5 inches, Christie Woodard weighs a lean 125 pounds. She's also open about relying on a low-dose GLP-1 to keep her weight there. She says sometimes people question why she's on the drug, "because they look at me and think I'm at healthy weight, or maybe they even think I'm thin." What people don't see is Woodard's previous struggles with obesity, which began in her 30s, and landed her at 260 pounds. She took up running half-marathons, but at that weight, it was painful. "I was not fast," she says. "I had massive issues; I was in physical therapy constantly. I tore my meniscus." Woodard, now 53 and living in Easton, Md., got gastric bypass surgery four years ago and cut her weight in half. Elated, she set a goal of completing half-marathons in all 50 states. Her weight remained stable until last year, when pounds began creeping back, despite adhering to a strict diet and lots of exercise. "I feel it in my knees, and mainly I feel it in my soul," she says. "I feel it in my confidence. It's messing with my head in a big way. I was terrified that I was going to go back to what I was." So her bariatric surgeon, Dr. Betsy Dovec, prescribed a low dose of the drug Zepbound, even though Woodard's body mass index didn't technically classify her as overweight. Dovec says Woodard isn't her only normal-weight patient on GLP-1s. "I prescribe medications for all types of people," she says. Though, she clarifies, she does not give the drugs to people for purely aesthetic reasons, like someone trying to shed a few pounds before an event, for example. © 2026 npr
Keyword: Obesity
Link ID: 30118 - Posted: 02.14.2026
By Amber Dance Real estate agents will tell you that a home’s most important feature is “location, location, location.” It’s similar in neuroscience: “Location is everything in the brain,” said Bosiljka Tasic (opens a new tab), a self-described “biological cartographer.” Brain injury in one spot could knock out memory; damage in another could interfere with personality. Neuroscientists and doctors are lost without a good map. Researchers have been mapping the brain for more than a century. By tracing cellular patterns that are visible under a microscope, they’ve created colorful charts and models that delineate regions and have been able to associate them with functions. In recent years, they’ve added vastly greater detail: They can now go cell by cell and define each one by its internal genetic activity. But no matter how carefully they slice and how deeply they analyze, their maps of the brain seem incomplete, muddled, inconsistent. For example, some large brain regions have been linked to many different tasks; scientists suspect that they should be subdivided into smaller regions, each with its own job. So far, mapping these cellular neighborhoods from enormous genetic datasets has been both a challenge and a chore. Recently, Tasic, a neuroscientist and genomicist at the Allen Institute for Brain Science, and her collaborators recruited artificial intelligence for the sorting and mapmaking effort. They fed genetic data from five mouse brains — 10.4 million individual cells with hundreds of genes per cell — into a custom machine learning algorithm. The program delivered maps that are a neuro-realtor’s dream, with known and novel subdivisions within larger brain regions. Humans couldn’t delineate such borders in several lifetimes, but the algorithm did it in hours. The authors published their methods (opens a new tab) in Nature Communications in October. © 2026 Simons Foundation
Keyword: Brain imaging; Development of the Brain
Link ID: 30117 - Posted: 02.11.2026
Jon Hamilton Parkinson's disease does more than cause tremor and trouble walking. It can also affect sleep, smell, digestion and even thinking. That may be because the disease disrupts communication in a brain network that links the body and mind, a team reports in the journal Nature. "It almost feels like a tunnel is jammed, so no traffic can go normally," says Hesheng Liu, a brain scientist at Changping Laboratory and Peking University in Beijing and an author of the study. The finding fits nicely with growing evidence that Parkinson's is a network disorder, rather than one limited to brain areas that control specific movements, says Peter Strick, a professor and chair of neurobiology at the University of Pittsburgh who was not involved in the study. Other degenerative brain diseases affect other brain networks in different ways. Alzheimer's, for example, tends to reduce connectivity in the default mode network, which supports memory and sense of self. ALS (amyotrophic lateral sclerosis) primarily damages the motor system network, which controls movement. Understanding the network affected by Parkinson's, which affects about 1 million people in the United States, could change the way doctors treat the disease. A mystery solved? People with Parkinson's often have symptoms that vary in ways that are hard to explain. For example, someone who usually is unable to stand may suddenly leap when faced with an emergency. And Parkinson's patients who can still walk may freeze if they try to carry on a conversation. © 2026 npr
Keyword: Parkinsons
Link ID: 30116 - Posted: 02.11.2026
By Corinna da Fonseca-Wollheim The placid chords of a Debussy prelude splashed through a darkened auditorium during a recital by the pianist Nicolas Namoradze at the University of California, San Francisco, on a November evening. A translucent image of Namoradze’s brain appeared above him on a screen: Electrical currents of different wavelengths, associated with varying levels of alertness, registered as colorful activity coursing through the model like storm fronts on a weather map. With each chord, clouds of green and blue bloomed, then faded as the sound receded. As the recital progressed with works by Bach, Beethoven and Scriabin, the image of the gently rotating brain showed a complex choreography of signals that sometimes ping-ponged between different areas or flickered simultaneously across the organ’s hemispheres. As a visual spectacle accompanying Namoradze’s pellucid playing, it was mesmerizing: an X-ray, seemingly, of virtuosity at work. But to the scientists in the audience, attendees at a conference on the neuroscience of music and dance, it was more than entertainment. It was evidence of a breakthrough in experiment design — one that opens up possibilities in an area that has long eluded scientific study: how music activates the brain, not in listeners, but in performers. It was also a reminder of the value artists can bring to scientific inquiry as active participants shaping studies of their craft. The neuroscientist Theodore Zanto, a member of the Neuroscape lab at U.C.S.F. that created the “Glass Brain” animations, said in an interview the next day that he was surprised — and moved — by the result. “It’s probably the cleanest real-time representation of what’s happening inside the brain during a piano performance,” he said. © 2026 The New York Times Company
Keyword: Hearing; Brain imaging
Link ID: 30115 - Posted: 02.11.2026
By Holly Barker Synaptic proteins degrade more slowly in aged mice than in younger mice, a new study finds. Microglia appear to unburden the neurons of the excess proteins, but that accumulation may turn toxic, the findings suggest. To function properly, cells need to clear out old and damaged proteins periodically, but that process stalls with age: Protein turnover is about 20 percent slower in the brains of older rodents than in youthful ones, according to an analysis of whole-brain samples. The new study is the first to probe protein clearance specifically in neurons in living animals. “Neurons face unique challenges to protein turnover,” says study investigator Ian Guldner, a postdoctoral fellow in Tony Wyss-Coray’s lab at Stanford University. For instance, their longevity prevents them from distributing old proteins among daughter cells. And unlike other proteins on the path to degradation, neuronal components must first navigate the axon—sometimes traveling as far as 1 meter, Guldner says. In the new study, Guldner and his colleagues engineered mice to express a modified version of aminoacyl-tRNA synthetase—a component of the protein synthesis machinery—in excitatory neurons. Every day for one week, mice of different ages received injections of chemically altered amino acids compatible only with that mutant enzyme. Neurons used the labeled amino acids to replenish proteins, enabling the group to track how quickly those proteins degraded over the subsequent two weeks. “The achievement lies in the technical advance, namely by being able to look at protein degradation and aggregation specifically in neuronal cells,” says F. Ulrich Hartl, director of the Max Planck Institute of Biochemistry, who was not involved in the study. © 2026 Simons Foundation
Keyword: Development of the Brain; Glia
Link ID: 30114 - Posted: 02.11.2026
Ian Sample Science editor People who have a couple of teas or coffees a day have a lower risk of dementia and marginally better cognitive performance than those who avoid the drinks, researchers say. Health records for more than 130,000 people showed that over 40 years, those who routinely drank two to three cups of caffeinated coffee or one to two cups of caffeinated tea daily had a 15-20% lower risk of dementia than those who went without. The caffeinated coffee drinkers also reported slightly less cognitive decline than those who opted for decaf and performed better on some objective tests of brain function, according to a report published in the Journal of the American Medical Association. The findings suggest habitual tea and coffee drinking is good for the brain, but the research cannot prove it, as caffeine drinkers may be less prone to dementia for other reasons. A similar link would arise if poor sleepers, who appear to have a greater risk of cognitive decline, steered clear of caffeine to get a better night’s rest. “Our study alone can’t prove causality, but to our knowledge, it is the best evidence to date looking at coffee and tea intake and cognitive health, and it is consistent with plausible biology,” said the lead author, Yu Zhang, who studies nutritional epidemiology at Harvard University. Coffee and tea contain caffeine and polyphenols that may protect against brain ageing by improving vascular health and reducing inflammation and oxidative stress, where harmful atoms and molecules called free radicals damage cells and tissues. Substances in the drinks could also work by improving metabolic health. Caffeine, for example, is linked to lower rates of type 2 diabetes, a known risk factor for dementia. © 2026 Guardian News & Media Limited
Keyword: Drug Abuse; Alzheimers
Link ID: 30113 - Posted: 02.11.2026
By Alexa Robles-Gil Having an imaginary friend, playing house or daydreaming about the future were long considered uniquely human abilities. Now, scientists have conducted the first study indicating that apes have the ability to play pretend as well. The findings, published Thursday in the journal Science, suggest that imagination is within the cognitive potential of an ape and can possibly be traced back to our common evolutionary ancestors. “This is one of those things that we assume is distinct about our species,” said Christopher Krupenye, a cognitive scientist at Johns Hopkins University and an author of the study. “This kind of finding really shows us that there’s much more richness to these animals’ minds than people give them credit for,” he said. Researchers knew that apes were capable of certain kinds of imagination. If an ape watches someone hide food in a cup, it can imagine that the food is there despite not seeing it. Because that perception is the reality — the food is actually there — it requires the ape to sustain only one view of the world, the one that it knows to be true. “This kind of work goes beyond it,” Dr. Krupenye said. “Because it suggests that they can, at the same time, consider multiple views of the world and really distinguish what’s real from what’s imaginary.” Bonobos, an endangered species found only in the Democratic Republic of Congo, are difficult to study in the wild. For this research, Dr. Krupenye and Amalia Bastos, a cognitive scientist at the University of St. Andrews, relied on an organization known as the Ape Initiative to study Kanzi, a male bonobo famous for demonstrating some understanding of spoken English. (Kanzi was an enculturated ape born in captivity; he died last year at age 44.) © 2026 The New York Times Company
Keyword: Consciousness; Evolution
Link ID: 30112 - Posted: 02.07.2026
By Nora Bradford For more than a century, psychologists thought that the infant experience was, as the psychologist and philosopher William James famously put it, a “blooming, buzzing confusion.” But new research suggests babies are born with a surprisingly sophisticated neurological toolkit that can organize the visual world into categories and pick out the beat in a song. In the first of two new studies, neuroscientists managed a rare feat: performing functional MRI (fMRI) scans on more than 100 awake 2-month-old infants to see how their brains categorize visual objects. fMRI requires near-stillness, which makes scanning babies notoriously difficult. While the infants lay in the machines, images of animals, food, household objects and other familiar items appeared above their heads like “an IMAX for babies,” says Cliona O’Doherty, a developmental neuroscientist at Stanford University who conducted the work at Trinity College Dublin. “MRI is difficult even under ‘ideal’ circumstances when research participants can follow instructions to hold still,” says Scott Johnson, a developmental psychologist at UCLA who was not involved in the study. “Babies can’t take instruction, so these researchers must have the patience of saints.” The imaging showed that a brain region called the ventral visual cortex, responsible for recognizing what we see, already responded similarly to that of adults, O’Doherty and colleagues report February 2 in Nature Neuroscience. In both adults and 2-month olds, the ventral visual cortex’s activity is distinct for different categories of objects, pushing back against the traditional view that the brain gradually learns to distinguish between categories throughout development. © Society for Science & the Public 2000–2026
Keyword: Hearing; Development of the Brain
Link ID: 30111 - Posted: 02.07.2026
By Natalia Mesa A region of the cerebellum shows language specificity akin to that of cortical language regions, indicating that it might be part of the broader language network, according to a new brain-imaging study. “This is the first time we see an area outside of the core left-hemisphere language areas that behaves so similarly to those core areas,” says study investigator Ev Fedorenko, associate professor of brain and cognitive sciences at the Massachusetts Institute of Technology. Initially thought to coordinate only movement, the cerebellum also contributes to cognitive processes, such as social reward, abstract reasoning and working memory, according to studies from the past decade. But despite the fact that people with cerebellar lesions have subtle language struggles, the region’s contributions to that skill have been ignored until recently, Fedorenko says. With this new work, “I think it becomes harder to dismiss language responses as somehow artifactual.” Fedorenko and her team analyzed nearly 1,700 whole-brain functional MRI experiments conducted over the course of 15 years. They originally collected and analyzed those scans to identify language-selective regions of the neocortex, but they reanalyzed many of them to determine the cerebellum’s role in linguistic processing. Four cerebellar regions activated robustly when participants performed language-related tasks, such as reading passages of text or listening to someone else reading the passages aloud, in line with previous work. But only one region responded exclusively to these language-related tasks; it did not activate during a variety of nonlinguistic tasks—including movement, arithmetic tasks and a spatial working memory task—or when participants listened to music or watched videos of faces and bodies. The findings were published last month in Neuron. © 2026 Simons Foundation
Keyword: Language
Link ID: 30110 - Posted: 02.07.2026
By Molly Glick Not long after upending federal diet guidelines in order to prioritize “real food” on our plates, United States Health and Human Services Secretary Robert F. Kennedy Jr. has offered a new piece of questionable advice. During a tour to promote these dietary recommendations, Kennedy recently claimed that a keto diet can cure schizophrenia—an assertion that experts have quickly thrown cold water on. The ketogenic diet promotes fat-rich meals and low amounts of carbohydrates. While keto eating has skyrocketed in popularity in recent years—it ranked the most Googled diet in the U.S. in 2020—it was initially designed in the early 20th century for patients with epilepsy. More recent studies have confirmed that the diet is effective for certain types of epilepsy because it can control seizures. Meanwhile, we have much less evidence for its impacts on symptoms of schizophrenia. So far, small studies have offered some early evidence that ketogenic diets may help people with the condition. “There is currently no credible evidence that ketogenic diets cure schizophrenia,” Mark Olfson, a psychiatrist at Columbia University, told The New York Times. Kennedy also proclaimed that the diet can essentially cure bipolar disorder, according to studies he recently read. But as with schizophrenia, keto’s impacts on bipolar disorder have only been examined in limited numbers of patients so far. Preliminary findings have also hinted that a keto diet could ease symptoms of depression. It may offer “small antidepressant benefits” for people who don’t respond to medication, according to a recently published JAMA Psychiatry paper. But this work is in the early stages as well and remains far from conclusive. © 2026 NautilusNext Inc.
Keyword: Schizophrenia; Depression
Link ID: 30109 - Posted: 02.07.2026
Peter Lukacs Popular wisdom holds we can ‘rewire’ our brains: after a stroke, after trauma, after learning a new skill, even with 10 minutes a day on the right app. The phrase is everywhere, offering something most of us want to believe: that when the brain suffers an assault, it can be restored with mechanical precision. But ‘rewiring’ is a risky metaphor. It borrows its confidence from engineering, where a faulty system can be repaired by swapping out the right component; it also smuggles that confidence into biology, where change is slower, messier and often incomplete. The phrase has become a cultural mantra that is easier to comprehend than the scientific term, neuroplasticity – the brain’s ability to change and form new neural connections throughout life. But what does it really mean to ‘rewire’ the brain? Is it a helpful shorthand for describing the remarkable plasticity of our nervous system or has it become a misleading oversimplification that distorts our grasp of science? After all, ‘rewiring your brain’ sounds like more than metaphor. It implies an engineering project: a system whose parts can be removed, replaced and optimised. The promise is both alluring and oddly mechanical. The metaphor actually did come from engineering. To an engineer, rewiring means replacing old and faulty circuits with new ones. As the vocabulary of technology crept into everyday life, it brought with it a new way of thinking about the human mind. Medical roots of the phrase trace back to 1912, when the British surgeon W Deane Butcher compared the body’s neural system to a house’s electrical wiring, describing how nerves connect to muscles much like wires connect appliances to a power source. By the 1920s, the Harvard psychologist Leonard Troland was referring to the visual system as ‘an extremely intricate telegraphic system’, reinforcing the comparison between brain function and electrical networks. © Aeon Media Group Ltd. 2012-2026.
Keyword: Learning & Memory; Development of the Brain
Link ID: 30108 - Posted: 02.04.2026
Elizabeth Quill Think about your breakfast this morning. Can you imagine the pattern on your coffee mug? The sheen of the jam on your half-eaten toast? Most of us can call up such pictures in our minds. We can visualize the past and summon images of the future. But for an estimated 4% of people, this mental imagery is weak or absent. When researchers ask them to imagine something familiar, they might have a concept of what it is, and words and associations might come to mind, but they describe their mind’s eye as dark or even blank. Systems neuroscientist Mac Shine at the University of Sydney, Australia, first realized that his mental experience differed in this way in 2013. He and his colleagues were trying to understand how certain types of hallucination come about1, and were discussing the vividness of mental imagery. “When I close my eyes, there’s absolutely nothing there,” Shine recalls telling his colleagues. They immediately asked him what he was talking about. “Whoa. What’s going on?” Shine thought. Neither he nor his colleagues had realized how much variation there is in the experiences people have when they close their eyes. This moment of revelation is common to many people who don’t form mental images. They report that they might never have thought about this aspect of their inner life if not for a chance conversation, a high-school psychology class or an article they stumbled across (see ‘How do you imagine?’). Although scientists have known for more than a century that mental imagery varies between people, the topic received a surge of attention when, a decade ago, an influential paper coined the term aphantasia to describe the experience of people with no mental imagery2. © 2026 Springer Nature Limited
Keyword: Attention; Consciousness
Link ID: 30107 - Posted: 02.04.2026
By Ellen Barry A new analysis of birth cohorts in the Canadian province of Ontario has found a striking rise in the incidence of psychotic disorders among young people, a finding that its authors said could reflect teens’ increasing use of substances like cannabis, stimulants and hallucinogens. The study, published on Monday in The Canadian Medical Association Journal, found that the rate of new diagnoses of psychotic disorders among people ages 14 to 20 increased by 60 percent between 1997 and 2023, while new diagnoses at older ages plateaued or declined. Compared with people born in the late 1970s, those born in the early 2000s were about twice as likely to have been diagnosed with a psychotic disorder by age 20. The researchers included 12 million people born in Ontario between 1960 and 2009, of which 0.9 percent were diagnosed with a psychotic disorder during the study period. The study was epidemiological and did not try to identify a cause for the rising prevalence. There are a number of possible explanations, among them older paternal age, the stress of migration, neonatal health problems and early intervention programs that now regularly identify the disorders at younger ages, the authors note. But Dr. Daniel Myran, one of the study’s authors, said he undertook the study, in part, to follow up on concerns that the legalization of cannabis might increase population-level rates of schizophrenia and other psychotic disorders. “I was expecting to see some increases in these younger folks, but I was quite surprised by the scale,” said Dr. Myran, a family physician and research chair at North York General Hospital. He said the results suggested a need for more research into the impact of expanding cannabis use by young people. © 2026 The New York Times Company
Keyword: Schizophrenia; Drug Abuse
Link ID: 30106 - Posted: 02.04.2026
By Marla Vacek Broadfoot Nearly 1 in 8 dementia cases — about half a million nationwide — may be linked to insomnia. The new findings, reported December 27 in the Journals of Gerontology: Series A, add weight to growing evidence that sleep is a modifiable risk factor for dementia, akin to hearing loss and hypertension. The study does not establish a direct cause-and-effect relationship between insomnia and dementia for individuals, says Yuqian Lin, a data analyst at Massachusetts General Hospital in Boston. Rather, she says, it looks at the overall extent to which insomnia may contribute to dementia across the population. Lin and her colleagues analyzed data from the National Health and Aging Trends Study, or NHATS, a long-running survey of 5,900 U.S. adults ages 65 and older. Participants reported whether they had difficulty falling asleep, staying asleep or both. Dementia was identified using standard research tools that rely on cognitive testing and reports from family members or caregivers. To estimate the impact of insomnia on the population, Lin and her team calculated the proportion of dementia cases that could theoretically be prevented if insomnia-related sleep disturbances were eliminated. The calculation combined the prevalence of insomnia and dementia in the NHATS population with relative risk estimates drawn from recent large meta-analyses linking insomnia to dementia later in life. © Society for Science & the Public 2000–2026.
Keyword: Sleep; Alzheimers
Link ID: 30105 - Posted: 02.04.2026
By Jake Buehler Though fearsome predators, snakes can go weeks or even months without eating. Now, scientists think they may know how they do it. Snakes have lost the genes to produce ghrelin, a key hormone that regulates appetite, digestion, and fat storage, researchers report today in Royal Society Open Biology. Chameleons and a group of desert lizards called toadhead agamas that also have huge spaces between meals have also lost the same genes, hinting that cutting off ghrelin is a key way to excel at fasting, possibly by suppressing appetite and holding onto fat stores. “I give [the researchers] a lot of credit for looking more deeply into the data that was staring us all in the face—myself included,” says Todd Castoe, a genomicist at the University of Texas at Arlington not involved with the study. The hormone is ubiquitous across vertebrates, from fish to mammals. So finding that reptiles have repeatedly ditched it is “pretty remarkable,” he says. When scientists first discovered ghrelin nearly 30 years ago, they thought this “hunger hormone” could be key to fighting obesity in humans. But it hasn’t been that simple. Since then, researchers have found that ghrelin has a complicated role within a network of hormones constantly tweaking hunger and energy stores. And even though ghrelin is commonly found in vertebrates, it’s been unclear how it has evolved across various groups of vertebrates. So in the new study, Rui Resende Pinto, an evolutionary biologist at the University of Porto, and his colleagues focused on reptiles, many of which can go long periods without food. The researchers scanned the genomes of 112 species. In snakes, chameleons, and toadhead agamas, ghrelin genes were either missing or so warped by mutations they could no longer encode the hormone, the team found. The degree of the genes’ erosion also varied considerably between snake families: Some snakes such as boas and pythons had malformed ghrelin genes, but others, such as vipers, cobras, and their relatives, barely had anything left.
Keyword: Obesity; Evolution
Link ID: 30104 - Posted: 02.04.2026


.gif)

