Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Veronique Greenwood Encased in the skull, perched atop the spine, the brain has a carefully managed existence. It receives only certain nutrients, filtered through the blood-brain barrier; an elaborate system of protective membranes surrounds it. That privileged space contains a mystery. For more than a century, scientists have wondered: If it’s so hard for anything to get into the brain, how does waste get out? The brain has one of the highest metabolisms of any organ in the body, and that process must yield by-products that need to be removed. In the rest of the body, blood vessels are shadowed by a system of lymphatic vessels. Molecules that have served their purpose in the blood move into these fluid-filled tubes and are swept away to the lymph nodes for processing. But blood vessels in the brain have no such outlet. Several hundred kilometers of them, all told, seem to thread their way through this dense, busily working tissue without a matching waste system. However, the brain’s blood vessels are surrounded by open, fluid-filled spaces. In recent decades, the cerebrospinal fluid, or CSF, in those spaces has drawn a great deal of interest. “Maybe the CSF can be a highway, in a way, for the flow or exchange of different things within the brain,” said Steven Proulx, who studies the CSF system at the University of Bern. A recent paper in Cell contains a new report about what is going on around the brain (opens a new tab) and in its hidden cavities. A team at the University of Rochester led by the neurologist Maiken Nedergaard (opens a new tab) asked whether the slow pumping of the brain’s blood vessels might be able to push the fluid around, among, and in some cases through cells, to potentially drive a system of drainage. In a mouse model, researchers injected a glowing dye into CSF, manipulated the blood vessel walls to trigger a pumping action, and saw the dye concentration increase in the brain soon after. They concluded that the movement of blood vessels might be enough to move CSF, and possibly the brain’s waste, over long distances. © 2025 Simons Foundation.
Keyword: Brain imaging; Sleep
Link ID: 29722 - Posted: 03.27.2025
Nora Bradford Scientists have created the first map of the crucial structures called mitochondria throughout the entire brain ― a feat that could help to unravel age-related brain disorders1. The results show that mitochondria, which generate the energy that powers cells, differ in type and density in different parts of the brain. For example, the evolutionarily oldest brain regions have a lower density of mitochondria than newer regions. The map, which the study’s authors call the MitoBrainMap, is “both technically impressive and conceptually groundbreaking”, says Valentin Riedl, a neurobiologist at Friedrich-Alexander University in Erlangen, Germany, who was not involved in the project. The brain’s mitochondria are not just bit-part players. “The biology of the brain, we know now, is deeply intertwined with the energetics of the brain,” says Martin Picard, a psychobiologist at Columbia University in New York City, and a co-author of the study. And the brain accounts for 20% of the human body’s energy usage2. Wielding a tool typically used for woodworking, the study’s authors divided a slice of frozen human brain ― from a 54-year-old donor who died of a heart attack ― into 703 tiny cubes. Each cube measured 3 × 3 × 3 millimetres, which is comparable to the size of the units that make up standard 3D images of the brain. “The most challenging part was having so many samples,” says Picard. The team used biochemical and molecular techniques to determine the density of mitochondria in each of the 703 samples. In some samples, the researchers also estimated the mitochondria’s efficiency at producing energy. To extend their findings beyond a single brain slab, the authors developed a model to predict the numbers and types of mitochondria across the entire brain. They fed it brain-imaging data and the brain-cube data. To check their model, they applied it to other samples of the frozen brain slice and found that it accurately predicted the samples’ mitochondrial make-up. © 2025 Springer Nature
Keyword: Brain imaging
Link ID: 29721 - Posted: 03.27.2025
By RJ Mackenzie New footage documents microglia pruning synapses at high resolution and in real time. The recordings, published in January, add a new twist to a convoluted debate about the range of these cells’ responsibilities. Microglia are the brain’s resident immune cells. For about a decade, some have also credited them with pruning excess synaptic connections during early brain development. But that idea was based on static images showing debris from destroyed synapses within the cells—which left open the possibility that microglia clean up after neurons do the actual pruning. In the January movies, though, a microglia cell expressing a green fluorescent protein clearly reaches out a ghostly green tentacle to a budding presynapse on a neuron and lifts it away, leaving the neighboring blue axon untouched. “Their imaging is superb,” says Amanda Sierra, a researcher at the Achucarro Basque Center for Neuroscience, who was not involved in the work. But “one single video, or even two single videos, however beautiful they are, are not sufficient evidence that this is the major mechanism of synapse elimination,” she says. In the new study, researchers isolated microglia and neurons from mice and grew them in culture with astrocytes, labeling the microglia, synapses and axons with different fluorescent dyes. Their approach ensured that the microglia formed ramified processes—thin, branching extensions that don’t form when they are cultured in isolation, says Ryuta Koyama, director of the Department of Translational Neurobiology at Japan’s National Center of Neurology and Psychiatry, who led the work. “People now know that ramified processes of microglia are really necessary to pick up synapses,” he says. “In normal culture systems, you can’t find ramified processes. They look amoeboid.” © 2025 Simons Foundation
Keyword: Learning & Memory; Glia
Link ID: 29720 - Posted: 03.27.2025
By Catherine Offord Scientists say they have found a long–sought-after population of stem cells in the retina of human fetuses that could be used to develop therapies for one of the leading causes of blindness. The use of fetal tissue, a source of ethical debate and controversy in some countries, likely wouldn’t be necessary for an eventual therapy: Transplanting similar human cells generated in the lab into the eyes of mice with retinal disease protected the animals’ vision, the team reported this week in Science Translational Medicine. “I see this as potentially a very interesting advancement of this field, where we are really in need of a regenerative treatment for retinal diseases,” says Anders Kvanta, a retinal specialist at the Karolinska Institute who was not involved in the work. He and others note that more evidence is needed to show the therapeutic usefulness of the newly described cells. The retina, a layer of light-sensing tissue at the back of the eye, can degenerate with age or because of an inherited condition such as retinitis pigmentosa, a rare disease that causes gradual breakdown of retinal cells. Hundreds of millions of people worldwide are affected by retinal degeneration, and many suffer vision loss or blindness as a result. Most forms can’t be treated. Scientists have long seen a potential solution in stem cells, which can regenerate and repair injured tissue. Several early-stage clinical trials are already evaluating the safety and efficacy of transplanting stem cells derived from cell lines established from human embryos, for example, or adult human cells that have been reprogrammed to a stem-like state. Other approaches include transplanting so-called retinal progenitor cells (RPCs)—immature cells that give rise to photoreceptors and other sorts of retinal cells—from aborted human fetuses. Some researchers have argued that another type of cell, sometimes referred to as retinal stem cells (RSCs), could also treat retinal degeneration. These cells’ long lifespans and ability to undergo numerous cells divisions could make them better candidates to regenerate damaged tissue than RPCs. RSCs have been found in the eyes of zebrafish and some other vertebrates, but evidence for their existence in mammals has been controversial. Reports announcing their discovery in adult mice in the early 2000s were later discounted.
Keyword: Vision; Stem Cells
Link ID: 29719 - Posted: 03.27.2025
Anna Bawden Health and social affairs correspondent Researchers have developed ultra-powerful scans that could enable surgery for previously treatment-resistant epilepsy. Globally, about 50 million people have epilepsy. In England, epileptic seizures are the sixth most common reason for hospital admission. About 360,000 people in the UK have focal epilepsy, which causes recurring seizures in a specific part of the brain. Many patients successfully treat their condition with medication but for more than 100,000 patients, their symptoms do not improve with drugs, leaving surgery as the only option. Finding brain lesions, a significant cause of epilepsy, can be tricky. Ultra-powerful MRI scanners are capable of identifying even tiny lesions in patients’ brains. These 7T MRI scanners produce much more detailed resolution on brain scans, enabling better detection of lesions. If surgeons can see the lesions on MRI scans, this can double the chances of the patient being free of seizures after surgery. But 7T scanners are also susceptible to “dark patches” known as signal dropouts. Now researchers in Cambridge and Paris have developed a new technique to overcome the problem. Scientists at the University of Cambridge’s Wolfson Brain Imaging Centre, and the Université Paris-Saclay, used eight transmitters around the brain, rather than the usual one, to “parallel transmit” MRI images, which significantly reduced the number of black spots. The first study to use this approach, doctors at Addenbrooke’s hospital, Cambridge, then trialled the technique with 31 drug-resistant epilepsy patients to see whether the parallel transmit 7T scanner was better than conventional 3T scanners at detecting brain lesions. The research, published in the journal Epilepsia, found that the parallel transmit 7T scanner identified previously unseen structural lesions in nine patients. © 2025 Guardian News & Media Limited
Keyword: Brain imaging; Epilepsy
Link ID: 29716 - Posted: 03.22.2025
Ari Daniel Tristan Yates has no doubt about her first memory, even if it is a little fuzzy. "I was about three and a half in Callaway Gardens in Georgia," she recalls, "just running around with my twin sister trying to pick up Easter eggs." But she has zero memories before that, which is typical. This amnesia of our babyhood is pretty much the rule. "We have memories from what happened earlier today and memories from what happened earlier last week and even from a few years ago," says Yates, who's a cognitive neuroscientist at Columbia University. "But all of us lack memories from our infancy." Is that because we don't make memories when we're babies, or is there something else responsible? Now, in new research published by Yates and her colleagues in the journal Science, they propose that babies are able to form memories, even if they become inaccessible later in life. These results might reveal something crucial about the earliest moments of our development. "That's the time when we learn who our parents are, that's when we learn language, that's when we learn how to walk," Yates says. "What happens in your brain in the first two years of life is magnificent," says Nick Turk-Browne, a cognitive neuroscientist at Yale University. "That's the period of by far the greatest plasticity across your whole life span. And better understanding how your brain learns and remembers in infancy lays the foundation for everything you know and do for the rest of your life. © 2025 npr
Keyword: Learning & Memory; Development of the Brain
Link ID: 29715 - Posted: 03.22.2025
By Christina Caron On TikTok, misinformation about attention deficit hyperactivity disorder can be tricky to spot, according to a new study. The study, published on Wednesday in the journal PLOS One, found that fewer than 50 percent of the claims made in some of the most popular A.D.H.D. videos on TikTok offered information that matched diagnostic criteria or professional treatment recommendations for the disorder. And, the researchers found, even study participants who had already been diagnosed with A.D.H.D. had trouble discerning which information was most reliable. About half of the TikTok creators included in the study were using the platform to sell products, such as fidget spinners, or services like coaching. None of them were licensed mental health professionals. The lack of nuance is concerning, said Vasileia Karasavva, a Ph.D. student in clinical psychology at the University of British Columbia in Vancouver and the lead author of the study. If TikTok creators talk about difficulty concentrating, she added, they don’t typically mention that the symptom is not specific to A.D.H.D. or that it could also be a manifestation of a different mental disorder, like depression or anxiety. “The last thing we want to do is discourage people from expressing how they’re feeling, what they’re experiencing and finding community online,” Ms. Karasavva said. “At the same time, it might be that you self-diagnose with something that doesn’t apply to you, and then you don’t get the help that you actually need.” Ms. Karasavva’s results echo those of a 2022 study that also analyzed 100 popular TikTok videos about A.D.H.D. and found that half of them were misleading. “The data are alarming,” said Stephen P. Hinshaw, a professor of psychology and an expert in A.D.H.D. at the University of California, Berkeley, who was not involved in either study. The themes of the videos might easily resonate with viewers, he added, but “accurate diagnosis takes access, time and money.” © 2025 The New York Times Company
Keyword: ADHD
Link ID: 29714 - Posted: 03.22.2025
By Paula Span Joan Presky worries about dementia. Her mother lived with Alzheimer’s disease for 14 years, the last seven in a memory-care residence, and her maternal grandfather developed dementia, too. “I’m 100 percent convinced that this is in my future,” said Ms. Presky, 70, a retired attorney in Thornton, Colo. Last year, she spent almost a full day with a neuropsychologist, undergoing an extensive evaluation. The results indicated that her short-term memory was fine — which she found “shocking and comforting” — and that she tested average or above in every cognitive category but one. She’s not reassured. “I saw what Alzheimer’s was like,” she said of her mother’s long decline. “The memory of what she went through is profound for me.” The prospect of dementia, which encompasses Alzheimer’s disease and a number of other cognitive disorders, so frightens Americans that a recent study projecting steep increases in cases over the next three decades drew enormous public attention. The researchers’ findings, published in January in Nature Medicine, even showed up as a joke on the Weekend Update segment of “Saturday Night Live.” “Dementia is a devastating condition, and it’s very much related to the oldest ages,” said Dr. Josef Coresh, director of the Optimal Aging Institute at NYU Langone Health and the senior author of the study. “The globe is getting older.” Now the findings are being challenged by other dementia researchers who say that while increases are coming, they will be far smaller than Dr. Coresh and his co-authors predicted. © 2025 The New York Times Company
Keyword: Alzheimers
Link ID: 29713 - Posted: 03.22.2025
By Laura Sanders There are countless metaphors for memory. It’s a leaky bucket, a steel trap, a file cabinet, words written in sand. But one of the most evocative — and neuroscientifically descriptive — invokes Lego bricks. A memory is like a Lego tower. It’s built from the ground up, then broken down, put away in bins and rebuilt in a slightly different form each time it’s taken out. This metaphor is beautifully articulated by psychologists Ciara Greene and Gillian Murphy in their new book, Memory Lane. Perhaps the comparison speaks to me because I have watched my kids create elaborate villages of Lego bricks, only to be dismantled, put away (after much nagging) and reconstructed, always with a similar overall structure but with minor and occasionally major changes. These villages’ blueprints are largely stable, but also fluid and flexible, subject to the material whims of the builders at any point in time. Memory works this way, too, Greene and Murphy propose. Imagine your own memory lane as a series of buildings, modified in ways both small and big each time you call them to mind. “As we walk down Memory Lane, the buildings we pass — our memories of individual events — are under constant reconstruction,” Greene and Murphy write. In accessible prose, the book covers a lot of ground, from how we form memories to how delicate those memories really are. Readers may find it interesting (or perhaps upsetting) to learn how bad we all are at remembering why we did something, from trivial choices, like buying an album, to consequential ones, such as a yes or no vote on an abortion referendum. People change their reasoning — or at least, their memories of their reasoning — on these sorts of events all the time. © Society for Science & the Public 2000–2025
Keyword: Learning & Memory
Link ID: 29712 - Posted: 03.22.2025
By Diana Kwon Charlene Sunkel was 19 when she started hearing voices and strange thoughts began filling her head. People wanted to infiltrate her mind, to poison her, to rat her out to the police. She stopped making eye contact, convinced that it would enable others to steal her thoughts. Once sociable and outgoing, Sunkel withdrew from friends and family, worried that they were conspiring against her. On her way to work, she had visions of men in hoods from the corner of her eye. As the illness progressed, she lost the ability to understand what people were saying, and when she spoke, the words would not come out right. About a year after her symptoms started, Sunkel was diagnosed with schizophrenia. Delusions, hallucinations and disordered thinking are collectively known as psychosis. These “positive” symptoms are among the most widely recognized aspects of schizophrenia. For about two thirds of patients with schizophrenia—which affects approximately 23 million people around the world—traditional antipsychotic drugs are often highly effective at treating psychosis. But these drugs frequently come with problematic side effects. And they do little to help with the so-called negative symptoms of schizophrenia, such as emotional flatness and social withdrawal, or with other issues involving thinking and memory referred to as cognitive problems. Until quite recently, all antipsychotics worked in essentially the same way. They blocked the activity of dopamine, a chemical messenger in the brain involved in motivation, learning, habit formation, and other processes. The successful treatment of psychosis with dopamine blockers led many clinicians to believe that they understood schizophrenia and that its underlying cause was an imbalance in dopamine. When a particular antipsychotic did not work in a patient, all doctors needed to do, they thought, was up the dosage or try another dopamine-targeting drug. But the arrival last September of a new drug, KarXT, supports an emerging awareness among clinicians that schizophrenia is more complex than most of them had realized. KarXT is the first antipsychotic to target a molecule other than dopamine. © 2024 SCIENTIFIC AMERICAN,
Keyword: Schizophrenia
Link ID: 29711 - Posted: 03.19.2025
By Ellen Barry On a recent Friday morning, Daniel, a lawyer in his early 40s, was in a Zoom counseling session describing tapering off lithium. Earlier that week he had awakened with racing thoughts, so anxious that he could not read, and he counted the hours before sunrise. At those moments, Daniel doubted his decision to wean off the cocktail of psychiatric medications which had been part of his life since his senior year in high school, when he was diagnosed with bipolar disorder. Was this his body adjusting to the lower dosage? Was it a reaction to the taco seasoning he had eaten the night before? Or was it what his psychiatrist would have called it: a relapse? “It still does go to the place of — what if the doctors are right?” said Daniel. On his screen, Laura Delano nodded sympathetically. Ms. Delano is not a doctor; her main qualification, she likes to say, is having been “a professional psychiatric patient between the ages of 13 and 27.” During those years, when she attended Harvard and was a nationally ranked squash player, she was prescribed 19 psychiatric medications, often in combinations of three or four at a time. Then Ms. Delano decided to walk away from psychiatric care altogether, a journey she detailed in a new memoir, “Unshrunk: A Story of Psychiatric Treatment Resistance.” Fourteen years after taking her last psychotropic drug, Ms. Delano projects a radiant good health that also serves as her argument — living proof that, all along, her psychiatrists were wrong. Since then, to the alarm of some physicians, an online DIY subculture focused on quitting psychiatric medications has expanded and begun to mature into a service industry. © 2025 The New York Times Company
Keyword: Schizophrenia
Link ID: 29710 - Posted: 03.19.2025
By Claudia López Lloreda For a neuroscientist, the opportunity to record single neurons in people doesn’t knock every day. It is so rare, in fact, that after 14 years of waiting by the door, Florian Mormann says he has recruited just 110 participants—all with intractable epilepsy. All participants had electrodes temporarily implanted in their brains to monitor their seizures. But the slow work to build this cohort is starting to pay off for Mormann, a group leader at the University of Bonn, and for other researchers taking a similar approach, according to a flurry of studies published in the past year. For instance, certain neurons selectively respond not only to particular scents but also to the words and images associated with them, Mormann and his colleagues reported in October. Other neurons help to encode stimuli, form memories and construct representations of the world, recent work from other teams reveals. Cortical neurons encode specific information about the phonetics of speech, two independent teams reported last year. Hippocampal cells contribute to working memory and map out time in novel ways, two other teams discovered last year, and some cells in the region encode information related to a person’s changing knowledge about the world, a study published in August found. These studies offer the chance to answer questions about human brain function that remain challenging to answer using animal models, says Ziv Williams, associate professor of neurosurgery at Harvard Medical School, who led one of the teams that worked on speech phonetics. “Concept cells,” he notes by way of example, such as those Mormann identified, or the “Jennifer Aniston” neurons famously described in a 2005 study, have proved elusive in the monkey brain. © 2025 Simons Foundation
Keyword: Attention; Learning & Memory
Link ID: 29709 - Posted: 03.19.2025
Nicola Davis Science correspondent Cat owners are being asked share their pet’s quirky traits and even post researchers their fur in an effort to shed light on how cats’ health and behaviour are influenced by their genetics. The scientists behind the project, Darwin’s Cats, are hoping to enrol 100,000 felines, from pedigrees to moggies, with the DNA of 5,000 cats expected to be sequenced in the next year. The team say the goal is to produce the world’s largest feline genetic database. “Unlike most existing databases, which tend to focus on specific breeds or veterinary applications, Darwin’s Cats is building a diverse, large-scale dataset that includes pet cats, strays and mixed breeds from all walks of life,” said Dr Elinor Karlsson, the chief scientist at the US nonprofit organisation Darwin’s Ark, director of the vertebrate genomics group at the Broad Institute of MIT and Harvard and associate professor at the UMass Chan medical school. “It’s important to note, this is an open data project, so we will share the data with other scientists as the dataset grows,” she added. The project follows on the heels of Darwin’s Dogs, a similar endeavour that has shed light on aspects of canine behaviour, disease and the genetic origins of modern breeds. Darwin’s Cats was launched in mid-2024 and already has more than 3,000 cats enrolled, although not all have submitted fur samples. Participants from all parts of the world are asked to complete a number of free surveys about their pet’s physical traits, behaviour, environment, and health. © 2025 Guardian News & Media Limited
Keyword: Genes & Behavior; Development of the Brain
Link ID: 29708 - Posted: 03.19.2025
By Evan Bush, Aria Bendix and Denise Chow “This is simply the end.” That was the five-word message that Rick Huganir, a neuroscientist at Johns Hopkins University in Baltimore, received from a colleague just before 6 p.m. two Fridays ago, with news that would send a wave of panic through the scientific community. When Huganir clicked on the link in the email, from fellow JHU neuroscientist Alex Kolodkin, he saw a new National Institutes of Health policy designed to slash federal spending on the indirect costs that keep universities and research institutes operating, including for new equipment, maintenance, utilities and support staff. “Am I reading this right 15%??” Huganir wrote back in disbelief, suddenly worried the cut could stall 25 years of work. In 1998, Huganir discovered a gene called SYNGAP1. About 1% of all children with intellectual disabilities have a mutation of the gene. He’s working to develop drugs to treat these children, who often have learning differences, seizures and sleep problems. He said his research is almost entirely reliant on NIH grants. The search for a cure for these rare disorders is a race against time, because researchers think treatment will be most effective if administered when patients are children. “We’re developing therapeutics for the kids and may have a therapeutic that could be curing these kids in the next several years, but that research is going to be compromised,” Huganir said in an interview, estimating that scientists in his field could start a Phase 1 clinical trial within the next five years. “Any delay or anything that inhibits our research is devastating to the parents.”
Keyword: Miscellaneous
Link ID: 29707 - Posted: 03.15.2025
By Emily Kwong You probably know the feeling of having a hearty meal at a restaurant, and feeling full and satisfied … only to take a peek at the dessert menu and decide the cheesecake looks just irresistible. So why is it that you just absolutely couldn't have another bite, but you somehow make an exception for a sweet treat? Or as Jerry Sienfeld might put it back in the day "Whhaaaat's the deal with dessert?!" Scientists now have a better understanding of the neural origins of this urge thanks to a recent study published in the journal Science. Sponsor Message Working with mice, researchers tried to set up a scenario similar to the human experience described above. They started by offering a standard chow diet to mice who hadn't eaten since the previous day. That "meal" period lasted for 90 minutes, and the mice ate until they couldn't eat any more. Then it was time for a 30-minute "dessert" period. The first round of the experiment, researchers offered mice more chow for dessert, and the mice ate just a little bit more. The second time around, during the "dessert" period, they offered a high sugar feed to the mice for 30 minutes. The mice really went for the sugary feed, consuming six times more calories than when they had regular chow for dessert. In the mice, researchers monitored the activity of neurons that are associated with feelings of fullness, called POMC neurons. They're located in a part of the brain called the hypothalamus, which is "very important for promoting satiety," says Henning Fenselau, one of the study authors and a researcher at the Max Planck Institute for Metabolism Research in Cologne, Germany. © 2025 npr
Keyword: Obesity
Link ID: 29706 - Posted: 03.15.2025
By Christina Caron Victoria Ratliff, the wealthy financier’s wife on season 3 of HBO’s “The White Lotus,” has a problem: She keeps popping pills. And her drug of choice, the anti-anxiety medication lorazepam, has left her a little loopy. In the show, which follows guests vacationing at a fictional resort, Victoria pairs her medication with wine, which leads her to nod off at the dinner table. Sometimes she slurs her words. When she notices that her pill supply is mysteriously dwindling, she asks her children if they’re stealing them. “You don’t have enough lorazepam to get through one week at a wellness spa?” her daughter, Piper, asks “The White Lotus” is not the only show to recently feature these drugs. The new Max series “The Pitt,” which takes place in an emergency department, includes a story line about a benzodiazepine called Librium. This isn’t a case of Hollywood taking dramatic liberties. Benzodiazepines such as lorazepam and chlordiazepoxide are notorious for having the potential to be highly addictive. They may also come with difficult — sometimes fatal — withdrawal symptoms. The characters’ misuse of benzodiazepine drugs is not uncommon, said Dr. Ian C. Neel, a geriatrician at UC San Diego Health. “We definitely see that a lot in real life as well.” And in recent years, he added, studies have shown that it’s a bigger problem than doctors initially realized. The drugs, which are often called benzos or downers, are commonly used to treat anxiety, panic attacks and sleep disorders like restless leg syndrome. But they can also be used for other reasons, such as to help people manage alcohol withdrawal. © 2025 The New York Times Company
Keyword: Drug Abuse; Stress
Link ID: 29705 - Posted: 03.15.2025
By Gina Kolata Women’s brains are superior to men’s in at least in one respect — they age more slowly. And now, a group of researchers reports that they have found a gene in mice that rejuvenates female brains. Humans have the same gene. The discovery suggests a possible way to help both women and men avoid cognitive declines in advanced age. The study was published Wednesday in the journal Science Advances. The journal also published two other studies on women’s brains, one on the effect of hormone therapy on the brain and another on how age at the onset of menopause shapes the risk of getting Alzheimer’s disease. The evidence that women’s brains age more slowly than men’s seemed compelling. Researchers, looking at the way the brain uses blood sugar, had already found that the brains of aging women are years younger, in metabolic terms, than the brains of aging men. Other scientists, examining markings on DNA, found that female brains are a year or so younger than male brains. And careful cognitive studies of healthy older people found that women had better memories and cognitive function than men of the same age. Dr. Dena Dubal, a professor of neurology at the University of California, San Francisco, set out to understand why. “We really wanted to know what could underlie this female resilience,” Dr. Dubal said. So she and her colleagues focused on the one factor that differentiates females and males: the X chromosome. Females have two X chromosomes; males have one X and one Y chromosome. Early in pregnancy, one of the X chromosomes in females shuts down and its genes go nearly silent. But that silencing changes in aging, Dr. Dubal found. © 2025 The New York Times Company
Keyword: Alzheimers; Sexual Behavior
Link ID: 29704 - Posted: 03.12.2025
By Kelly Servick New York City—A recent meeting here on consciousness started from a relatively uncontroversial premise: A newly fertilized human egg isn’t conscious, and a preschooler is, so consciousness must emerge somewhere in between. But the gathering, sponsored by New York University (NYU), quickly veered into more unsettled territory. At the Infant Consciousness Conference from 28 February to 1 March, researchers explored when and how consciousness might arise, and how to find out. They also considered hints from recent brain imaging studies that the capacity for consciousness could emerge before birth, toward the end of gestation. “Fetal consciousness would have been a less central topic at a meeting like this a few years ago,” says Claudia Passos-Ferreira, a bioethicist at NYU who co-organized the gathering. The conversation has implications for how best to care for premature infants, she says, and intersects with thorny issues such as abortion. “Whatever you claim about this, there are some moral implications.” How to define consciousness is itself the subject of debate. “Each of us might have a slightly different definition,” neuroscientist Lorina Naci of Trinity College Dublin acknowledged at the meeting before describing how she views consciousness—as the capacity to have an experience or a subjective point of view. There’s also vigorous debate about where consciousness arises in the brain and what types of neural activity define it. That makes it hard to agree on specific markers of consciousness in beings—such as babies—that can’t talk about their experience. Further complicating the picture, the nature of consciousness could be different for infants than adults, researchers noted at the meeting. And it may emerge gradually versus all at once, on different timescales for different individuals.
Keyword: Consciousness; Development of the Brain
Link ID: 29703 - Posted: 03.12.2025
By Mark Humphries There are many ways neuroscience could end. Prosaically, society may just lose interest. Of all the ways we can use our finite resources, studying the brain has only recently become one; it may one day return to dust. Other things may take precedence, like feeding the planet or preventing an asteroid strike. Or neuroscience may end as an incidental byproduct, one of the consequences of war or of thoughtlessly disassembling a government or of being sideswiped by a chunk of space rock. We would prefer it to end on our own terms. We would like neuroscience to end when we understand the brain. Which raises the obvious question: Is this possible? For the answer to be yes, three things need to be true: that there is a finite amount of stuff to know, that stuff is physically accessible and that we understand all the stuff we obtain. But each of these we can reasonably doubt. The existence of a finite amount of knowledge is not a given. Some arguments suggest that an infinite amount of knowledge is not only possible but inevitable. Physicist David Deutsch proposes the seemingly innocuous idea that knowledge grows when we find a good explanation for a phenomenon, an explanation whose details are hard to vary without changing its predictions and hence breaking it as an explanation. Bad explanations are those whose details can be varied without consequence. Ancient peoples attributing the changing seasons to the gods is a bad explanation, for those gods and their actions can be endlessly varied without altering the existence of four seasons occurring in strict order. Our attributing the changing seasons to the Earth’s tilt in its orbit of the sun is a good explanation, for if we omit the tilt, we lose the four seasons and the opposite patterns of seasons in the Northern and Southern hemispheres. A good explanation means we have nailed down some property of the universe sufficiently well that something can be built upon it. © 2025 Simons Foundation
Keyword: Consciousness
Link ID: 29702 - Posted: 03.12.2025
By Meghan Rosen and Laura Sanders Millions of Americans take antidepressants to help manage everything from depression and anxiety to post-traumatic stress disorder. Now, the Trump administration has announced that these drugs, which have been in use for decades and gone through rigorous testing, will be subject to new scrutiny. Invoking a burden of chronic disease, including in children, the administration has pledged to, in its words, “assess the prevalence of and threat posed by” certain commonly prescribed medications. In the coming months, its “Make America Healthy Again” commission plans to review a slew of existing medications, including SSRIs, or selective serotonin reuptake inhibitors. More than 10 percent of U.S. adults took antidepressants over the previous 30 days, data from 2015 to 2018 show. And SSRIs are among the most widely prescribed of those drugs. U.S. Health and Human Services Secretary Robert F. Kennedy Jr. has long questioned the safety of antidepressants and other psychiatric medicines, making misleading and unsubstantiated claims about the drugs. For instance, as recently as his January confirmation hearings, he likened taking SSRIs to having a heroin addiction. He also has suggested — without evidence — that SSRIs play a role in school shootings. With the executive order and statements like these, “it’s implied there is something nefarious or harmful” about antidepressants and related medications, says Lisa Fortuna, chair of the American Psychiatric Association’s Council on Children, Adolescents and Their Families. “People may think that they’re dangerous drugs.” © Society for Science & the Public 2000–2025
Keyword: Depression; Sexual Behavior
Link ID: 29701 - Posted: 03.12.2025