Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Rachel E. Gross Estrogen is the Meryl Streep of hormones, its versatility renowned among scientists. Besides playing a key role in sexual and reproductive health, it strengthens bones, keeps skin supple, regulates sugar levels, increases blood flow, lowers inflammation and supports the central nervous system. “You name the organ, and it promotes the health of that organ,” said Roberta Brinton, a neuroscientist who leads the Center for Innovation in Brain Science at the University of Arizona. But appreciation for estrogen’s more expansive role has been slow in coming. The compound was first identified in 1923 and was henceforth known as the “female sex hormone” — a one-dimensional reputation baked into its very name. “Estrogen” comes from the Greek “oestrus,” a literal gadfly known for whipping cattle into a mad frenzy. Scientifically, estrus has come to mean the period in the reproductive cycles of some mammals when females are fertile and sexually active. Women don’t enter estrus; they menstruate. Nevertheless, when researchers named estrogen, these were the roles it was cast in: inducing a frenzy and supporting female sexual health. Now, estrogen is gaining recognition for what may be its most important role yet: influencing the brain. Neuroscientists have learned that estrogen is vital to healthy brain development but that it also contributes to conditions including multiple sclerosis and Alzheimer’s. Changes in estrogen levels — either from the menstrual cycle or external sources — can exacerbate migraines, seizures and other common neurological symptoms. “There are a huge number of neurological diseases that can be affected by sex hormone fluctuations,” said Dr. Hyman Schipper, a neurologist at McGill University who listed a dozen of them in a recent review in the journal Brain Medicine. “And many of the therapies that are used in reproductive medicine should be repurposed for these neurological diseases.” Today, the insight that sex hormones are also brain hormones is transforming how doctors approach brain health and disease — helping them guide treatment, avoid harmful interactions and develop new hormone-based therapies.. © 2025 The New York Times Company
Keyword: Hormones & Behavior; Sexual Behavior
Link ID: 29757 - Posted: 04.23.2025
By Elise Cutts Food poisoning isn’t an experience you’re likely to forget — and now, scientists know why. A study published April 2 in Nature has unraveled neural circuitry in mice that makes food poisoning so memorable. “We’ve all experienced food poisoning at some point … And not only is it terrible in the moment, but it leads us to not eat those foods again,” says Christopher Zimmerman of Princeton University. Luckily, developing a distaste for foul food doesn’t take much practice — one ill-fated encounter with an undercooked enchilada or contaminated hamburger is enough, even if it takes hours or days for symptoms to set in. The same is true for other animals, making food poisoning one of the best ways to study how our brains connect events separated in time, says neuroscientist Richard Palmiter of the University of Washington in Seattle. Mice usually need an immediate reward or punishment to learn something, Palmiter says; even just a minute’s delay between cause (say, pulling a lever) and effect (getting a treat) is enough to prevent mice from learning. Not so for food poisoning. Despite substantial delays, their brains have no trouble associating an unfamiliar food in the past with tummy torment in the present. Researchers knew that a brain region called the amygdala represents flavors and decides whether or not they’re gross. Palmiter’s group had also shown that the gut tells the brain it’s feeling icky by activating specific “alarm” neurons, called CGRP neurons. “They respond to everything that’s bad,” Palmiter says. © Society for Science & the Public 2000–2025.
Keyword: Learning & Memory; Emotions
Link ID: 29756 - Posted: 04.23.2025
By Bruce Rosen The past two decades—and particularly the past 10 years, with the tool-focused efforts of the BRAIN Initiative—have delivered remarkable advances in our ability to study and manipulate the brain, both in exquisite cellular detail and across increasing swaths of brain territory. These advances resulted from improvements in tools such as optical imaging, chemogenetics and multiprobe electrodes, to name a few. Powerful as these technologies are, though, their invasive nature makes them ill-suited for widespread adoption in human brain research. Fortunately, our fundamental understanding of the physics and engineering behind noninvasive modalities—based largely on recording, generating and manipulating electromagnetic and acoustic fields in the human brain—has also progressed over the past decade. These advances are on the threshold of providing much more detailed recordings of electromagnetic activity, not only across the human cortex but at depth. And these same principles can improve our ability to precisely and noninvasively stimulate the human brain. Though these tools have limitations compared with their invasive counterparts, their noninvasive nature make them suitable for wide-scale investigation of the links between human behavior and action, as well as for individually understanding and treating an array of brain disorders. The most common method to assess brain electrophysiology is the electroencephalogram (EEG), first developed in the 1920s and now routinely used for both basic neuroscience and the clinical diagnosis of conditions ranging from epilepsy to sleep disorders to traumatic brain injury. It’s widely used, given its simplicity and low cost, but it has drawbacks. Understanding exactly where the EEG signals arise from in the brain is often difficult, for example; electric current from the brain must pass through multiple tissue layers (including overlying brain itself) before it can be detected with electrodes on the scalp surface, blurring the spatial resolution. Advanced computational methods combined with imaging data from MRI can partially mitigate these issues, but the analysis is complex, and results are imperfect. Still, because EEG can be readily combined with behavioral assessments and other
Keyword: Brain imaging
Link ID: 29755 - Posted: 04.23.2025
William Wright & Takaki Komiyama Every day, people are constantly learning and forming new memories. When you pick up a new hobby, try a recipe a friend recommended or read the latest world news, your brain stores many of these memories for years or decades. But how does your brain achieve this incredible feat? In our newly published research in the journal Science, we have identified some of the “rules” the brain uses to learn. Learning in the brain The human brain is made up of billions of nerve cells. These neurons conduct electrical pulses that carry information, much like how computers use binary code to carry data. These electrical pulses are communicated with other neurons through connections between them called synapses. Individual neurons have branching extensions known as dendrites that can receive thousands of electrical inputs from other cells. Dendrites transmit these inputs to the main body of the neuron, where it then integrates all these signals to generate its own electrical pulses. It is the collective activity of these electrical pulses across specific groups of neurons that form the representations of different information and experiences within the brain. For decades, neuroscientists have thought that the brain learns by changing how neurons are connected to one another. As new information and experiences alter how neurons communicate with each other and change their collective activity patterns, some synaptic connections are made stronger while others are made weaker. This process of synaptic plasticity is what produces representations of new information and experiences within your brain. In order for your brain to produce the correct representations during learning, however, the right synaptic connections must undergo the right changes at the right time. The “rules” that your brain uses to select which synapses to change during learning – what neuroscientists call the credit assignment problem – have remained largely unclear. © 2010–2025, The Conversation US, Inc.
Keyword: Learning & Memory
Link ID: 29754 - Posted: 04.23.2025
By Michael Erard In many Western societies, parents eagerly await their children’s first words, then celebrate their arrival. There’s also a vast scientific and popular attention to early child language. Yet there is (and was) surprisingly little hullabaloo sparked by the first words and hand signs displayed by great apes. WHAT I LEFT OUT is a recurring feature in which book authors are invited to share anecdotes and narratives that, for whatever reason, did not make it into their final manuscripts. In this installment, author and linguist Michael Erard shares a story that didn’t make it into his recent book “Bye Bye I Love You: The Story of Our First and Last Words” (MIT Press, 344 pages.) As far back as 1916, scientists have been exploring the linguistic abilities of humans’ closest relatives by raising them in language-rich environments. But the first moments in which these animals did cross a communication threshold created relatively little fuss in both the scientific literature and the media. Why? Consider, for example, the first sign by Washoe, a young chimpanzee that was captured in the wild and transported in 1966 to a laboratory at the University of Nevada, where she was studied by two researchers, Allen Gardner and Beatrice Gardner. Washoe was taught American Sign Language in family-like settings that would be conducive to communicative situations. “Her human companions,” wrote the Gardners in 1969, “were to be friends and playmates as well as providers and protectors, and they were to introduce a great many games and activities that would be likely to result in maximum interaction.” When the Gardners wrote about the experiments, they did note her first uses of specific signs, such as “toothbrush,” that didn’t seem to echo a sign a human had just used. These moments weren’t ignored, yet you have to pay very close attention to their writings to find the slightest awe or enthusiasm. Fireworks it is not.
Keyword: Language; Evolution
Link ID: 29753 - Posted: 04.23.2025
By Jacek Krywko edited by Allison Parshall There are only so many colors that the typical human eye can see; estimates put the number just below 10 million. But now, for the first time, scientists say they’ve broken out of that familiar spectrum and into a new world of color. In a paper published on Friday in Science Advances, researchers detail how they used a precise laser setup to stimulate the retinas of five participants, making them the first humans to see a color beyond our visual range: an impossibly saturated bluish green. Our retinas contain three types of cone cells, photoreceptors that detect the wavelengths of light. S cones pick up relatively short wavelengths, which we see as blue. M cones react to medium wavelengths, which we see as green. And L cones are triggered by long wavelengths, which we see as red. These red, green and blue signals travel to the brain, where they’re combined into the full-color vision we experience. But these three cone types handle overlapping ranges of light: the light that activates M cones will also activate either S cones or L cones. “There’s no light in the world that can activate only the M cone cells because, if they are being activated, for sure one or both other types get activated as well,” says Ren Ng, a professor of electrical engineering and computer science at the University of California, Berkeley. Ng and his research team wanted to try getting around that fundamental limitation, so they developed a technicolor technique they call “Oz.” “The name comes from the Wizard of Oz, where there’s a journey to the Emerald City, where things look the most dazzling green you’ve ever seen,” Ng explains. On their own expedition, the researchers used lasers to precisely deliver tiny doses of light to select cone cells in the human eye. First, they mapped a portion of the retina to identify each cone cell as either an S, M or L cone. Then, using the laser, they delivered light only to M cone cells. © 2025 SCIENTIFIC AMERICAN,
Keyword: Vision
Link ID: 29752 - Posted: 04.19.2025
Smriti Mallapaty Two hotly anticipated clinical trials using stem cells to treat people with Parkinson’s disease have published encouraging results. The early-stage trials demonstrate that injecting stem-cell-derived neurons into the brain is safe1,2. They also show hints of benefit: the transplanted cells can replace the dopamine-producing cells that die off in people with the disease, and survive long enough to produce the crucial hormone. Some participants experienced visible reductions in tremors. The studies, published by two groups in Nature today, are “a big leap in the field”, says Malin Parmar, a stem-cell biologist at Lund University, Sweden. “These cell products are safe and show signs of cell survival.” Japan’s big bet on stem-cell therapies might soon pay off with breakthrough therapies The trials were mainly designed to test safety and were small, involving 19 individuals in total, which is not enough to indicate whether the intervention is effective, says Parmar. “Some people got slightly better and others didn’t get worse,” says Jeanne Loring, a stem-cell researcher at Scripps Research in La Jolla, California. This could be due to the relatively small number of cells transplanted in these first early-stage trials. Parkinson’s is a progressive neurological condition driven by the loss of dopamine-producing neurons, which causes tremors, stiffness and slowness in movement. There is currently no cure for the condition, which is projected to affect 25 million people globally by 2050. Cell therapies are designed to replace damaged neurons, but previous trials using fetal tissue transplants have had mixed results. The latest findings are the first among a handful of global trials testing more-advanced cell therapies. © 2025 Springer Nature Limited
Keyword: Parkinsons; Stem Cells
Link ID: 29751 - Posted: 04.19.2025
By Jan Hoffman Fentanyl overdoses have finally begun to decline over the past year, but that good news has obscured a troubling shift in illicit drug use: a nationwide surge in methamphetamine, a powerful, highly addictive stimulant. This isn’t the ’90s club drug or even the blue-white tinged crystals cooked up in “Breaking Bad.” As cartels keep revising lab formulas to make their product more addictive and potent, often using hazardous chemicals, many experts on addiction think that today’s meth is more dangerous than older versions. Here is what to know. What is meth? Meth, short for methamphetamine, is a stimulant, a category of drugs that includes cocaine. Meth is far stronger than coke, with effects that last many hours longer. It comes in pill, powder or paste form and is smoked, snorted, swallowed or injected. Meth jolts the central nervous system and prompts the brain to release exorbitant amounts of reinforcing, feel-good neurotransmitters such as dopamine, which help people experience euphoria and drive them to keep seeking it. Meth is an amphetamine, a category of stimulant drugs perhaps best known to the public as the A.D.H.D. prescription medications Adderall and Vyvanse. Those stimulants are milder and shorter-lasting than meth but, if misused, they too can be addictive. What are meth’s negative side effects? They vary, depending on the tolerance of the person taking it and the means of ingestion. After the drug’s rush has abated, many users keep bingeing it. They forget to drink water and are usually unable to sleep or eat for days. In this phase, known as “tweaking,” users can become hyper-focused on activities such as taking apart bicycles — which they forget to reassemble — or spending hours collecting things like pebbles and shiny gum wrappers. They may become agitated and aggressive. Paranoia, hallucinations and psychosis can set in. © 2025 The New York Times Company
Keyword: Drug Abuse
Link ID: 29750 - Posted: 04.19.2025
By Erin Blakemore Consuming more than eight alcoholic drinks a week is associated with brain injuries linked to Alzheimer’s disease and cognitive decline, a recent study in the journal Neurology suggests. The analysis looked for links between heavy drinking and brain health. Researchers used autopsy data from the Biobank for Aging Studies at the University of São Paulo Medical School in Brazil collected between 2004 and 2024. The team analyzed data from 1,781 people ages 50 or older at death. The average age at death was 74.9. With the help of surveys of the deceased’s next of kin, researchers gathered information about the deceased’s cognitive function and alcohol consumption in the three months before their death. Among participants, 965 never drank, 319 drank up to seven drinks per week (moderate drinking), and 129 had eight or more drinks per week (heavy drinking). Another 368 were former heavy drinkers who had stopped drinking before their last three months of life. The analysis showed that heavy drinkers and former heavy drinkers, respectively, had 41 percent and 31 percent higher odds of neurofibrillary tangles — clumps of the protein tau that accumulate inside brain neurons and have been associated with Alzheimer’s disease. Moderate, heavy and former heavy drinkers also had a higher risk of hyaline arteriolosclerosis, which thickens the walls of small blood vessels in the brain, impeding blood flow and causing brain damage over time. Though 40 percent of those who never drank had vascular brain lesions, they were more common in moderate (44.6 percent), heavy (44.1 percent) and former heavy drinkers (50.2 percent), the study found.
Keyword: Drug Abuse; Alzheimers
Link ID: 29749 - Posted: 04.19.2025
By Frieda Klotz Up until a couple years ago, an attorney in his late 30s used to repeatedly check his vehicle for signs that he might have injured a pedestrian. The man had no reason to think he had actually hit someone, but his obsessive-compulsive disorder made him fearful. “I spent hours examining the car,” he said. He’d feel the body for dents, take photos, and was never quite done. At its worst, the condition consumed up to 17 hours of his day. “My mind was hijacked for 25 years by a devil that was OCD,” said the man, who asked that his name not be used due to the stigma surrounding mental health disorders and the treatment he’s undergone. He was first diagnosed with the disorder, which is characterized by obsessive preoccupations that interfere with daily life, when he was 15, shortly after his mother died. In the intervening years, he tried numerous forms of therapy, medication, brain stimulation, and residential treatments — all of which, he estimated, cost him hundreds of thousands of dollars. None of them helped long-term. In 2022, his father heard about a brain surgery intended to relieve OCD symptoms and found it was offered by two hospitals affiliated with Brown University. In December 2023, a neurosurgeon created a small hole in the man’s skull and deployed heat to burn away brain tissue. The resulting lesion is thought to disrupt the interaction between parts of the brain associated with OCD symptoms. “I didn’t think it would work at all, because nothing had worked on me,” he told Undark on a Zoom call with his neuropsychologist at Brown, Nicole McLaughlin, and a communications officer from the health system where the attorney had his surgery. “It was a complete miracle.” He added that he was still aware of his repeating thoughts after the surgery, but they no longer bothered him: “It was unbelievable.”
Keyword: OCD - Obsessive Compulsive Disorder
Link ID: 29748 - Posted: 04.16.2025
The devastating stimulant has been hitting Portland, Maine hard, even competing with fentanyl as the street drug of choice. Although a fentanyl overdose can be reversed with Narcan, no medicine can reverse a meth overdose. Nor has any been approved to treat meth addiction. Unlike fentanyl, which sedates users, meth can make people anxious and violent. Its effects can overwhelm not just users but community residents and emergency responders. John once fielded customer complaints for a telecommunications company. Now he usually hangs out with friends in the courtyard of a center offering services to help people who use drugs, hitting his pipe, or as he calls it, “getting methicated.” He usually lives outdoors, though he can sometimes handle a few days at a shelter. By noon, he tries to stop smoking meth, so he can get to sleep later that night. Quitting is not on his radar: meth rules his life. “We cannot ride on the railroad, the railroad rides upon us,” he said, with a nod to Henry David Thoreau. Most weekdays, Bill Burns, an addiction and mental health specialist with the Portland police, walks the Bayside neighborhood, checking in on folks. On Thursdays, he rewards the regulars he drives to addiction treatment clinics with his own homemade jolts of dopamine: sugar-dense, Rice Krispie-style treats. Recently, he encountered a young man in full meth psychosis, wild-eyed, bare-chested and bleeding, flinging himself against concrete barriers in an alley. Mr. Burns slipped between the man and a brick wall and wrapped his arms protectively around him. Even as the man flailed uncontrollably, smacking Mr. Burns and smearing blood on his shirt, he managed to stammer, “Sorry!” Speaking softly, Mr. Burns kept repeating, “You’re going to be safe. You’re OK. We’re here because we just want to make sure you’re safe. No, you’re not in trouble. Nobody wants to hurt you. ” © 2025 The New York Times Company
Keyword: Drug Abuse
Link ID: 29747 - Posted: 04.16.2025
By Rachel Brazil Drugs that mimic glucagonlike peptide-1 (GLP-1), such as semaglutide—marketed as Ozempic or Wegovy—have revolutionized the treatment of obesity and type 2 diabetes, but they have major drawbacks. “[They] are expensive to manufacture, they have to be refrigerated, and they often have to be injected because they cannot go through the gastrointestinal tract without being degraded,” explains Alejandra Tomas, a cell biologist at Imperial College London who studies the cellular receptor GLP-1 drugs target. That’s all because they consist of peptides, or long chains of amino acids. A small-molecule version of the therapy, on the other hand, could be given as a daily pill and would be much cheaper to produce. Companies including Eli Lilly, Pfizer, and Roche have launched clinical trials of such compounds. Results from Lilly’s first phase 3 trial of its oral drug are expected later this year. But Pfizer announced this week it was halting development of its candidate after signs of liver injury in a trial participant. The candidates furthest along in development activate the same receptors as peptide drugs do, in much the same way. But several firms are exploring more innovative small molecules that target different sites on those receptors—and could lead to even more effective treatments with fewer side effects. “In the next 4 or 5 years, this field will mature and more patients ultimately should be able to get these medicines,” says Kyle Sloop, a molecular biologist at Lilly Research Laboratories. By mimicking a natural hormone, semaglutide and other drugs in its class help regulate blood sugar by increasing insulin secretion from the pancreas in response to glucose, and suppress appetite by slowing down digestion. The first generation of peptide drugs were essentially copies of GLP-1, with modifications to prevent the peptide from quickly degrading once in the body. Novo Nordisk first won U.S. approval for semaglutide to treat type 2 diabetes in 2017. It needed to be injected, but in 2019 the company added a pill form, which includes an absorption-enhancing ingredient that allows the peptide to penetrate the stomach wall. However, it requires a high dose and has to be taken while fasting, with minimal liquid.
Keyword: Obesity
Link ID: 29746 - Posted: 04.16.2025
By Roni Caryn Rabin Middle-aged and older adults who sought hospital or emergency room care because of cannabis use were almost twice as likely to develop dementia over the next five years, compared with similar people in the general population, a large Canadian study reported on Monday. When compared with adults who sought care for other reasons, the risk of developing dementia was still 23 percent higher among users of cannabis, the study also found. The study included the medical records of six million people in Ontario from 2008 to 2021. The authors accounted for health and sociodemographic differences between comparison groups, some of which play a role in cognitive decline. The data do not reveal how much cannabis the subjects had been using, and the study does not prove that regular or heavy cannabis use plays a causal role in dementia. But the finding does raise serious concerns that require further exploration, said Dr. Daniel T. Myran, the first author of the study, which was published in JAMA Neurology. “Figuring out whether or not cannabis use or heavy regular chronic use causes dementia is a challenging and complicated question that you don’t answer in one study,” said Dr. Myran, an assistant professor of family medicine at University of Ottawa. “This contributes to the literature and to a sign, or signal, of concern.” Dr. Myran’s previous research has found that patients with cannabis use disorder died at almost three times the rate of individuals without the disorder over a five-year period. He has also reported that more cases of schizophrenia and psychosis in Canada have been linked to cannabis use disorder since the drug was legalized. © 2025 The New York Times Company
Keyword: Alzheimers; Drug Abuse
Link ID: 29745 - Posted: 04.16.2025
By Azeen Ghorayshi The percentage of American children estimated to have autism spectrum disorder increased in 2022, continuing a long-running trend, according to data released on Tuesday by the Centers for Disease Control and Prevention. Among 8-year-olds, one in 31 were found to have autism in 2022, compared with 1 in 36 in 2020. That rate is nearly five times as high as the figure in 2000, when the agency first began collecting data. The health agency noted that the increase was most likely being driven by better awareness and screening, not necessarily because autism itself was becoming more common. That diverged sharply from the rhetoric of the nation’s health secretary, Robert F. Kennedy Jr., who on Tuesday said, “The autism epidemic is running rampant.” Mr. Kennedy has repeatedly tried to connect rising autism rates with vaccines, despite dozens of studies over decades that failed to establish such a link. The health secretary nonetheless has initiated a federal study that will revisit the possibility and has hired a well-known vaccine skeptic to oversee the effort. Mr. Kennedy recently announced an effort by the Department of Health and Human Services to pinpoint the “origins of the epidemic” by September, an initiative that was greeted with skepticism by many autism experts. “It seems very unlikely that it is an epidemic, in the way that people define epidemics,” said Catherine Lord, a psychologist and autism researcher at the David Geffen School of Medicine at the University of California, Los Angeles. A significant part of the increase instead can be attributed to the expansion of the diagnosis over the years to capture milder cases, Dr. Lord said, as well as decreased stigma and greater awareness of support services. Still, she left open the possibility that other factors are contributing to more children developing autism. “We can account for a lot of the increase but perhaps not all of it,” Dr. Lord said. “But whatever it is, it’s not vaccines,” she added. © 2025 The New York Times Company
Keyword: Autism
Link ID: 29744 - Posted: 04.16.2025
By Carl Zimmer The human brain is so complex that scientific brains have a hard time making sense of it. A piece of neural tissue the size of a grain of sand might be packed with hundreds of thousands of cells linked together by miles of wiring. In 1979, Francis Crick, the Nobel-prize-winning scientist, concluded that the anatomy and activity in just a cubic millimeter of brain matter would forever exceed our understanding. “It is no use asking for the impossible,” Dr. Crick wrote. Forty-six years later, a team of more than 100 scientists has achieved that impossible, by recording the cellular activity and mapping the structure in a cubic millimeter of a mouse’s brain — less than one percent of its full volume. In accomplishing this feat, they amassed 1.6 petabytes of data — the equivalent of 22 years of nonstop high-definition video. “This is a milestone,” said Davi Bock, a neuroscientist at the University of Vermont who was not involved in the study, which was published Wednesday in the journal Nature. Dr. Bock said that the advances that made it possible to chart a cubic millimeter of brain boded well for a new goal: mapping the wiring of the entire brain of a mouse. “It’s totally doable, and I think it’s worth doing,” he said. More than 130 years have passed since the Spanish neuroscientist Santiago Ramón y Cajal first spied individual neurons under a microscope, making out their peculiar branched shapes. Later generations of scientists worked out many of the details of how a neuron sends a spike of voltage down a long arm, called an axon. Each axon makes contact with tiny branches, or dendrites, of neighboring neurons. Some neurons excite their neighbors into firing voltage spikes of their own. Some quiet other neurons. Human thought somehow emerges from this mix of excitation and inhibition. But how that happens has remained a tremendous mystery, largely because scientists have been able to study only a few neurons at a time. In recent decades, technological advances have allowed scientists to start mapping brains in their entirety. In 1986, British researchers published the circuitry of a tiny worm, made up of 302 neurons. In subsequent years, researchers charted bigger brains, such as the 140,000 neurons in the brain of a fly. © 2025 The New York Times Company
Keyword: Brain imaging; Development of the Brain
Link ID: 29743 - Posted: 04.12.2025
Dyani Lewis Human brain cells engineered to evade detection by the immune system have successfully restored muscle control in a rat model of Parkinson’s disease1. The study is a step towards the development of a ‘universal’ cell line that can be transplanted into anyone, to cure a raft of diseases without the need for anti-rejection drugs. “It’s a one-cell-fits-all proposal,” says Clare Parish, a stem-cell biologist at the Florey Institute of Neuroscience and Mental Health in Melbourne, Australia, and a co-author of the study. The work, published today in Cell Stem Cell, builds on earlier efforts to ‘cloak’ cells from the immune system. Cloaking is a key goal for cell-replacement therapies being tested for conditions ranging from type 2 diabetes and Parkinson’s disease to heart failure and blindness. It would eliminate the need for immunosuppressant drugs, which increase the risk of infection and cancer, and cause tissue damage that ultimately shortens the life of a recipient. To help cells to evade the immune system, the researchers created a cell line with eight genes altered to increase their activity so they acted as an immune invisibility cloak. All of the genes have been shown to assist the placenta and cancer cells in naturally evading immune surveillance. For example, mouse embryonic stem cells engineered with the same set of genes were able to evade detection when transplanted into mice2. Instead of mouse embryonic cells, Parish and her team used human pluripotent stem cells, which can develop into most types of cell found in the body. After being engineered with the cloaking genes, the cells differentiated into nerve cells suitable for treating Parkinson’s disease. The researchers injected the neurons into mice whose immune systems had been replaced with human immune cells, and the neurons were not rejected, suggesting that they were able to evade detection. © 2025 Springer Nature Limited
Keyword: Parkinsons
Link ID: 29742 - Posted: 04.12.2025
By Gayoung Lee edited by Allison Parshall Crows sometimes have a bad rap: they’re said to be loud and disruptive, and myths surrounding the birds tend to link them to death or misfortune. But crows deserve more love and charity, says Andreas Nieder, a neurophysiologist at the University of Tübingen in Germany. They not only can be incredibly cute, cuddly and social but also are extremely smart—especially when it comes to geometry, as Nieder has found. In a paper published on Friday in Science Advances, Nieder and his colleagues report that crows display an impressive aptitude at distinguishing shapes by using geometric irregularities as a cognitive cue. These crows could even discern quite subtle differences. For the experiment, the crows perched in front of a digital screen that, almost like a video game, displayed progressively more complex combinations of shapes. First, the crows were taught to peck at a certain shape for a reward. Then they were presented with that same shape among five others—for example, one star shape placed among five moon shapes—and were rewarded if they correctly picked the "outlier." “Initially [the outlier] was very obvious,” Nieder says. But once the crows appeared to have familiarized themselves with how the “game” worked, Nieder and his team introduced more similar quadrilateral shapes to see if the crows would still be able to identify outliers. “And they could tell us, for instance, if they saw a figure that was just not a square, slightly skewed, among all the other squares,” Nieder says. “They really could do this spontaneously [and] discriminate the outlier shapes based on the geometric differences without us needing them to train them additionally.” Even when the researchers stopped rewarding them with treats, the crows continued to peck the outliers. © 2024 SCIENTIFIC AMERICAN,
Keyword: Evolution; Intelligence
Link ID: 29741 - Posted: 04.12.2025
By Michael Schulson Two years ago, at a Stop & Shop in Rhode Island, the Danish neuroscientist and physician Henriette Edemann-Callesen visited an aisle stocked with sleep aids containing melatonin. She looked around in amazement. Then she took out her phone and snapped a photo to send to colleagues back home. “It was really pretty astonishing,” she recalled recently. In Denmark, as in many countries, the hormone melatonin is a prescription drug for treating sleep problems, mostly in adults. Doctors are supposed to prescribe it to children only if they have certain developmental disorders that make it difficult to sleep — and only after the family has tried other methods to address the problem. But at the Rhode Island Stop & Shop, melatonin was available over the counter, as a dietary supplement, meaning it receives slightly less regulatory scrutiny, in some respects, than a package of Skittles. Many of the products were marketed for children, in colorful bottles filled with liquid drops and chewable tablets and bright gummies that look and taste like candy. A quiet but profound shift is underway in American parenting, as more and more caregivers turn to pharmacological solutions to help children sleep. What makes that shift unusual is that it’s largely taking place outside the traditional boundaries of health care. Instead, it’s driven by the country’s sprawling dietary supplements industry, which critics have long said has little regulatory oversight — and which may get a boost from Secretary of Health and Human Services Robert F. Kennedy Jr., who is widely seen as an ally to supplement makers. Thirty years ago, few people were giving melatonin to children, outside of a handful of controlled experiments. Even as melatonin supplements grew in popularity among adults in the late 1990s in the United States and Canada, some of those products carried strict warnings not to give them to younger people. But with time, the age floor dropped, and by the mid-2000s, news reports and academic surveys suggest some early adopters were doing just that. (Try it for ages 11-and-up only, one CNN report warned at the time.) By 2013, according to a Wall Street Journal article, a handful of companies were marketing melatonin products specifically for kids.
Keyword: Biological Rhythms; Development of the Brain
Link ID: 29740 - Posted: 04.12.2025
Jon Hamilton Researchers created an assembloid by integrating four organoids that represent the four components of the human sensory pathway, along which pain stimuli signals are conveyed to the brain. Stimulation of the sensory organoid (top) by pain-inducing substances, such as capsaicin, triggers neuronal activity in that organoid which is then transmitted to the adjacent spinal-cord organoid, the thalamic organoid and, finally, to the cortical organoid (bottom) Researchers integrated four organoids that represent the four components of the human sensory pathway, along which pain signals are conveyed to the brain. Stimulation of the sensory organoid (top) by substances, such as capsaicin, triggers neuronal activity that is then transmitted throughout the rest of the organoids. Pasca lab/Stanford Medicine Scientists have re-created a pain pathway in the brain by growing four key clusters of human nerve cells in a dish. This laboratory model could be used to help explain certain pain syndromes, and offer a new way to test potential analgesic drugs, a Stanford team reports in the journal Nature. "It's exciting," says Dr. Stephen Waxman, a professor at Yale School of Medicine who was not involved in the research. © 2025 npr
Keyword: Pain & Touch; Development of the Brain
Link ID: 29739 - Posted: 04.12.2025
By Yasemin Saplakoglu Humans tend to put our own intelligence on a pedestal. Our brains can do math, employ logic, explore abstractions and think critically. But we can’t claim a monopoly on thought. Among a variety of nonhuman species known to display intelligent behavior, birds have been shown time and again to have advanced cognitive abilities. Ravens plan (opens a new tab) for the future, crows count and use tools (opens a new tab), cockatoos open and pillage (opens a new tab) booby-trapped garbage cans, and chickadees keep track (opens a new tab) of tens of thousands of seeds cached across a landscape. Notably, birds achieve such feats with brains that look completely different from ours: They’re smaller and lack the highly organized structures that scientists associate with mammalian intelligence. “A bird with a 10-gram brain is doing pretty much the same as a chimp with a 400-gram brain,” said Onur Güntürkün (opens a new tab), who studies brain structures at Ruhr University Bochum in Germany. “How is it possible?” Researchers have long debated about the relationship between avian and mammalian intelligences. One possibility is that intelligence in vertebrates — animals with backbones, including mammals and birds — evolved once. In that case, both groups would have inherited the complex neural pathways that support cognition from a common ancestor: a lizardlike creature that lived 320 million years ago, when Earth’s continents were squished into one landmass. The other possibility is that the kinds of neural circuits that support vertebrate intelligence evolved independently in birds and mammals. It’s hard to track down which path evolution took, given that any trace of the ancient ancestor’s actual brain vanished in a geological blink. So biologists have taken other approaches — such as comparing brain structures in adult and developing animals today — to piece together how this kind of neurobiological complexity might have emerged. © 2025 Simons Foundation
Keyword: Intelligence; Evolution
Link ID: 29738 - Posted: 04.09.2025