Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.

Links 1 - 20 of 28327

By Veronique Greenwood In the dappled sunlit waters of Caribbean mangrove forests, tiny box jellyfish bob in and out of the shade. Box jellies are distinguished from true jellyfish in part by their complex visual system — the grape-size predators have 24 eyes. But like other jellyfish, they are brainless, controlling their cube-shaped bodies with a distributed network of neurons. That network, it turns out, is more sophisticated than you might assume. On Friday, researchers published a report in the journal Current Biology indicating that the box jellyfish species Tripedalia cystophora have the ability to learn. Because box jellyfish diverged from our part of the animal kingdom long ago, understanding their cognitive abilities could help scientists trace the evolution of learning. The tricky part about studying learning in box jellies was finding an everyday behavior that scientists could train the creatures to perform in the lab. Anders Garm, a biologist at the University of Copenhagen and an author of the new paper, said his team decided to focus on a swift about-face that box jellies execute when they are about to hit a mangrove root. These roots rise through the water like black towers, while the water around them appears pale by comparison. But the contrast between the two can change from day to day, as silt clouds the water and makes it more difficult to tell how far away a root is. How do box jellies tell when they are getting too close? “The hypothesis was, they need to learn this,” Dr. Garm said. “When they come back to these habitats, they have to learn, how is today’s water quality? How is the contrast changing today?” In the lab, researchers produced images of alternating dark and light stripes, representing the mangrove roots and water, and used them to line the insides of buckets about six inches wide. When the stripes were a stark black and white, representing optimum water clarity, box jellies never got close to the bucket walls. With less contrast between the stripes, however, box jellies immediately began to run into them. This was the scientists’ chance to see if they would learn. © 2023 The New York Times Company

Keyword: Learning & Memory; Evolution
Link ID: 28925 - Posted: 09.23.2023

By Laura Sanders On a hot, sunny Sunday afternoon in Manhattan, time froze for Jon Nelson. He stood on the sidewalk and said good-bye to his three kids, whose grandfather had come into the city from Long Island to pick them up. Like any parent, Jon is deeply attuned to his children’s quirks. His oldest? Sometimes quiet but bitingly funny. His middle kid? Rates dad a 10 out of 10 on the embarrassment scale and doesn’t need a hug. His 10-year-old son, the baby of the family, is the emotional one. “My youngest son would climb back up into my wife’s womb if he could,” Jon says. “He’s that kid.” An unexpected parade had snarled traffic, so Jon parked illegally along a yellow curb on 36th Street, near where his father-in-law was waiting. It was time to go. His youngest gave the last hug. “He looked up, scared and sad,” Jon says, and asked, “Dad, am I going to see you again?” That question stopped the clock. “I was like, ‘Oh man,’” Jon says. “It was one of those moments where I was living it through his eyes. And I got scared for the first time.” Until that good-bye, Jon hadn’t wanted to live. For years, he had a constant yearning to die — he talks about it like it was an addiction — as he fought deep, debilitating depression. But his son’s question pierced through that heaviness and reached something inside him. “That was the first time I really thought about it. I was like, ‘I kind of hope I don’t die.’ I hadn’t had that feeling in so long.” That hug happened around 5 p.m. on August 21, 2022. Twelve hours later, Jon was wheeled into a surgical suite. There, at Mount Sinai’s hospital just southwest of Central Park, surgery team members screwed Jon’s head into a frame to hold it still. Then they numbed him and drilled two small holes through the top of his skull, one on each side. Through each hole, a surgeon plunged a long, thin wire dotted at the end with electrodes deep into his brain. The wiring, threaded under his skin, snaked around the outside of Jon’s skull and sank down behind his ear. From there, a wire wrapped around to the front, meeting a battery-powered control box that surgeons implanted in his chest, just below his collarbone. © Society for Science & the Public 2000–2023.

Keyword: Depression
Link ID: 28924 - Posted: 09.23.2023

By Till Hein Human couples could learn a lot from seahorses. The marine marvels spend only quality time together. They flirt, swim together, and mate. The rest of time they go their own way, drifting in ocean currents, leisurely eating their fill. But they do look forward to getting together again. Right after sunrise, male and female seahorses approach one another, gently rubbing their noses together and then begin to circle each other. Many of them make seductive clicking noises. The partners gracefully rock back and forth, as though to the beat of underwater music. They dance and cuddle together dreamily, as though they’ve lost track of time. However, love can be dangerous for seahorses. During partner dancing, hormones are released that can make their camouflage fade. This causes changes in color, so their bodies begin to glow, and the contrasts in the patterns of their skin become more pronounced. Researchers hypothesize this is how seahorses signal their willingness to mate. The partner dances also serve as a means of seduction. Before mating, courtship can take many hours. Finally, the female signals that she’s ready. She swims up toward the water surface, pointing her snout toward the sky, and stretches her body out straight as a stick—a pose that is irresistible to the male. The stallion of the sea presses his chin against his chest and makes his prehensile tail open and close like a switchblade. This enables him to pump water into his brood pouch to show his beloved mare of the sea how roomy it is. Soon afterward, the mare and stallion of the sea snuggle up together closely and let themselves drift upward. They press their bodies together so that their snouts and abdomens are touching. On account of the curves in their body posture, the space between them looks like the shape of a heart. Then, something amazing takes place. A tubular rod appears in the middle of the female seahorse’s belly, which looks a little like a penis, the so-called ovipositor. At the climax of the love scene, both partners lift their heads as though in ecstasy, curving their backs, and the female seahorse transfers her eggs into the male’s brood pouch, while her partner fertilizes them with his sperm. © 2023 NautilusNext Inc., All rights reserved.

Keyword: Sexual Behavior; Evolution
Link ID: 28923 - Posted: 09.23.2023

By Taylor Majewski Rachel Nuwer’s “I Feel Love: MDMA and the Quest for Connection in a Fractured World,” is clearly aimed at a broad audience. It will resonate with readers who have experienced MDMA recreationally, probably at a rave, or therapeutically, probably to heal the emotional aftereffects of deep-seated trauma. Or both. But it’s also intended for readers who have never touched the drug, colloquially known as ecstasy or molly. Perhaps it’s especially for them. “I Feel Love” belongs to a growing family of nonfiction accounts of the fraught history of psychedelics and why, through compelling anecdotes and the latest science, we should reconsider them. Nuwer, a science journalist, chronicles the hopeful story of something both small and large — MDMA, the compound, and MDMA, the drug that’s repeatedly brought humans together across decades, continents, politics, and moral panics. The book is a natural successor to Michael Pollan’s 2018 bestseller “How to Change Your Mind,” which covered the mystical and medical benefits of LSD and psilocybin, and paved the way for a psychedelic renaissance of sorts, Nuwer writes in the introduction, “no such modern telling exists for MDMA.” Now, it does. “I Feel Love” is, above all, a time capsule. Nuwer begins with a crucial asterisk: “MDMA, also known as Ecstasy or Molly, is currently an illegal drug.” Today, most journalism around psychedelics is stipulated with this simple fact. Despite their potential to heal, drugs like psilocybin, LSD, and MDMA are still classified as Schedule I, the Drug Enforcement Administration’s highest category for controlled substances with no medical use, with a high potential for abuse. For MDMA specifically, that might be about to change.

Keyword: Drug Abuse
Link ID: 28922 - Posted: 09.23.2023

By Sonia Shah Can a mouse learn a new song? Such a question might seem whimsical. Though humans have lived alongside mice for at least 15,000 years, few of us have ever heard mice sing, because they do so in frequencies beyond the range detectable by human hearing. As pups, their high-pitched songs alert their mothers to their whereabouts; as adults, they sing in ultrasound to woo one another. For decades, researchers considered mouse songs instinctual, the fixed tunes of a windup music box, rather than the mutable expressions of individual minds. But no one had tested whether that was really true. In 2012, a team of neurobiologists at Duke University, led by Erich Jarvis, a neuroscientist who studies vocal learning, designed an experiment to find out. The team surgically deafened five mice and recorded their songs in a mouse-size sound studio, tricked out with infrared cameras and microphones. They then compared sonograms of the songs of deafened mice with those of hearing mice. If the mouse songs were innate, as long presumed, the surgical alteration would make no difference at all. Jarvis and his researchers slowed down the tempo and shifted the pitch of the recordings, so that they could hear the songs with their own ears. Those of the intact mice sounded “remarkably similar to some bird songs,” Jarvis wrote in a 2013 paper that described the experiment, with whistlelike syllables similar to those in the songs of canaries and the trills of dolphins. Not so the songs of the deafened mice: Deprived of auditory feedback, their songs became degraded, rendering them nearly unrecognizable. They sounded, the scientists noted, like “squawks and screams.” Not only did the tunes of a mouse depend on its ability to hear itself and others, but also, as the team found in another experiment, a male mouse could alter the pitch of its song to compete with other male mice for female attention. Inside these murine skills lay clues to a puzzle many have called “the hardest problem in science”: the origins of language. In humans, “vocal learning” is understood as a skill critical to spoken language. Researchers had already discovered the capacity for vocal learning in species other than humans, including in songbirds, hummingbirds, parrots, cetaceans such as dolphins and whales, pinnipeds such as seals, elephants and bats. But given the centuries-old idea that a deep chasm separated human language from animal communications, most scientists understood the vocal learning abilities of other species as unrelated to our own — as evolutionarily divergent as the wing of a bat is to that of a bee. The apparent absence of intermediate forms of language — say, a talking animal — left the question of how language evolved resistant to empirical inquiry. © 2023 The New York Times Company

Keyword: Language; Animal Communication
Link ID: 28921 - Posted: 09.21.2023

COMIC: When, why and how did neurons first evolve? Scientists are piecing together the ancient story. By Tim Vernimmen Illustrated by Maki Naro 09.14.2023 © 2023 Annual Reviews

Keyword: Evolution; Development of the Brain
Link ID: 28920 - Posted: 09.21.2023

Hannah Devlin Science correspondent The brain circuit that causes the sound of a newborn crying to trigger the release of breast milk in mothers has been uncovered by scientists. The study, in mice, gives fresh insights into sophisticated changes that occur in the brain during pregnancy and parenthood. It found that 30 seconds of continuous crying by mouse pups triggered the release of oxytocin, the brain chemical that controls the breast-milk release response in mothers. “Our findings uncover how a crying infant primes its mother’s brain to ready her body for nursing,” said Habon Issa, a graduate student at NYU Langone Health and co-author of the study. “Without such preparation, there can be a delay of several minutes between suckling and milk flow, potentially leading to a frustrated baby and stressed parent.” The study showed that once prompted, the surge of hormones continued for roughly five minutes before tapering off, enabling mouse mothers to feed their young until they were sated or began crying again. The observation that a mother’s breasts can leak milk when they hear a crying baby is not new. But the latest research is the first to identify the brain mechanisms behind what the scientists described as the “wail-to-milk pipeline”, and could pave the way for a better understanding of the challenges of breastfeeding for many women. The findings, published in Nature, showed that when a mouse pup starts crying, sound information travels to an area of its mother’s brain called the posterior intralaminar nucleus of the thalamus (PIL). This sensory hub then sends signals to oxytocin-releasing brain cells (neurons) in another region called the hypothalamus. Most of the time these hypothalamus neurons are “locked down” to prevent false alarms and wasted milk. However, after 30 seconds of continuous crying, signals from the PIL built up and overpowered the in-built inhibitory mechanism, setting off oxytocin release. © 2023 Guardian News & Media Limited

Keyword: Sexual Behavior; Hormones & Behavior
Link ID: 28919 - Posted: 09.21.2023

Mariana Lenharo A letter, signed by 124 scholars and posted online last week1, has caused an uproar in the consciousness research community. It claims that a prominent theory describing what makes someone or something conscious — called the integrated information theory (IIT) — should be labelled “pseudoscience”. Since its publication on 15 September in the preprint repository PsyArXiv, the letter has some researchers arguing over the label and others worried it will increase polarization in a field that has grappled with issues of credibility in the past. “I think it’s inflammatory to describe IIT as pseudoscience,” says neuroscientist Anil Seth, director of the Centre for Consciousness Science at the University of Sussex near Brighton, UK, adding that he disagrees with the label. “IIT is a theory, of course, and therefore may be empirically wrong,” says neuroscientist Christof Koch, a meritorious investigator at the Allen Institute for Brain Science in Seattle, Washington, and a proponent of the theory. But he says that it makes its assumptions — for example, that consciousness has a physical basis and can be mathematically measured — very clear. There are dozens of theories that seek to understand consciousness — everything that a human or non-human experiences, including what they feel, see and hear — as well as its underlying neural foundations. IIT has often been described as one of the central theories, alongside others, such as global neuronal workspace theory (GNW), higher-order thought theory and recurrent processing theory. It proposes that consciousness emerges from the way information is processed within a ‘system’ (for instance, networks of neurons or computer circuits), and that systems that are more interconnected, or integrated, have higher levels of consciousness. Hakwan Lau, a neuroscientist at Riken Center for Brain Science in Wako, Japan, and one of the authors of the letter, says that some researchers in the consciousness field are uncomfortable with what they perceive as a discrepancy between IIT’s scientific merit and the considerable attention it receives from the popular media because of how it is promoted by advocates. “Has IIT become a leading theory because of academic acceptance first, or is it because of the popular noise that kind of forced the academics to give it acknowledgement?”, Lau asks. © 2023 Springer Nature Limited

Keyword: Consciousness
Link ID: 28918 - Posted: 09.21.2023

Kimberlee D'Ardenne Dopamine seems to be having a moment in the zeitgeist. You may have read about it in the news, seen viral social media posts about “dopamine hacking” or listened to podcasts about how to harness what this molecule is doing in your brain to improve your mood and productivity. But recent neuroscience research suggests that popular strategies to control dopamine are based on an overly narrow view of how it functions. Dopamine is one of the brain’s neurotransmitters – tiny molecules that act as messengers between neurons. It is known for its role in tracking your reaction to rewards such as food, sex, money or answering a question correctly. There are many kinds of dopamine neurons located in the uppermost region of the brainstem that manufacture and release dopamine throughout the brain. Whether neuron type affects the function of the dopamine it produces has been an open question. Recently published research reports a relationship between neuron type and dopamine function, and one type of dopamine neuron has an unexpected function that will likely reshape how scientists, clinicians and the public understand this neurotransmitter. Dopamine is involved with more than just pleasure. Dopamine neuron firing Dopamine is famous for the role it plays in reward processing, an idea that dates back at least 50 years. Dopamine neurons monitor the difference between the rewards you thought you would get from a behavior and what you actually got. Neuroscientists call this difference a reward prediction error. Understand new developments in science, health and technology, each week Eating dinner at a restaurant that just opened and looks likely to be nothing special shows reward prediction errors in action. If your meal is very good, that results in a positive reward prediction error, and you are likely to return and order the same meal in the future. Each time you return, the reward prediction error shrinks until it eventually reaches zero when you fully expect a delicious dinner. But if your first meal was terrible, that results in a negative reward prediction error, and you probably won’t go back to the restaurant. Dopamine neurons communicate reward prediction errors to the brain through their firing rates and patterns of dopamine release, which the brain uses for learning. They fire in two ways. © 2010–2023, The Conversation US, Inc.

Keyword: Drug Abuse; Learning & Memory
Link ID: 28917 - Posted: 09.21.2023

By Jim Crotty The opioid crisis continues to rage across the U.S., but there are some positive, if modest, signs that it may be slowing. Overdose deaths due to opioids are flattening in many places and dropping in others, awareness of the dangers of opioid abuse continues to increase, and more than $50 billion in opioid settlement funds are finally making their way to state and local governments after years of delay. There is still much work to be done, but all public health emergencies eventually subside. Then what? First, it’s important to realize that synthetic opioids like fentanyl will never fully disappear from the drug supply. They are too potent, too addictive, and perhaps most importantly, too lucrative. Opioids, like Covid-19, are here to stay, consistently circulating in the community but at more manageable levels. More alarming is what may take its place. Since 2010, overdoses involving both stimulants and fentanyl have increased 50-fold. Experts suggest this dramatic rise in polysubstance use represents a “fourth wave” in the opioid crisis, but what if it is really the start of a new wave of an emerging stimulant crisis? Substance abuse tends to move in cycles. Periods with high rates of depressant drug use (like opioids) are almost always followed by ones with high rates of stimulant drug use (like methamphetamine and cocaine), and vice versa. The heroin crisis of the 1960s and 1970s was followed by the crack epidemic of the 1980s and 1990s, which gave way to the current opioid epidemic. As the think tank scholar Charles Fain Lehman quipped, “As with fashion, so with drugs — whatever the last generation did, the next generation tends to abhor.” The difference now is the primacy of synthetic drugs — that is, illicit substances created in a lab that are designed to mimic the effects of naturally occurring drugs.

Keyword: Drug Abuse
Link ID: 28916 - Posted: 09.21.2023

By Janet Lee Doing puzzles, playing memory-boosting games, taking classes and reading are activities that we often turn to for help keeping our brains sharp. But research is showing that what you eat, how often you exercise and the type of exercise you do can help lower your risk of dementia to a greater extent than previously thought. Live well every day with tips and guidance on food, fitness and mental health, delivered to your inbox every Thursday. Although more studies are needed, “there’s a lot of data that suggests exercise and diet are good for the brain and can prevent or help slow down” cognitive changes, says Jeffrey Burns, co-director of the University of Kansas Alzheimer’s Disease Research Center in Fairway. And living a healthy lifestyle can produce brain benefits no matter what your age. The big diet picture If you’re already eating in a way that protects your heart — plenty of whole grains, vegetables, and fruit, and little saturated fat, sodium and ultra-processed “junk” foods — there’s good news: You’re also protecting your brain. A healthy cardiovascular system keeps blood vessels open, allowing good blood flow to the brain and reducing the risk of high blood pressure, stroke and dementia. Research suggests that two specific dietary approaches — the Mediterranean diet and the MIND diet (the Mediterranean-DASH Intervention for Neurodegenerative Delay, essentially a combo of two heart-healthy eating plans) — may help stave off cognitive decline. Both diets rely on eating mostly plant foods (fruits, vegetables, whole grains, beans, nuts), olive oil, fish and poultry. The main difference between the two is that the MIND diet emphasizes specific fruits and vegetables, such as berries and leafy greens. Studies show that people who most closely follow either diet have a reduced risk of dementia compared with those who don’t. For example, people eating the Mediterranean way had a 23 percent lower risk of dementia in a nine-year study of more than 60,000 men and women published this year in BMC Medicine.

Keyword: Alzheimers
Link ID: 28915 - Posted: 09.21.2023

By Gina Kolata Tucker Marr’s life changed forever last October. He was on his way to a wedding reception when he fell down a steep flight of metal stairs, banging the right side of his head so hard he went into a coma. He’d fractured his skull, and a large blood clot formed on the left side of his head. Surgeons had to remove a large chunk of his skull to relieve pressure on his brain and to remove the clot. “Getting a piece of my skull taken out was crazy to me,” Mr. Marr said. “I almost felt like I’d lost a piece of me.” But what seemed even crazier to him was the way that piece was restored. Mr. Marr, a 27-year-old analyst at Deloitte, became part of a new development in neurosurgery. Instead of remaining without a piece of skull or getting the old bone put back, a procedure that is expensive and has a high rate of infection, he got a prosthetic piece of skull made with a 3-D printer. But it is not the typical prosthesis used in such cases. His prosthesis, which is covered by his skin, is embedded with an acrylic window that would let doctors peer into his brain with ultrasound. A few medical centers are offering such acrylic windows to patients who had to have a piece of skull removed to treat conditions like a brain injury, a tumor, a brain bleed or hydrocephalus. “It’s very cool,” Dr. Michael Lev, director of emergency radiology at Massachusetts General Hospital, said. But, “it is still early days,” he added. Advocates of the technique say that if a patient with such a window has a headache or a seizure or needs a scan to see if a tumor is growing, a doctor can slide an ultrasound probe on the patient’s head and look at the brain in the office. © 2023 The New York Times Company

Keyword: Brain imaging; Brain Injury/Concussion
Link ID: 28914 - Posted: 09.16.2023

By Kenneth S. Kosik Before our evolutionary ancestors had a brain—before they had any organs—18 different cell types got together to make a sea sponge. Remarkably, some of these cells had many of the genes needed to make a brain, even though the sponge has neither neurons nor a brain. In my neuroscience lab at the University of California, Santa Barbara, my colleagues and collaborators discovered this large repository of brain genes in the sponge. Ever since, we have asked ourselves why this ancient, porous blob of cells would contain a set of neural genes in the absence of a nervous system? What was evolution up to? The sea sponge first shows up in the fossil record about 600 million years ago. They live at the bottom of the ocean and are immobile, passive feeders. In fact, early biologists thought they were plants. Often encased by a hard exterior, a row of cells borders a watery center. Each cell has a tiny cilium that gently circulates a rich flow of microorganisms on which they feed. This seemingly simple organization belies a giant step in evolution. For the previous 3 billion years, single-celled creatures inhabited the planet. In one of evolution’s most creative acts, independent cells joined together, first into a colony and later into a truly inseparable multicellular organism. Colonies of single cells offered the first inkling that not every cell in the colony had to be identical. Cells in the interior might differ subtly from those on the periphery that are subject to the whims of the environment. Colonies offered the advantages of cooperation among many nearly identical cells. The next evolutionary innovation, multicellularity, broke radically from the past. © 2023 NautilusNext Inc.,

Keyword: Evolution
Link ID: 28913 - Posted: 09.16.2023

By Darren Incorvaia By now, it’s no secret that the phrase “bird brain” should be a compliment, not an insult. Some of our feathered friends are capable of complex cognitive tasks, including tool use (SN: 2/10/23). Among the brainiest feats that birds are capable of is vocal learning, or the ability to learn to mimic sounds and use them to communicate. In birds, this leads to beautiful calls and songs; in humans, it leads to language. The best avian vocal learners, such as crows and parrots, also tend to be considered the most intelligent birds. So it’s natural to think that the two traits could be linked. But studies with smart birds have found conflicting evidence. Although vocal learning may be linked with greater cognitive capacity in some species, the opposite relationship seems to hold true in others. Now, a massive analysis of 214 birds from 23 species shows that there is indeed a link between vocal learning and at least one advanced cognitive ability — problem-solving. The study, described in the Sept. 15 Science, is the first to analyze multiple bird species instead of just one. More than 200 birds from 23 species were given different cognitive tests to gauge their intelligence. One of the problem-solving tasks asked birds to pull a cork lid off a glass flask to access a tasty treat (bottom left). Comparing these tests with birds’ ability to learn songs and calls showed that the better vocal learners are also better at problem-solving. To compare species, biologist Jean-Nicolas Audet of the Rockefeller University in New York City and colleagues had to devise a way to assess all the birds’ vocal learning and cognitive abilities. © Society for Science & the Public 2000–2023.

Keyword: Intelligence; Evolution
Link ID: 28912 - Posted: 09.16.2023

Sara Reardon The psychedelic drug MDMA, also known as ecstasy or molly, has passed another key hurdle on its way to regulatory approval as a treatment for mental illness. A second large clinical trial has found that the drug — in combination with psychotherapy — is effective at treating post-traumatic stress disorder (PTSD). The results allow the trial’s sponsor to now seek approval from the US Food and Drug Administration (FDA) for MDMA’s use as a PTSD treatment for the general public, which might come as soon as next year. “It’s an important study,” says Matthias Liechti, a psychopharmacologist who studies MDMA at the University of Basel in Switzerland, but who was not involved with the trial or its sponsor. “It confirms MDMA works.” In June, Australia became the first country to allow physicians to prescribe MDMA for treating psychiatric conditions. MDMA is illegal in the United States and other countries because of the potential for its misuse. But the Multidisciplinary Association for Psychedelic Studies (MAPS), a non-profit organization in San Jose, California, has long been developing a proprietary protocol for using MDMA as a treatment for PTSD and other disorders. MAPS has been campaigning for its legalization — a move that could encourage other countries to follow suit. In 2021, researchers sponsored by MAPS reported the results of a study1 in which 90 people received a form of psychotherapy developed by the organization alongside either MDMA or a placebo. After three treatment sessions, 67% of those who received MDMA with therapy no longer qualified for a PTSD diagnosis, compared with 32% of those who received therapy and a placebo. The results were widely hailed as promising, but the FDA typically requires two placebo-controlled trials before a drug can be approved. The results of a second trial, involving 104 further individuals with PTSD and published on 14 September in Nature Medicine2, were similar to those of the original: 71% of people who received MDMA alongside therapy lost their PTSD diagnosis, compared with 48% of those who received a placebo and therapy. © 2023 Springer Nature Limited

Keyword: Drug Abuse; Stress
Link ID: 28911 - Posted: 09.16.2023

By Jim Davies Think of what you want to eat for dinner this weekend. What popped into mind? Pizza? Sushi? Clam chowder? Why did those foods (or whatever foods you imagined) appear in your consciousness and not something else? Psychologists have long held that when we are making a decision about a particular category of thing, we tend to bring to mind items that are typical or common in our culture or everyday lives, or ones we value the most. On this view, whatever foods you conjured up are likely ones that you eat often, or love to eat. Sounds intuitive. But a recent paper published in Cognition suggests it’s more complicated than that. Tracey Mills, a research assistant working at MIT, led the study along with Jonathan Phillips, a cognitive scientist and philosopher at Dartmouth College. They put over 2,000 subjects, recruited online, through a series of seven experiments that allowed them to test a novel approach for understanding which ideas within a category will pop into our consciousness—and which won’t. In this case, they had subjects think about zoo animals, holidays, jobs, kitchen appliances, chain restaurants, sports, and vegetables. What they found is that what makes a particular thing come to mind—such as a lion when one is considering zoo animals—is determined not by how valuable or familiar it is, but by where it lies in a multidimensional idea grid that could be said to resemble a kind of word cloud. “Under the hypothesis we argue for,” Mills and Phillips write, “the process of calling members of a category to mind might be modeled as a search through feature space, weighted toward certain features that are relevant for that category.” Historical “value” just happens to be one dimension that is particularly relevant when one is talking about dinner, but is less relevant for categories such as zoo animals or, say, crimes, they write. © 2023 NautilusNext Inc., All rights reserved.

Keyword: Attention; Learning & Memory
Link ID: 28910 - Posted: 09.16.2023

By Molly Rains Over the past 50 years, worldwide obesity rates have tripled, creating a public health crisis so widespread and damaging that it is sometimes referred to as an epidemic. Most accounts put the roots of the problem firmly in the modern age. But could it have been brewing since before World War II? That’s one provocative conclusion of a study published today in Science Advances that purports to push the obesity epidemic’s origin back to as early as the 1930s. Historical measurements from hundreds of thousands of Danish youth show that in the decades before the problem was officially recognized, the heaviest members of society were already getting steadily bigger. The findings raise questions about the accepted narrative of the obesity epidemic, says Lindsey Haynes-Maslow, an obesity expert at the University of North Carolina at Chapel Hill who was not involved in the study. “This paper is an opportunity … to say maybe we’ve been looking at this wrong, maybe we should go back to the beginning—or, when was the beginning?” she says. Most epidemiologists trace that beginning to the 1970s, when health officials first observed an uptick in the prevalence of obesity—defined as a body mass index (BMI) above 30—in many Western nations. The crisis is usually blamed on the increased postwar availability of cheap, highly processed, and calorie-rich foods, as well as increasingly sedentary lifestyles and growing portion sizes. But University of Copenhagen epidemiologist Thorkild Sørensen was skeptical of that story. Years of slowly increasing body size typically precede obesity, and might show up in historical data, he suspected. And Sørensen wasn’t convinced that the so-called obesogenic diet and lifestyle were the only factors at play. Historical data, he hoped, could reveal whether other, yet-unknown factors had contributed to the crisis.

Keyword: Obesity
Link ID: 28909 - Posted: 09.16.2023

By Amber Dance We’ve all heard of the five tastes our tongues can detect — sweet, sour, bitter, savory-umami and salty. But the real number is actually six, because we have two separate salt-taste systems. One of them detects the attractive, relatively low levels of salt that make potato chips taste delicious. The other one registers high levels of salt — enough to make overly salted food offensive and deter overconsumption. Exactly how our taste buds sense the two kinds of saltiness is a mystery that’s taken some 40 years of scientific inquiry to unravel, and researchers haven’t solved all the details yet. In fact, the more they look at salt sensation, the weirder it gets. Many other details of taste have been worked out over the past 25 years. For sweet, bitter and umami, it’s known that molecular receptors on certain taste bud cells recognize the food molecules and, when activated, kick off a series of events that ultimately sends signals to the brain. Sour is slightly different: It is detected by taste bud cells that respond to acidity, researchers recently learned. In the case of salt, scientists understand many details about the low-salt receptor, but a complete description of the high-salt receptor has lagged, as has an understanding of which taste bud cells host each detector. “There are a lot of gaps still in our knowledge — especially salt taste. I would call it one of the biggest gaps,” says Maik Behrens, a taste researcher at the Leibniz Institute for Food Systems Biology in Freising, Germany. “There are always missing pieces in the puzzle.” A fine balance Our dual perception of saltiness helps us to walk a tightrope between the two faces of sodium, an element that’s crucial for the function of muscles and nerves but dangerous in high quantities. To tightly control salt levels, the body manages the amount of sodium it lets out in urine, and controls how much comes in through the mouth. © 2023 Annual Reviews

Keyword: Chemical Senses (Smell & Taste)
Link ID: 28908 - Posted: 09.16.2023

By Sarah Lyall The author Cat Bohannon was a preteen in Atlanta in the 1980s when she saw the film “2001: A Space Odyssey” for the first time. As she took in its famous opening scene, in which a bunch of apes picks up a bunch of bones and quickly begins using them to hit each other, Bohannon was struck by the sheer maleness of the moment. “I thought, ‘Where are the females in this story?’” Bohannon said recently, imagining what those absent females might have been up to at that particular time. “It’s like, ‘Oh, sorry, I see you’re doing something really important with a rock. I’m just going to go over there behind that hill and quietly build the future of the species in my womb.” That realization was just one of what Bohannon, 44, calls “a constellation of moments” that led her to write her new book, “Eve: How the Female Body Drove 200 Million Years of Human Evolution.” A page-turning whistle-stop tour of mammalian development that begins in the Jurassic Era, “Eve” recasts the traditional story of evolutionary biology by placing women at its center. The idea is that by examining how women evolved differently from men, Bohannon argues, we can “provide the latest answers to women’s most basic questions about their bodies.” These include, she says: Why do women menstruate? Why do they live longer? And what is the point of menopause? These are timely questions. Thanks to regulations established in the 1970s, clinical trials in the United States have typically used mostly male subjects, from mice to humans. (This is known as “the male norm.”) Though that changed somewhat in 1994, when the National Institutes of Health updated its rules, even the new protocols are replete with loopholes. For example: “From 1996 to 2006, more than 79 percent of animal studies published in the scientific journal Pain included only male subjects,” she writes. © 2023 The New York Times Company

Keyword: Sexual Behavior; Evolution
Link ID: 28907 - Posted: 09.13.2023

Nicola Davis Science correspondent Whether it’s seeing Jesus in burnt toast, a goofy grin in the grooves of a cheese grater, or simply the man in the moon, humans have long perceived faces in unlikely places. Now researchers say the tendency may not be fixed in adults, suggesting it appears to be enhanced in women who have just given birth. The scientists suggest the finding could be down to postpartum women having higher levels of oxytocin, colloquially referred to as the “love” or “trust” hormone because of its role in social bonding. “These data, collected online, suggest that our sensitivity to face-like patterns is not fixed and may change throughout adulthood,” the team write. Writing in the journal Biology Letters, researchers from Australia’s University of Queensland and the University of the Sunshine Coast describe how they set out to investigate whether the propensity to see faces in inanimate objects – a phenomenon known as face pareidolia – changes during life. Previous research has suggested that when humans are given oxytocin, their ability to recognise certain emotions in faces increases. As a result, the team wanted to explore if the hormone could play a role in how sensitive individuals are towards seeing faces in inanimate objects. The researchers used an online platform to recruit women, with participants asked if they were pregnant or had just given birth – the latter being a period when oxytocin levels are generally increased. The women were each shown 320 images in a random order online and asked to rate on an 11-point scale how easily they could see a face. While 32 of the images were of human faces, 256 were of inanimate objects with patterns that could be said to resemble a face, and 32 depicted inanimate objects with no such facial patterns. The team gathered data from 84 pregnant women, 79 women who had given birth in the past year, and 216 women who did not report being pregnant or having recently had a baby. © 2023 Guardian News & Media Limited

Keyword: Sexual Behavior; Attention
Link ID: 28906 - Posted: 09.13.2023