Chapter 15. Language and Lateralization

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 2762

James Doubek Researchers have some new evidence about what makes birds make so much noise early in the morning, and it's not for some of the reasons they previously thought. For decades, a dominant theory about why birds sing at dawn — called the "dawn chorus" — has been that they can be heard farther and more clearly at that time. Sound travels faster in humid air and it's more humid early in the morning. It's less windy, too, which is thought to lessen any distortion of their vocalizations. But scientists from the Cornell Lab of Ornithology's K. Lisa Yang Center for Conservation Bioacoustics and Project Dhvani in India combed through audio recordings of birds in the rainforest. They say they didn't find evidence to back up this "acoustic transmission hypothesis." It was among the hypotheses involving environmental factors. Another is that birds spend their time singing at dawn because there's low light and it's a bad time to look for food. "We basically didn't find much support for some of these environmental cues which have been purported in literature as hypotheses" for why birds sing more at dawn, says Vijay Ramesh, a postdoctoral research associate at Cornell and the study's lead author. The study, called "Why is the early bird early? An evaluation of hypotheses for avian dawn-biased vocal activity," was published this month in the peer-reviewed journal Philosophical Transactions of the Royal Society B. The researchers didn't definitively point to one reason for why the dawn chorus is happening, but they found support for ideas that the early morning racket relates to birds marking their territory after being inactive at night, and communicating about finding food. © 2025 npr

Keyword: Animal Communication; Evolution
Link ID: 29839 - Posted: 06.21.2025

Associated Press Prairie dogs bark to alert each other to the presence of predators, with different cries depending on whether the threat is airborne or approaching by land. But their warnings also seem to help a vulnerable grassland bird. Curlews have figured out that if they eavesdrop on alarms from US prairie dog colonies they may get a jump on predators coming for them, too, according to research published on Thursday in the journal Animal Behavior. “Prairie dogs are on the menu for just about every predator you can think of – golden eagles, red-tailed hawks, foxes, badgers, even large snakes,” said Andy Boyce, a research ecologist in Montana at the Smithsonian’s National Zoo and Conservation Biology Institute. Such animals also gladly snack on grassland nesting birds such as the long-billed curlew, so the birds have adapted. Previous research has shown birds frequently eavesdrop on other bird species to glean information about food sources or danger, said Georgetown University ornithologist Emily Williams, who was not involved in the study. But, so far, scientists have documented only a few instances of birds eavesdropping on mammals. “That doesn’t necessarily mean it’s rare in the wild,” she said, “it just means we haven’t studied it yet.” Prairie dogs, a type of ground squirrel, live in large colonies with a series of burrows that may stretch for miles underground, especially on the vast US plains. When they hear each other’s barks, they either stand alert watching or dive into their burrows. “Those little barks are very loud; they can carry quite a long way,” said research co-author Andrew Dreelin, who also works for the Smithsonian. © 2025 Guardian News & Media Limited

Keyword: Animal Communication; Language
Link ID: 29832 - Posted: 06.18.2025

David Farrier Charles Darwin suggested that humans learned to speak by mimicking birdsong: our ancestors’ first words may have been a kind of interspecies exchange. Perhaps it won’t be long before we join the conversation once again. The race to translate what animals are saying is heating up, with riches as well as a place in history at stake. The Jeremy Coller Foundation has promised $10m to whichever researchers can crack the code. This is a race fuelled by generative AI; large language models can sort through millions of recorded animal vocalisations to find their hidden grammars. Most projects focus on cetaceans because, like us, they learn through vocal imitation and, also like us, they communicate via complex arrangements of sound that appear to have structure and hierarchy. Sperm whales communicate in codas – rapid sequences of clicks, each as brief as 1,000th of a second. Project Ceti (the Cetacean Translation Initiative) is using AI to analyse codas in order to reveal the mysteries of sperm whale speech. There is evidence the animals take turns, use specific clicks to refer to one another, and even have distinct dialects. Ceti has already isolated a click that may be a form of punctuation, and they hope to speak whaleish as soon as 2026. The linguistic barrier between species is already looking porous. Last month, Google released DolphinGemma, an AI program to translate dolphins, trained on 40 years of data. In 2013, scientists using an AI algorithm to sort dolphin communication identified a new click in the animals’ interactions with one another, which they recognised as a sound they had previously trained the pod to associate with sargassum seaweed – the first recorded instance of a word passing from one species into another’s native vocabulary. The prospect of speaking dolphin or whale is irresistible. And it seems that they are just as enthusiastic. In November last year, scientists in Alaska recorded an acoustic “conversation” with a humpback whale called Twain, in which they exchanged a call-and-response form known as “whup/throp” with the animal over a 20-minute period. In Florida, a dolphin named Zeus was found to have learned to mimic the vowel sounds, A, E, O, and U. © 2025 Guardian News & Media Limited

Keyword: Language; Evolution
Link ID: 29821 - Posted: 06.04.2025

Danielle Wilhour Cerebrospinal fluid, or CSF, is a clear, colorless liquid that plays a crucial role in maintaining the health and function of your central nervous system. It cushions the brain and spinal cord, provides nutrients and removes waste products. Despite its importance, problems related to CSF often go unnoticed until something goes wrong. Recently, cerebrospinal fluid disorders drew public attention with the announcement that musician Billy Joel had been diagnosed with normal pressure hydrocephalus. In this condition, excess CSF accumulates in the brain’s cavities, enlarging them and putting pressure on surrounding brain tissue even though diagnostic readings appear normal. Because normal pressure hydrocephalus typically develops gradually and can mimic symptoms of other neurodegenerative diseases, such as Alzheimer’s or Parkinson’s disease, it is often misdiagnosed. I am a neurologist and headache specialist. In my work treating patients with CSF pressure disorders, I have seen these conditions present in many different ways. Here’s what happens when your cerebrospinal fluid stops working. What is cerebrospinal fluid? CSF is made of water, proteins, sugars, ions and neurotransmitters. It is primarily produced by a network of cells called the choroid plexus, which is located in the brain’s ventricles, or cavities. The choroid plexus produces approximately 500 milliliters (17 ounces) of CSF daily, but only about 150 milliliters (5 ounces) are present within the central nervous system at any given time due to constant absorption and replenishment in the brain. This fluid circulates through the ventricles of the brain, the central canal of the spinal cord and the subarachnoid space surrounding the brain and spinal cord. © 2010–2025, The Conversation US, Inc.

Keyword: Biomechanics; Stroke
Link ID: 29812 - Posted: 05.31.2025

By Paula Span & KFF Health News Kristin Kramer woke up early on a Tuesday morning 10 years ago because one of her dogs needed to go out. Then, a couple of odd things happened. When she tried to call her other dog, “I couldn’t speak,” she said. As she walked downstairs to let them into the yard, “I noticed that my right hand wasn’t working.” But she went back to bed, “which was totally stupid,” said Kramer, now 54, an office manager in Muncie, Indiana. “It didn’t register that something major was happening,” especially because, reawakening an hour later, “I was perfectly fine.” So she “just kind of blew it off” and went to work. It’s a common response to the neurological symptoms that signal a TIA, a transient ischemic attack or ministroke. At least 240,000 Americans experience one each year, with the incidence increasing sharply with age. Because the symptoms disappear quickly, usually within minutes, people don’t seek immediate treatment, putting them at high risk for a bigger stroke. Kramer felt some arm tingling over the next couple of days and saw her doctor, who found nothing alarming on a CT scan. But then she started “jumbling” her words and finally had a relative drive her to an emergency room. By then, she could not sign her name. After an MRI, she recalled, “my doctor came in and said, ‘You’ve had a small stroke.’” Did those early-morning aberrations constitute a TIA? Might a 911 call and an earlier start on anticlotting drugs have prevented her stroke? “We don’t know,” Kramer said. She’s doing well now, but faced with such symptoms again, “I would seek medical attention.” © 2025 SCIENTIFIC AMERICAN,

Keyword: Stroke
Link ID: 29808 - Posted: 05.28.2025

Sofia Marie Haley I approach a flock of mountain chickadees feasting on pine nuts. A cacophony of sounds, coming from the many different bird species that rely on the Sierra Nevada’s diverse pine cone crop, fill the crisp mountain air. The strong “chick-a-dee” call sticks out among the bird vocalizations. The chickadees are communicating to each other about food sources – and my approach. Mountain chickadees are a member of the family Paridae, which is known for its complex vocal communication systems and cognitive abilities. Along with my advisers, behavioral ecologists Vladimir Pravosudov and Carrie Branch, I’m studying mountain chickadees at our study site in Sagehen Experimental Forest, outside of Truckee, California, for my doctoral research. I am focusing on how these birds convey a variety of information with their calls. The chilly autumn air on top of the mountain reminds me that it will soon be winter. It is time for the mountain chickadees to leave the socially monogamous partnerships they had while raising their chicks to form larger flocks. Forming social groups is not always simple; young chickadees are joining new flocks, and social dynamics need to be established before the winter storms arrive. I can hear them working this out vocally. There’s an unusual variety of complex calls, with melodic “gargle calls” at the forefront, coming from individuals announcing their dominance over other flock members. Examining and decoding bird calls is becoming an increasingly popular field of study, as scientists like me are discovering that many birds – including mountain chickadees – follow systematic rules to share important information, stringing together syllables like words in a sentence. © 2010–2025, The Conversation US, Inc.

Keyword: Language; Evolution
Link ID: 29807 - Posted: 05.28.2025

By Maggie Astor Billy Joel has canceled his upcoming concerts because of a brain disorder affecting his hearing, vision and balance, the singer-songwriter announced on Friday. The condition, called normal pressure hydrocephalus, or N.P.H., is estimated to affect hundreds of thousands of older Americans. Here’s what to know about it. What is normal pressure hydrocephalus? N.P.H. occurs when excess cerebrospinal fluid accumulates in the brain, causing difficulty walking, trouble controlling one’s bladder and memory problems. Those symptoms together suggest the disorder. The bladder symptoms can include incontinence and waking up at night to urinate with increasing frequency, said Dr. Charles Matouk, a neurosurgeon at Yale University and director of the university’s Normal Pressure Hydrocephalus Program. A statement posted to Mr. Joel’s social media accounts on Friday said his condition had been “exacerbated by recent concert performances.” N.P.H. is rare, but risk increases with age. Dr. Matouk estimated that it might affect less than 1 percent of the population ages 65 to 80, but likely 5 percent or more of people over 80. Experts say the condition is likely underdiagnosed because its symptoms can easily be dismissed as normal effects of aging. Dr. Matouk urged people to see a doctor if they experienced trouble walking, controlling their bladder and remembering things. How is it diagnosed? When a patient shows up with gait, bladder and memory problems, the first test may be a CT scan or M.R.I. In patients with N.P.H., that imaging will show enlargement of the brain’s fluid-filled ventricles. But the conclusive test is a spinal tap: Because that procedure removes cerebrospinal fluid, patients with N.P.H. experience a temporary alleviation of symptoms, confirming the diagnosis, Dr. Matouk said. © 2025 The New York Times Company

Keyword: Brain imaging
Link ID: 29801 - Posted: 05.24.2025

By Sara Novak Just a few weeks after they hatch, baby male zebra finches begin to babble, spending much of the day testing their vocal chords. Dad helps out, singing to his hatchlings during feedings, so that the babies can internalize his tune, the same mating refrain shared by all male zebra finches. Soon, these tiny Australian birds begin to rehearse the song itself, repeating it up to 10,000 times a day, without any clear reward other than their increasing perfection of the melody. The baby birds’ painstaking devotion to mastering their song led Duke University neuroscientist Richard Mooney and his Duke colleague John Pearson to wonder whether the birds could help us better understand the nature of self-directed learning. In humans, language and musical expression are thought to be self-directed—spontaneous, adaptive and intrinsically reinforced. In a study recently published in Nature, the scientists tracked the brain signals and levels of dopamine, a neurotransmitter involved in reward and movement, in the brains of five male baby Zebra finches while they were singing. They also measured song quality for each rendition the birds sang, in terms of both pitch and vigor, as well as the quality of song performance relative to the bird’s age. What they found is that dopamine levels in the baby birds’ brains closely matched the birds’ performance of the song, suggesting it plays a central role in the learning process. Scientists have long known that learning that is powered by external rewards, such as grades, praise or sugary treats, is driven by dopamine—which is thought to chart the differences between expected and experienced rewards. But while they have suspected that self-directed learning is likewise guided by dopamine, it had been difficult to test that hypothesis until now. © 2025 NautilusNext Inc.,

Keyword: Attention; Sexual Behavior
Link ID: 29800 - Posted: 05.24.2025

By Gina Kolata Dr. Geoffrey Manley, a neurosurgeon at the University of California, San Francisco, wants the medical establishment to change the way it deals with brain injuries. His work is motivated in part by what happened to a police officer he treated in 2002, just after completing his medical training. The man arrived at the emergency room unconscious, in a coma. He had been in a terrible car crash while pursuing a criminal. Two days later, Dr. Manley’s mentor said it was time to tell the man’s family there was no hope. His life support should be withdrawn. He should be allowed to die. Dr. Manley resisted. The patient’s brain oxygen levels were encouraging. Seven days later the policeman was still in a coma. Dr. Manley’s mentor again pressed him to talk to the man’s family about withdrawing life support. Again, Dr. Manley resisted. Ten days after the accident, the policeman began to come out of his coma. Three years later he was back at work and was named San Francisco Police Officer of the Month. In 2010, he was Police Officer of the Year “That case, and another like it,” Dr. Manley said, “changed my practice.” But little has changed in the world of traumatic brain injuries since Dr. Manley’s patient woke up. Assessments of who will recover and how severely patients are injured are pretty much the same, which results in patients being told they “just” have a concussion, who then have trouble getting care for recurring symptoms like memory lapses or headaches. And it results in some patients in the position of that policemen, who have their life support withdrawn when they might have recovered. Now, though, Dr. Manley and 93 others from 14 countries are proposing a new way to evaluate patients. They published their classification system Tuesday in the journal Lancet Neurology. © 2025 The New York Times Company

Keyword: Brain Injury/Concussion; Consciousness
Link ID: 29798 - Posted: 05.21.2025

By Erin Wayman Barbara J. King remembers the first time she met Kanzi the bonobo. It was the late 1990s, and the ape was living in a research center in Georgia. King walked in and told Kanzi she had a present. A small, round object created a visible outline in the front pocket of her jeans. Kanzi picked up a board checkered with colorful symbols and pointed to the one meaning “egg” and then to “question.” An egg? No, not an egg. A ball. But “he asked an on-point question, and even an extremely simple conversation was just amazing,” says King, a biolog­ical anthropologist at William & Mary in Williamsburg, Va. Born in 1980, Kanzi began learn­ing to communicate with symbols as an infant. He ultimately mastered more than 300 symbols, combined them in novel ways and understood spoken English. Kanzi was arguably the most accomplished among a cohort of “talking” apes that scientists in­tensely studied to understand the origins of language and to probe the ape mind. He was also the last of his kind. In March, Kanzi died. “It’s not just Kanzi that is gone; it’s this whole field of inquiry,” says comparative psychologist Heidi Lyn of the University of South Alabama in Mobile. Lyn had worked with Kanzi on and off for 30 years. Kanzi’s death offers an opportu­nity to reflect on what decades of ape-language experiments taught us — and at what cost. A history of ape-language experiments Language — communication marked by using symbols, grammar and syntax — has long been consid­ered among the abilities that make humans unique. And when it comes to delineating the exact boundary separating us from other animals, scientists often turn to our closest living relatives, the great apes. © Society for Science & the Public 2000–2025.

Keyword: Language; Evolution
Link ID: 29797 - Posted: 05.21.2025

By Paula Span Kristin Kramer woke up early on a Tuesday morning 10 years ago because one of her dogs needed to go out. Then, a couple of odd things happened. When she tried to call her other dog, “I couldn’t speak,” she said. As she walked downstairs to let them into the yard, “I noticed that my right hand wasn’t working.” But she went back to bed, “which was totally stupid,” said Ms. Kramer, now 54, an office manager in Muncie, Ind. “It didn’t register that something major was happening,” especially because, reawakening an hour later, “I was perfectly fine.” So she “just kind of blew it off” and went to work. It’s a common response to the neurological symptoms that signal a T.I.A., a transient ischemic attack or ministroke. At least 240,000 Americans experience one each year, with the incidence increasing sharply with age. Because the symptoms disappear quickly, usually within minutes, people don’t seek immediate treatment, putting them at high risk for a bigger stroke. Ms. Kramer felt some arm tingling over the next couple of days and saw her doctor, who found nothing alarming on a CT scan. But then she started “jumbling” her words and finally had a relative drive her to an emergency room. By then, she could not sign her name.After an M.R.I., she recalled, “my doctor came in and said, ‘You’ve had a small stroke.’” Did those early-morning aberrations constitute a T.I.A.? Might a 911 call and an earlier start on anti-clotting drugs have prevented her stroke? “We don’t know,” Ms. Kramer said. She’s doing well now, but faced with such symptoms again, “I would seek medical attention.” Now, a large epidemiological study by researchers at the University of Alabama at Birmingham, published in JAMA Neurology, points to another reason to take T.I.A.s seriously: Over five years, study participants’ performance on cognitive tests after a T.I.A. drops as steeply as it does among victims of a full-on stroke. © 2025 The New York Times Company

Keyword: Stroke
Link ID: 29796 - Posted: 05.21.2025

By Mikael Angelo Francisco A comic explains the highs and lows of birdsong Mikael Angelo Francisco is a science journalist and illustrator from the Philippines who enjoys writing about paleontology, biodiversity, environment conservation, and science in pop culture. He has written and edited books about media literacy, Filipino scientists, and science trivia. © 2025 NautilusNext Inc.

Keyword: Language; Evolution
Link ID: 29794 - Posted: 05.21.2025

By Sheila Hale On the night before the accident, John and I and our son Jay, who was then 26, lingered in the garden drinking wine and enjoying the mid-summer scent of jasmine and lilies. We talked about the Manet exhibition we had just seen at the National Gallery. We probably talked about how the end of the cold war might affect the chances of Bill Clinton winning the presidential election against George HW Bush in November. I know what John thought about that. I only wish I could recall his words. The next morning, 30 July 1992, John got up before me as he always did. In the kitchen I found the contents of the dishwasher – knives, forks, spoons, plates, mugs – jumbled together on the table. This was odd because unloading the dishwasher was the one domestic ritual he willingly performed. It would be years before I learned the reason. At the time I put it down to absent-mindedness. It was a month since he had delivered a book to the publisher and he was already preoccupied by the next one, about art in the European Renaissance. Before I had time to be annoyed, I heard a crash from his study at the top of the house. I ran upstairs and found him lying on the floor next to his desk. He looked up at me with the radiant, witless smile of a baby. And he said: “Da walls.” The ambulance took us to the local hospital where they said that my husband had had cerebral accident – a stroke. The cause was probably years of uncontrolled high blood pressure, about which no doctor had warned him. They said he needed rest and reassurance. Unfortunately, because of the so-called efficiency savings introduced by John Major’s government, there was a shortage of beds and of nurses in all London hospitals. I was so grateful when they found a bed for him in a geriatric ward later in the day that I didn’t at first notice how filthy it was and how hot. The floor was covered in urine, blood and dust balls. (Later I brought in a mop to clean around John’s bed.) The plateglass window could not be opened: to prevent suicides, a passing nurse told me. It was a week before I managed to track down the doctor whose name was printed on a grimy card at the head of John’s bed. The doctor informed me that my husband’s case was hopeless. He would never walk again and must never be allowed to try to stand because the hospital insurance wouldn’t cover a fall. Physiotherapy, which the doctor considered “about as useful as peanut butter”, was out of the question. © 2025 Guardian News & Media Limited

Keyword: Stroke
Link ID: 29792 - Posted: 05.17.2025

By Christa Lesté-Lasserre Can a robot arm wave hello to a cuttlefish—and get a hello back? Could a dolphin’s whistle actually mean “Where are you?” And are monkeys quietly naming each other while we fail to notice? These are just a few of the questions tackled by the finalists for this year’s Dolittle prize, a $100,000 award recognizing early breakthroughs in artificial intelligence (AI)-powered interspecies communication. The winning project—announced today—explores how dolphins use shared, learned whistles that may carry specific meanings—possibly even warning each other about danger, or just expressing confusion. The other contending teams—working with marmosets, cuttlefish, and nightingales—are also pushing the boundaries of what human-animal communication might look like. The prize marks an important milestone in the Coller Dolittle Challenge, a 5-year competition offering up to $10 million to the first team that can achieve genuine two-way communication with animals. “Part of how this initiative was born came from my skepticism,” says Yossi Yovel, a neuroecologist at Tel Aviv University and one of the prize’s organizers. “But we really have much better tools now. So this is the time to revisit a lot of our previous assumptions about two-way communication within the animal’s own world.” Science caught up with the four finalists to hear how close we really are to cracking the animal code. This interview has been edited for clarity and length. Cuttlefish (Sepia officinalis and S. bandensis) lack ears and voices, but they apparently make up for this with a kind of sign language. When shown videos of comrades waving their arms, they wave back.

Keyword: Language; Evolution
Link ID: 29788 - Posted: 05.17.2025

By Jake Buehler Grunts, barks, screams and pants ring through Taï National Park in Cȏte d’Ivoire. Chimpanzees there combine these different calls like linguistic Legos to relay complex meanings when communicating, researchers report May 9 in Science Advances. Chimps can combine and flexibly rearrange pairs of sounds to convey different ideas or meanings, an ability that investigators have not documented in other nonhuman animals. This system may represent a key evolutionary transition between vocal communication strategies of other animals and the syntax rules that structure human languages. “The difference between human language and how other animals communicate is really about how we combine sounds to form words, and how we combine words to form sentences,” says Cédric Girard-Buttoz, an evolutionary biologist at CNRS in Lyon, France. Chimpanzees (Pan troglodytes) were known to have a particularly complicated vocal repertoire, with about a dozen single sounds that they can combine into hundreds of sequences. But it was unclear if the apes used multiple approaches when combining sounds to make new meanings, like in human language. In 2019 and 2020, Girard-Buttoz and his colleagues recorded 53 different adult chimpanzees living in the Taï forest. In all, the team analyzed over 4,300 sounds and described 16 different “bigrams” — short sequences of two sounds, like a grunt followed by a bark, or a panted hoo followed by a scream. The team then used statistical analyses to map those bigrams to behaviors to reveal some of the bigrams’ meanings. The result? Chimpanzees don’t combine sounds in a single, consistent way. They have at least four different methods — a first seen outside of humans. © Society for Science & the Public 2000–2025

Keyword: Language; Evolution
Link ID: 29781 - Posted: 05.11.2025

By Rachel Lehmann-Haupt On a brisk January evening this year, I was speeding down I–295 in northeast Florida, under a full moon, to visit my dad’s brain. As I drove past shadowy cypress swamps, sinewy river estuaries, and gaudy-hued billboards of condominiums with waterslides and red umbrellas boasting, “Best place to live in Florida,” I was aware of the strangeness of my visit. Most people pay respects to their loved ones at memorials and grave sites, but I was intensely driven to check in on the last remaining physical part of my dad, immortalized in what seemed like the world’s most macabre library. Michael DeTure, a professor of neuroscience, stepped out of a golf cart to meet me. “Welcome to the bunker. Just 8,000 of your quietest friends in here,” he said in a melodic southern drawl, grinning in a way that told me he’s made this joke before. The bunker is an indiscriminate warehouse, part of the Mayo Clinic’s Jacksonville, Florida campus that houses its brain bank. DeTure opened the warehouse door, and I was met with a blast of cold air. In the back of the warehouse sat rows of buzzing white freezers. DeTure pointed to the freezer where my dad’s brain sat in a drawer in a plastic bag with his name written on it in black Sharpie pen. I welled up with tears and a feeling of intense fear. The room suddenly felt too cold, too sterile, too bright, and my head started to spin. I wanted to run away from this place. And then my brain escaped for me. I saw my dad on a beach on Cape Cod in 1977. He was in a bathing suit, shirtless, lying on a towel. I was 7 years old and snuggled up to him to protect myself from the wind. He was reading aloud to my mom and me from Evelyn Waugh’s novel, A Handful of Dust, whose title is from T.S. Eliot’s poem, “The Wasteland”: “I will show you fear in a handful of dust.” He was reading the part about Tony Last, an English gentleman, being imprisoned by an eccentric recluse who forces him to read Dickens endlessly. © 2025 NautilusNext Inc.,

Keyword: Language; Learning & Memory
Link ID: 29776 - Posted: 05.07.2025

By Michael Erard In many Western societies, parents eagerly await their children’s first words, then celebrate their arrival. There’s also a vast scientific and popular attention to early child language. Yet there is (and was) surprisingly little hullabaloo sparked by the first words and hand signs displayed by great apes. WHAT I LEFT OUT is a recurring feature in which book authors are invited to share anecdotes and narratives that, for whatever reason, did not make it into their final manuscripts. In this installment, author and linguist Michael Erard shares a story that didn’t make it into his recent book “Bye Bye I Love You: The Story of Our First and Last Words” (MIT Press, 344 pages.) As far back as 1916, scientists have been exploring the linguistic abilities of humans’ closest relatives by raising them in language-rich environments. But the first moments in which these animals did cross a communication threshold created relatively little fuss in both the scientific literature and the media. Why? Consider, for example, the first sign by Washoe, a young chimpanzee that was captured in the wild and transported in 1966 to a laboratory at the University of Nevada, where she was studied by two researchers, Allen Gardner and Beatrice Gardner. Washoe was taught American Sign Language in family-like settings that would be conducive to communicative situations. “Her human companions,” wrote the Gardners in 1969, “were to be friends and playmates as well as providers and protectors, and they were to introduce a great many games and activities that would be likely to result in maximum interaction.” When the Gardners wrote about the experiments, they did note her first uses of specific signs, such as “toothbrush,” that didn’t seem to echo a sign a human had just used. These moments weren’t ignored, yet you have to pay very close attention to their writings to find the slightest awe or enthusiasm. Fireworks it is not.

Keyword: Language; Evolution
Link ID: 29753 - Posted: 04.23.2025

By Erin Blakemore Consuming more than eight alcoholic drinks a week is associated with brain injuries linked to Alzheimer’s disease and cognitive decline, a recent study in the journal Neurology suggests. The analysis looked for links between heavy drinking and brain health. Researchers used autopsy data from the Biobank for Aging Studies at the University of São Paulo Medical School in Brazil collected between 2004 and 2024. The team analyzed data from 1,781 people ages 50 or older at death. The average age at death was 74.9. With the help of surveys of the deceased’s next of kin, researchers gathered information about the deceased’s cognitive function and alcohol consumption in the three months before their death. Among participants, 965 never drank, 319 drank up to seven drinks per week (moderate drinking), and 129 had eight or more drinks per week (heavy drinking). Another 368 were former heavy drinkers who had stopped drinking before their last three months of life. The analysis showed that heavy drinkers and former heavy drinkers, respectively, had 41 percent and 31 percent higher odds of neurofibrillary tangles — clumps of the protein tau that accumulate inside brain neurons and have been associated with Alzheimer’s disease. Moderate, heavy and former heavy drinkers also had a higher risk of hyaline arteriolosclerosis, which thickens the walls of small blood vessels in the brain, impeding blood flow and causing brain damage over time. Though 40 percent of those who never drank had vascular brain lesions, they were more common in moderate (44.6 percent), heavy (44.1 percent) and former heavy drinkers (50.2 percent), the study found.

Keyword: Drug Abuse; Alzheimers
Link ID: 29749 - Posted: 04.19.2025

By Carl Zimmer After listening to hundreds of hours of ape calls, a team of scientists say they have detected a hallmark of human language: the ability to put together strings of sounds to create new meanings. The provocative finding, published Thursday in the journal Science, drew praise from some scholars and skepticism from others. Federica Amici, a primatologist at the University of Leipzig in Germany, said that the study helped place the roots of language even further back in time, to millions of years before the emergence of our species. “Differences between humans and other primates, including in communication, are far less distinct and well-defined than we have long assumed,” Dr. Amici said. But other researchers said that the study, which had been conducted on bonobos, close relatives of chimpanzees, had little to reveal about how we use words. “The present findings don’t tell us anything about the evolution of language,” said Johan Bolhuis, a neurobiologist at Utrecht University in the Netherlands. Many species can communicate with sounds. But when an animal makes a sound, it typically means just one thing. Monkeys, for instance, can make one warning call in reference to a leopard and a different one for an incoming eagle flying. In contrast, we humans can string words together in ways that combine their individual meanings into something new. Suppose I say, “I am a bad dancer.” When I combine the words “bad” and “dancer,” I no longer mean them independently; I’m not saying, “I am a bad person who also happens to dance.” Instead, I mean that I don’t dance well. Linguists call this compositionality, and have long considered it an essential ingredient of language. “It’s the force behind language’s creativity and productivity,” said Simon Townsend, a comparative psychologist at the University of Zurich in Switzerland. “Theoretically, you can come up with any phrase that has never been uttered before.” © 2025 The New York Times Company

Keyword: Language; Evolution
Link ID: 29730 - Posted: 04.05.2025

Miryam Naddaf A brain-reading implant that translates neural signals into audible speech has allowed a woman with paralysis to hear what she intends to say nearly instantly. Researchers enhanced the device — known as a brain–computer interface (BCI) — with artificial intelligence (AI) algorithms that decoded sentences as the woman thought of them, and then spoke them out loud using a synthetic voice. Unlike previous efforts, which could produce sounds only after users finished an entire sentence, the current approach can simultaneously detect words and turn them into speech within 3 seconds. The findings, published in Nature Neuroscience on 31 March1, represent a big step towards BCIs that are of practical use. Older speech-generating BCIs are similar to “a WhatsApp conversation”, says Christian Herff, a computational neuroscientist at Maastricht University, the Netherlands, who was not involved with the work. “I write a sentence, you write a sentence and you need some time to write a sentence again,” he says. “It just doesn’t flow like a normal conversation.” BCIs that stream speech in real time are “the next level” in research because they allow users to convey the tone and emphasis that are characteristic of natural speech, he adds. The study participant, Ann, lost her ability to speak after a stroke in her brainstem in 2005. Some 18 years later, she underwent a surgery to place a paper-thin rectangle containing 253 electrodes on the surface of her brain cortex. The implant can record the combined activity of thousands of neurons at the same time. Researchers personalized the synthetic voice to sound like Ann’s own voice from before her injury, by training AI algorithms on recordings from her wedding video. During the latest study, Ann silently mouthed 100 sentences from a set of 1,024 words and 50 phrases that appeared on a screen. The BCI device captured her neural signals every 80 milliseconds, starting 500 milliseconds before Ann started to silently say the sentences. It produced between 47 and 90 words per minute (natural conversation happens at around 160 words per minute).

Keyword: Language; Robotics
Link ID: 29726 - Posted: 04.02.2025