Chapter 15. Language and Lateralization

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 2706

By Amanda Heidt For the first time, a brain implant has helped a bilingual person who is unable to articulate words to communicate in both of his languages. An artificial-intelligence (AI) system coupled to the brain implant decodes, in real time, what the individual is trying to say in either Spanish or English. The findings1, published on 20 May in Nature Biomedical Engineering, provide insights into how our brains process language, and could one day lead to long-lasting devices capable of restoring multilingual speech to people who can’t communicate verbally. “This new study is an important contribution for the emerging field of speech-restoration neuroprostheses,” says Sergey Stavisky, a neuroscientist at the University of California, Davis, who was not involved in the study. Even though the study included only one participant and more work remains to be done, “there’s every reason to think that this strategy will work with higher accuracy in the future when combined with other recent advances”, Stavisky says. The person at the heart of the study, who goes by the nickname Pancho, had a stroke at age 20 that paralysed much of his body. As a result, he can moan and grunt but cannot speak clearly. In his thirties, Pancho partnered with Edward Chang, a neurosurgeon at the University of California, San Francisco, to investigate the stroke’s lasting effects on his brain. In a groundbreaking study published in 20212, Chang’s team surgically implanted electrodes on Pancho’s cortex to record neural activity, which was translated into words on a screen. Pancho’s first sentence — ‘My family is outside’ — was interpreted in English. But Pancho is a native Spanish speaker who learnt English only after his stroke. It’s Spanish that still evokes in him feelings of familiarity and belonging. “What languages someone speaks are actually very linked to their identity,” Chang says. “And so our long-term goal has never been just about replacing words, but about restoring connection for people.” © 2024 Springer Nature Limited

Keyword: Language; Robotics
Link ID: 29321 - Posted: 05.23.2024

By Emily Anthes Half a century ago, one of the hottest questions in science was whether humans could teach animals to talk. Scientists tried using sign language to converse with apes and trained parrots to deploy growing English vocabularies. The work quickly attracted media attention — and controversy. The research lacked rigor, critics argued, and what seemed like animal communication could simply have been wishful thinking, with researchers unconsciously cuing their animals to respond in certain ways. In the late 1970s and early 1980s, the research fell out of favor. “The whole field completely disintegrated,” said Irene Pepperberg, a comparative cognition researcher at Boston University, who became known for her work with an African gray parrot named Alex. Today, advances in technology and a growing appreciation for the sophistication of animal minds have renewed interest in finding ways to bridge the species divide. Pet owners are teaching their dogs to press “talking buttons” and zoos are training their apes to use touch screens. In a cautious new paper, a team of scientists outlines a framework for evaluating whether such tools might give animals new ways to express themselves. The research is designed “to rise above some of the things that have been controversial in the past,” said Jennifer Cunha, a visiting research associate at Indiana University. The paper, which is being presented at a science conference on Tuesday, focuses on Ms. Cunha’s parrot, an 11-year-old Goffin’s cockatoo named Ellie. Since 2019, Ms. Cunha has been teaching Ellie to use an interactive “speech board,” a tablet-based app that contains more than 200 illustrated icons, corresponding to words and phrases including “sunflower seeds,” “happy” and “I feel hot.” When Ellie presses on an icon with her tongue, a computerized voice speaks the word or phrase aloud. In the new study, Ms. Cunha and her colleagues did not set out to determine whether Ellie’s use of the speech board amounted to communication. Instead, they used quantitative, computational methods to analyze Ellie’s icon presses to learn more about whether the speech board had what they called “expressive and enrichment potential.” © 2024 The New York Times Company

Keyword: Language; Epilepsy
Link ID: 29306 - Posted: 05.14.2024

By Miryam Naddaf Scientists have developed brain implants that can decode internal speech — identifying words that two people spoke in their minds without moving their lips or making a sound. Although the technology is at an early stage — it was shown to work with only a handful of words, and not phrases or sentences — it could have clinical applications in future. Similar brain–computer interface (BCI) devices, which translate signals in the brain into text, have reached speeds of 62–78 words per minute for some people. But these technologies were trained to interpret speech that is at least partly vocalized or mimed. The latest study — published in Nature Human Behaviour on 13 May1 — is the first to decode words spoken entirely internally, by recording signals from individual neurons in the brain in real time. “It's probably the most advanced study so far on decoding imagined speech,” says Silvia Marchesotti, a neuroengineer at the University of Geneva, Switzerland. “This technology would be particularly useful for people that have no means of movement any more,” says study co-author Sarah Wandelt, a neural engineer who was at the California Institute of Technology in Pasadena at the time the research was done. “For instance, we can think about a condition like locked-in syndrome.” The researchers implanted arrays of tiny electrodes in the brains of two people with spinal-cord injuries. They placed the devices in the supramarginal gyrus (SMG), a region of the brain that had not been previously explored in speech-decoding BCIs. © 2024 Springer Nature Limited

Keyword: Brain imaging; Language
Link ID: 29302 - Posted: 05.14.2024

By Elizabeth Anne Brown The beluga whale wears its heart on its sleeve — or rather, its forehead. Researchers have created a visual encyclopedia of the different expressions that belugas (Delphinapterus leucas) in captivity seem to make with their highly mobile “melon,” a squishy deposit of fat on the forehead that helps direct sound waves for echolocation. Using muscles and connective tissue, belugas can extend the melon forward until it juts over their lips like the bill of a cap; mush it down until it’s flattened against their skull; lift it vertically to create an impressive fleshy top hat; and shake it with such force that it jiggles like Jell-O. “If that doesn’t scream ‘pay attention to me,’ I don’t know what does,” says animal behaviorist Justin Richard of the University of Rhode Island in Kingston. “It’s like watching a peacock spread their feathers.” Before Richard became a scientist, he spent a decade as a beluga trainer at the Mystic Aquarium in Connecticut, working closely with the enigmatic animals. “Even as a trainer, I knew the shapes meant something,” Richard says. “But nobody had been able to put together enough observations to make sense of it.” Over the course of a year, from 2014 to 2015, Richard and colleagues recorded interactions between four belugas at the Mystic Aquarium. Analyzing the footage revealed that the belugas make five distinct melon shapes the scientists dubbed flat, lift, press, push and shake. The belugas sported an average of nearly two shapes per minute during social interaction, the team reports March 2 in Animal Cognition. © Society for Science & the Public 2000–2024

Keyword: Animal Communication; Evolution
Link ID: 29291 - Posted: 05.03.2024

By Claire Cameron On Aug. 19, 2021, a humpback whale named Twain whupped back. Specifically, Twain made a series of humpback whale calls known as “whups” in response to playback recordings of whups from a boat of researchers off the coast of Alaska. The whale and the playback exchanged calls 36 times. On the boat was naturalist Fred Sharpe of the Alaska Whale Foundation, who has been studying humpbacks for over two decades, and animal behavior researcher Brenda McCowan, a professor at the University of California, Davis. The exchange was groundbreaking, Sharpe says, because it brought two linguistic beings—humans and humpback whales—together. “You start getting the sense that there’s this mutual sense of being heard.” In their 2023 published results, McGowan, Sharpe, and their coauthors are careful not to characterize their exchange with Twain as a conversation. They write, “Twain was actively engaged in a type of vocal coordination” with the playback recordings. To the paper’s authors, the interspecies exchange could be a model for perhaps something even more remarkable: an exchange with an extraterrestrial intelligence. Sharpe and McGowan are members of Whale SETI, a team of scientists at the SETI Institute, which has been scanning the skies for decades, listening for signals that may be indicative of extraterrestrial life. The Whale SETI team seeks to show that animal communication, and particularly, complex animal vocalizations like those of humpback whales, can provide scientists with a model to help detect and decipher a message from an extraterrestrial intelligence. And, while they’ve been trying to communicate with whales for years, this latest reported encounter was the first time the whales talked back. It all might sound far-fetched. But then again, Laurance Doyle, an astrophysicist who founded the Whale SETI team and has been part of the SETI Institute since 1987, is accustomed to being doubted by the mainstream science community. © 2024 NautilusNext Inc.,

Keyword: Animal Communication; Language
Link ID: 29276 - Posted: 04.30.2024

By Saima May Sidik In 2010, Theresa Chaklos was diagnosed with chronic lymphocytic leukaemia — the first in a series of ailments that she has had to deal with since. She’d always been an independent person, living alone and supporting herself as a family-law facilitator in the Washington DC court system. But after illness hit, her independence turned into loneliness. Loneliness, in turn, exacerbated Chaklos’s physical condition. “I dropped 15 pounds in less than a week because I wasn’t eating,” she says. “I was so miserable, I just would not get up.” Fortunately a co-worker convinced her to ask her friends to help out, and her mood began to lift. “It’s a great feeling” to know that other people are willing to show up, she says. Many people can’t break out of a bout of loneliness so easily. And when acute loneliness becomes chronic, the health effects can be far-reaching. Chronic loneliness can be as detrimental as obesity, physical inactivity and smoking according to a report by Vivek Murthy, the US surgeon general. Depression, dementia, cardiovascular disease1 and even early death2 have all been linked to the condition. Worldwide, around one-quarter of adults feel very or fairly lonely, according to a 2023 poll conducted by the social-media firm Meta, the polling company Gallup and a group of academic advisers (see go.nature.com/48xhu3p). That same year, the World Health Organization launched a campaign to address loneliness, which it called a “pressing health threat”. But why does feeling alone lead to poor health? Over the past few years, scientists have begun to reveal the neural mechanisms that cause the human body to unravel when social needs go unmet. The field “seems to be expanding quite significantly”, says cognitive neuroscientist Nathan Spreng at McGill University in Montreal, Canada. And although the picture is far from complete, early results suggest that loneliness might alter many aspects of the brain, from its volume to the connections between neurons.

Keyword: Stress
Link ID: 29245 - Posted: 04.06.2024

By Emily Makowski & I spend my days surrounded by thousands of written words, and sometimes I feel as though there’s no escape. That may not seem particularly unusual. Plenty of people have similar feelings. But no, I’m not just talking about my job as a copy editor here at Scientific American, where I edit and fact-check an endless stream of science writing. This constant flow of text is all in my head. My brain automatically translates spoken words into written ones in my mind’s eye. I “see” subtitles that I can’t turn off whenever I talk or hear someone else talking. This same speech-to-text conversion even happens for the inner dialogue of my thoughts. This mental closed-captioning has accompanied me since late toddlerhood, almost as far back as my earliest childhood memories. And for a long time, I thought that everyone could “read” spoken words in their head the way I do. What I experience goes by the name of ticker-tape synesthesia. It is not a medical condition—it’s just a distinctive way of perceiving the surrounding world that relatively few people share. Not much is known about the neurophysiology or psychology of this phenomenon, sometimes called “ticker taping,” even though a reference to it first appeared in the scientific literature in the late 19th century. Ticker taping is considered a form of synesthesia, an experience in which the brain reroutes one kind of incoming sensory information so that it is processed as another. For example, sounds might be perceived as touch, allowing the affected person to “feel” them as tactile sensations. As synesthesia goes, ticker taping is relatively uncommon. “There are varieties of synesthesia which really have just been completely under the radar..., and ticker tape is really one of those,” says Mark Price, a cognitive psychologist at the University of Bergen in Norway. The name “ticker-tape synesthesia” itself evokes the concept’s late 19th-century origins. At that time stock prices transmitted by telegraph were printed on long paper strips, which would be torn into tiny bits and thrown from building windows during parades. © 2024 SCIENTIFIC AMERICAN,

Keyword: Attention; Language
Link ID: 29238 - Posted: 04.04.2024

Ian Sample Science editor Dogs understand what certain words stand for, according to researchers who monitored the brain activity of willing pooches while they were shown balls, slippers, leashes and other highlights of the domestic canine world. The finding suggests that the dog brain can reach beyond commands such as “sit” and “fetch”, and the frenzy-inducing “walkies”, to grasp the essence of nouns, or at least those that refer to items the animals care about. “I think the capacity is there in all dogs,” said Marianna Boros, who helped arrange the experiments at Eötvös Loránd University in Hungary. “This changes our understanding of language evolution and our sense of what is uniquely human.” Scientists have long been fascinated by whether dogs can truly learn the meanings of words and have built up some evidence to back the suspicion. A survey in 2022 found that dog owners believed their furry companions responded to between 15 and 215 words. More direct evidence for canine cognitive prowess came in 2011 when psychologists in South Carolina reported that after three years of intensive training, a border collie called Chaser had learned the names of more than 1,000 objects, including 800 cloth toys, 116 balls and 26 Frisbees. However, studies have said little about what is happening in the canine brain when it processes words. To delve into the mystery, Boros and her colleagues invited 18 dog owners to bring their pets to the laboratory along with five objects the animals knew well. These included balls, slippers, Frisbees, rubber toys, leads and other items. At the lab, the owners were instructed to say words for objects before showing their dog either the correct item or a different one. For example, an owner might say “Look, here’s the ball”, but hold up a Frisbee instead. The experiments were repeated multiple times with matching and non-matching objects. © 2024 Guardian News & Media Limited

Keyword: Language; Learning & Memory
Link ID: 29214 - Posted: 03.26.2024

By Darren Incorvaia Be it an arched eyebrow, a shaken head or a raised finger, humans wordlessly communicate complex ideas through gestures every day. This ability is rare in the animal kingdom, having been observed only in primates (SN: 8/10/10). Scientists now might be able to add a feathered friend to the club. Researchers have observed Japanese tits making what they call an “after you” gesture: A bird flutters its wings, cuing its mate to enter the nest first. The finding, reported in the March 25 Current Biology, “shows that Japanese tits not only use wing fluttering as a symbolic gesture, but also in a complex social context involving a sender, receiver and a specific goal, much like how humans communicate,” says biologist Toshitaka Suzuki of the University of Tokyo. Suzuki has been listening in on the calls of Japanese tits (Parus minor) for more than 17 years. During his extensive time in the field, he noticed that Japanese tits bringing food to the nest would sometimes perch on a branch and flutter their wings. At that point, their partners would enter the nest with the flutterer close behind. “This led me to investigate whether this behavior fulfills the criteria of gestures,” Suzuki says. Suzuki and Norimasa Sugita, a researcher at Tokyo’s National Museum of Nature and Science, observed eight mated pairs make 321 trips to their nests. A pattern quickly emerged: Females fluttered their wings far more often than males, with six females shaking it up while only one male did. Females almost always entered the nest first — unless they fluttered their wings. Then the males went first. © Society for Science & the Public 2000–2024.

Keyword: Animal Communication; Evolution
Link ID: 29213 - Posted: 03.26.2024

Ian Sample Science editor Dogs understand what certain words stand for, according to researchers who monitored the brain activity of willing pooches while they were shown balls, slippers, leashes and other highlights of the domestic canine world. The finding suggests that the dog brain can reach beyond commands such as “sit” and “fetch”, and the frenzy-inducing “walkies”, to grasp the essence of nouns, or at least those that refer to items the animals care about. “I think the capacity is there in all dogs,” said Marianna Boros, who helped arrange the experiments at Eötvös Loránd University in Hungary. “This changes our understanding of language evolution and our sense of what is uniquely human.” Scientists have long been fascinated by whether dogs can truly learn the meanings of words and have built up some evidence to back the suspicion. A survey in 2022 found that dog owners believed their furry companions responded to between 15 and 215 words. More direct evidence for canine cognitive prowess came in 2011 when psychologists in South Carolina reported that after three years of intensive training, a border collie called Chaser had learned the names of more than 1,000 objects, including 800 cloth toys, 116 balls and 26 Frisbees. However, studies have said little about what is happening in the canine brain when it processes words. To delve into the mystery, Boros and her colleagues invited 18 dog owners to bring their pets to the laboratory along with five objects the animals knew well. These included balls, slippers, Frisbees, rubber toys, leads and other items. © 2024 Guardian News & Media Limited

Keyword: Language; Evolution
Link ID: 29212 - Posted: 03.23.2024

By Anthony Ham What is the meaning of a cat’s meow that grows louder and louder? Or your pet’s sudden flip from softly purring as you stroke its back to biting your hand? It turns out these misunderstood moments with your cat may be more common than not. A new study by French researchers, published last month in the journal Applied Animal Behaviour Science, found that people were significantly worse at reading the cues of an unhappy cat (nearly one third got it wrong) than those of a contented cat (closer to 10 percent). The study also suggested that a cat’s meows and other vocalizations are greatly misinterpreted and that people should consider both vocal and visual cues to try to determine what’s going on with their pets. The researchers drew these findings from the answers of 630 online participants; respondents were volunteers recruited through advertisements on social media. Each watched 24 videos of differing cat behaviors. One third depicted only vocal communication, another third just visual cues, and the remainder involved both. “Some studies have focused on how humans understand cat vocalizations,” said Charlotte de Mouzon, lead author of the study and a cat behavior expert at the Université Paris Nanterre. “Other studies studied how people understand cats’ visual cues. But studying both has never before been studied in human-cat communication.” Cats display a wide range of visual signals: tails swishing side to side, or raised high in the air; rubbing and curling around our legs; crouching; flattening ears or widening eyes. Their vocals can range from seductive to threatening: meowing, purring, growling, hissing and caterwauling. At last count, kittens were known to use nine different forms of vocalization, while adult cats uttered 16. That we could better understand what a cat wants by using visual and vocal cues may seem obvious. But we know far less than we think we do. © 2024 The New York Times Compan

Keyword: Animal Communication; Evolution
Link ID: 29169 - Posted: 02.29.2024

Rob Stein Benjamin Franklin famously wrote: "In this world nothing can be said to be certain, except death and taxes." While that may still be true, there's a controversy simmering today about one of the ways doctors declare people to be dead. The debate is focused on the Uniform Determination of Death Act, a law that was adopted by most states in the 1980s. The law says that death can be declared if someone has experienced "irreversible cessation of all functions of the entire brain." But some parts of the brain can continue to function in people who have been declared brain dead, prompting calls to revise the statute. Many experts say the discrepancy needs to be resolved to protect patients and their families, maintain public trust and reconcile what some see as a troubling disconnect between the law and medical practice. The debate became so contentious, however, that the Uniform Law Commission, the group charged with rewriting model laws for states, paused its process last summer because participants couldn't reach a consensus. "I'm worried," says Thaddeus Pope, a bioethicist and lawyer at Mitchell Hamline School of Law in St. Paul, Minnesota. "There's a lot of conflict at the bedside over this at hospitals across the United States. Let's get in front of it and fix it before it becomes a crisis. It's such an important question that everyone needs to be on the same page." The second method, brain death, can be declared for people who have sustained catastrophic brain injury causing the permanent cessation of all brain function, such as from a massive traumatic brain injury or massive stroke, but whose hearts are still pumping through the use of ventilators or other artificial forms of life support. © 2024 npr

Keyword: Brain Injury/Concussion; Brain imaging
Link ID: 29147 - Posted: 02.13.2024

By Fletcher Reveley One afternoon in May 2020, Jerry Tang, a Ph.D. student in computer science at the University of Texas at Austin, sat staring at a cryptic string of words scrawled across his computer screen: “I am not finished yet to start my career at twenty without having gotten my license I never have to pull out and run back to my parents to take me home.” The sentence was jumbled and agrammatical. But to Tang, it represented a remarkable feat: A computer pulling a thought, however disjointed, from a person’s mind. For weeks, ever since the pandemic had shuttered his university and forced his lab work online, Tang had been at home tweaking a semantic decoder — a brain-computer interface, or BCI, that generates text from brain scans. Prior to the university’s closure, study participants had been providing data to train the decoder for months, listening to hours of storytelling podcasts while a functional magnetic resonance imaging (fMRI) machine logged their brain responses. Then, the participants had listened to a new story — one that had not been used to train the algorithm — and those fMRI scans were fed into the decoder, which used GPT1, a predecessor to the ubiquitous AI chatbot ChatGPT, to spit out a text prediction of what it thought the participant had heard. For this snippet, Tang compared it to the original story: “Although I’m twenty-three years old I don’t have my driver’s license yet and I just jumped out right when I needed to and she says well why don’t you come back to my house and I’ll give you a ride.” The decoder was not only capturing the gist of the original, but also producing exact matches of specific words — twenty, license. When Tang shared the results with his adviser, a UT Austin neuroscientist named Alexander Huth who had been working towards building such a decoder for nearly a decade, Huth was floored. “Holy shit,” Huth recalled saying. “This is actually working.”

Keyword: Brain imaging; Language
Link ID: 29073 - Posted: 01.03.2024

By Gary Stix This year was full of roiling debate and speculation about the prospect of machines with superhuman capabilities that might, sooner than expected, leave the human brain in the dust. A growing public awareness of ChatGPT and other so-called large language models (LLMs) dramatically expanded public awareness of artificial intelligence. In tandem, it raised the question of whether the human brain can keep up with the relentless pace of AI advances. The most benevolent answer posits that humans and machines need not be cutthroat competitors. Researchers found one example of potential cooperation by getting AI to probe the infinite complexity of the ancient game of Go—which, like chess, has seen a computer topple the highest-level human players. A study published in March showed how people might learn from machines with such superhuman skills. And understanding ChatGPT’s prodigious abilities offers some inkling as to why an equivalence between the deep neural networks that underlie the famed chatbot and the trillions of connections in the human brain is constantly invoked. Importantly, the machine learning incorporated into AI has not totally distracted mainstream neuroscience from avidly pursuing better insights into what has been called “the most complicated object in the known universe”: the brain. One of the grand challenges in science—understanding the nature of consciousness—received its due in June with the prominent showcasing of experiments that tested the validity of two competing theories, both of which purport to explain the underpinnings of the conscious self. The past 12 months provided lots of examples of impressive advances for you to store in your working memory. Now here’s a closer look at some of the standout mind and brain stories we covered in Scientific American in 2023. © 2023 SCIENTIFIC AMERICAN

Keyword: Brain imaging; Consciousness
Link ID: 29069 - Posted: 12.31.2023

By Esther Landhuis When Frank Lin was in junior high, his grandma started wearing hearing aids. During dinner conversations, she was often painfully silent, and communicating by phone was nearly impossible. As a kid, Lin imagined “what her life would be like if she wasn’t always struggling to communicate.” It was around that time that Lin became interested in otolaryngology, the study of the ears, nose, and throat. He would go on to study to be an ENT physician, which, he hoped, could equip him to help patients with similar age-related hardships. Those aspirations sharpened during his residency at Johns Hopkins University School of Medicine in the late 2000s. Administering hearing tests in the clinic, Lin noticed that his colleagues had vastly different reactions to the same results in young versus old patients. If mild deficits showed up in a kid, “it would be like, ‘Oh, that hearing is critically important,’” said Lin, who today is the director of the Cochlear Center for Hearing and Public Health at Hopkins. But when they saw that same mild to moderate hearing loss in a 70-something patient, many would downplay the findings. Yet today, research increasingly suggests that untreated hearing loss puts people at higher risk for cognitive decline and dementia. And, unlike during Lin’s early training, many patients can now do something about it: They can assess their own hearing using online tests or mobile phone apps, and purchase over-the-counter hearing aids, which are generally more affordable their predecessors and came under regulation by the Food and Drug Administration in October 2022. Despite this expanded accessibility, interest in direct-to-consumer hearing devices has lagged thus far — in part, experts suggest, due to physician inattention to adult hearing health, inadequate insurance coverage for hearing aids, and lingering stigma around the issue. (As Lin put it: “There’s always been this notion that everyone has it as you get older, how can it be important?”) Even now, hearing tests aren’t necessarily recommended for individuals unless they report a problem.

Keyword: Hearing; Alzheimers
Link ID: 29064 - Posted: 12.27.2023

By Cathleen O’Grady Why do some children learn to talk earlier than others? Linguists have pointed to everything from socioeconomic status to gender to the number of languages their parents speak. But a new study finds a simpler explanation. An analysis of nearly 40,000 hours of audio recordings from children around the world suggests kids speak more when the adults around them are more talkative, which may also give them a larger vocabulary early in life. Factors such as social class appear to make no difference, researchers report this month in the Proceedings of the National Academy of Sciences. The paper is a “wonderful, impactful, and much needed contribution to the literature,” says Ece Demir-Lira, a developmental scientist at the University of Iowa who was not involved in the work. By looking at real-life language samples from six different continents, she says, the study provides a global view of language development sorely lacking from the literature. Most studies on language learning have focused on children in Western, industrialized nations. To build a more representative data set, Harvard University developmental psychologist Elika Bergelson and her collaborators scoured the literature for studies that had used LENA devices: small audio recorders that babies can wear—tucked into a pocket on a specially made vest—for days at a time. These devices function as a kind of “talk pedometer,” with an algorithm that estimates how much its wearer speaks, as well as how much language they hear in their environment—from parents, other adults, and even siblings. The team asked 18 research groups across 12 countries whether they would share their data from the devices, leaving them with a whopping 2865 days of recordings from 1001 children. Many of the kids, who ranged from 2 months to 4 years old, were from English-speaking families, but the data also included speakers of Dutch, Spanish, Vietnamese, and Finnish, as well as Yélî Dnye (Papua New Guinea), Wolof (Senegal), and Tsimané (Bolivia). Combining these smaller data sets gave the researchers a more powerful, diverse sample.

Keyword: Language; Development of the Brain
Link ID: 29061 - Posted: 12.22.2023

A new study shows male zebra finches must sing every day to keep their vocal muscles in shape. Females prefer the songs of males that did their daily vocal workout. Sponsor Message ARI SHAPIRO, HOST: Why do songbirds sing so much? Well, a new study suggests they have to to stay in shape. Here's NPR's Ari Daniel. ARI DANIEL, BYLINE: A few years ago, I was out at dawn in South Carolina low country, a mix of swamp and trees draped in Spanish moss. (SOUNDBITE OF BIRDS CHIRPING) DANIEL: The sound of birdsong filled the air. It's the same in lots of places. Once the light of day switches on, songbirds launch their serenade. IRIS ADAM: I mean, why birds sing is relatively well-answered. DANIEL: Iris Adam is a behavioral neuroscientist at the University of Southern Denmark. ADAM: For many songbirds, males sing to impress a female and attract them as mate. And also, birds sing to defend their territory. DANIEL: But Adam says these reasons don't explain why songbirds sing so darn much. ADAM: There's an insane drive to sing. DANIEL: For some, it's hours every day. That's a lot of energy. Plus, singing can be dangerous. ADAM: As soon as you sing, you reveal yourself - like, where you are, that you even exist, where your territory is. All of that immediately is out in the open for predators, for everybody. DANIEL: Why take that risk? Adam wondered whether the answer might lie in the muscles that produce birdsong and if those muscles require regular exercise. So she designed a series of experiments on zebra finches, little Australian songbirds with striped heads and a bloom of orange on their cheeks. One of Adam's first experiments involved taking males and severing the connection between their brains and their singing muscles. ADAM: Already after two days, they had lost some of their performance. And after three weeks, they were back to the same level when they were juveniles and never had sung before. DANIEL: Next, she left the finches intact but prevented them from singing for a week by keeping them in the dark almost around the clock. ADAM: The first two or three days, it's quite easy. But the longer the experiment goes, the more they are like, I need to sing. And so then you need to tell them, like, stop. You can't sing. DANIEL: After a week, the birds' singing muscles lost half their strength. But does that impact what the resulting song sounds like? Here's a male before the seven days of darkness. © 2023 npr

Keyword: Animal Communication; Language
Link ID: 29042 - Posted: 12.13.2023

By Carl Zimmer Traumatic brain injuries have left more than five million Americans permanently disabled. They have trouble focusing on even simple tasks and often have to quit jobs or drop out of school. A study published on Monday has offered them a glimpse of hope. Five people with moderate to severe brain injuries had electrodes implanted in their heads. As the electrodes stimulated their brains, their performance on cognitive tests improved. If the results hold up in larger clinical trials, the implants could become the first effective therapy for chronic brain injuries, the researchers said. “This is the first evidence that you can move the dial for this problem,” said Dr. Nicholas Schiff, a neurologist at Weill Cornell Medicine in New York who led the study. Gina Arata, one of the volunteers who received the implant, was 22 when a car crash left her with fatigue, memory problems and uncontrollable emotions. She abandoned her plans for law school and lived with her parents in Modesto, Calif., unable to keep down a job. In 2018, 18 years after the crash, Ms. Arata received the implant. Her life has changed profoundly, she said. “I can be a normal human being and have a conversation,” she said. “It’s kind of amazing how I’ve seen myself improve.” Dr. Schiff and his colleagues designed the trial based on years of research on the structure of the brain. Those studies suggested that our ability to focus on tasks depends on a network of brain regions that are linked to each other by long branches of neurons. The regions send signals to each other, creating a feedback loop that keeps the whole network active. Sudden jostling of the brain — in a car crash or a fall, for example — can break some of the long-distance connections in the network and lead people to fall into a coma, Dr. Schiff and his colleagues have hypothesized. During recovery, the network may be able to power itself back up. But if the brain is severely damaged, it may not fully rebound. Dr. Schiff and his colleagues pinpointed a structure deep inside the brain as a crucial hub in the network. Known as the central lateral nucleus, it is a thin sheet of neurons about the size and shape of an almond shell. © 2023 The New York Times Company

Keyword: Brain Injury/Concussion
Link ID: 29033 - Posted: 12.06.2023

Anil Oza Researchers have long known that areas of songbird brains that are responsible for singing grow during mating season and then shrink when the season is over. But one species, Gambel’s white-crowned sparrow (Zonotrichia leucophrys gambelii), does this on a scale that scientists are struggling to understand. A part of the male sparrow’s brain called the HVC grows from around 100,000 neurons to about 170,000 — nearly doubling in size — during the bird’s mating season. Although how the bird pulls off this feat is still a mystery, scientists who presented data at the annual Society for Neuroscience meeting in Washington DC on 11–15 November are closing in on answers. They hope their findings might one day point to ways of treating anomalies in the human brain. In most animals, when a brain region grows and shrinks, “frequently, it’s pretty detrimental on behaviour and function of the brain”, says Tracy Larson, a neuroscientist at the University of Virginia in Charlottesville who led the work. In particular, growth on this scale in mammals would cause inflammation and increase the pressure inside their skulls. But when it comes to the sparrows, “there’s something really fascinating about these birds that they can manage to do this and not have detrimental impacts”, Larson adds. Larson’s research has so far hinted that the sparrow’s brain is using a slew of tactics to quickly form and then kill a large number of neurons. One question that Larson wanted to answer is how the sparrow’s brain shrinks dramatically at the end of mating season. So she and her colleagues tagged cells in and around the HVCs of male sparrows with a molecule called bromodeoxyuridine (BrdU), which can become incorporated into the DNA of dividing cells. They also used hormone supplements to simulate breeding season in the birds. © 2023 Springer Nature Limited

Keyword: Sexual Behavior; Hormones & Behavior
Link ID: 29029 - Posted: 12.02.2023

By Claudia López Lloreda Genetic tweaks in kingfishers might help cushion the blow when the diving birds plunge beak first into the water to catch fish. Analysis of the genetic instruction book of some diving kingfishers identified changes in genes related to brain function as well as retina and blood vessel development, which might protect against damage during dives, researchers report October 24 in Communications Biology. The results suggest the different species of diving kingfishers may have adapted to survive their dives unscathed in some of the same ways, but it’s still unclear how the genetic changes protect the birds. Hitting speeds of up to 40 kilometers per hour, kingfisher dives put huge amounts of potentially damaging pressure on the birds’ heads, beaks and brains. The birds dive repeatedly, smacking their heads into the water in ways that could cause concussions in humans, says Shannon Hackett, an evolutionary biologist and curator at the Field Museum in Chicago. “So there has to be something that protects them from the terrible consequences of repeatedly hitting their heads against a hard substrate.” Hackett first became interested in how the birds protect their brains while she worked with her son’s hockey team and started worrying about the effect of repeated hits on the human brain. Around the same time, evolutionary biologist Chad Eliason joined the museum to study kingfishers and their plunge diving behavior. In the new study, Hackett, Eliason and colleagues analyzed the complete genome of 30 kingfisher species, some that plunge dive and others that don’t, from specimens frozen and stored at the museum. The preserved birds came from all over the world; some of the diving species came from mainland areas and others from islands and had evolved to dive independently rather than from the same plunge-diving ancestor. The team wanted to know if the different diving species had evolved similar genetic changes to arrive at the same behaviors. Many kingfisher species have developed this behavior, but it was unclear whether this was through genetic convergence, similar to how many species of birds have lost their flight or how bats and dolphins independently developed echolocation (SN: 9/6/2013). © Society for Science & the Public 2000–2023.

Keyword: Brain Injury/Concussion; Evolution
Link ID: 28991 - Posted: 11.08.2023