Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 2201 - 2220 of 29565

By Katherine Ellison Jessica McCabe crashed and burned at 30, when she got divorced, dropped out of community college and moved in with her mother. Eric Tivers had 21 jobs before age 21. Both have been diagnosed with attention-deficit/hyperactivity disorder, and both today are entrepreneurs who wear their diagnoses — and rare resilience — on their sleeves. With YouTube videos, podcasts and tweets, they’ve built online communities aimed at ending the shame that so often makes having ADHD so much harder. Now they’re going even further, asking: Why not demand more than mere compassion? Why not seek deeper changes to create a more ADHD-friendly world? “I’ve spent the last five or six years trying to understand how my brain works so that I could conform, but now I’m starting to evolve,” says McCabe, 38, whose chipper, NASCAR-speed delivery has garnered 742,000 subscribers — and counting — to her YouTube channel, “How to ADHD.” “I think we no longer have to accept that we live in a world that is not built for our brains.” With Tivers, she is planning a virtual summit on the topic for next May. As a first step, with the help of Canadian cognitive scientist Deirdre Kelly, she says she’ll soon release new guidelines to assess products and services for their ADHD friendliness. Computer programs that help restless users meditate and a chair that accommodates a variety of seated positions are high on the list to promote, while error-prone apps or devices will be flagged. Kelly also envisions redesigning refrigerator vegetable drawers, so that the most nutritious food will no longer be out of sight and mind. In the past two decades, the world has become much kinder to the estimated 6.1 million children and approximately 10 million adults with ADHD, whose hallmark symptoms are distraction, forgetfulness and impulsivity. Social media has made all the difference.

Keyword: ADHD
Link ID: 27960 - Posted: 08.25.2021

By Paula Span Learning your odds of eventually developing dementia — a pressing concern for many, especially those with a family history of it — requires medical testing and counseling. But what if everyday behavior, like overlooking a couple of credit card payments or habitually braking while driving, could foretell your risk? A spate of experiments is underway to explore that possibility, reflecting the growing awareness that the pathologies underlying dementia can begin years or even decades before symptoms emerge. “Early detection is key for intervention, at the stage when that would be most effective,” said Sayeh Bayat, the lead author of a driving study funded by the National Institutes of Health and conducted at Washington University in St. Louis. Such efforts could help identify potential volunteers for clinical trials, researchers say, and help protect older people against financial abuse and other dangers. In recent years, many once-promising dementia drugs, particularly for Alzheimer’s disease, have failed in trials. One possible reason, researchers say, is that the drugs are administered too late to be helpful. Identifying risks earlier, when the brain has sustained less damage, could create a pool of potential participants with “preclinical” Alzheimer’s disease, who could then test preventive measures or treatments. It could also bring improvements in daily life. “We could support people’s ability to drive longer, and have safer streets for everyone,” Ms. Bayat offered as an example. © 2021 The New York Times Company

Keyword: Alzheimers; Learning & Memory
Link ID: 27959 - Posted: 08.25.2021

By Carolyn Wilke Frog and toad pupils come in quite the array, from slits to circles. But overall, there are seven main shapes of these animals’ peepholes, researchers report in the Aug. 25 Proceedings of the Royal Society B. Eyes are “among the most charismatic features of frogs and toads,” says herpetologist Julián Faivovich of the Museo Argentino de Ciencias Naturales “Bernardino Rivadavia” in Buenos Aires. People have long marveled at the animals’ many iris colors and pupil shapes. Yet “there’s almost nothing known about the anatomical basis of that diversity.” Faivovich and colleagues catalogued pupil shapes from photos of 3,261 species, representing 44 percent of known frogs and toads. The team identified seven main shapes: vertical slits, horizontal slits, diamonds, circles, triangles, fans and inverted fans. The most common shape, horizontal slits, appeared in 78 percent of studied species. Mapping pupil shapes onto a tree of evolutionary relationships allowed the scientists to infer how these seven shapes emerged. Though uncommon in other vertebrates, horizontal pupils seem to have given rise to most of the other shapes in frogs and toads. All together, these seven shapes have evolved at least 116 times, the researchers say. Pupil shape affects the amount of light that reaches the retina and its light-receiving cells, says Nadia Cervino, a herpetologist also at the Argentine museum. But how the shape influences what animals actually see isn’t well-known. © Society for Science & the Public 2000–2021.

Keyword: Vision; Evolution
Link ID: 27958 - Posted: 08.25.2021

By Jonathan Lambert At least 65 million years of evolution separate humans and greater sac-winged bats, but these two mammals share a key feature of learning how to speak: babbling. Just as human infants babble their way from “da-da-da-da” to “Dad,” wild bat pups (Saccopteryx bilineata) learn the mating and territorial songs of adults by first babbling out the fundamental syllables of the vocalizations, researchers report in the Aug. 20 Science. These bats now join humans as the only clear examples of mammals who learn to make complex vocalizations through babbling. “This is a hugely important step forward in the study of vocal learning,” says Tecumseh Fitch, an evolutionary biologist at the University of Vienna not involved in the new study. “These findings suggest that there are deep parallels between how humans and young bats learn to control their vocal apparatus,” he says. The work could enable future studies that might allow researchers to peer deeper into the brain activity that underpins vocal learning. Before complex vocalizations, whether words or mating songs, can be spoken or sung, vocalizers must learn to articulate the syllables that make up a species’s vocabulary, says Ahana Fernandez, an animal behavior biologist at the Museum für Naturkunde in Berlin. “Babbling is a way of practicing,” and honing those vocalizations, she says. The rhythmic, repetitive “ba-ba-ba’s” and “ga-ga-ga’s” of human infants may sound like gibberish, but they are necessary exploratory steps toward learning how to talk. Seeing whether babbling is required for any animal that learns complex vocalizations necessitates looking in other species. © Society for Science & the Public 2000–2021.

Keyword: Language; Hearing
Link ID: 27957 - Posted: 08.21.2021

By Priyanka Runwal Brain tissue is innately squishy. Unlike bones, shells or teeth, it is rich in fat and rots quickly, seldom making an appearance in the fossil record. So when Russell Bicknell, an invertebrate paleontologist at the University of New England in Australia, noticed a pop of white near the front of a fossilized horseshoe crab body where the animal’s brain would have been, he was surprised. A closer look revealed an exceptional imprint of the brain along with other bits of the creature’s nervous system. Unearthed from the Mazon Creek deposit in northeastern Illinois, and dating back 310 million years, it’s the first fossilized horseshoe crab brain ever found. Dr. Bicknell and his colleagues reported the find last month in the journal Geology. “These kinds of fossils are so rare that if you happen to stumble upon one, you’d generally be in shock,” he said. “We’re talking a needle-in-a-haystack level of wow.” The find helps fill a gap in the evolution of arthropod brains and also shows how little they have changed over hundreds of millions of years. Soft-tissue preservation requires special conditions. Scientists have found brains encased in fossilized tree resin, better known as amber, that were less than 66 million years old. They have also found brains preserved as flattened carbon films, sometimes replaced or overlaid by minerals in shale deposits that are more than 500 million years old. Such deposits include corpses of ocean-dwelling arthropods that sank to the seafloor, were rapidly buried in mud and remained shielded from immediate decay in the low-oxygen environment. However, the fossilized brain of Euproops danae, which is kept in a collection at the Yale Peabody Museum of Natural History, required a different set of conditions to be preserved. © 2021 The New York Times Company

Keyword: Evolution
Link ID: 27956 - Posted: 08.21.2021

By Christiane Gelitz, Maddie Bender | To a chef, the sounds of lip smacking, slurping and swallowing are the highest form of flattery. But to someone with a certain type of misophonia, these same sounds can be torturous. Brain scans are now helping scientists start to understand why. People with misophonia experience strong discomfort, annoyance or disgust when they hear particular triggers. These can include chewing, swallowing, slurping, throat clearing, coughing and even audible breathing. Researchers previously thought this reaction might be caused by the brain overactively processing certain sounds. Now, however, a new study published in the Journal of Neuroscience has linked some forms of misophonia to heightened “mirroring” behavior in the brain: those affected feel distress while their brains act as if they are mimicking the triggering mouth movements. “This is the first breakthrough in misophonia research in 25 years,” says psychologist Jennifer J. Brout, who directs the International Misophonia Research Network and was not involved in the new study. The research team, led by Newcastle University neuroscientist Sukhbinder Kumar, analyzed brain activity in people with and without misophonia when they were at rest and while they listened to sounds. These included misophonia triggers (such as chewing), generally unpleasant sounds (like a crying baby), and neutral sounds. The brain's auditory cortex, which processes sound, reacted similarly in subjects with and without misophonia. But in both the resting state and listening trials, people with misophonia showed stronger connections between the auditory cortex and brain regions that control movements of the face, mouth and throat. Kumar found this connection became most active in participants with misophonia when they heard triggers specific to the condition. © 2021 Scientific American,

Keyword: Hearing; Attention
Link ID: 27955 - Posted: 08.21.2021

by Peter Hess Children born to mothers who take antipsychotic medications during pregnancy do not have elevated odds of autism or attention deficit hyperactivity disorder (ADHD), nor are they more likely to be born preterm or underweight, according to a study released this past Monday in JAMA Internal Medicine. Some women with schizophrenia, Tourette syndrome or bipolar disorder take antipsychotic drugs, such as aripiprazole, haloperidol or risperidone. Clinicians have long debated whether women should discontinue these medications during pregnancy out of concern for the drugs’ effects on the developing fetus. But children born to mothers who take antipsychotics during pregnancy and to those who do not take them have similar outcomes, the new work shows. “Our findings do not support a recommendation for women to discontinue their regular antipsychotic treatment during pregnancy,” says senior investigator Kenneth Man, research fellow at the University College London School of Pharmacy in the United Kingdom. Prescribing antipsychotics during pregnancy can help prevent potentially dangerous psychotic episodes and ensure that an expectant mother can take care of herself, says Mady Hornig, associate professor of epidemiology at Columbia University, who was not involved in the study. “We certainly don’t want to be cavalier about the use of any medication during pregnancy, but one also wants to balance out the implications of not treating.” © 2021 Simons Foundation

Keyword: Schizophrenia; Development of the Brain
Link ID: 27954 - Posted: 08.21.2021

By Jillian Kramer Mice are at their best at night. But a new analysis suggests researchers often test the nocturnal creatures during the day—which could alter results and create variability across studies—if they record time-of-day information at all. Of the 200 papers examined in the new study, more than half either failed to report the timing of behavioral testing or did so ambiguously. Only 20 percent reported nighttime testing. The analysis was published in Neuroscience & Biobehavioral Reviews. West Virginia University neuroscientist Randy Nelson, the study's lead author, says this is likely a matter of human convenience. “It is easier to get students and techs to work during the day than [at] night,” Nelson says. But that convenience comes at a cost. “Time of day not only impacts the intensity of many variables, including locomotor activity, aggressive behavior, and plasma hormone levels,” but changes in those variables can only be observed during certain parts of the diurnal cycle, says University of Wyoming behavioral neuroscientist William D. Todd. This means that “failing to report time of day of data collection and tests makes interpretation of results extremely difficult,” adds Beth Israel Deaconess Medical Center staff scientist Natalia Machado. Neither Todd nor Machado was involved in the new study. The study researchers say it is critical that scientists report the timing of their work and consider the fact that animals' behavioral and physiological responses can vary with the hour. As a first step, Nelson says, “taking care of time-of-day considerations seems like low-hanging fruit in terms of increasing behavioral neuroscience research reliability, reproducibility and rigor.” © 2021 Scientific American

Keyword: Biological Rhythms
Link ID: 27953 - Posted: 08.21.2021

By John Horgan In my 20s, I had a friend who was brilliant, charming, Ivy-educated and rich, heir to a family fortune. I’ll call him Gallagher. He could do anything he wanted. He experimented, dabbling in neuroscience, law, philosophy and other fields. But he was so critical, so picky, that he never settled on a career. Nothing was good enough for him. He never found love for the same reason. He also disparaged his friends’ choices, so much so that he alienated us. He ended up bitter and alone. At least that’s my guess. I haven’t spoken to Gallagher in decades. There is such a thing as being too picky, especially when it comes to things like work, love and nourishment (even the pickiest eater has to eat something). That’s the lesson I gleaned from Gallagher. But when it comes to answers to big mysteries, most of us aren’t picky enough. We settle on answers for bad reasons, for example, because our parents, priests or professors believe it. We think we need to believe something, but actually we don’t. We can, and should, decide that no answers are good enough. We should be agnostics. Some people confuse agnosticism (not knowing) with apathy (not caring). Take Francis Collins, a geneticist who directs the National Institutes of Health. He is a devout Christian, who believes that Jesus performed miracles, died for our sins and rose from the dead. In his 2006 bestseller The Language of God, Collins calls agnosticism a “cop-out.” When I interviewed him, I told him I am an agnostic and objected to “cop-out.” © 2021 Scientific American

Keyword: Consciousness
Link ID: 27952 - Posted: 08.18.2021

Natalie Grover Cuttlefish have one of the largest brains among invertebrates and can remember what, where, and when specific things happened right up to their final days of life, according to new research. The cephalopods – which have three hearts, eight arms, blue-green blood, regenerating limbs, and the ability to camouflage and exert self-control – only live for roughly two years. As they get older, they show signs of declining muscle function and appetite, but it appears that no matter their age they can remember what they ate, where and when, and use this to guide their future feeding decisions, said the lead study author, Dr Alexandra Schnell from the University of Cambridge. This is in contrast to humans, who gradually lose the ability to remember experiences that occurred at a particular time and place with age – for instance, what you ate for lunch last Wednesday. This “episodic memory” and its deterioration is linked to the hippocampus, a seahorse-shaped organ in the part of the brain near our ears. Cuttlefish, meanwhile, do not have a hippocampus, but a “vertical lobe” associated with learning and memory. In the study, Schnell and her colleagues conducted memory tests in 24 cuttlefish. Half were 10-12 months old (not quite adults) while the rest were 22-24 months old (the equivalent of a human in their 90s), according to the paper, published in the journal Proceedings of the Royal Society B. In one experiment, both groups of cuttlefish were first trained to approach a specific location in their tank, marked with a flag, and learn that two different foods would be provided at different times. At one spot, the flag was waved and the less-preferred king prawn was provided every hour. Grass shrimp, which they like more, was provided at a different spot where another flag was waved – but only every three hours. This was done for about four weeks, until they learned that waiting for longer meant that they could get their preferred food. © 2021 Guardian News & Media Limited

Keyword: Learning & Memory; Evolution
Link ID: 27951 - Posted: 08.18.2021

Ruth Williams In the days before a newborn mouse opens its peepers, nerve impulses that have been sweeping randomly across the retina since birth start flowing consistently in one direction, according to a paper published in Science today (July 22). This specific pattern has a critical purpose, the authors say, helping to establish the brain circuitry to be used later in motion detection. “I love this paper. It blew my mind,” says David Berson, who studies the visual system at Brown University and was not involved in the research. “What it implies is that evolution has built a visual system that can simulate the patterns of activity that it will see later when it’s fully mature and the eyes are open, and that [the simulated pattern] in turn shapes the development of the nervous system in a way that makes it better adapted to seeing those patterns. . . . That’s staggering.” The thread of this concept may be looped, but to unravel it, Berson says, it helps to think of the mammalian visual system, or really any neuronal circuitry, as being formed by a combination of evolution and life experiences—in short, nature and nurture. We might expect that life’s visual experiences, the nurture part, would begin when the eyes open. But, much like a human baby in the womb practices breathing and sucking without ever having experienced air or breastfeeding, the eyes of newborn mice appear to practice seeing before they can actually see. Motion detection is important enough to mouse survival that evolution has selected for gene variants that set up this prevision training, says Berson. © 1986–2021 The Scientist.

Keyword: Vision; Development of the Brain
Link ID: 27950 - Posted: 08.18.2021

By Gina Kolata Everyone knows conventional wisdom about metabolism: People put pounds on year after year from their 20s onward because their metabolisms slow down, especially around middle age. Women have slower metabolisms than men. That’s why they have a harder time controlling their weight. Menopause only makes things worse, slowing women’s metabolisms even more. All wrong, according to a paper published Thursday in Science. Using data from nearly 6,500 people, ranging in age from 8 days to 95 years, researchers discovered that there are four distinct periods of life, as far as metabolism goes. They also found that there are no real differences between the metabolic rates of men and women after controlling for other factors. The findings from the research are likely to reshape the science of human physiology and could also have implications for some medical practices, like determining appropriate drug doses for children and older people. “It will be in textbooks,” predicted Leanne Redman, an energy balance physiologist at Pennington Biomedical Research Institute in Baton Rouge, La., who also called it “a pivotal paper.” Rozalyn Anderson, a professor of medicine at the University of Wisconsin-Madison, who studies aging, wrote a perspective accompanying the paper. In an interview, she said she was “blown away” by its findings. “We will have to revise some of our ideas,” she added. But the findings’ implications for public health, diet and nutrition are limited for the moment because the study gives “a 30,000-foot view of energy metabolism,” said Dr. Samuel Klein, who was not involved in the study and is director of the Center for Human Nutrition at the Washington University School of Medicine in St. Louis. He added, “I don’t think you can make any new clinical statements” for an individual. When it comes to weight gain, he says, the issue is the same as it has always been: People are eating more calories than they are burning. Metabolic research is expensive, and so most published studies have had very few participants. But the new study’s principal investigator, Herman Pontzer, an evolutionary anthropologist at Duke University, said that the project’s participating researchers agreed to share their data. There are more than 80 co-authors on the study. By combining efforts from a half dozen labs collected over 40 years, they had sufficient information to ask general questions about changes in metabolism over a lifetime. © 2021 The New York Times Company

Keyword: Obesity
Link ID: 27949 - Posted: 08.14.2021

By Teresa Carr In the fall of 2016, sex therapist and researcher Leonore Tiefer shuttered the New View Campaign, an organization she had founded to combat what she refers to as “the medicalization of sex” — essentially, the pharmaceutical industry’s efforts to define variations in sexuality and sexual problems as medical issues requiring a drug fix. For 16 years, the group had fought against industry’s involvement in sex research, including its push for a drug to boost women’s sex drives. New View hosted conferences and its members penned papers and testified before the United States Food and Drug Administration. The campaign was prominently featured in an 80-minute documentary called Orgasm Inc, and promoted a clever (if off-pitch) video advising women to “throw that pink pill away,” a reference to the female-libido drug flibanserin (Addyi), which was seeking FDA approval at the time. New View counted some successes: The FDA didn’t approve an allegedly libido-boosting testosterone patch for women, on the grounds that the patch’s slim benefits didn’t outweigh its risks, and the FDA twice rejected flibanserin for the same reason. But in August 2015, the agency reversed itself and approved the so-called pink Viagra. “I felt we’d said everything we had to say,” said Tiefer of ending the campaign. Advocates predicted FDA approval would be sought for additional women’s libido drugs, but the group felt there was nothing they could do to stop it. “However many more drugs were going to come down the pike,” said Tiefer, “it was just going to be more of the same.”

Keyword: Sexual Behavior
Link ID: 27948 - Posted: 08.14.2021

By Virginia Hughes In the 1960s, the drug was given to women during childbirth to dampen their consciousness. In the 1990s, an illicit version made headlines as a “date rape” drug, linked to dozens of deaths and sexual assaults. And for the last two decades, a pharmaceutical-grade slurry of gamma-hydroxybutyrate, or GHB, has been tightly regulated as a treatment for narcolepsy, a disorder known for its sudden sleep attacks. Now, the Food and Drug Administration has approved the drug for a new use: treating “idiopathic hypersomnia,” a mysterious condition in which people sleep nine or more hours a day, yet never feel rested. Branded as Xywav, the medication is thought to work by giving some patients restorative sleep at night, allowing their brains to be more alert when they wake up. It is the first approved treatment for the illness. But some experts say the publicly available evidence to support the new approval is weak. And they worry about the dangers of the medication, which acts so swiftly that its label advises users to take it while in bed. Xywav and an older, high-salt version called Xyrem have a host of serious side effects, including breathing problems, anxiety, depression, sleepwalking, hallucinations and suicidal thoughts. GHB “has serious safety concerns, both in terms of its abuse liability and its addictive potential,” said Dr. Lewis S. Nelson, the director of medical toxicology at Rutgers New Jersey Medical School. An estimated 40,000 people in the United States have been diagnosed with idiopathic hypersomnia, but Dr. Nelson said that many more people with daytime drowsiness might wind up with this diagnosis now that it has an F.D.A.-approved treatment. The disorder’s hallmark symptoms — sleep cravings, long naps and brain fog — overlap with many other conditions. The more people who take the drug, the more opportunity for abuse. “The potential for the scope of use to expand is very real,” Dr. Nelson said. “So that is concerning to me.” © 2021 The New York Times Company

Keyword: Sleep; Drug Abuse
Link ID: 27947 - Posted: 08.14.2021

By Katherine Ellison ADHD — the most common psychiatric disorder of childhood —  lasts longer for more people than has been widely assumed, according to new research. “Only 10 percent of people really appear to grow out of ADHD,” says the lead author, psychologist Margaret Sibley, associate professor of psychiatry and behavioral sciences at the University of Washington School of Medicine. “Ninety percent still struggle with at least mild symptoms as adults — even if they have periods when they are symptom free.” The study challenges a widely persistent perception of a time-limited condition occurring mostly in childhood. Indeed, one of the earliest names for attention deficit/hyperactivity disorder was “a hyperkinetic disease of infancy,” while its most common poster child has long been a young, White, disruptive male. Previous research has suggested the condition essentially vanishes in about half of those who receive diagnoses. But in recent years, increasing numbers of women, people of color and especially adults have been seeking help in managing the hallmark symptoms of distraction, forgetfulness and impulsivity. By the most recent estimates, 9.6 percent of children ages 3 to 17 have been diagnosed with ADHD. Yet researchers report that only 4.4 percent of young adults ages 18 to 44 have the disorder, suggesting that if the new estimates are valid, there may be some catching up to do. Sibley’s paper paints a picture of an on-again, off-again condition, with symptoms fluctuating depending on life circumstances. © 1996-2021 The Washington Post

Keyword: ADHD
Link ID: 27946 - Posted: 08.14.2021

By Cara Giaimo Giraffes seem above it all. They float over the savanna like two-story ascetics, peering down at the fray from behind those long lashes. For decades, many biologists thought giraffes extended this treatment to their peers as well, with one popular wildlife guide calling them “aloof” and capable of only “the most casual” associations. Sign up for Science Times Get stories that capture the wonders of nature, the cosmos and the human body. Get it sent to your inbox. But more recently, as experts have paid closer attention to these lanky icons, a different social picture has begun to emerge. Female giraffes are now known to enjoy yearslong bonds. They have lunch buddies, stand guard over dead calves and stay close with their mothers and grandmothers. Females even form shared day care-like arrangements, called crèches, in which they take turns babysitting and feeding each others young. Observations like these have reached a critical mass, said Zoe Muller, a wildlife biologist who completed her Ph.D. at the University of Bristol in England. She and Stephen Harris, also at Bristol, recently reviewed hundreds of giraffe studies to look for broader patterns. Their analysis, published on Tuesday in the journal Mammal Review, suggests that giraffes are not loners, but socially complex creatures, akin to elephants or chimpanzees. They’re just a little more subtle about it. Dr. Muller’s sense of giraffes as secret socialites began in 2005, when she was researching her master’s thesis in Laikipia, Kenya. There to collect data on antelopes, she found herself drawn to the ganglier ungulates. “They are so weird to look at,” she said. “If somebody described them to you, you wouldn’t believe they even really existed.” After noticing that the same giraffes tended to spend time together — they looked “like teenagers hanging out,” she said — Dr. Muller started to read up on their lifestyles. “I was really surprised to see that all the scientific books said that they were completely non-sociable,” she said. “I thought, ‘Well, hang on. That’s not what I see at all.’” © 2021 The New York Times Company

Keyword: Evolution; Emotions
Link ID: 27945 - Posted: 08.11.2021

Jordana Cepelewicz An understanding of numbers is often viewed as a distinctly human faculty — a hallmark of our intelligence that, along with language, sets us apart from all other animals. But that couldn’t be further from the truth. Honeybees count landmarks when navigating toward sources of nectar. Lionesses tally the number of roars they hear from an intruding pride before deciding whether to attack or retreat. Some ants keep track of their steps; some spiders keep track of how many prey are caught in their web. One species of frog bases its entire mating ritual on number: If a male calls out — a whining pew followed by a brief pulsing note called a chuck — his rival responds by placing two chucks at the end of his own call. The first frog then responds with three, the other with four, and so on up to around six, when they run out of breath. Practically every animal that scientists have studied — insects and cephalopods, amphibians and reptiles, birds and mammals — can distinguish between different numbers of objects in a set or sounds in a sequence. They don’t just have a sense of “greater than” or “less than,” but an approximate sense of quantity: that two is distinct from three, that 15 is distinct from 20. This mental representation of set size, called numerosity, seems to be “a general ability,” and an ancient one, said Giorgio Vallortigara, a neuroscientist at the University of Trento in Italy. Now, researchers are uncovering increasingly more complex numerical abilities in their animal subjects. Many species have displayed a capacity for abstraction that extends to performing simple arithmetic, while a select few have even demonstrated a grasp of the quantitative concept of “zero” — an idea so paradoxical that very young children sometimes struggle with it. All Rights Reserved © 2021

Keyword: Intelligence; Evolution
Link ID: 27944 - Posted: 08.11.2021

Max G. Levy Agony is contagious. If you drop a thick textbook on your toes, circuits in your brain’s pain center come alive. If you pick it up and accidentally drop it on my toes, hurting me, an overlapping neural neighborhood will light up in your brain again. “There's a physiological mechanism for emotional contagion of negative responses like stress and pain and fear,” says Inbal Ben-Ami Bartal, a neuroscientist at Tel-Aviv University in Israel. That's empathy. Researchers debate to this day whether empathy is a uniquely human ability. But more scientists are finding evidence suggesting it exists widely, particularly in social mammals like rats. For the past decade, Bartal has studied whether—and why—lab rodents might act on that commiseration to help pals in need. Picture two rats in a cage. One roams freely, while the other is constrained in a vented plexiglass tunnel with a small door that only opens from the outside. Bartal, along with teams at UC Berkeley and the University of Chicago, has shown that the free rat may feel their trapped fellow’s distress and learn to open the door. This empathic pull is so strong that rats will rescue their roommates instead of feasting on piles of chocolate chips. (Disclosure: I have three pet rats. My sources confirm that chocolate chips are borderline irresistible.) But there's been a catch: Bartal’s experiments over the years have shown that rats only help others they perceive as members of their social group—specific pals or entire genetic strains they recognize. So does this mean they can't empathize with strangers? In new results appearing in the journal eLife in July, Bartal and her adviser from Berkeley, Daniela Kaufer, uncovered a surprise. Rats do show the neural signatures of empathy for trapped strangers, but that alone isn’t enough to make them help. While seeing a trapped stranger lights up parts of the brain associated with empathy, only seeing a familiar rat or breed elicits a rush of activity in the brain’s so-called reward center, the nucleus accumbens—so only those rats get rescued. © 2021 Condé Nast

Keyword: Emotions; Evolution
Link ID: 27943 - Posted: 08.11.2021

Lydia Denworth Lee Reeves always wanted to be a veterinarian. When he was in high school in the Washington, D.C., suburbs, he went to an animal hospital near his house on a busy Saturday morning to apply for a job. The receptionist said the doctor was too busy to talk. But Reeves was determined and waited. Three and a half hours later, after all the dogs and cats had been seen, the veterinarian emerged and asked Reeves what he could do for him. Reeves, who has stuttered since he was three years old, had trouble answering. “I somehow struggled out the fact that I wanted the job and he asked me what my name was,” he says. “I couldn’t get my name out to save my life.” The vet finally reached for a piece of paper and had Reeves write down his name and add his phone number, but he said there was no job available. “I remember walking out of that clinic that morning thinking that essentially my life was over,” Reeves says. “Not only was I never going to become a veterinarian, but I couldn’t even get a job cleaning cages.” More than 50 years have passed. Reeves, who is now 72, has gone on to become an effective national advocate for people with speech impairments, but the frustration and embarrassment of that day are still vivid. They are also emblematic of the complicated experience that is stuttering. Technically, stuttering is a disruption in the easy flow of speech, but the physical struggle and the emotional effects that often go with it have led observers to wrongly attribute the condition to defects of the tongue or voice box, problems with cognition, emotional trauma or nervousness, forcing left-handed children to become right-handed, and, most unfortunately, poor parenting. Freudian psychiatrists thought stuttering represented “oral-sadistic conflict,” whereas the behavioralists argued that labeling a child a stutterer would exacerbate the problem. Reeves’s parents were told to call no attention to his stutter—wait it out, and it would go away. © 2021 Scientific American,

Keyword: Language
Link ID: 27942 - Posted: 08.11.2021

Nicola Davis Science correspondent It’s been used to detect eye diseases, make medical diagnoses, and spot early signs of oesophageal cancer. Now it has been claimed artificial intelligence may be able to diagnose dementia from just one brain scan, with researchers starting a trial to test the approach. The team behind the AI tool say the hope is that it will lead to earlier diagnoses, which could improve outcomes for patients, while it may also help to shed light on their prognoses. Dr Timothy Rittman, a senior clinical research associate and consultant neurologist at the University of Cambridge, who is leading the study, told the BBC the AI system is a “fantastic development”. “These set of diseases are really devastating for people,” he said. “So when I am delivering this information to a patient, anything I can do to be more confident about the diagnosis, to give them more information about the likely progression of the disease to help them plan their lives is a great thing to be able to do.” It is expected that in the first year of the trial the AI system, which uses algorithms to detect patterns in brain scans, will be tested in a “real-world” clinical setting on about 500 patients at Addenbrooke’s hospital in Cambridge and other memory clinics across the country. “If we intervene early, the treatments can kick in early and slow down the progression of the disease and at the same time avoid more damage,” Prof Zoe Kourtzi, of Cambridge University and a fellow of national centre for AI and data science the Alan Turing Institute, told the BBC. “And it’s likely that symptoms occur much later in life or may never occur.” © 2021 Guardian News & Media Limited

Keyword: Alzheimers; Development of the Brain
Link ID: 27941 - Posted: 08.11.2021