Links for Keyword: Attention

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 707

By Olivia Gieger Three pioneers in face-perception research have won the 2024 Kavli Prize in Neuroscience. Nancy Kanwisher, professor of cognitive neuroscience at the Massachusetts Institute of Technology; Winrich Freiwald, professor of neurosciences and behavior at Rockefeller University; and Doris Tsao, professor of neurobiology at the University of California, Berkeley, will share the $1 million Kavli Prize for their discoveries of the regions—in both the human and monkey brains—responsible for identifying and recognizing faces. “This is work that’s very classic and very elegant, not only in face-processing and face-recognition work, but the impact it’s had on how we think about brain organization in general is huge,” says Alexander Cohen, assistant professor of neurology at Harvard Medical School, who studies face recognition in autistic people. The Norwegian Academy of Science and Letters awards the prize every two years. Kanwisher says she long suspected that something special happens in the brain when we look at faces, because people with prosopagnosia—the inability to recognize faces—maintain the ability to recognize nearly all other objects. What’s more, it is harder to recognize an upside-down face than most other inverted objects, studies have shown. To get to the root of face processing, Kanwisher spent hours as a young researcher lying still in an MRI machine as images of faces and objects flashed before her. A spot in the bottom right of the cerebral cortex lit up when she and others looked at faces, according to functional MRI (fMRI) scans, she and her colleagues reported in a seminal 1997 paper. They called the region the fusiform face area. © 2024 Simons Foundation

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29356 - Posted: 06.13.2024

By Betsy Mason To help pay for his undergraduate education, Elias Garcia-Pelegrin had an unusual summer job: cruise ship magician. “I was that guy who comes out at dinnertime and does random magic for you,” he says. But his latest magic gig is even more unusual: performing for Eurasian jays at Cambridge University’s Comparative Cognition Lab. Birds can be harder to fool than tourists. And to do magic for the jays, he had to learn to do sleight-of-hand tricks with a live, wriggling waxworm instead of the customary coin or ball. But performing in an aviary does have at least one advantage over performing on a cruise ship: The birds aren’t expecting to be entertained. “You don’t have to worry about impressing anybody, or tell a joke,” Garcia-Pelegrin says. “So you just do the magic.” In just the last few years, researchers have become interested in what they can learn about animal minds by studying what does and doesn’t fool them. “Magic effects can reveal blind spots in seeing and roadblocks in thinking,” says Nicky Clayton, who heads the Cambridge lab and, with Garcia-Pelegrin and others, cowrote an overview of the science of magic in the Annual Review of Psychology. What we visually perceive about the world is a product of how our brains interpret what our eyes see. Humans and other animals have evolved to handle the immense amount of visual information we’re exposed to by prioritizing some types of information, filtering out things that are usually less relevant and filling in gaps with assumptions. Many magic effects exploit these cognitive shortcuts in humans, and comparing how well these same tricks work on other species may reveal something about how their minds operate. Clayton and her colleagues have used magic tricks with both jays and monkeys to reveal differences in how these animals experience the world. Now they are hoping to expand to more species and inspire other researchers to try magic to explore big questions about complex mental abilities and how they evolved.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29345 - Posted: 06.06.2024

By Mariana Lenharo Crows know their numbers. An experiment has revealed that these birds can count their own calls, showcasing a numerical skill previously only seen in people. Investigating how animals understand numbers can help scientists to explore the biological origins of humanity’s numerical abilities, says Giorgio Vallortigara, a neuroscientist at the University of Trento in Rovereto, Italy. Being able to produce a deliberate number of vocalizations on cue, as the birds in the experiment did, “is actually a very impressive achievement”, he notes. Andreas Nieder, an animal physiologist at the University of Tübingen in Germany and a co-author of the study published 23 May in Science1, says it was amazing to see how cognitively flexible these corvids are. “They have a reputation of being very smart and intelligent, and they proved this once again.” The researchers worked with three carrion crows (Corvus corone) that had already been trained to caw on command. Over the next several months, the birds were taught to associate visual cues — a screen showing the digits 1, 2, 3 or 4 — with the number of calls they were supposed to produce. They were later also introduced to four auditory cues that were each associated with a distinct number. During the experiment, the birds stood in front of the screen and were presented with a visual or auditory cue. They were expected to produce the number of vocalizations associated with the cue and to peck at an ‘enter key’ on the touchscreen monitor when they were done. If they got it right, an automated feeder delivered bird-seed pellets and mealworms as a reward. They were correct most of the time. “Their performance was way beyond chance and highly significant,” says Nieder. © 2024 Springer Nature Limited

Related chapters from BN: Chapter 6: Evolution of the Brain and Behavior; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29326 - Posted: 05.25.2024

By Meghan Willcoxon In the summer of 1991, the neuroscientist Vittorio Gallese was studying how movement is represented in the brain when he noticed something odd. He and his research adviser, Giacomo Rizzolatti, at the University of Parma were tracking which neurons became active when monkeys interacted with certain objects. As the scientists had observed before, the same neurons fired when the monkeys either noticed the objects or picked them up. But then the neurons did something the researchers didn’t expect. Before the formal start of the experiment, Gallese grasped the objects to show them to a monkey. At that moment, the activity spiked in the same neurons that had fired when the monkey grasped the objects. It was the first time anyone had observed neurons encode information for both an action and another individual performing that action. Those neurons reminded the researchers of a mirror: Actions the monkeys observed were reflected in their brains through these peculiar motor cells. In 1992, Gallese and Rizzolatti first described the cells in the journal Experimental Brain Research and then in 1996 named them “mirror neurons” in Brain. The researchers knew they had found something interesting, but nothing could have prepared them for how the rest of the world would respond. Within 10 years of the discovery, the idea of a mirror neuron had become the rare neuroscience concept to capture the public imagination. From 2002 to 2009, scientists across disciplines joined science popularizers in sensationalizing these cells, attributing more properties to them to explain such complex human behaviors as empathy, altruism, learning, imitation, autism, and speech. Then, nearly as quickly as mirror neurons caught on, scientific doubts about their explanatory power crept in. Within a few years, these celebrity cells were filed away in the drawer of over-promised, under-delivered discoveries. © 2024 NautilusNext Inc.,

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 5: The Sensorimotor System
Link ID: 29316 - Posted: 05.21.2024

By Lilly Tozer How the brain processes visual information — and its perception of time — is heavily influenced by what we’re looking at, a study has found. In the experiment, participants perceived the amount of time they had spent looking at an image differently depending on how large, cluttered or memorable the contents of the picture were. They were also more likely to remember images that they thought they had viewed for longer. The findings, published on 22 April in Nature Human Behaviour1, could offer fresh insights into how people experience and keep track of time. “For over 50 years, we’ve known that objectively longer-presented things on a screen are better remembered,” says study co-author Martin Wiener, a cognitive neuroscientist at George Mason University in Fairfax, Virginia. “This is showing for the first time, a subjectively experienced longer interval is also better remembered.” Research has shown that humans’ perception of time is intrinsically linked to our senses. “Because we do not have a sensory organ dedicated to encoding time, all sensory organs are in fact conveying temporal information” says Virginie van Wassenhove, a cognitive neuroscientist at the University of Paris–Saclay in Essonne, France. Previous studies found that basic features of an image, such as its colours and contrast, can alter people’s perceptions of time spent viewing the image. In the latest study, researchers set out to investigate whether higher-level semantic features, such as memorability, can have the same effect. © 2024 Springer Nature Limited

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 29269 - Posted: 04.24.2024

By Meghan Willcoxon In the summer of 1991, the neuroscientist Vittorio Gallese was studying how movement is represented in the brain when he noticed something odd. He and his research adviser, Giacomo Rizzolatti, at the University of Parma were tracking which neurons became active when monkeys interacted with certain objects. As the scientists had observed before, the same neurons fired when the monkeys either noticed the objects or picked them up. But then the neurons did something the researchers didn’t expect. Before the formal start of the experiment, Gallese grasped the objects to show them to a monkey. At that moment, the activity spiked in the same neurons that had fired when the monkey grasped the objects. It was the first time anyone had observed neurons encode information for both an action and another individual performing that action. Those neurons reminded the researchers of a mirror: Actions the monkeys observed were reflected in their brains through these peculiar motor cells. In 1992, Gallese and Rizzolatti first described the cells in the journal Experimental Brain Research and then in 1996 named them “mirror neurons” in Brain. The researchers knew they had found something interesting, but nothing could have prepared them for how the rest of the world would respond. Within 10 years of the discovery, the idea of a mirror neuron had become the rare neuroscience concept to capture the public imagination. From 2002 to 2009, scientists across disciplines joined science popularizers in sensationalizing these cells, attributing more properties to them to explain such complex human behaviors as empathy, altruism, learning, imitation, autism and speech. Then, nearly as quickly as mirror neurons caught on, scientific doubts about their explanatory power crept in. Within a few years, these celebrity cells were filed away in the drawer of over-promised, under-delivered discoveries. Vittorio Gallese wears round glasses.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 29242 - Posted: 04.04.2024

By Emily Makowski & I spend my days surrounded by thousands of written words, and sometimes I feel as though there’s no escape. That may not seem particularly unusual. Plenty of people have similar feelings. But no, I’m not just talking about my job as a copy editor here at Scientific American, where I edit and fact-check an endless stream of science writing. This constant flow of text is all in my head. My brain automatically translates spoken words into written ones in my mind’s eye. I “see” subtitles that I can’t turn off whenever I talk or hear someone else talking. This same speech-to-text conversion even happens for the inner dialogue of my thoughts. This mental closed-captioning has accompanied me since late toddlerhood, almost as far back as my earliest childhood memories. And for a long time, I thought that everyone could “read” spoken words in their head the way I do. What I experience goes by the name of ticker-tape synesthesia. It is not a medical condition—it’s just a distinctive way of perceiving the surrounding world that relatively few people share. Not much is known about the neurophysiology or psychology of this phenomenon, sometimes called “ticker taping,” even though a reference to it first appeared in the scientific literature in the late 19th century. Ticker taping is considered a form of synesthesia, an experience in which the brain reroutes one kind of incoming sensory information so that it is processed as another. For example, sounds might be perceived as touch, allowing the affected person to “feel” them as tactile sensations. As synesthesia goes, ticker taping is relatively uncommon. “There are varieties of synesthesia which really have just been completely under the radar..., and ticker tape is really one of those,” says Mark Price, a cognitive psychologist at the University of Bergen in Norway. The name “ticker-tape synesthesia” itself evokes the concept’s late 19th-century origins. At that time stock prices transmitted by telegraph were printed on long paper strips, which would be torn into tiny bits and thrown from building windows during parades. © 2024 SCIENTIFIC AMERICAN,

Related chapters from BN: Chapter 8: General Principles of Sensory Processing, Touch, and Pain; Chapter 19: Language and Lateralization
Related chapters from MM:Chapter 5: The Sensorimotor System; Chapter 15: Language and Lateralization
Link ID: 29238 - Posted: 04.04.2024

By Marta Zaraska The renowned Polish piano duo Marek and Wacek didn’t use sheet music when playing live concerts. And yet onstage the pair appeared perfectly in sync. On adjacent pianos, they playfully picked up various musical themes, blended classical music with jazz and improvised in real time. “We went with the flow,” said Marek Tomaszewski, who performed with Wacek Kisielewski until Wacek’s death in 1986. “It was pure fun.” The pianists seemed to read each other’s minds by exchanging looks. It was, Marek said, as if they were on the same wavelength. A growing body of research suggests that might have been literally true. Dozens of recent experiments studying the brain activity of people performing and working together — duetting pianists, card players, teachers and students, jigsaw puzzlers and others — show that their brain waves can align in a phenomenon known as interpersonal neural synchronization, also known as interbrain synchrony. “There’s now a lot of research that shows that people interacting together display coordinated neural activities,” said Giacomo Novembre, a cognitive neuroscientist at the Italian Institute of Technology in Rome, who published a key paper on interpersonal neural synchronization last summer. The studies have come out at an increasing clip over the past few years — one as recently as last week — as new tools and improved techniques have honed the science and theory. They’re finding that synchrony between brains has benefits. It’s linked to better problem-solving, learning and cooperation, and even with behaviors that help others at a personal cost. What’s more, recent studies in which brains were stimulated with an electric current hint that synchrony itself might cause the improved performance observed by scientists. © 2024 the Simons Foundation.

Related chapters from BN: Chapter 17: Learning and Memory; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 13: Memory and Learning; Chapter 5: The Sensorimotor System
Link ID: 29229 - Posted: 03.30.2024

By Ingrid Wickelgren You see a woman on the street who looks familiar—but you can’t remember how you know her. Your brain cannot attach any previous experiences to this person. Hours later, you suddenly recall the party at a friend’s house where you met her, and you realize who she is. In a new study in mice, researchers have discovered the place in the brain that is responsible for both types of familiarity—vague recognition and complete recollection. Both, moreover, are represented by two distinct neural codes. The findings, which appeared on February 20 in Neuron, showcase the use of advanced computer algorithms to understand how the brain encodes concepts such as social novelty and individual identity, says study co-author Steven Siegelbaum, a neuroscientist at the Mortimer B. Zuckerman Mind Brain Behavior Institute at Columbia University. The brain’s signature for strangers turns out to be simpler than the one used for old friends—which makes sense, Siegelbaum says, given the vastly different memory requirements for the two relationships. “Where you were, what you were doing, when you were doing it, who else [was there]—the memory of a familiar individual is a much richer memory,” Siegelbaum says. “If you’re meeting a stranger, there’s nothing to recollect.” The action occurs in a small sliver of a brain region called the hippocampus, known for its importance in forming memories. The sliver in question, known as CA2, seems to specialize in a certain kind of memory used to recall relationships. “[The new work] really emphasizes the importance of this brain area to social processing,” at least in mice, says Serena Dudek, a neuroscientist at the National Institute of Environmental Health Sciences, who was not involved in the study. © 2024 SCIENTIFIC AMERICAN,

Related chapters from BN: Chapter 17: Learning and Memory; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 13: Memory and Learning; Chapter 14: Attention and Higher Cognition
Link ID: 29222 - Posted: 03.28.2024

By Robert D. Hershey Jr. Daniel Kahneman, who never took an economics course but who pioneered a psychologically based branch of that field that led to a Nobel in economic science in 2002, died on Wednesday. He was 90. His death was confirmed by his partner, Barbara Tversky. She declined to say where he died. Professor Kahneman, who was long associated with Princeton University and lived in Manhattan, employed his training as a psychologist to advance what came to be called behavioral economics. The work, done largely in the 1970s, led to a rethinking of issues as far-flung as medical malpractice, international political negotiations and the evaluation of baseball talent, all of which he analyzed, mostly in collaboration with Amos Tversky, a Stanford cognitive psychologist who did groundbreaking work on human judgment and decision-making. (Ms. Tversky, also a professor of psychology at Stanford, had been married to Professor Tversky, who died in 1996. She and Professor Kahneman became partners several years ago.) As opposed to traditional economics, which assumes that human beings generally act in fully rational ways and that any exceptions tend to disappear as the stakes are raised, the behavioral school is based on exposing hard-wired mental biases that can warp judgment, often with counterintuitive results. “His central message could not be more important,” the Harvard psychologist and author Steven Pinker told The Guardian in 2014, “namely, that human reason left to its own devices is apt to engage in a number of fallacies and systematic errors, so if we want to make better decisions in our personal lives and as a society, we ought to be aware of these biases and seek workarounds. That’s a powerful and important discovery.” © 2024 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29218 - Posted: 03.28.2024

By Anna Gibbs Imagine a person’s face. Now imagine that whenever you looked at that face, there was a chance it would appear distorted. That’s what life is like for a person with prosopometamorphopsia, or PMO. Now, thanks to a new study, you can see through the eyes of someone with this rare condition. Relying on feedback from a 58-year-old man who has had PMO for nearly three years, researchers at Dartmouth College altered photos of faces to mimic the “demonic” distortions he experienced. This is believed to be the first time that images have been created to so closely replicate what a patient with the condition is seeing, psychologist Antônio Mello and colleagues report in the March 23 Lancet. “We hope this has a big impact in the way people think about PMO, especially for them to be able to understand how severe PMO can be,” Mello says. For instance, he says, this particular patient didn’t like to go to the store because fellow shoppers looked like “an army of demons.” PMO is poorly understood, with fewer than 100 cases cited since 1904. Patients report a wide variety of facial distortions. While the patient in this study sees extremely stretched features with deep grooves on the face, others may see distortions that cause features to move position or change size. Because of that, this visualization is patient-specific and wouldn’t apply for everyone with PMO, says Jason Barton, a neurologist at the University of British Columbia in Vancouver who has worked with the researchers before but was not involved in this study. Still, “I think it’s helpful for people to understand the kinds of distortions people can see.” © Society for Science & the Public 2000–2024.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29211 - Posted: 03.23.2024

By Meghan Rosen Leakiness in the brain could explain the memory and concentration problems linked to long COVID. In patients with brain fog, MRI scans revealed signs of damaged blood vessels in their brains, researchers reported February 22 in Nature Neuroscience. In these people, dye injected into the bloodstream leaked into their brains and pooled in regions that play roles in language, memory, mood and vision. It’s the first time anyone’s shown that long COVID patients can have leaky blood brain barriers, says study coauthor Matthew Campbell, a geneticist at Trinity College Dublin in Ireland. That barrier, tightly knit cells lining blood vessels, typically keeps riffraff out of the brain, like bouncers guarding a nightclub. If the barrier breaks down, bloodborne viruses, cells and other interlopers can sneak into the brain’s tissues and wreak havoc, says Avindra Nath, a neurologist at the National Institutes of Health in Bethesda, Md. It’s too early to say definitively whether that’s happening in people with long COVID, but the new study provides evidence that “brain fog has a biological basis,” says Nath, who wasn’t involved with the work. That alone is important for patients, he says, because their symptoms may be otherwise discounted by physicians. For some people, brain fog can feel like a slowdown in thinking or difficulty recalling short-term memories, Campbell says. For example, “patients will go for a drive, and forget where they’re driving to.” That might sound trivial, he says, but it actually pushes people into panic mode. © Society for Science & the Public 2000–2024.

Related chapters from BN: Chapter 17: Learning and Memory; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 13: Memory and Learning; Chapter 14: Attention and Higher Cognition
Link ID: 29192 - Posted: 03.16.2024

By Pam Belluck Long Covid may lead to measurable cognitive decline, especially in the ability to remember, reason and plan, a large new study suggests. Cognitive testing of nearly 113,000 people in England found that those with persistent post-Covid symptoms scored the equivalent of 6 I.Q. points lower than people who had never been infected with the coronavirus, according to the study, published Wednesday in The New England Journal of Medicine. People who had been infected and no longer had symptoms also scored slightly lower than people who had never been infected, by the equivalent of 3 I.Q. points, even if they were ill for only a short time. The differences in cognitive scores were relatively small, and neurological experts cautioned that the results did not imply that being infected with the coronavirus or developing long Covid caused profound deficits in thinking and function. But the experts said the findings are important because they provide numerical evidence for the brain fog, focus and memory problems that afflict many people with long Covid. “These emerging and coalescing findings are generally highlighting that yes, there is cognitive impairment in long Covid survivors — it’s a real phenomenon,” said James C. Jackson, a neuropsychologist at Vanderbilt Medical Center, who was not involved in the study. He and other experts noted that the results were consistent with smaller studies that have found signals of cognitive impairment. The new study also found reasons for optimism, suggesting that if people’s long Covid symptoms ease, the related cognitive impairment might, too: People who had experienced long Covid symptoms for months and eventually recovered had cognitive scores similar to those who had experienced a quick recovery, the study found. © 2024 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 29171 - Posted: 02.29.2024

By Nora Bradford Whenever you’re actively performing a task — say, lifting weights at the gym or taking a hard exam — the parts of your brain required to carry it out become “active” when neurons step up their electrical activity. But is your brain active even when you’re zoning out on the couch? The answer, researchers have found, is yes. Over the past two decades they’ve defined what’s known as the default mode network, a collection of seemingly unrelated areas of the brain that activate when you’re not doing much at all. Its discovery has offered insights into how the brain functions outside of well-defined tasks and has also prompted research into the role of brain networks — not just brain regions — in managing our internal experience. In the late 20th century, neuroscientists began using new techniques to take images of people’s brains as they performed tasks in scanning machines. As expected, activity in certain brain areas increased during tasks — and to the researchers’ surprise, activity in other brain areas declined simultaneously. The neuroscientists were intrigued that during a wide variety of tasks, the very same brain areas consistently dialed back their activity. It was as if these areas had been active when the person wasn’t doing anything, and then turned off when the mind had to concentrate on something external. Researchers called these areas “task negative.” When they were first identified, Marcus Raichle, a neurologist at the Washington University School of Medicine in St. Louis, suspected that these task-negative areas play an important role in the resting mind. “This raised the question of ‘What’s baseline brain activity?’” Raichle recalled. In an experiment, he asked people in scanners to close their eyes and simply let their minds wander while he measured their brain activity. All Rights Reserved © 2024

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29135 - Posted: 02.06.2024

By Conor Feehly A decade ago, when I was starting my first year of university in New Zealand, I attended a stage hypnosis. It was one of a number of events the university offered to incoming students during orientation week. From the stage of a campus auditorium, the hypnotist-for-hire asked an audience of some 200 students to close their eyes and listen to his voice. Then he directed us to clasp our hands tightly together, and to imagine an invisible thread wrapping around them—over and under, over and under—until it was impossible to pull them apart. After a few minutes of this, he told us to try to separate our hands. Those who could not, he said, should come on down to the stage. I instantly pulled my hands apart, but to my surprise, a close friend sitting next to me made his way to the front of the auditorium with roughly 20 others from the audience. Once on stage, the hypnotist tried to bring them deeper into a hypnotic trance, directing them to focus on his calm, authoritative voice. He then asked a few of them to role-play scenarios for our entertainment: a supermarket checkout clerk ringing up shopping items, a lifeguard scanning for lives to save. After a short time, I saw the hypnotist whisper something into the ear of my friend. He sheepishly made his way back to the seat next to me. “What did he say to you?” I asked. He replied, “I can tell you’re acting, mate, get off the stage.” In the more than 200 years since the practice of contemporary hypnosis was described by German physician Franz Mesmer, public perception of it has see-sawed between skepticism and credulity. Today hypnotherapy is used to provide therapeutic remedy for depression, pain, substance use disorders, and certain traumas, uses that are supported to a certain extent by research evidence. But many still consider hypnosis more of a cheap magician’s trick than legitimate clinical medicine. © 2024 NautilusNext Inc.,

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29094 - Posted: 01.13.2024

By Regina G. Barber Human brains aren't built to comprehend large numbers, like the national debt or how much to save for retirement. But with a few tools — analogies, metaphors and visualizations — we can get better at it. erhui1979/Getty Images Imagine a horizontal line. The very left is marked one thousand and the very right is marked one billion. On this line, where would you add a marker to represent one million? If you said somewhere in the middle, you answered the same as the roughly 50 percent of people who have done this exercise in a number line study. But the answer is actually much closer to one thousand since there are one thousand millions in one billion. This error makes sense because "our human brains are pretty bad at comprehending large numbers," says Elizabeth Toomarian, an educational neuroscientist at Stanford University. She studies how the brain makes sense of numbers. Or doesn't. "Our brains are evolutionarily very old and we are pushing them to do things that we've only just recently conceptualized," says Toomarian. Instead, the human brain is built to understand how much of something is in its environment. For example, which bush has more berries or how many predators are in that clearing? But comprehending the national debt or imagining the size of our universe? "We certainly can use our brains in that way, but we're recycling these sort of evolutionarily old brain architectures to do something really new," she says. In other words, it's not our fault that we have trouble wrapping our heads around big numbers. © 2024 npr

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29074 - Posted: 01.03.2024

By Ann Gibbons Louise hadn’t seen her sister or nephew for 26 years. Yet the moment she spotted them on a computer screen, she recognized them, staring hard at their faces. The feat might have been impressive enough for a human, but Louise is a bonobo—one who had spent most of her life at a separate sanctuary from these relatives. The discovery, published today in the Proceedings of the National Academy of Sciences, reveals that our closest primate cousins can remember the faces of friends and family for years, and sometimes even decades. The study, experts say, shows that the capability for long-term social memory is not unique to people, as was long believed. “It’s a remarkable finding,” says Frans de Waal, a primatologist at Emory University who was not involved with the work. “I’m not even sure we humans remember most individuals we haven’t seen for 2 decades.” The research, he says, raises the possibility that other animals can also do this and may remember far more than we give them credit for. Trying to figure out whether nonhuman primates remember a face isn’t simple. You can’t just ask them. So in the new study, comparative psychologist Christopher Krupenye at Johns Hopkins University and colleagues used eye trackers, infrared cameras that noninvasively map a subject’s gaze as they look at images of people or objects. The scientists worked with 26 chimpanzees and bonobos living in three zoos or sanctuaries in Europe and Japan. The team showed the animals photos of the faces of two apes placed side by side on the screen at the same time for 3 seconds. Some images were of complete strangers; some were of close friends, foes, or family members who had once lived in their same social groups, but whom they hadn’t seen in years.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29058 - Posted: 12.19.2023

By Jaimie Seaton It’s not uncommon for Veronica Smith to be looking at her partner’s face when suddenly she sees his features changing—his eyes moving closer together and then farther apart, his jawline getting wider and narrower, and his skin moving and shimmering. Smith, age 32, has experienced this phenomenon when looking at faces since she was four or five years old, and while it’s intermittent when she’s viewing another person’s face, it’s more constant when she views her own. “I almost always experience it when I look at my own face in the mirror, which makes it really hard to get ready because I’ll think that I look weird,” Smith explains. “I can more easily tell that I’m experiencing distortions when I’m looking at other people because I know what they look like.” Smith has a rare condition called prosopometamorphopsia (PMO), in which faces appear distorted in shape, texture, position or color. (PMO is related to Alice in Wonderland syndrome, or AIWS, which distorts the size perception of objects or one’s own body.) PMO has fascinated many scientists. The late neurologist and writer Oliver Sacks co-wrote a paper on the condition that was published in 2014, the year before he died. Brad Duchaine, a professor of psychological and brain sciences at Dartmouth College, explains that some people with it see distortions that affect the whole face (bilateral PMO) while others see only the left or right half of a face as distorted (hemi-PMO). “Not surprisingly, people with PMO find the distortions extremely distressing. Over the last century, approximately 75 cases have been reported in the literature. However, little is known about the condition because cases with face distortions have usually been documented by neurologists who don’t have expertise in visual neuroscience or the time to study the cases in depth,” Duchaine says. For 25 years Duchaine’s work has focused on prosopagnosia (face blindness), but after co-authoring a study on hemi-PMO that was published in 2020, Duchaine shifted much of his lab’s work to PMO. © 2023 SCIENTIFIC AMERICAN,

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 29051 - Posted: 12.16.2023

By Francesca Paris There are more Americans who say they have serious cognitive problems — with remembering, concentrating or making decisions — than at any time in the last 15 years, data from the Census Bureau shows. The increase started with the pandemic: The number of working-age adults reporting “serious difficulty” thinking has climbed by an estimated one million people. About as many adults ages 18 to 64 now report severe cognitive issues as report trouble walking or taking the stairs, for the first time since the bureau started asking the questions each month in the 2000s. The sharp increase captures the effects of long Covid for a small but significant portion of younger adults, researchers say, most likely in addition to other effects of the pandemic, including psychological distress. But they also say it’s not yet possible to fully dissect all the reasons behind the increase. Richard Deitz, an economist at the Federal Reserve Bank of New York, analyzed the data and attributed much of the increase to long Covid. “These numbers don’t do this — they don’t just start suddenly increasing sharply like this,” he said. In its monthly Current Population Survey, the census asks a sample of Americans whether they have serious problems with their memory and concentration. It defines them as disabled if they answer yes to that question or one of five others about limitations on their daily activities. The questions are unrelated to disability applications, so respondents don’t have a financial incentive to answer one way or another. At the start of 2020, the survey estimated there were fewer than 15 million Americans ages 18 to 64 with any kind of disability. That rose to about 16.5 million by September 2023. Nearly two-thirds of that increase was made up of people who had newly reported limitations on their thinking. There were also increases in census estimates of the number of adults with a vision disability or serious difficulty doing basic errands. For older working-age Americans, the pandemic ended a yearslong decline in reported rates of disability. © 2023 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29003 - Posted: 11.13.2023

By Yasemin Saplakoglu More than 150 years ago, the economist and philosopher William Stanley Jevons discovered something curious about the number 4. While musing about how the mind conceives of numbers, he tossed a handful of black beans into a cardboard box. Then, after a fleeting glance, he guessed how many there were, before counting them to record the true value. After more than 1,000 trials, he saw a clear pattern. When there were four or fewer beans in the box, he always guessed the right number. But for five beans or more, his quick estimations were often incorrect. Jevons’ description of his self-experiment, published in Nature in 1871, set the “foundation of how we think about numbers,” said Steven Piantadosi, a professor of psychology and neuroscience at the University of California, Berkeley. It sparked a long-lasting and ongoing debate about why there seems to be a limit on the number of items we can accurately judge to be present in a set. Now, a new study in Nature Human Behaviour has edged closer to an answer by taking an unprecedented look at how human brain cells fire when presented with certain quantities. Its findings suggest that the brain uses a combination of two mechanisms to judge how many objects it sees. One estimates quantities. The second sharpens the accuracy of those estimates — but only for small numbers. It’s “very exciting” that the findings connect long-debated ideas to their neural underpinnings, said Piantadosi, who was not involved in the study. “There’s not many things in cognition where people have been able to pinpoint very plausible biological foundations.” Although the new study does not end the debate, the findings start to untangle the biological basis for how the brain judges quantities, which could inform bigger questions about memory, attention and even mathematics. All Rights Reserved © 2023

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29000 - Posted: 11.11.2023