Chapter 14. Attention and Higher Cognition

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 81 - 100 of 1705

Nicola Davis Science Correspondent If the sound of someone chewing gum or slurping their tea gets on your nerves, you are not alone. Researchers say almost one in five people in the UK has strong negative reactions to such noises. Misophonia is a disorder in which people feel strong emotional responses to certain sounds, feeling angry, distressed or even unable to function in social or work settings as a result. But just how common the condition is has been a matter of debate. Now researchers say they have found 18.4% of the UK population have significant symptoms of misophonia. “This is the very first study where we have a representative sample of the UK population,” said Dr Silia Vitoratou, first author of the study at King’s College London. “Most people with misophonia think they are alone, but they are not. This is something we need to know [about] and make adjustments if we can.” Writing in the journal Plos One, the team report how they gathered responses from 768 people using metrics including the selective sound sensitivity syndrome scale. This included one questionnaire probing the sounds that individuals found triggering, such as chewing or snoring, and another exploring the impact of such sounds – including whether they affected participants’ social life and whether the participant blamed the noise-maker – as well as the type of emotional response participants felt to the sounds and the intensity of their emotions. As a result, each participant was given an overall score. The results reveal more than 80% of participants had no particular feelings towards sounds such as “normal breathing” or “yawning” but this plummeted to less than 25% when it came to sounds including “slurping”, “chewing gum” and “sniffing”. © 2023 Guardian News & Media Limited

Keyword: Hearing; Attention
Link ID: 28712 - Posted: 03.23.2023

By Ellen Barry It is a truism that time seems to expand or contract depending on our circumstances: In a state of terror, seconds can stretch. A day spent in solitude can drag. When we’re trying to meet a deadline, hours race by. A study published this month in the journal Psychophysiology by psychologists at Cornell University found that, when observed at the level of microseconds, some of these distortions could be driven by heartbeats, whose length is variable from moment to moment. The psychologists fitted undergraduates with electrocardiograms to measure the length of each heartbeat precisely, and then asked them to estimate the length of brief audio tones. The psychologists discovered that after a longer heartbeat interval, subjects tended to perceive the tone as longer; shorter intervals led subjects to assess the tone as shorter. Subsequent to each tone, the subjects’ heartbeat intervals lengthened. A lower heart rate appeared to assist with perception, said Saeedeh Sadeghi, a doctoral candidate at Cornell and the study’s lead author. “When we need to perceive things from the outside world, the beats of the heart are noise to the cortex,” she said. “You can sample the world more — it’s easier to get things in — when the heart is silent.” The study provides more evidence, after an era of research focusing on the brain, that “there is no single part of the brain or body that keeps time — it’s all a network,” she said, adding, “The brain controls the heart, and the heart, in turn, impacts the brain.” Interest in the perception of time has exploded since the Covid pandemic, when activity outside the home came to an abrupt halt for many and people around the world found themselves facing stretches of undifferentiated time. A study of time perception conducted during the first year of the lockdown in Britain found that 80 percent of participants reported distortions in time, in different directions. On average, older, more socially isolated people reported that time slowed, and younger, more active people reported that it sped up. © 2023 The New York Times Company

Keyword: Attention
Link ID: 28704 - Posted: 03.15.2023

By Marta Zaraska The Neumayer III polar station sits near the edge of Antarctica’s unforgiving Ekström Ice Shelf. During the winter, when temperatures can plunge below minus 50 degrees Celsius and the winds can climb to more than 100 kilometers per hour, no one can come or go from the station. Its isolation is essential to the meteorological, atmospheric and geophysical science experiments conducted there by the mere handful of scientists who staff the station during the winter months and endure its frigid loneliness. But a few years ago, the station also became the site for a study of loneliness itself. A team of scientists in Germany wanted to see whether the social isolation and environmental monotony marked the brains of people making long Antarctic stays. Eight expeditioners working at the Neumayer III station for 14 months agreed to have their brains scanned before and after their mission and to have their brain chemistry and cognitive performance monitored during their stay. (A ninth crew member also participated but could not have their brain scanned for medical reasons.) As the researchers described in 2019, in comparison to a control group, the socially isolated team lost volume in their prefrontal cortex — the region at the front of the brain, just behind the forehead, that is chiefly responsible for decision-making and problem-solving. They also had lower levels of brain-derived neurotrophic factor, a protein that nurtures the development and survival of nerve cells in the brain. The reduction persisted for at least a month and a half after the team’s return from Antarctica. It’s uncertain how much of this was due purely to the social isolation of the experience. But the results are consistent with evidence from more recent studies that chronic loneliness significantly alters the brain in ways that only worsen the problem. Neuroscience suggests that loneliness doesn’t necessarily result from a lack of opportunity to meet others or a fear of social interactions. Instead, circuits in our brain and changes in our behavior can trap us in a catch-22 situation: While we desire connection with others, we view them as unreliable, judgmental and unfriendly. Consequently, we keep our distance, consciously or unconsciously spurning potential opportunities for connections. Simons Foundation All Rights Reserved © 2023

Keyword: Stress; Attention
Link ID: 28689 - Posted: 03.04.2023

By Stephani Sutherland Tara Ghormley has always been an overachiever. She finished at the top of her class in high school, graduated summa cum laude from college and earned top honors in veterinary school. She went on to complete a rigorous training program and build a successful career as a veterinary internal medicine specialist. But in March 2020 she got infected with the SARS-CoV-2 virus—just the 24th case in the small, coastal central California town she lived in at the time, near the site of an early outbreak in the COVID pandemic. “I could have done without being first at this,” she says. Almost three years after apparently clearing the virus from her body, Ghormley is still suffering. She gets exhausted quickly, her heartbeat suddenly races, and she goes through periods where she can't concentrate or think clearly. Ghormley and her husband, who have relocated to a Los Angeles suburb, once spent their free time visiting their “happiest place on Earth”—Disneyland—but her health prevented that for more than a year. She still spends most of her days off resting in the dark or going to her many doctors' appointments. Her early infection and ongoing symptoms make her one of the first people in the country with “long COVID,” a condition where symptoms persist for at least three months after the infection and can last for years. The syndrome is known by medical professionals as postacute sequelae of COVID-19, or PASC. People with long COVID have symptoms such as pain, extreme fatigue and “brain fog,” or difficulty concentrating or remembering things. As of February 2022, the syndrome was estimated to affect about 16 million adults in the U.S. and had forced between two million and four million Americans out of the workforce, many of whom have yet to return. Long COVID often arises in otherwise healthy young people, and it can follow even a mild initial infection. The risk appears at least slightly higher in people who were hospitalized for COVID and in older adults (who end up in the hospital more often). Women and those at socioeconomic disadvantage also face higher risk, as do people who smoke, are obese, or have any of an array of health conditions, particularly autoimmune disease. Vaccination appears to reduce the danger but does not entirely prevent long COVID.

Keyword: Attention; Learning & Memory
Link ID: 28667 - Posted: 02.15.2023

By Betsy Mason Some fish can recognize their own faces in photos and mirrors, an ability usually attributed to humans and other animals considered particularly brainy, such as chimpanzees, scientists report. Finding the ability in fish suggests that self-awareness may be far more widespread among animals than scientists once thought. “It is believed widely that the animals that have larger brains will be more intelligent than animals of the small brain,” such as fish, says animal sociologist Masanori Kohda of Osaka Metropolitan University in Japan. It may be time to rethink that assumption, Kohda says. Kohda’s previous research showed that bluestreak cleaner wrasses can pass the mirror test, a controversial cognitive assessment that purportedly reveals self-awareness, or the ability to be the object of one’s own thoughts. The test involves exposing an animal to a mirror and then surreptitiously putting a mark on the animal’s face or body to see if they will notice it on their reflection and try to touch it on their body. Previously only a handful of large-brained species, including chimpanzees and other great apes, dolphins, elephants and magpies, have passed the test. In a new study, cleaner fish that passed the mirror test were then able to distinguish their own faces from those of other cleaner fish in still photographs. This suggests that the fish identify themselves the same way humans are thought to — by forming a mental image of one’s face, Kohda and colleagues report February 6 in the Proceedings of the National Academy of Sciences. “I think it’s truly remarkable that they can do this,” says primatologist Frans de Waal of Emory University in Atlanta who was not involved in the research. “I think it’s an incredible study.” © Society for Science & the Public 2000–2023.

Keyword: Attention; Evolution
Link ID: 28659 - Posted: 02.08.2023

By John M. Beggs Over the last few decades, an idea called the critical brain hypothesis has been helping neuroscientists understand how the human brain operates as an information-processing powerhouse. It posits that the brain is always teetering between two phases, or modes, of activity: a random phase, where it is mostly inactive, and an ordered phase, where it is overactive and on the verge of a seizure. The hypothesis predicts that between these phases, at a sweet spot known as the critical point, the brain has a perfect balance of variety and structure and can produce the most complex and information-rich activity patterns. This state allows the brain to optimize multiple information processing tasks, from carrying out computations to transmitting and storing information, all at the same time. To illustrate how phases of activity in the brain — or, more precisely, activity in a neural network such as the brain — might affect information transmission through it, we can play a simple guessing game. Imagine that we have a network with 10 layers and 40 neurons in each layer. Neurons in the first layer will only activate neurons in the second layer, and those in the second layer will only activate those in the third layer, and so on. Now, I will activate some number of neurons in the first layer, but you will only be able to observe the number of neurons active in the last layer. Let’s see how well you can guess the number of neurons I activated under three different strengths of network connections. First, let’s consider weak connections. In this case, neurons typically activate independently of each other, and the pattern of network activity is random. No matter how many neurons I activate in the first layer, the number of neurons activated in the last layer will tend toward zero because the weak connections dampen the spread of activity. This makes our guessing game incredibly difficult. The amount of information about the first layer that you can learn from the last layer is practically nothing. All Rights Reserved © 2023

Keyword: Attention; Learning & Memory
Link ID: 28652 - Posted: 02.01.2023

By Kristen French George Church looks like he needs a nap. I’m talking to him on Zoom, and his eyelids have grown heavy, inclining toward slumber. Or maybe my mind is playing tricks on me. He assures me he is wide awake. But sleeping and waking life are often blurred for Church. One of the world’s most imaginative scientists, Church is a narcoleptic. A rare disorder, narcolepsy causes sudden attacks of sleep, and Church has fallen asleep in some unfortunate circumstances—at The World Economic Forum, just a few feet away from Microsoft founder Bill Gates, for instance. He also had to give up driving due to the risk that a bout of sleepiness will strike while he is behind the wheel. But Church, a Harvard geneticist known for his pathbreaking contributions to numerous fields—from genetics to astrobiology to biomedicine—says the benefits of his condition outweigh the inconveniences. Many of his wildest and most prescient ideas come from his narcoleptic naps. “The fact is, I fall asleep several times a day, and so almost everything comes from there,” Church says. His idea for a quick and simple way to “read” DNA—which resulted in the first commercial genome sequence, of the human pathogen H. pylori—came from a narcoleptic nap. He also conceived of editing genomes with CRISPR and building new genomes with off-the-shelf molecules during narcoleptic naps. More recently, in December, a wild idea for a space probe that could reach distant stars within just 20 years, at one-fifth the speed of light, came to him after a narcoleptic nap. He proposed that these lightning-speed interstellar missions could be launched by microbes and powered by laser sails. The ideas that come to him are often the result of collisions of unexpected images in his head. “I try to turn science fiction into science fact,” Church tells me. © 2023 NautilusNext Inc.,

Keyword: Sleep; Attention
Link ID: 28648 - Posted: 02.01.2023

By Dana G. Smith Do you: Cut the tags out of your clothes? Relive (and regret) past conversations? Have episodes of burnout and fatigue? Zone out while someone is talking? Become hyper-focused while working on a project? Take on dozens of hobbies? Daydream? Forget things? According to TikTok, you might have attention deficit hyperactivity disorder. Videos about the psychiatric condition are all over the social media app, with the #adhd hashtag receiving more than 17 billion views to date. Many feature young people describing their specific (and sometimes surprising) symptoms, like sensitivity to small sensory annoyances (such as clothing tags) or A.D.H.D. paralysis, a type of extreme procrastination. After viewing these videos, many people who were not diagnosed with A.D.H.D. as children may question whether they would qualify as adults. As with most psychiatric conditions, A.D.H.D. symptoms can range in type and severity. And many of them “are behaviors everyone experiences at some point or another,” said Joel Nigg, a professor of psychiatry at Oregon Health & Science University. The key to diagnosing the condition, however, requires “determining that it’s serious, it’s extreme” and it’s interfering with people’s lives, he said. It’s also critical that the symptoms have been present since childhood. Those nuances can be lost on social media, experts say. In fact, one study published earlier this year found that more than half of the A.D.H.D. videos on TikTok were misleading. If a video (or article) has you thinking you may have undiagnosed A.D.H.D., here’s what to consider. Approximately 4 percent of adults in the United States have enough symptoms to qualify for A.D.H.D., but only an estimated one in 10 of them is diagnosed and treated. For comparison, roughly 9 percent of children in the United States have been diagnosed with the condition, and three-quarters have received medication or behavioral therapy for it. One reason for the lack of diagnoses in adults is that when people think of A.D.H.D., they often imagine a boy who can’t sit still and is disruptive in class, said Dr. Deepti Anbarasan, a clinical associate professor of psychiatry at the NYU Grossman School of Medicine. But those stereotypical hyperactive symptoms are present in just 5 percent of adult cases, she said. © 2023 The New York Times Company

Keyword: ADHD
Link ID: 28646 - Posted: 01.27.2023

By Jennifer Szalai “‘R’s’ are hard,” John Hendrickson writes in his new memoir, “Life on Delay: Making Peace With a Stutter,” committing to paper a string of words that would have caused him trouble had he tried to say them out loud. In November 2019, Hendrickson, an editor at The Atlantic, published an article about then-presidential candidate Joe Biden, who talked frequently about “beating” his childhood stutter — a bit of hyperbole that the article finally laid to rest. Biden insisted on his redemptive narrative, even though Hendrickson, who has stuttered since he was 4, could tell when Biden repeated (“I-I-I-I-I”) or blocked (“…”) on certain sounds. The article went viral, putting Hendrickson in the position of being invited to go on television — a “nightmare,” he said on MSNBC at the time, though it did lead to a flood of letters from fellow stutterers, a number of whom he interviewed for this book. “Life on Delay” traces an arc from frustration and isolation to acceptance and community, recounting a lifetime of bullying and well-meaning but ineffectual interventions and what Hendrickson calls “hundreds of awful first impressions.” When he depicts scenes from his childhood it’s often in a real-time present tense, putting us in the room with the boy he was, more than two decades before. Hendrickson also interviews people: experts, therapists, stutterers, his own parents. He calls up his kindergarten teacher, his childhood best friend and the actress Emily Blunt. He reaches out to others who have published personal accounts of stuttering, including The New Yorker’s Nathan Heller and Katharine Preston, the author of a memoir titled “Out With It.” We learn that it’s only been since the turn of the millennium or so that stuttering has been understood as a neurological disorder; that for 75 percent of children who stutter, “the issue won’t follow them to adulthood”; that there’s still disagreement over whether “disfluency” is a matter of language or motor control, because “the research is still a bit of a mess.” © 2023 The New York Times Company

Keyword: Language; Attention
Link ID: 28643 - Posted: 01.27.2023

By Alessandra Buccella, Tomáš Dominik  Imagine you are shopping online for a new pair of headphones. There is an array of colors, brands and features to look at. You feel that you can pick any model that you like and are in complete control of your decision. When you finally click the “add to shopping cart” button, you believe that you are doing so out of your own free will. But what if we told you that while you thought that you were still browsing, your brain activity had already highlighted the headphones you would pick? That idea may not be so far-fetched. Though neuroscientists likely could not predict your choice with 100 percent accuracy, research has demonstrated that some information about your upcoming action is present in brain activity several seconds before you even become conscious of your decision. As early as the 1960s, studies found that when people perform a simple, spontaneous movement, their brain exhibits a buildup in neural activity—what neuroscientists call a “readiness potential”—before they move. In the 1980s, neuroscientist Benjamin Libet reported this readiness potential even preceded a person’s reported intention to move, not just their movement. In 2008 a group of researchers found that some information about an upcoming decision is present in the brain up to 10 seconds in advance, long before people reported making the decision of when or how to act. Advertisement These studies have sparked questions and debates. To many observers, these findings debunked the intuitive concept of free will. After all, if neuroscientists can infer the timing or choice of your movements long before you are consciously aware of your decision, perhaps people are merely puppets, pushed around by neural processes unfolding below the threshold of consciousness. But as researchers who study volition from both a neuroscientific and philosophical perspective, we believe that there’s still much more to this story. We work with a collaboration of philosophers and scientists to provide more nuanced interpretations—including a better understanding of the readiness potential—and a more fruitful theoretical framework in which to place them. The conclusions suggest “free will” remains a useful concept, although people may need to reexamine how they define it. © 2023 Scientific American

Keyword: Consciousness
Link ID: 28635 - Posted: 01.18.2023

By Dennis Overbye If you could change the laws of nature, what would you change? Maybe it’s that pesky speed-of-light limit on cosmic travel — not to mention war, pestilence and the eventual asteroid that has Earth’s name on it. Maybe you would like the ability to go back in time — to tell your teenage self how to deal with your parents, or to buy Google stock. Couldn’t the universe use a few improvements? That was the question that David Anderson, a computer scientist, enthusiast of the Search for Extraterrestrial Intelligence (SETI), musician and mathematician at the University of California, Berkeley, recently asked his colleagues and friends. In recent years the idea that our universe, including ourselves and all of our innermost thoughts, is a computer simulation, running on a thinking machine of cosmic capacity, has permeated culture high and low. In an influential essay in 2003, Nick Bostrom, a philosopher at the University of Oxford and director of the Institute for the Future of Humanity, proposed the idea, adding that it was probably an easy accomplishment for “technologically mature” civilizations wanting to explore their histories or entertain their offspring. Elon Musk, who, for all we know, is the star of this simulation, seemed to echo this idea when he once declared that there was only a one-in-a-billion chance that we lived in “base reality.” It’s hard to prove, and not everyone agrees that such a drastic extrapolation of our computing power is possible or inevitable, or that civilization will last long enough to see it through. But we can’t disprove the idea either, so thinkers like Dr. Bostrom contend that we must take the possibility seriously. In some respects, the notion of a Great Simulator is redolent of a recent theory among cosmologists that the universe is a hologram, its margins lined with quantum codes that determine what is going on inside. A couple of years ago, pinned down by the coronavirus pandemic, Dr. Anderson began discussing the implications of this idea with his teenage son. If indeed everything was a simulation, then making improvements would simply be a matter of altering whatever software program was running everything. “Being a programmer, I thought about exactly what these changes might involve,” he said in an email. © 2023 The New York Times Company

Keyword: Consciousness
Link ID: 28634 - Posted: 01.18.2023

By Oliver Whang Hod Lipson, a mechanical engineer who directs the Creative Machines Lab at Columbia University, has shaped most of his career around what some people in his industry have called the c-word. On a sunny morning this past October, the Israeli-born roboticist sat behind a table in his lab and explained himself. “This topic was taboo,” he said, a grin exposing a slight gap between his front teeth. “We were almost forbidden from talking about it — ‘Don’t talk about the c-word; you won’t get tenure’ — so in the beginning I had to disguise it, like it was something else.” That was back in the early 2000s, when Dr. Lipson was an assistant professor at Cornell University. He was working to create machines that could note when something was wrong with their own hardware — a broken part, or faulty wiring — and then change their behavior to compensate for that impairment without the guiding hand of a programmer. Just as when a dog loses a leg in an accident, it can teach itself to walk again in a different way. This sort of built-in adaptability, Dr. Lipson argued, would become more important as we became more reliant on machines. Robots were being used for surgical procedures, food manufacturing and transportation; the applications for machines seemed pretty much endless, and any error in their functioning, as they became more integrated with our lives, could spell disaster. “We’re literally going to surrender our life to a robot,” he said. “You want these machines to be resilient.” One way to do this was to take inspiration from nature. Animals, and particularly humans, are good at adapting to changes. This ability might be a result of millions of years of evolution, as resilience in response to injury and changing environments typically increases the chances that an animal will survive and reproduce. Dr. Lipson wondered whether he could replicate this kind of natural selection in his code, creating a generalizable form of intelligence that could learn about its body and function no matter what that body looked like, and no matter what that function was. ImageHod Lipson, in jeans, a dark jacket and a dark button-down shirt, stands at the double-door entrance to the Creative Machines Lab. Signs on and next to the doors read “Creative Machines Lab,” “Laboratory,” “No Smoking” and “Smile, You’re On Camera.” © 2023 The New York Times Company

Keyword: Consciousness; Robotics
Link ID: 28625 - Posted: 01.07.2023

By Tom Siegfried Survival of the fittest often means survival of the fastest. But fastest doesn’t necessarily mean the fastest moving. It might mean the fastest thinking. When faced with the approach of a powerful predator, for instance, a quick brain can be just as important as quick feet. After all, it is the brain that tells the feet what to do — when to move, in what direction, how fast and for how long. And various additional mental acrobatics are needed to evade an attacker and avoid being eaten. A would-be meal’s brain must decide whether to run or freeze, outrun or outwit, whether to keep going or find a place to hide. It also helps if the brain remembers where the best hiding spots are and recalls past encounters with similar predators. All in all, a complex network of brain circuitry must be engaged, and neural commands executed efficiently, to avert a predatory threat. And scientists have spent a lot of mental effort themselves trying to figure out how the brains of prey enact their successful escape strategies. Studies in animals as diverse as mice and crabs, fruit flies and cockroaches are discovering the complex neural activity — in both the primitive parts of the brain and in more cognitively advanced regions — that underlies the physical behavior guiding escape from danger and the search for safety. Lessons learned from such studies might not only illuminate the neurobiology of escape, but also provide insights into how evolution has shaped other brain-controlled behaviors. This research “highlights an aspect of neuroscience that is really gaining traction these days,” says Gina G. Turrigiano of Brandeis University, past president of the Society for Neuroscience. “And that is the idea of using ethological behaviors — behaviors that really matter for the biology of the animal that’s being studied — to unravel brain function.” © 2022 Annual Reviews

Keyword: Aggression; Attention
Link ID: 28609 - Posted: 12.24.2022

Jon Hamilton Time is woven into our personal memories. Recall a childhood fall from a bike and the brain replays the entire episode in excruciating detail: the glimpse of wet leaves on the road ahead, the moment of weightless dread, and then the painful impact. This exact sequence has been embedded in the memory, thanks to some special neurons known as time cells. When the brain detects a notable event, time cells begin a highly orchestrated performance, says Marc Howard, who directs the Brain, Behavior, and Cognition program at Boston University. "What we find is that the cells fire in a sequence," he says. "So cell one might fire immediately, but cell two waits a little bit, followed by cell three, cell four, and so on." As each cell fires, it places a sort of time stamp on an unfolding experience. And the same cells fire in the same order when we retrieve a memory of the experience, even something mundane. "If I remember being in my kitchen and making a cup of coffee," Howard says, "the time cells that were active at that moment are re-activated." They recreate the grinder's growl, the scent of Arabica, the curl of steam rising from a fresh mug – and your neurons replay these moments in sequence every time you summon the memory. This system appears to explain how we are able to virtually travel back in time, and play mental movies of our life experiences. There are also hints that time cells play a critical role in imagining future events. Without time cells, our memories would lack order. In an experiment at the University of California, San Diego, scientists gave several groups of people a tour of the campus. The tour included 11 planned events, including finding change in a vending machine and drinking from a water fountain. © 2022 npr

Keyword: Attention; Learning & Memory
Link ID: 28608 - Posted: 12.21.2022

By Gary Stix  Can the human brain ever really understand itself? The problem of gaining a deep knowledge of the subjective depths of the conscious mind is such a hard problem that it has in fact been named the hard problem. The human brain is impressively powerful. Its 100 billion neurons are connected by 100 trillion wirelike fibers, all squeezed into three pounds of squishy flesh lodged below a helmet of skull. Yet we still don’t know whether this organ will ever be able to muster the requisite smarts to hack the physical processes that underlie the ineffable “quality of deep blue” or “the sensation of middle C,” as philosopher David Chalmers put it when giving examples of the “hard problem” of consciousness, a term he invented, in a 1995 paper. This past year did not uncover a solution to the hard problem, and one may not be forthcoming for decades, if ever. But 2022 did witness plenty of surprises and solutions to understanding the brain that do not require a complete explanation of consciousness. Such incrementalism could be seen in mid-November, when a crowd of more than 24,000 attendees of the annual Society for Neuroscience meeting gathered in San Diego, Calif. The event was a tribute of sorts to reductionism—the breaking down of hard problems into simpler knowable entities. At the event, there were reports of an animal study of a brain circuit that encodes social trauma and a brain-computer interface that lets a severely paralyzed person mentally spell out letters to form words. Your Brain Has a Thumbs-Up–Thumbs-Down Switch When neuroscientist Kay Tye was pursuing her Ph.D., she was told a chapter on emotion was inappropriate for her thesis. Emotion just wasn’t accepted as an integral, intrinsic part of behavioral neuroscience, her field of study. That didn’t make any sense to Tye. She decided to go her own way to become a leading researcher on feelings. This year Tye co-authored a Nature paper that reported on a kind of molecular switch in rodents that flags an experience as either good or bad. If human brains operate the same way as the brains of the mice in her lab, a malfunctioning thumbs-up–thumbs-down switch might explain some cases of depression, anxiety and addiction.

Keyword: Consciousness
Link ID: 28601 - Posted: 12.17.2022

By Yasemin Saplakoglu Memory and perception seem like entirely distinct experiences, and neuroscientists used to be confident that the brain produced them differently, too. But in the 1990s neuroimaging studies revealed that parts of the brain that were thought to be active only during sensory perception are also active during the recall of memories. “It started to raise the question of whether a memory representation is actually different from a perceptual representation at all,” said Sam Ling, an associate professor of neuroscience and director of the Visual Neuroscience Lab at Boston University. Could our memory of a beautiful forest glade, for example, be just a re-creation of the neural activity that previously enabled us to see it? “The argument has swung from being this debate over whether there’s even any involvement of sensory cortices to saying ‘Oh, wait a minute, is there any difference?’” said Christopher Baker, an investigator at the National Institute of Mental Health who runs the learning and plasticity unit. “The pendulum has swung from one side to the other, but it’s swung too far.” Even if there is a very strong neurological similarity between memories and experiences, we know that they can’t be exactly the same. “People don’t get confused between them,” said Serra Favila, a postdoctoral scientist at Columbia University and the lead author of a recent Nature Communications study. Her team’s work has identified at least one of the ways in which memories and perceptions of images are assembled differently at the neurological level. When we look at the world, visual information about it streams through the photoreceptors of the retina and into the visual cortex, where it is processed sequentially in different groups of neurons. Each group adds new levels of complexity to the image: Simple dots of light turn into lines and edges, then contours, then shapes, then complete scenes that embody what we’re seeing. Simons Foundation © 2022

Keyword: Attention; Vision
Link ID: 28597 - Posted: 12.15.2022

Researchers at the National Institutes of Health have successfully identified differences in gene activity in the brains of people with attention deficit hyperactivity disorder (ADHD). The study, led by scientists at the National Human Genome Research Institute (NHGRI), part of NIH, found that individuals diagnosed with ADHD had differences in genes that code for known chemicals that brain cells use to communicate. The results of the findings, published in Molecular Psychiatry(link is external), show how genomic differences might contribute to symptoms. To date, this is the first study to use postmortem human brain tissue to investigate ADHD. Other approaches to studying mental health conditions include non-invasively scanning the brain, which allows researchers to examine the structure and activation of brain areas. However, these studies lack information at the level of genes and how they might influence cell function and give rise to symptoms. The researchers used a genomic technique called RNA sequencing to probe how specific genes are turned on or off, also known as gene expression. They studied two connected brain regions associated with ADHD: the caudate and the frontal cortex. These regions are known to be critical in controlling a person’s attention. Previous research found differences in the structure and activity of these brain regions in individuals with ADHD. As one of the most common mental health conditions, ADHD affects about 1 in 10 children in the United States. Diagnosis often occurs during childhood, and symptoms may persist into adulthood. Individuals with ADHD may be hyperactive and have difficulty concentrating and controlling impulses, which may affect their ability to complete daily tasks and their ability to focus at school or work. With technological advances, researchers have been able to identify genes associated with ADHD, but they had not been able to determine how genomic differences in these genes act in the brain to contribute to symptoms until now.

Keyword: ADHD; Genes & Behavior
Link ID: 28559 - Posted: 11.19.2022

By Jim Davies Living for the moment gets a bad rap. If you’re smart, people say, you should work toward a good future, sacrificing fun and pleasure in the present. Yet there are good reasons to discount the future, which is why economists tend to do it when making predictions. Would you rather find $5 when you’re in elementary school, or in your second marriage? People tend to get richer as they age. Five dollars simply means more to you when you’re 9 than when you’re 49. Also, the future is uncertain. We can’t always trust there’ll be one. It’s likely some kids in Walter Mischel’s famous “marshmallow experiment”—which asked kids to wait to eat a marshmallow to get another one—didn’t actually believe that the experimenter would come through with the second marshmallow, and so ate the first marshmallow right away. Saving for retirement makes no sense if in five years a massive meteor cuts human civilization short. Economists call this the “catastrophe” or “hazard” rate. For Sangil “Arthur” Lee, a psychologist at the University of California, Berkeley, where he’s a postdoc, a hazard rate makes sense from an evolutionary perspective. “You might not survive until next winter, so there is some inherent trade off that you need to make, which is not only specific for humans, but also for animals,” he said. While an undergraduate, Lee experimented with delay-discounting tasks using pigeons. The pigeons would peck one button to get a small amount of pellets now, or peck a different button to get large amounts of pellets later. “What we know,” Lee said, “is that across pigeons, monkeys, rats, and various animals, they also discount future rewards in pretty much a similar way that humans do, which is this sort of hyperbolic fashion.” We discount future rewards by a lot very quickly, more so than we would be if discounting the future exponentially, but the hyperbolic discount rate eases after a bit. What makes us discount the future? Lee, in a new study with his colleagues, pins it at least partly on our powers of imagination.1 When we think about what hasn’t yet happened, it tends to be abstract. Things right now, on the other hand, we think of in more tangible terms. Several behavioral studies have supported the idea that what we cannot clearly imagine, we value less. © 2022 NautilusThink Inc, All rights reserved.

Keyword: Attention
Link ID: 28552 - Posted: 11.16.2022

By Jan Claassen, Brian L. Edlow A medical team surrounded Maria Mazurkevich’s hospital bed, all eyes on her as she did … nothing. Mazurkevich was 30 years old and had been admitted to New York–Presbyterian Hospital at Columbia University on a blisteringly hot July day in New York City. A few days earlier, at home, she had suddenly fallen unconscious. She had suffered a ruptured blood vessel in her brain, and the bleeding area was putting tremendous pressure on critical brain regions. The team of nurses and physicians at the hospital’s neurological intensive care unit was looking for any sign that Mazurkevich could hear them. She was on a mechanical ventilator to help her breathe, and her vital signs were stable. But she showed no signs of consciousness. Mazurkevich’s parents, also at her bed, asked, “Can we talk to our daughter? Does she hear us?” She didn’t appear to be aware of anything. One of us (Claassen) was on her medical team, and when he asked Mazurkevich to open her eyes, hold up two fingers or wiggle her toes, she remained motionless. Her eyes did not follow visual cues. Yet her loved ones still thought she was “in there.” She was. The medical team gave her an EEG—placing sensors on her head to monitor her brain’s electrical activity—while they asked her to “keep opening and closing your right hand.” Then they asked her to “stop opening and closing your right hand.” Even though her hands themselves didn’t move, her brain’s activity patterns differed between the two commands. These brain reactions clearly indicated that she was aware of the requests and that those requests were different. And after about a week, her body began to follow her brain. Slowly, with minuscule responses, Mazurkevich started to wake up. Within a year she recovered fully without major limitations to her physical or cognitive abilities. She is now working as a pharmacist. © 2022 Scientific American,

Keyword: Consciousness; Brain imaging
Link ID: 28527 - Posted: 10.26.2022

By Fenit Nirappil A national shortage of Adderall has left patients who rely on the pills for attention-deficit/hyperactivity disorder scrambling to find alternative treatments and uncertain whether they will be able to refill their medication. The Food and Drug Administration announced the shortage last week, saying that one of the largest producers is experiencing “intermittent manufacturing delays” and that other makers cannot keep up with demand. Some patients say the announcement was a belated acknowledgment of a reality they have faced for months — pharmacies unable to fill their orders and anxiety about whether they will run out of a medication needed to manage their daily lives. Experts say it is often difficult for patients to access Adderall, a stimulant that is tightly regulated as a controlled substance because of high potential for abuse. Medication management generally requires monthly doctor visits. There have been other shortages in recent years. “This one is more sustained,” said Timothy Wilens, an ADHD expert and chief of child and adolescent psychiatry at Massachusetts General Hospital who said access issues stretch back to spring. “It’s putting pressure on patients, and it’s putting pressure on institutions that support the patients.” Erik Gude, a 28-year-old chef who lives in Atlanta, experiences regular challenges filling his Adderall prescription, whether it’s pharmacies not carrying generic versions or disputes with insurers. He has been off the medication for a month after his local pharmacy ran out.

Keyword: ADHD; Drug Abuse
Link ID: 28520 - Posted: 10.22.2022