Chapter 18. Attention and Higher Cognition

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 1589

By Emily Cataneo If you could upload your consciousness to the cloud and live forever as a mind in the metaverse, would you do it? Think carefully before answering. In “Feeling & Knowing: Making Minds Conscious,” neuroscientist Antonio Damasio argues that consciousness is far more than an algorithmic process. Uploading your consciousness to the cloud, he says, would be like experiencing a meal by reading a recipe rather than by eating. So then what is consciousness? That’s the question at the heart of this book. Damasio is a professor of neuroscience, philosophy, and psychology and the director of the Brain and Creativity Institute at the University of Southern California, Los Angeles, as well as the author of the 2018 book “The Strange Order of Things,” in which he extols the power of homeostasis, the force that keeps all living beings in equilibrium and therefore alive. Consciousness is such a slippery and ephemeral concept that it doesn’t even have its own word in many Romance languages, but nevertheless it’s a hot topic these days. “Feeling & Knowing” is the result of Damasio’s editor’s request to weigh in on the subject by writing a very short, very focused book. Over 200 pages, Damasio ponders profound questions: How did we get here? How did we develop minds with mental maps, a constant stream of images, and memories — mechanisms that exist symbiotically with the feelings and sensations in our bodies that we then, crucially, relate back to ourselves and associate with a sense of personhood?

Keyword: Consciousness
Link ID: 28096 - Posted: 12.04.2021

To eavesdrop on a brain, one of the best tools neuroscientists have is the fMRI scan, which helps map blood flow, and therefore the spikes in oxygen that occur whenever a particular brain region is being used. It reveals a noisy world. Blood oxygen levels vary from moment to moment, but those spikes never totally flatten out. “Your brain, even resting, is not going to be completely silent,” says Poortata Lalwani, a PhD student in cognitive neuroscience at the University of Michigan. She imagines the brain, even at its most tranquil, as kind of like a tennis player waiting to return a serve: “He’s not going to be standing still. He’s going to be pacing a little bit, getting ready to hit the backhand.” Many fMRI studies filter out that noise to find the particular spikes researchers want to scrutinize. But for Lalwani, that noise is the most telling signal of all. To her, it’s a signal of cognitive flexibility. Young, healthy brains tend to have signals with a lot of variability in blood oxygen levels from moment to moment. Older ones vary less, at least in certain regions of the brain. About a decade ago, scientists first showed the link between low neural signal variability and the kind of cognitive decline that accompanies healthy aging, rather than specific dementias. A brain’s noisiness is a solid proxy for details that are more abstract, Lalwani says: “How efficient information transfer is, how well-connected the neural networks are, in general how well-functioning the underlying neural network is.” But why that change happens with age has been a mystery. So has the question of whether it’s reversible. © 2021 Condé Nast.

Keyword: Attention; Alzheimers
Link ID: 28091 - Posted: 11.24.2021

David Robson Michelle Carr is frequently plagued by tidal waves in her dreams. What should be a terrifying nightmare, however, can quickly turn into a whimsical adventure – thanks to her ability to control her dreams. She can transform herself into a dolphin and swim into the water. Once, she transformed the wave itself, turning it into a giant snail with a huge shell. “It came right up to me – it was a really beautiful moment.” There’s a thriving online community of people who are now trying to learn how to lucid dream. (A single subreddit devoted to the phenomenon has more than 400,000 members.) Many are simply looking for entertainment. “It’s just so exciting and unbelievable to be in a lucid dream and to witness your mind creating this completely vivid simulation,” says Carr, who is a sleep researcher at the University of Rochester in New York state. Others hope that exercising skills in their dreams will increase their real-life abilities. “A lot of elite athletes use lucid dreams to practise their sport.” And there are more profound reasons to exploit this sleep state, besides personal improvement. By identifying the brain activity that gives rise to the heightened awareness and sense of agency in lucid dreams, neuroscientists and psychologists hope to answer fundamental questions about the nature of human consciousness, including our apparently unique capacity for self-awareness. “More and more researchers, from many different fields, have started to incorporate lucid dreams in their research,” says Carr. This interest in lucid dreaming has been growing in fits and starts for more than a century. Despite his fascination with the interaction between the conscious and subconscious minds, Sigmund Freud barely mentioned lucid dreams in his writings. Instead, it was an English aristocrat and writer, Mary Arnold-Forster, who provided one of the earliest and most detailed descriptions in the English language in her book Studies in Dreams. © 2021 Guardian News & Media Limited

Keyword: Sleep; Consciousness
Link ID: 28079 - Posted: 11.17.2021

Sirin Kale Claudia*, a sailor from Lichfield in her late 30s, is not Italian. She has never been to Italy. She has no Italian family or friends. And she has no idea why a belligerent Italian couple have taken over her inner voice, duking it out in Claudia’s brain while she sits back and listens. “I have no idea where this has come from,” says Claudia, apologetically. “It’s probably offensive to Italians.” The couple are like the family in the Dolmio pasta sauce adverts: flamboyant, portly, prone to waving their hands and shouting. If Claudia has a big decision to make in her life, the Italians take over. “They passionately argue either side,” Claudia says. “It’s really useful because I let them do the work, so I don’t get stressed out by it.” These disagreements always take place in a kitchen, surrounded by food. Claudia hasn’t given the Italians names – yet. But they did help Claudia make a major life decision, encouraging her to quit her job as a scientist two years ago and fulfil a lifelong dream of running away to sea. “They were chatting non-stop before I handed in my notice,” Claudia sighs. “I’d wake up and they’d be arguing. I’d be driving to work and they’d be arguing. It was exhausting, to be honest.” The woman was in favour of Claudia going, but her husband was wary. “He’d be saying: ‘It’s a stable job!’ And she’d go: ‘Let her enjoy life!’” The woman prevailed, and Claudia left to work on a flotilla in Greece (although she’s now back in the UK temporarily, due to Covid). She’s much happier, even if she did have to have neurolinguistic programming to get the shouting to calm down. “They’re quieter now,” Claudia says with relief. “Less shouting. They just bicker.” Most of us have an inner voice: that constant presence that tells you to “Watch out” or “Buy shampoo” or “Urgh, this guy’s a creep”. For many of us, this voice sounds much like our own, or at least how we think we sound. But for some people, their inner voice isn’t a straightforward monologue that reproaches, counsels and reminds. Their inner voice is a squabbling Italian couple, say, or a calm-faced interviewer with their hands folded on their lap. Or it’s a taste, feeling, sensation or colour. In some cases, there isn’t a voice at all, just silence. © 2021 Guardian News & Media Limited

Keyword: Consciousness; Schizophrenia
Link ID: 28053 - Posted: 10.27.2021

Catherine Offord Earlier this year, Brian Butterworth decided to figure out how many numbers the average person encounters in a day. He picked a Saturday for his self-experiment—as a cognitive neuroscientist and professor emeritus at University College London, Butterworth works with numbers, so a typical weekday wouldn’t have been fair. He went about his day as usual, but kept track of how frequently he saw or heard a number, whether that was a symbol, such as 4 or 5, or a word such as “four” or “five.” He flicked through the newspaper, listened to the radio, popped out for a bit of shopping (taking special note of price tags and car license plates), and then, at last, sat down to calculate a grand total. “Would you like to take a guess?” he asks me when we speak over Zoom a couple of weeks later. I hazard that it’s well into the hundreds, but admit I’ve never thought about it before. He says: “I reckoned that I experienced about a thousand numbers an hour. A thousand numbers an hour is sixteen thousand numbers a day, is about five or six million a year. . . . That’s an awful lot of numbers.” Butterworth didn’t conduct his thought experiment just to satisfy his own curiosity. He’s including the calculation in an upcoming book, Can Fish Count?, slated for publication next year. In it, he argues that humans and other animals are constantly exposed to and make use of numbers—not just in the form of symbols and words, but as quantities of objects, of events, and of abstract concepts. Butterworth is one of several researchers who believe that the human brain can be thought of as having a “sense” for number, and that we, like our evolutionary ancestors, are neurologically hardwired to perceive all sorts of quantities in our environments, whether that serves for selecting the bush with more fruit on it, recognizing when a few predators on the horizon become too many, or telling from a show of hands when a consensus has been reached. © 1986–2021 The Scientist.

Keyword: Attention
Link ID: 28051 - Posted: 10.27.2021

By Kate Conger, Kellen Browning and Erin Woo A 27-year-old YouTube star, prodded by her millions of followers with concerns about her health. A 19-year-old TikTok creator who features posts about being skinny. Teen communities throughout the internet, cleverly naming and culling their discussions to avoid detection. They present a nearly intractable problem for social media companies under pressure to do something about material on their services that many people believe is causing harm, particularly to teenagers. Those concerns came into sharp focus in recent weeks in a pair of Senate subcommittee hearings: the first featuring a Facebook executive defending her company, and the second featuring a former Facebook employee turned whistle-blower who bluntly argued that her former employer’s products drove some young people toward eating disorders. The hearings were prompted in part by a Wall Street Journal article that detailed how internal Facebook research showed Instagram, which is owned by Facebook, can make body image issues worse for some young people. On Tuesday, executives from YouTube, TikTok and Snapchat are scheduled to testify before a Senate subcommittee about the effects of their products on children. They are expected to face questions about how they moderate content that might encourage disordered eating, and how their algorithms might promote such content. “Big Tech’s exploiting these powerful algorithms and design features is reckless and heedless, and needs to change,” Senator Richard Blumenthal, a Democrat of Connecticut and the chair of the subcommittee, said in a statement. “They seize on the insecurities of children, including eating disorders, simply to make more money.” But what exactly can be done about that content — and why people create it in the first place — may defy easy answers. If creators say they don’t intend to glamorize eating disorders, should their claims be taken at face value? Or should the companies listen to users complaining about them? © 2021 The New York Times Company

Keyword: Anorexia & Bulimia; Attention
Link ID: 28049 - Posted: 10.23.2021

By Jamie Friedlander Serrano My dad was planning a trip to Cannon Beach, a small coastal town in Oregon that I love. Yet when I sat down to email him some recommendations, I drew a blank. I couldn’t remember the name of the state park we visited or the breakfast spot we adored. Even the name of the hotel we stayed at eluded me. U.S. coronavirus cases tracker and map Since giving birth to my year-old daughter, I’ve had countless moments like this. I have trouble recalling words, forget to respond to text messages, and even missed an appointment. What I’m experiencing is often called “mommy brain”— the forgetful, foggy and scatterbrained feeling many pregnant women and new mothers experience. But is mommy brain real? Anecdotally, yes. Ask any new mom if she has felt the above, and she'll likely say she has — as many as 80 percent of new moms report feelings of mommy brain. Scientifically, it also appears the answer is yes: A growing body of research supports the argument that moms' brains change during pregnancy and after giving birth. A clear explanation for the phenomenon still remains somewhat elusive, however. There are countless variables that experts say contribute to mommy brain, such as fluctuating hormones postpartum, sleep deprivation in dealing with a new baby, anxiety over new parenthood, elevated stress levels, and a general of lives that having a baby forces. Put together, it’s only natural that changes in mental processing would occur, says Moriah Thomason, Barakett associate professor of child and adolescent psychiatry at New York University School of Medicine. When our brain needs to make space for a new priority — keeping a baby alive — remembering a grocery list takes a back seat. “Does it mean that you literally cannot do those things that you used to do as well? Probably not,” she says. “It’s just not the most important thing for you to be accessing.” © 1996-2021 The Washington Post

Keyword: Hormones & Behavior; Sexual Behavior
Link ID: 28033 - Posted: 10.13.2021

Annie Melchor After finishing his PhD in neuroscience in 2016, Thomas Andrillon spent a year road-tripping around Africa and South America with his wife. One evening, on a particularly difficult road in Patagonia, his mind began to wander and he ended up accidentally flipping the car. Luckily, no one was hurt. As locals rushed in to help, they asked Andrillon what had happened. Was there an animal on the road? Had he fallen asleep at the wheel? “I had difficulty explaining that I was just thinking about something else,” he remembers. This experience made him think. What had happened? What was going on in his brain when his mind began to wander? In 2017, Andrillon started his postdoctoral research with neuroscientists Naotsugu Tsuchiya and Joel Pearson at Monash University in Melbourne. Shortly after, Tsuchiya and Andrillon teamed up with philosopher Jennifer Windt, also at Monash, to dive into the neural basis of mind wandering. Initially, Andrillon says, they wanted to know if they could detect mind wandering from facial expressions, recalling how teachers claim to be very good at knowing when their students are not paying attention. So they did a pilot experiment in which they filmed their test subjects performing a tedious, repetitive task. After reviewing the videos, one of Andrillon’s students came to him, concerned. “I think we have a problem,” said the student. “[The subjects] look exhausted.” Sure enough, even though all the study participants were awake, they were obviously struggling to not fall asleep, says Andrillon. It was this observation that gave them the idea to broaden their focus, and start looking at the connection between wavering attention and sleep. © 1986–2021 The Scientist.=

Keyword: Attention; Sleep
Link ID: 28016 - Posted: 10.02.2021

Jordana Cepelewicz Neuroscientists are the cartographers of the brain’s diverse domains and territories — the features and activities that define them, the roads and highways that connect them, and the boundaries that delineate them. Toward the front of the brain, just behind the forehead, is the prefrontal cortex, celebrated as the seat of judgment. Behind it lies the motor cortex, responsible for planning and coordinating movement. To the sides: the temporal lobes, crucial for memory and the processing of emotion. Above them, the somatosensory cortex; behind them, the visual cortex. Not only do researchers often depict the brain and its functions much as mapmakers might draw nations on continents, but they do so “the way old-fashioned mapmakers” did, according to Lisa Feldman Barrett, a psychologist at Northeastern University. “They parse the brain in terms of what they’re interested in psychologically or mentally or behaviorally,” and then they assign the functions to different networks of neurons “as if they’re Lego blocks, as if there are firm boundaries there.” But a brain map with neat borders is not just oversimplified — it’s misleading. “Scientists for over 100 years have searched fruitlessly for brain boundaries between thinking, feeling, deciding, remembering, moving and other everyday experiences,” Barrett said. A host of recent neurological studies further confirm that these mental categories “are poor guides for understanding how brains are structured or how they work.” Neuroscientists generally agree about how the physical tissue of the brain is organized: into particular regions, networks, cell types. But when it comes to relating those to the task the brain might be performing — perception, memory, attention, emotion or action — “things get a lot more dodgy,” said David Poeppel, a neuroscientist at New York University. All Rights Reserved © 2021

Keyword: Brain imaging; Attention
Link ID: 27963 - Posted: 08.25.2021

Tim Adams For centuries, philosophers have theorised about the mind-body question, debating the relationship between the physical matter of the brain and the conscious mental activity it somehow creates. Even with advances in neuroscience and brain imaging techniques, large parts of that fundamental relationship remain stubbornly mysterious. It was with good reason that, in 1995, the cognitive scientist David Chalmers coined the term “the hard problem” to describe the question of exactly how our brains conjure subjective conscious experience. Some philosophers continue to insist that mind is inherently distinct from matter. Advances in understanding how the brain functions undermine those ideas of dualism, however. Anil Seth, professor of cognitive and computational neuroscience at the University of Sussex, is at the leading edge of that latter research. His Ted talk on consciousness has been viewed more than 11m times. His new book, Being You, proposes an idea of the human mind as a “highly evolved prediction machine”, rooted in the functions of the body and “constantly hallucinating the world and the self” to create reality. One of the things that I liked about your approach in the book was the way that many of the phenomena you investigate arise out of your experience. For example, the feeling of returning to consciousness after anaesthesia or how your mother, experiencing delirium, was no longer recognisably herself. Do you think it’s always important to keep that real-world framework in mind? The reason I’m interested in consciousness is intrinsically personal. I want to understand myself and, by extension, others. But I’m also super-interested for example in developing statistical models and mathematical methods for characterising things such as emergence [behaviour of the mind as a whole that exceeds the capability of its individual parts] and there is no personal component in that. © 2021 Guardian News & Media Limited

Keyword: Consciousness; Attention
Link ID: 27962 - Posted: 08.25.2021

By Katherine Ellison Jessica McCabe crashed and burned at 30, when she got divorced, dropped out of community college and moved in with her mother. Eric Tivers had 21 jobs before age 21. Both have been diagnosed with attention-deficit/hyperactivity disorder, and both today are entrepreneurs who wear their diagnoses — and rare resilience — on their sleeves. With YouTube videos, podcasts and tweets, they’ve built online communities aimed at ending the shame that so often makes having ADHD so much harder. Now they’re going even further, asking: Why not demand more than mere compassion? Why not seek deeper changes to create a more ADHD-friendly world? “I’ve spent the last five or six years trying to understand how my brain works so that I could conform, but now I’m starting to evolve,” says McCabe, 38, whose chipper, NASCAR-speed delivery has garnered 742,000 subscribers — and counting — to her YouTube channel, “How to ADHD.” “I think we no longer have to accept that we live in a world that is not built for our brains.” With Tivers, she is planning a virtual summit on the topic for next May. As a first step, with the help of Canadian cognitive scientist Deirdre Kelly, she says she’ll soon release new guidelines to assess products and services for their ADHD friendliness. Computer programs that help restless users meditate and a chair that accommodates a variety of seated positions are high on the list to promote, while error-prone apps or devices will be flagged. Kelly also envisions redesigning refrigerator vegetable drawers, so that the most nutritious food will no longer be out of sight and mind. In the past two decades, the world has become much kinder to the estimated 6.1 million children and approximately 10 million adults with ADHD, whose hallmark symptoms are distraction, forgetfulness and impulsivity. Social media has made all the difference.

Keyword: ADHD
Link ID: 27960 - Posted: 08.25.2021

By Christiane Gelitz, Maddie Bender | To a chef, the sounds of lip smacking, slurping and swallowing are the highest form of flattery. But to someone with a certain type of misophonia, these same sounds can be torturous. Brain scans are now helping scientists start to understand why. People with misophonia experience strong discomfort, annoyance or disgust when they hear particular triggers. These can include chewing, swallowing, slurping, throat clearing, coughing and even audible breathing. Researchers previously thought this reaction might be caused by the brain overactively processing certain sounds. Now, however, a new study published in the Journal of Neuroscience has linked some forms of misophonia to heightened “mirroring” behavior in the brain: those affected feel distress while their brains act as if they are mimicking the triggering mouth movements. “This is the first breakthrough in misophonia research in 25 years,” says psychologist Jennifer J. Brout, who directs the International Misophonia Research Network and was not involved in the new study. The research team, led by Newcastle University neuroscientist Sukhbinder Kumar, analyzed brain activity in people with and without misophonia when they were at rest and while they listened to sounds. These included misophonia triggers (such as chewing), generally unpleasant sounds (like a crying baby), and neutral sounds. The brain's auditory cortex, which processes sound, reacted similarly in subjects with and without misophonia. But in both the resting state and listening trials, people with misophonia showed stronger connections between the auditory cortex and brain regions that control movements of the face, mouth and throat. Kumar found this connection became most active in participants with misophonia when they heard triggers specific to the condition. © 2021 Scientific American,

Keyword: Hearing; Attention
Link ID: 27955 - Posted: 08.21.2021

By John Horgan In my 20s, I had a friend who was brilliant, charming, Ivy-educated and rich, heir to a family fortune. I’ll call him Gallagher. He could do anything he wanted. He experimented, dabbling in neuroscience, law, philosophy and other fields. But he was so critical, so picky, that he never settled on a career. Nothing was good enough for him. He never found love for the same reason. He also disparaged his friends’ choices, so much so that he alienated us. He ended up bitter and alone. At least that’s my guess. I haven’t spoken to Gallagher in decades. There is such a thing as being too picky, especially when it comes to things like work, love and nourishment (even the pickiest eater has to eat something). That’s the lesson I gleaned from Gallagher. But when it comes to answers to big mysteries, most of us aren’t picky enough. We settle on answers for bad reasons, for example, because our parents, priests or professors believe it. We think we need to believe something, but actually we don’t. We can, and should, decide that no answers are good enough. We should be agnostics. Some people confuse agnosticism (not knowing) with apathy (not caring). Take Francis Collins, a geneticist who directs the National Institutes of Health. He is a devout Christian, who believes that Jesus performed miracles, died for our sins and rose from the dead. In his 2006 bestseller The Language of God, Collins calls agnosticism a “cop-out.” When I interviewed him, I told him I am an agnostic and objected to “cop-out.” © 2021 Scientific American

Keyword: Consciousness
Link ID: 27952 - Posted: 08.18.2021

By Katherine Ellison ADHD — the most common psychiatric disorder of childhood —  lasts longer for more people than has been widely assumed, according to new research. “Only 10 percent of people really appear to grow out of ADHD,” says the lead author, psychologist Margaret Sibley, associate professor of psychiatry and behavioral sciences at the University of Washington School of Medicine. “Ninety percent still struggle with at least mild symptoms as adults — even if they have periods when they are symptom free.” The study challenges a widely persistent perception of a time-limited condition occurring mostly in childhood. Indeed, one of the earliest names for attention deficit/hyperactivity disorder was “a hyperkinetic disease of infancy,” while its most common poster child has long been a young, White, disruptive male. Previous research has suggested the condition essentially vanishes in about half of those who receive diagnoses. But in recent years, increasing numbers of women, people of color and especially adults have been seeking help in managing the hallmark symptoms of distraction, forgetfulness and impulsivity. By the most recent estimates, 9.6 percent of children ages 3 to 17 have been diagnosed with ADHD. Yet researchers report that only 4.4 percent of young adults ages 18 to 44 have the disorder, suggesting that if the new estimates are valid, there may be some catching up to do. Sibley’s paper paints a picture of an on-again, off-again condition, with symptoms fluctuating depending on life circumstances. © 1996-2021 The Washington Post

Keyword: ADHD
Link ID: 27946 - Posted: 08.14.2021

By Christina Caron Q: How common is adult A.D.H.D.? What are the symptoms and is it possible for someone who was not diagnosed with it as a child to be diagnosed as an adult? A: Attention deficit hyperactivity disorder, or A.D.H.D., is a neurodevelopmental disorder often characterized by inattention, disorganization, hyperactivity and impulsivity. It is one of the most common mental health disorders. According to the World Federation of A.D.H.D., it is thought to occur in nearly 6 percent of children and 2.5 percent of adults. In the United States, 5.4 million children, or about 8 percent of all U.S. children ages 3 to 17, were estimated to have A.D.H.D. in 2016, the Centers for Disease Control and Prevention reported. For decades, experts believed that A.D.H.D. occurred only among children and ended after adolescence. But a number of studies in the ’90s showed that A.D.H.D. can continue into adulthood. Experts now say that at least 60 percent of children with A.D.H.D. will also have symptoms as adults. It’s not surprising that so many people are now wondering whether they might have the disorder, especially if their symptoms were exacerbated by the pandemic. The Attention Deficit Disorder Association, an organization founded in 1990 for adults with A.D.H.D, saw its membership nearly double between 2019 and 2021. In addition, Children and Adults With Attention-Deficit/Hyperactivity Disorder, or CHADD, reported that the highest proportion of people who call their A.D.H.D. help line are adults seeking guidance and resources for themselves. © 2021 The New York Times Company

Keyword: ADHD
Link ID: 27933 - Posted: 08.07.2021

By Christof Koch Consider the following experiences: • You're headed toward a storm that's a couple of miles away, and you've got to get across a hill. You ask yourself: “How am I going to get over that, through that?” • You see little white dots on a black background, as if looking up at the stars at night. Advertisement • You look down at yourself lying in bed from above but see only your legs and lower trunk. These may seem like idiosyncratic events drawn from the vast universe of perceptions, sensations, memories, thoughts and dreams that make up our daily stream of consciousness. In fact, each one was evoked by directly stimulating the brain with an electrode. As American poet Walt Whitman intuited in his poem “I Sing the Body Electric,” these anecdotes illustrate the intimate relationship between the body and its animating soul. The brain and the conscious mind are as inexorably linked as the two sides of a coin. Recent clinical studies have uncovered some of the laws and regularities of conscious activity, findings that have occasionally proved to be paradoxical. They show that brain areas involved in conscious perception have little to do with thinking, planning and other higher cognitive functions. Neuroengineers are now working to turn these insights into technologies to replace lost cognitive function and, in the more distant future, to enhance sensory, cognitive or memory capacities. For example, a recent brain-machine interface provides completely blind people with limited abilities to perceive light. These tools, however, also reveal the difficulties of fully restoring sight or hearing. They underline even more the snags that stand in the way of sci-fi-like enhancements that would enable access to the brain as if it were a computer storage drive. © 2021 Scientific American,

Keyword: Consciousness
Link ID: 27865 - Posted: 06.19.2021

Christopher M. Filley One of the most enduring themes in human neuroscience is the association of higher brain functions with gray matter. In particular, the cerebral cortex—the gray matter of the brain's surface—has been the primary focus of decades of work aiming to understand the neurobiological basis of cognition and emotion. Yet, the cerebral cortex is only a few millimeters thick, so the relative neglect of the rest of the brain below the cortex has prompted the term “corticocentric myopia” (1). Other regions relevant to behavior include the deep gray matter of the basal ganglia and thalamus, the brainstem and cerebellum, and the white matter that interconnects all of these structures. On page 1304 of this issue, Zhao et al. (2) present compelling evidence for the importance of white matter by demonstrating genetic influences on structural connectivity that invoke a host of provocative clinical implications. Insight into the importance of white matter in human behavior begins with its anatomy (3–5) (see the figure). White matter occupies about half of the adult human brain, and some 135,000 km of myelinated axons course through a wide array of tracts to link gray matter regions into distributed neural networks that serve cognitive and emotional functions (3). The human brain is particularly well interconnected because white matter has expanded more in evolution than gray matter, which has endowed the brain of Homo sapiens with extensive structural connectivity (6). The myelin sheath, white matter's characteristic feature, appeared late in vertebrate evolution and greatly increased axonal conduction velocity. This development enhanced the efficiency of distributed neural networks, expanding the transfer of information throughout the brain (5). Information transfer serves to complement the information processing of gray matter, where neuronal cell bodies, synapses, and a variety of neurotransmitters are located (5). The result is a brain with prodigious numbers of both neurons and myelinated axons, which have evolved to subserve the domains of attention, memory, emotion, language, perception, visuospatial processing, executive function (5), and social cognition (7). © 2021 American Association for the Advancement of Science.

Keyword: Development of the Brain; Attention
Link ID: 27862 - Posted: 06.19.2021

Laura Sanders A new view of the human brain shows its cellular residents in all their wild and weird glory. The map, drawn from a tiny piece of a woman’s brain, charts the varied shapes of 50,000 cells and 130 million connections between them. This intricate map, named H01 for “human sample 1,” represents a milestone in scientists’ quest to provide ever more detailed descriptions of a brain (SN: 2/7/14). “It’s absolutely beautiful,” says neuroscientist Clay Reid at the Allen Institute for Brain Science in Seattle. “In the best possible way, it’s the beginning of something very exciting.” Scientists at Harvard University, Google and elsewhere prepared and analyzed the brain tissue sample. Smaller than a sesame seed, the bit of brain was about a millionth of an entire brain’s volume. It came from the cortex — the brain’s outer layer responsible for complex thought — of a 45-year-old woman undergoing surgery for epilepsy. After it was removed, the brain sample was quickly preserved and stained with heavy metals that revealed cellular structures. The sample was then sliced into more than 5,000 wafer-thin pieces and imaged with powerful electron microscopes. Computational programs stitched the resulting images back together and artificial intelligence programs helped scientists analyze them. A short description of the resulting view was published as a preprint May 30 to bioRxiv.org. The full dataset is freely available online. black background with green and purple nerve cells with lots of long tendrils These two neurons are mirror symmetrical. It’s unclear why these cells take these shapes. Lichtman Lab/Harvard University, Connectomics Team/Google For now, researchers are just beginning to see what’s there. “We have really just dipped our toe into this dataset,” says study coauthor Jeff Lichtman, a developmental neurobiologist at Harvard University. Lichtman compares the brain map to Google Earth: “There are gems in there to find, but no one can say they’ve looked at the whole thing.” © Society for Science & the Public 2000–2021.

Keyword: Brain imaging
Link ID: 27858 - Posted: 06.16.2021

By Carl Zimmer Dr. Adam Zeman didn’t give much thought to the mind’s eye until he met someone who didn’t have one. In 2005, the British neurologist saw a patient who said that a minor surgical procedure had taken away his ability to conjure images. Over the 16 years since that first patient, Dr. Zeman and his colleagues have heard from more than 12,000 people who say they don’t have any such mental camera. The scientists estimate that tens of millions of people share the condition, which they’ve named aphantasia, and millions more experience extraordinarily strong mental imagery, called hyperphantasia. In their latest research, Dr. Zeman and his colleagues are gathering clues about how these two conditions arise through changes in the wiring of the brain that join the visual centers to other regions. And they’re beginning to explore how some of that circuitry may conjure other senses, such as sound, in the mind. Eventually, that research might even make it possible to strengthen the mind’s eye — or ear — with magnetic pulses. “This is not a disorder as far as I can see,” said Dr. Zeman, a cognitive scientist at the University of Exeter in Britain. “It’s an intriguing variation in human experience.” The patient who first made Dr. Zeman aware of aphantasia was a retired building surveyor who lost his mind’s eye after minor heart surgery. To protect the patient’s privacy, Dr. Zeman refers to him as M.X. When M.X. thought of people or objects, he did not see them. And yet his visual memories were intact. M.X. could answer factual questions such as whether former Prime Minister Tony Blair has light-colored eyes. (He does.) M.X. could even solve problems that required mentally rotating shapes, even though he could not see them. I came across M.X.’s case study in 2010 and wrote a column about it for Discover magazine. Afterward, I got emails from readers who had the same experience but who differed from M.X. in a remarkable way: They had never had a mind’s eye to begin with. © 2021 The New York Times Company

Keyword: Attention; Vision
Link ID: 27851 - Posted: 06.11.2021

By Ben Guarino and Frances Stead Sellers In the coronavirus pandemic’s early weeks, in neuropathology departments around the world, scientists wrestled with a question: Should they cut open the skulls of patients who died of covid-19 and extract their brains? Autopsy staff at Columbia University in New York were hesitant. Sawing into bone creates dust, and the Centers for Disease Control and Prevention had issued a warning about the bodies of covid patients — airborne debris from autopsies could be an infectious hazard. But as more patients were admitted and more began to die, researchers decided to “make all the efforts we could to start collecting the brain tissue,” Columbia neuropathologist Peter D. Canoll said. In March 2020, in an insolation room, the Columbia team extracted a brain from a patient who had died of severe covid-19, the illness caused by the coronavirus. During the next months, they would examine dozens more. Saw met skull elsewhere, too. In Germany, scientists autopsied brains — even though medical authorities recommended against doing that. Researchers were searching the brain for damage — and for the virus itself. At the pandemic’s start, understanding how the virus affected the nervous system was largely a mystery. S. Andrew Josephson, chair of neurology at the University of California at San Francisco and editor in chief of the academic journal JAMA Neurology, said, “We had hundreds of submissions of ‘I saw one case of X.’” It was difficult to understand whether single cases has any relationship to covid at all. Patients reported visual and auditory disturbances, vertigo and tingling sensations, among other perplexing symptoms. Some lost their sense of smell, or their vision became distorted. Weeks or months after the initial onset of symptoms, some remain convinced after even a mild bout of the coronavirus of persistent “brain fog.”

Keyword: Learning & Memory; Attention
Link ID: 27845 - Posted: 06.08.2021