Chapter 14. Attention and Higher Cognition
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Iris Berent Seeing the striking magenta of bougainvillea. Tasting a rich morning latte. Feeling the sharp pain of a needle prick going into your arm. These subjective experiences are the stuff of the mind. What is “doing the experiencing,” the 3-pound chunk of meat in our head, is a tangible object that works on electrochemical signals—physics, essentially. How do the two—our mental experiences and physical brains—interact? The puzzle of consciousness seems to be giving science a run for its money. The problem, to be clear, isn’t merely to pinpoint “where it all happens” in the brain (although this, too, is far from trivial). The real mystery is how to bridge the gap between the mental, first-person stuff of consciousness and the physical lump of matter inside the cranium. Some think the gap is unbreachable. The philosopher David Chalmers, for instance, has argued that consciousness is something special and distinct from the physical world. If so, it may never be possible to explain consciousness in terms of physical brain processes. No matter how deeply scientists understand the brain, for Chalmers, this would never explain how our neurons produce consciousness. Why should a hunk of flesh, teeming with chemical signals and electrical charges, experience a point of view? There seems to be no conceivable reason why meaty matter would have this light of subjectivity “on the inside.” Consciousness, then, is a “hard problem”—as Chalmers has labeled it—indeed. The possibility that consciousness itself isn’t anything physical raises burning questions about whether, for example, an AI can fall in love with its programmer. And since consciousness is a natural phenomenon, much like gravity or genes, these questions carry huge implications. Science explains the natural world by physical principles only. So if it turns out that one natural phenomenon transcends the laws of physics, then it is not only the science of consciousness that is in trouble—our entire understanding of the natural world would require serious revision. © 2024 NautilusNext Inc.,
Keyword: Consciousness
Link ID: 29581 - Posted: 11.30.2024
By Joanne Silberner To describe the destructive effects of intense health anxiety to his young doctors in training at Columbia University Irving Medical Center in New York City, psychiatrist Brian Fallon likes to quote 19th-century English psychiatrist Henry Maudsley: “The sorrow which has no vent in tears may make other organs weep.” That weeping from other parts of the body may come in the form of a headache that, in the mind of its sufferer, is flagging a brain tumor. It may be a rapid heartbeat a person wrongly interprets as a brewing heart attack. The fast beats may be driven by overwhelming, incapacitating anxiety. Hal Rosenbluth, a businessman in the Philadelphia area, says he used to seek medical care for the slightest symptom. In his recent book Hypochondria, he describes chest pains, breathing difficulties and vertigo that came on after he switched from a daily diabetes drug to a weekly one. He ended up going to the hospital by ambulance for blood tests, multiple electrocardiograms, a chest x-ray, a cardiac catheterization and an endoscopy, all of which were normal. Rosenbluth’s worries about glucose levels had led him to push for the new diabetes drug, and its side effects were responsible for many of his cardiac symptoms. His own extreme anxiety had induced doctors to order the extra care. Hypochondria can, in extreme cases, leave people unable to hold down a job or make it impossible for them to leave the house, cook meals, or care for themselves and their families. Recent medical research has shown that hypochondria is as much a real illness as depression and post-traumatic stress disorder. This work, scientists hope, will convince doctors who believed the disorder was some kind of character flaw that their patients are truly ill—and in danger. A study published just last year showed that people with hypochondria have higher death rates than similar but nonafflicted people, and the leading nonnatural cause of death was suicide. It was relatively rare, but the heightened risk was clear.
Keyword: Stress; Attention
Link ID: 29567 - Posted: 11.20.2024
By Laura Sanders Growing up, Roberto S. Luciani had hints that his brain worked differently than most people. He didn’t relate when people complained about a movie character looking different than what they’d pictured from the book, for instance. But it wasn’t until he was a teenager that things finally clicked. His mother had just woken up and was telling him about a dream she had. “Movielike,” is how she described it. “Up until then, I assumed that cartoon depictions of imagination were exaggerated,” Luciani says, “I asked her what she meant and quickly realized my visual imagery was not functioning like hers.” That’s because Luciani has a condition called aphantasia — an inability to picture objects, people and scenes in his mind. When he was growing up, the term didn’t even exist. But now, Luciani, a cognitive scientist at the University of Glasgow in Scotland, and other scientists are getting a clearer picture of how some brains work, including those with a blind mind’s eye. In a recent study, Luciani and colleagues explored the connections between the senses, in this case, hearing and seeing. In most of our brains, these two senses collaborate. Auditory information influences activity in brain areas that handle vision. But in people with aphantasia, this connection isn’t as strong, researchers report November 4 in Current Biology. While in a brain scanner, blindfolded people listened to three sound scenes: A forest full of birds, a crowd of people, and a street bustling with traffic. In 10 people without aphantasia, these auditory scenes create reliable neural hallmarks in parts of the visual cortex. But in 23 people with aphantasia, these hallmarks were weaker. © Society for Science & the Public 2000–2024.
Keyword: Attention
Link ID: 29566 - Posted: 11.20.2024
By Christina Caron In high school, Sophie Didier started falling behind. She found it difficult to concentrate on her schoolwork, felt restless in class and often got in trouble for talking too much. “I had a teacher that used to give me suckers so that I would shut up,” she said. At 15, a doctor diagnosed her with attention deficit hyperactivity disorder. Medication helped, but she discovered that having a demanding schedule was also important. In both high school and college, her grades improved when she was juggling lacrosse and other extracurricular activities with her classes. Being so busy forced her to stick to a routine. “I felt more organized then,” recalled Ms. Didier, now 24 and living in Kansas City, Mo. “Like I had a better handle on things.” Research has shown that A.D.H.D. symptoms can change over time, improving and then worsening again or vice versa. And according to a recently published study, having additional responsibilities and obligations is associated with periods of milder A.D.H.D. This might mean that staying busy had been beneficial, researchers said. It could also just mean that people with milder symptoms had been able to handle more demands, they added. Oftentimes, people with A.D.H.D. “seem to do best when there’s an urgent deadline or when the stakes are high,” said Margaret Sibley, who is a professor of psychiatry and behavioral sciences at the University of Washington School of Medicine in Seattle and who led the study. The study, published online in October in the Journal of Clinical Psychiatry, tracked 483 patients in the United States and Canada who each had a combination of inattentive and hyperactive-impulsive A.D.H.D. symptoms. The researchers followed the participants for 16 years, starting at an average age of 8. They found that about three-quarters of the patients experienced fluctuations in their symptoms, generally beginning around age 12, which included either a full or partial remission of symptoms. © 2024 The New York Times Company
Keyword: ADHD
Link ID: 29553 - Posted: 11.13.2024
By Tamlyn Hunt The neuron, the specialized cell type that makes up much of our brains, is at the center of today’s neuroscience. Neuroscientists explain perception, memory, cognition and even consciousness itself as products of billions of these tiny neurons busily firing their tiny “spikes” of voltage inside our brain. These energetic spikes not only convey things like pain and other sensory information to our conscious mind, but they are also in theory able to explain every detail of our complex consciousness. At least in principle. The details of this “neural code” have yet to be worked out. While neuroscientists have long focused on spikes travelling throughout brain cells, “ephaptic” field effects may really be the primary mechanism for consciousness and cognition. These effects, resulting from the electric fields produced by neurons rather than their synaptic firings, may play a leading role in our mind’s workings. In 1943 American scientists first described what is known today as the neural code, or spike code. They fleshed out a detailed map of how logical operations can be completed with the “all or none” nature of neural firing—similar to how today’s computers work. Since then neuroscientists around the world have engaged in a vast endeavor to crack the neural code in order to understand the specifics of cognition and consciousness. To little avail. “The most obvious chasm in our understanding is in all the things we did not meet on our journey from your eye to your hand,” confessed neuroscientist Mark Humphries in 2020’s The Spike, a deep dive into this journey: “All the things of the mind I’ve not been able to tell you about, because we know so little of what spikes do to make them.” © 2024 SCIENTIFIC AMERICAN
Keyword: Consciousness
Link ID: 29546 - Posted: 11.09.2024
By Angie Voyles Askham Keeping track of social hierarchies is crucial for any animal. Primates in particular must adapt their behaviors based on the status of those around them, or risk losing their own rank. “True, smart social behavior in humans and in monkeys is dependent on a full adjustment to the context,” says Katalin Gothard, professor of physiology and neuroscience at the University of Arizona. Multiple brain areas keep track of social information. Among them, the amygdala—known for processing emotions—responds to faces, facial expressions and social status, and activates as people learn social hierarchies. But the brain adapts this information for different social settings, a new study reveals: Neurons in the macaque amygdala encode knowledge about social status in a context-specific way, Gothard and her colleagues discovered. Just like people, macaques can infer social standing from videos, and the activity of amygdala cells captures information about both the identity of the individual they are watching and how that animal relates to others in the scene. These findings help explain how primates process information about social position, says Ralph Adolphs, professor of psychology and neuroscience at California Institute of Technology, who was not involved in the work. And because the monkeys could successfully learn this information from videos, the results open up a new avenue for studying how the primate brain encodes these relationships in a complex and dynamic way, he adds. “That’s a big step forward.” Like people, macaques have no physical traits that directly convey dominance, Gothard says. “The status of these individuals is inferred.” So she and her colleagues tested two macaques’ ability to understand a hierarchy that the team invented among four unfamiliar monkeys in a series of videos. Each clip simulated status-appropriate interactions between two of the four monkeys on a split screen to convey those two animals’ relative positions: a scene of a higher-ranked animal acting aggressive juxtaposed with one of a lower-ranked monkey smacking its lips in appeasement, for example. © 2024 Simons Foundation
Keyword: Emotions; Attention
Link ID: 29544 - Posted: 11.06.2024
By Talia Barrington Growing up in England, Caragh McMurtry wasn’t your typical future Olympic rower. Born to parents who worked in a local factory, raised in low-income housing, frequently in trouble for being a “terror,” she didn’t exactly fit the mold of a sport known for a certain elitism. But when an after-school program funded by British Rowing was offered at her school, McMurtry gave it a try. With rowing, unlike at school, where she struggled to connect with peers, rules were clear. Everyone had a defined job, it was always the same, and because the rowers sat in a single row, she didn’t feel that people were looking at her. “It was cathartic,” she told me. “Pushing hard gave me that sensory feedback,” and the repetitive action was “calming.” At first, everything seemed to go well. She made it to the World Rowing Junior Championships and the under 23s and senior championships, and the medals started rolling in. But then things went a little haywire. Her coaches labeled her difficult and told her she asked too many questions and was too blunt and honest with her peers. She was diagnosed with bipolar disorder, which is known for its extreme mood swings, and she was put on lithium and a cocktail of other drugs that did not work. It would take five more years before a doctor would figure out why she struggled to connect with her teammates and others around her but was so focused, so “regulated” when it came to extreme and continued physical exertion: She had a form of autism. Experts call a key aspect of what McMurtry experiences when engaged in physical activity “hyperfocus,” and it’s an overlapping hallmark of both autism and ADHD. “People with ADHD and autism have an incredibly high ability to focus on tasks that they find interesting or stimulating,” said Laura Huckins, an associate professor of psychiatry at the Yale Center for Genomic Health. “They tend to be drawn towards professions that require or include novelty, that include regular challenges, and that require high performance under stress and pressure.”
Keyword: Autism; Attention
Link ID: 29535 - Posted: 11.02.2024
Nicola Davis Science correspondent Whether it is news headlines or WhatsApp messages, modern humans are inundated with short pieces of text. Now researchers say they have unpicked how we get their gist in a single glance. Prof Liina Pylkkanen, co-author of the study from New York University, said most theories of language processing assume words are understood one by one, in sequence, before being combined to yield the meaning of the whole sentence. “From this perspective, at-a-glance language processing really shouldn’t work since there’s just not enough time for all the sequential processing of words and their combination into a larger representation,” she said. However, the research offers fresh insights, revealing we can detect certain sentence structures in as little as 125 milliseconds (ms) – a timeframe similar to the blink of an eye. Pylkkanen said: “We don’t yet know exactly how this ultrafast structure detection is possible, but the general hypothesis is that when something you perceive fits really well with what you know about – in this case, we’re talking about knowledge of the grammar – this top-down knowledge can help you identify the stimulus really fast. “So just like your own car is quickly identifiable in a parking lot, certain language structures are quickly identifiable and can then give rise to a rapid effect of syntax in the brain.” The team say the findings suggest parallels with the way in which we perceive visual scenes, with Pylkkanen noting the results could have practical uses for the designers of digital media, as well as advertisers and designers of road signs. Writing in the journal Science Advances, Pylkkanen and colleagues report how they used a non-invasive scanning device to measure the brain activity of 36 participants. © 2024 Guardian News & Media Limited
Keyword: Language; Attention
Link ID: 29527 - Posted: 10.26.2024
By Yasemin Saplakoglu Around 2,500 years ago, Babylonian traders in Mesopotamia impressed two slanted wedges into clay tablets. The shapes represented a placeholder digit, squeezed between others, to distinguish numbers such as 50, 505 and 5,005. An elementary version of the concept of zero was born. Hundreds of years later, in seventh-century India, zero took on a new identity. No longer a placeholder, the digit acquired a value and found its place on the number line, before 1. Its invention went on to spark historic advances in science and technology. From zero sprang the laws of the universe, number theory and modern mathematics. “Zero is, by many mathematicians, definitely considered one of the greatest — or maybe the greatest — achievement of mankind,” said the neuroscientist Andreas Nieder (opens a new tab), who studies animal and human intelligence at the University of Tübingen in Germany. “It took an eternity until mathematicians finally invented zero as a number.” Perhaps that’s no surprise given that the concept can be difficult for the brain to grasp. It takes children longer to understand and use zero than other numbers, and it takes adults longer to read it than other small numbers. That’s because to understand zero, our mind must create something out of nothing. It must recognize absence as a mathematical object. “It’s like an extra level of abstraction away from the world around you,” said Benjy Barnett (opens a new tab), who is completing graduate work on consciousness at University College London. Nonzero numbers map onto countable objects in the environment: three chairs, each with four legs, at one table. With zero, he said, “we have to go one step further and say, ‘OK, there wasn’t anything there. Therefore, there must be zero of them.’” © 2024 Simons Foundation
Keyword: Attention
Link ID: 29523 - Posted: 10.19.2024
By Phil Plait I remember watching the full moon rise one early evening a while back. It was when I still lived in Colorado, and I was standing outside in my yard. I first noticed a glow to the east lighting up the flat horizon in the darkening sky, and within moments the moon was cresting above it, yellow and swollen—like, really swollen As it cleared the horizon, the moon looked huge! It also seemed so close that I could reach out and touch it; it was so “in my face” that I felt I could fall in. I gawped at it for a moment and then smiled. I knew what I was actually seeing: the moon illusion. Anyone who is capable of seeing the moon (or the sun) near the horizon has experienced this effect. The moon looks enormous there, far larger than it does when it’s overhead. I’m an astronomer, and I know the moon is no bigger on the horizon than at the zenith, yet I can’t not see it that way. It’s an overwhelming effect. But it’s not real. Simple measurements of the moon show it’s essentially the same size on the horizon as when it’s overhead. This really is an illusion. It’s been around awhile, too: the illusion is shown in cuneiform on a clay tablet from the ancient Assyrian city Nineveh that has been dated to the seventh century B.C.E. Attempts to explain it are as old as the illusion itself, and most come up short. Aristotle wrote about it, for example, attributing it to the effects of mist. This isn’t correct, obviously; the illusion manifests even in perfectly clear weather. A related idea, still common today, is that Earth’s air acts like a lens, refracting (bending) the light from the moon and magnifying it. But we know that’s not right because the moon is measurably the same size no matter where it is in the sky. Also, examining the physics of that explanation shows that it falls short as well. In fact, while the air near the horizon does indeed act like a lens, its actual effect is to make the sun and moon look squished, like flat ovals, not to simply magnify them. So that can’t be the cause either.
Keyword: Vision; Attention
Link ID: 29522 - Posted: 10.19.2024
By Giorgia Guglielmi As the famed tale “Hansel and Gretel” makes clear, hunger can change behavior. The two lost and starving siblings give in to the temptation of a gingerbread cottage and ignore the danger lurking within—a wicked witch who has created the delicious house as a trap. Hunger is such a powerful driver that animals often pursue food at the expense of other survival needs, such as avoiding predators or recovering from injury. Hungry vicuñas, for example, will sometimes increase their risk of predation by pumas to get something to eat, behavioral ecologists have shown. Scientists know many of the key cells and circuits behind these competing drives—such as the hypothalamic “hunger neurons” that regulate food intake. But how the brain juggles the need to eat amidst other urges has remained mysterious, says Henning Fenselau, who leads the Synaptic Transmission in Energy Homeostasis group at the Max Planck Institute for Metabolism Research in Köln, Germany. “This is still a huge question [in neuroscience],” he says. In recent years, however, new clues about where and how hunger collides with rival motivations have come from technology to manipulate and monitor individual neurons across multiple brain regions at once. Those findings suggest that hunger neuron activity can override some brain signals, such as fear and pain. Exploring the brain’s ability to handle multiple needs simultaneously may offer insights into decision-making, anxiety and other neuropsychiatric conditions—helping to explain why people sometimes make maladaptive choices, says Nicholas Betley, associate professor of biology at the University of Pennsylvania. © 2024 Simons Foundation
Keyword: Obesity; Attention
Link ID: 29515 - Posted: 10.12.2024
By Miryam Naddaf Neurons in the hippocampus help to pick out patterns in the flood of information pouring through the brain.Credit: Arthur Chien/Science Photo Library The human brain is constantly picking up patterns in everyday experiences — and can do so without conscious thought, finds a study1 of neuronal activity in people who had electrodes implanted in their brain tissue for medical reasons. The study shows that neurons in key brain regions combine information on what occurs and when, allowing the brain to pick out the patterns in events as they unfold over time. That helps the brain to predict coming events, the authors say. The work was published today in Nature. “The brain does a lot of things that we are not consciously aware of,” says Edvard Moser, a neuroscientist at the Norwegian University of Science and Technology in Trondheim. “This is no exception.” To make sense of the world around us, the brain must process an onslaught of information on what happens, where it happens and when it happens. The study’s authors wanted to explore how the brain organizes this information over time — a crucial step in learning and memory. The team studied 17 people who had epilepsy and had electrodes implanted in their brains in preparation for surgical treatment. These electrodes allowed the authors to directly capture the activity of individual neurons in multiple brain regions. Among those regions were the hippocampus and entorhinal cortex, which are involved in memory and navigation. These areas contain time and place cells that act as the body’s internal clock and GPS system, encoding time and locations. “All the external world coming into our brain has to be filtered through that system,” says study co-author Itzhak Fried, a neurosurgeon and neuroscientist at the University of California, Los Angeles. © 2024 Springer Nature Limited
Keyword: Attention; Learning & Memory
Link ID: 29497 - Posted: 09.28.2024
By Angie Voyles Askham Most people visit the Minnesota State Fair for a fun-filled day of fried food, farm animals and carnival rides. Not Ka Ip. He saw the annual event as the perfect setting for a new experiment. Ip, assistant professor of child development at the University of Minnesota, is particularly interested in executive function: the set of skills, such as organization and impulse control, people need to plan and achieve goals. Children from lower socioeconomic backgrounds tend to perform worse on tests of these skills than do their more privileged peers, past research shows. But that gap may reflect where those skills are typically tested: a quiet lab, in which some children may feel out of their element, Ip says. “That may not actually mimic their actual day-to-day environment.” Which is why Ip started to devise a series of experiments to conduct at the less-than-serene state fair. “We really want to understand how, for example, unpredictability in the home environment is related to executive function development,” he says. The fair also offered a way to recruit children from a wider swath of society than researchers can often find at a university, he adds. Last month, after a year of planning, Ip and his team lugged a trolley full of equipment to the fairgrounds outside Minneapolis. There, they collected functional near-infrared spectroscopy (fNIRS) data on 75 children aged 3 to 7 as they played a computer game that tests impulse control. The team aims to evaluate whether the bustling surroundings affect participants’ performances and neural activity differently based on their background. © 2024 Simons Foundation
Keyword: Brain imaging; Attention
Link ID: 29490 - Posted: 09.25.2024
By Angie Voyles Askham Nathaniel Daw has never touched a mouse. As professor of computational and theoretical neuroscience at Princeton University, he mainly works with other people’s data to construct models of the brain’s decision-making process. So when a collaborator came to him a few years ago with confusing data from mice that had performed a complex decision-making task in the lab, Daw says his best advice was just to fit the findings to the tried-and-true model of reward prediction error (RPE). That model relies on the idea that dopaminergic activity in the midbrain reflects discrepancies between expected and received rewards. Daw’s collaborator, Ben Engelhard, had measured the activity of dopamine neurons in the ventral tegmental area (VTA) of mice as they were deciding how to navigate a virtual environment. And although the virtual environment was more complex than what a mouse usually experiences in the real world, an RPE-based model should have held, Daw assumed. “It was obvious to me that there was this very simple story that was going to explain his data,” Daw says. But it didn’t. The neurons exhibited a wide range of responses, with some activated by visual cues and others by movement or cognitive tasks. The classic RPE model, it turned out, could not explain such heterogeneity. Daw, Engelhard and their colleagues published the findings in 2019. That was a wake-up call, Daw says, particularly after he watched videos of what the mice actually experienced in the maze. “It’s just so much more complicated, and high dimensional, and richer” than expected, he says. The idea that this richness could be reduced to such a simple model seems ludicrous now, he adds. “I was just so blinded.” © 2024 Simons Foundation
Keyword: Attention; Drug Abuse
Link ID: 29479 - Posted: 09.14.2024
By Christina Caron Julianna McLeod, 26, had her first psychotic episode while taking Vyvanse for attention deficit hyperactivity disorder last year. Ms. McLeod, who lives in Ontario, Canada, had taken the drug before but paused while pregnant with her first child and didn’t start taking it again until six months postpartum. Although the dose was 40 milligrams, she often forgot when she had last taken a pill. So she took one whenever she remembered — and may have ended up taking more than her prescribed daily dose. The delusions that she experienced made her feel euphoric and highly energetic. “I felt like my brain was exploding with connections,” she said. In her mind, she was a “super detective” who was uncovering the people and organizations that were secretly engaging in child sex trafficking. She even began to believe that someone was drugging her and her baby. Psychosis and mania are each known side effects of stimulant medications, and the Food and Drug Administration has added warnings to the medications’ labels saying that they may cause symptoms like hallucinations, delusional thinking or mania. But these side effects are considered rare — experienced by an estimated 1 in 1,000 patients — and have not been extensively researched. It can take months for someone to fully recover. A new study published on Thursday in The American Journal of Psychiatry suggests that dosage may play a role. It found that among people who took high doses of prescription amphetamines such as Vyvanse and Adderall, there was a fivefold increased risk of developing psychosis or mania for the first time compared with those who weren’t taking stimulants. The researchers defined a high dose as more than 40 milligrams of Adderall, 100 milligrams of Vyvanse or 30 milligrams of dextroamphetamine. The medium dosage (20 to 40 milligrams of Adderall, 50 to 100 milligrams of Vyvanse or 16 to 30 milligrams of dextroamphetamine) was associated with a 3.5 times higher risk of psychosis or mania. There was no increased risk of psychosis or mania among those who used methylphenidate drugs, like Concerta or Ritalin, regardless of the dose. © 2024 The New York Times Company
Keyword: ADHD; Schizophrenia
Link ID: 29478 - Posted: 09.14.2024
By Christina Caron Patients and caregivers have struggled for two years to find stimulant medications like Adderall, Vyvanse and Concerta to treat attention deficit hyperactivity disorder. Some spend hours each month going from pharmacy to pharmacy to find a drug, while others are forced to switch to a different brand or formulation, or go without medication for weeks. This week the Drug Enforcement Administration announced a potential solution: It is raising the amount of lisdexamfetamine (Vyvanse) that can be produced by U.S. manufacturers this year by nearly 24 percent to meet demand in the United States and abroad. Vyvanse is an amphetamine that has been approved for use in children and adults with A.D.H.D. and has become commonly prescribed after the generic version was introduced last year. According to the D.E.A., the latest data shows that demand for the drug has been rising globally. But right now every manufacturer of generic Vyvanse listed on the Food and Drug Administration website is experiencing a shortage. Many health care providers who specialize in treating patients with A.D.H.D. said that the D.E.A.’s decision was a positive development but that it was unclear just how much of an effect it might have on the shortage. “Obviously it’s not going to solve the problem completely,” said Ami Norris-Brilliant, clinical director of the Division of A.D.H.D., Learning Disorders, and Related Disorders at the Icahn School of Medicine at Mount Sinai in New York City. “But I think anything that helps increase drug availability is a good thing.” It is not the first time that the D.E.A. has increased production quotas for A.D.H.D. drugs. Last year it announced a new 2023 limit for methylphenidate, which is used to make drugs like Ritalin and Concerta, raising the allotted amount by 27 percent for 2023. The drug remains in shortage, however, in the extended release formulation. © 2024 The New York Times Company
Keyword: ADHD
Link ID: 29473 - Posted: 09.11.2024
By Steve Paulson Oliver Sacks wasn’t always the beloved neurologist we remember today, sleuthing around the backwaters of the mind in search of mysterious mental disorders. For a few years in the 1960s, he was a committed psychonaut, often spending entire weekends blitzed out of his mind on weed, LSD, morning glory seeds, or mescaline. Once, after injecting himself with a large dose of morphine, he found himself hovering over an enormous battlefield, watching the armies of England and France drawn up for battle, and then realized he was witnessing the 1415 Battle of Agincourt. “I completely lost the sense that I was lying on my bed stoned,” he told me in 2012, a few years before he died. “I felt like a historian, seeing Agincourt from a celestial viewpoint. This was not ordinary imagination. It was absolutely real.” The vision seemed to last only a few minutes, but later, he discovered he’d been tripping for 13 hours. These early experiences with hallucinogens gave Sacks an appreciation for the strange turns the mind can take. He had a craving for direct experience of the numinous, but he believed his visions were nothing more than hallucinations. “At the physiological level, everything is electricity and chemistry, but it was a wonderful feeling,” he said. When I asked if he ever thought he’d crossed over into some transpersonal dimension of reality, he said, “I’m an old Jewish atheist. I have no belief in heaven or anything supernatural or paranormal, but there’s a mystical feeling of oneness and of beauty, which is not explicitly religious, but goes far beyond the aesthetic.” I’ve often thought about this conversation as I’ve watched today’s psychedelic renaissance. Clinical trials with psychedelic-assisted therapy show great promise for treating depression, addiction, and PTSD, and a handful of leading universities have recently created their own heavily endowed psychedelic centers. © 2024 NautilusNext Inc.,
Keyword: Drug Abuse; Consciousness
Link ID: 29453 - Posted: 08.28.2024
By Rachel Nuwer One person felt a sensation of “slowly floating into the air” as images flashed around. Another recalled “the most profound sense of love and peace,” unlike anything experienced before. Consciousness became a “foreign entity” to another whose “whole sense of reality disappeared.” These were some of the firsthand accounts shared in a small survey of people who belonged to an unusual cohort: They had all undergone a near-death experience and tried psychedelic drugs. The survey participants described their near-death and psychedelic experiences as being distinct, yet they also reported significant overlap. In a paper published on Thursday, researchers used these accounts to provide a comparison of the two phenomena. “For the first time, we have a quantitative study with personal testimony from people who have had both of these experiences,” said Charlotte Martial, a neuroscientist at the University of Liège in Belgium and an author of the findings, which were published in the journal Neuroscience of Consciousness. “Now we can say for sure that psychedelics can be a kind of window through which people can enter a rich, subjective state resembling a near-death experience.” Near-death experiences are surprisingly common — an estimated 5 to 10 percent of the general population has reported having one. For decades, scientists largely dismissed the fantastical stories of people who returned from the brink of death. But some researchers have started to take these accounts seriously. “In recent times, the science of consciousness has become interested in nonordinary states,” said Christopher Timmermann, a research fellow at the Center for Psychedelic Research at Imperial College London and an author of the article. “To get a comprehensive account of what it means to be a human being requires incorporating these experiences.” © 2024 The New York Times Company
Keyword: Consciousness; Drug Abuse
Link ID: 29450 - Posted: 08.22.2024
By Carl Zimmer When people suffer severe brain damage — as a result of car crashes, for example, or falls or aneurysms — they may slip into a coma for weeks, their eyes closed, their bodies unresponsive. Some recover, but others enter a mysterious state: eyes open, yet without clear signs of consciousness. Hundreds of thousands of such patients in the United States alone are diagnosed in a vegetative state or as minimally conscious. They may survive for decades without regaining a connection to the outside world. These patients pose an agonizing mystery both for their families and for the medical professionals who care for them. Even if they can’t communicate, might they still be aware? A large study published on Wednesday suggests that a quarter of them are. Teams of neurologists at six research centers asked 241 unresponsive patients to spend several minutes at a time doing complex cognitive tasks, such as imagining themselves playing tennis. Twenty-five percent of them responded with the same patterns of brain activity seen in healthy people, suggesting that they were able to think and were at least somewhat aware. Dr. Nicholas Schiff, a neurologist at Weill Cornell Medicine and an author of the study, said the study shows that up to 100,000 patients in the United States alone might have some level of consciousness despite their devastating injuries. The results should lead to more sophisticated exams of people with so-called disorders of consciousness, and to more research into how these patients might communicate with the outside world, he said: “It’s not OK to know this and to do nothing.” When people lose consciousness after a brain injury, neurologists traditionally diagnose them with a bedside exam. They may ask patients to say something, to look to their left or right, or to give a thumbs-up. © 2024 The New York Times Company
Keyword: Consciousness
Link ID: 29436 - Posted: 08.15.2024
By Greg Donahue In late 2018, after an otherwise-normal Christmas holiday, Laurie Beatty started acting strange. An 81-year-old retired contractor, he grew unnaturally quiet and began poring over old accounting logs from a construction business he sold decades earlier, convinced that he had been bilked in the deal. Listen to this article, read by Robert Petkoff Over the course of several days, Beatty slipped further into unreality. He told his wife the year was 1992 and wondered aloud why his hair had turned white. Then he started having seizures. His arms began to move in uncontrollable jerks and twitches. By the end of May, he was dead. Doctors at the Georges-L.-Dumont University Hospital Center in Moncton, the largest city in the province of New Brunswick, Canada, zeroed in on an exceedingly rare condition — Creutzfeldt-Jakob disease, caused by prions, misfolding proteins in the brain — as the most likely culprit. The doctors explained this to Beatty’s children, Tim and Jill, and said they would run additional tests to confirm the post-mortem diagnosis. Three months later, when the siblings returned to the office of their father’s neurologist, Dr. Alier Marrero, that’s what they were expecting to hear. Instead, Marrero told them that Laurie’s Creutzfeldt-Jakob test had come back negative. “We were all looking at one another,” Tim says, “because we were all very confused.” If Creutzfeldt-Jakob hadn’t killed their father, then what had? What Marrero said next was even more unsettling. “There’s something going on,” they recall him saying. “And I don’t know what it is.” It turned out that Laurie Beatty was just one of many local residents who had gone to Marrero’s office exhibiting similar, inexplicable symptoms of neurological decline — more than 20 in the previous four years. The first signs were often behavioral. One patient fell asleep for nearly 20 hours straight before a friend took her to the hospital; another found himself afraid to disturb the stranger who had sat down in his living room, only to realize hours later that the stranger was his wife. © 2024 The New York Times Company
Keyword: Alzheimers; Learning & Memory
Link ID: 29434 - Posted: 08.15.2024