Chapter 10. Vision: From Eye to Brain

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.

Links 1 - 20 of 1396

By Angie Voyles Askham The primary visual cortex carries, well, visual information — or so scientists thought until early 2010. That’s when a team at the University of California, San Francisco first described vagabond activity in the brain area, called V1, in mice. When the animals started to run on a treadmill, some neurons more than doubled their firing rate. The finding “was kind of mysterious,” because V1 was thought to represent only visual signals transmitted from the retina, says Anne Churchland, professor of neurobiology at the University of California, Los Angeles, who was not involved in that work. “The idea that running modulated neural activity suggested that maybe those visual signals were corrupted in a way that, at the time, felt like it would be really problematic.” The mystery grew over the next decade, as a flurry of mouse studies from Churchland and others built on the 2010 results. Both arousal and locomotion could shape the firing of primary visual neurons, those newer findings showed, and even subtle movements such as nose scratches contribute to variance in population activity, all without compromising the sensory information. A consensus started to form around the idea that sensory cortical regions encode broader information about an animal’s physiological state than previously thought. At least until last year, when two studies threw a wrench into that storyline: Neither marmosets nor macaque monkeys show any movement-related increase in V1 signaling. Instead, running seems to slightly suppress V1 activity in marmosets, and spontaneous movements have no effect on the same cells in macaques. The apparent differences across species raise new questions about whether mice are a suitable model to study the primate visual system, says Michael Stryker, professor of physiology at the University of California, San Francisco, who led the 2010 work. “Maybe the primate’s V1 is not working the same as in the mouse,” he says. “As I see it, it’s still a big unanswered question.” © 2024 Simons Foundation

Keyword: Vision
Link ID: 29153 - Posted: 02.20.2024

By Kevin Mitchell It is often said that “the mind is what the brain does.” Modern neuroscience has indeed shown us that mental goings-on rely on and are in some sense entailed by neural goings-on. But the truth is that we have a poor handle on the nature of that relationship. One way to bridge that divide is to try to define the relationship between neural and mental representations. The basic premise of neuroscience is that patterns of neural activity carry some information — they are about something. But not all such patterns need be thought of as representations; many of them are just signals. Simple circuits such as the muscle stretch reflex or the eye-blink reflex, for example, are configured to respond to stimuli such as the lengthening of a muscle or a sudden bright light. But they don’t need to internally represent this information — or make that information available to other parts of the nervous system. They just need to respond to it. More complex information processing, by contrast, such as in our image-forming visual system, requires internal neural representation. By integrating signals from multiple photoreceptors, retinal ganglion cells carry information about patterns of light in the visual stimulus — particularly edges where the illumination changes from light to dark. This information is then made available to the thalamus and the cortical hierarchy, where additional processing goes on to extract higher- and higher-order features of the entire visual scene. Scientists have elucidated the logic of these hierarchical systems by studying the types of stimuli to which neurons are most sensitively tuned, known as “receptive fields.” If some neuron in an early cortical area responds selectively to, say, a vertical line in a certain part of the visual field, the inference is that when such a neuron is active, that is the information that it is representing. In this case, it is making that information available to the next level of the visual system — itself just a subsystem of the brain. © 2024 Simons Foundation

Keyword: Consciousness; Vision
Link ID: 29148 - Posted: 02.13.2024

By Shruti Ravindran When preparing to become a butterfly, the Eastern Black Swallowtail caterpillar wraps its bright striped body within a leaf. This leaf is its sanctuary, where it will weave its chrysalis. So when the leaf is disturbed by a would-be predator—a bird or insect—the caterpillar stirs into motion, briefly darting out a pair of fleshy, smelly horns. To humans, these horns might appear yellow—a color known to attract birds and many insects—but from a predator’s-eye-view, they appear a livid, almost neon violet, a color of warning and poison for some birds and insects. “It’s like a jump scare,” says Daniel Hanley, an assistant professor of biology at George Mason University. “Startle them enough, and all you need is a second to get away.” Hanley is part of a team that has developed a new technique to depict on video how the natural world looks to non-human species. The method is meant to capture how animals use color in unique—and often fleeting—behaviors like the caterpillar’s anti-predator display. Most animals, birds, and insects possess their own ways of seeing, shaped by the light receptors in their eyes. Human retinas, for example, are sensitive to three wavelengths of light—blue, green, and red—which enables us to see approximately 1 million different hues in our environment. By contrast, many mammals, including dogs, cats, and cows, sense only two wavelengths. But birds, fish, amphibians, and some insects and reptiles typically can sense four—including ultraviolet light. Their worlds are drenched in a kaleidoscope of color—they can often see 100 times as many shades as humans do. Hanley’s team, which includes not just biologists but multiple mathematicians, a physicist, an engineer, and a filmmaker, claims that their method can translate the colors and gradations of light perceived by hundreds of animals to a range of frequencies that human eyes can comprehend with an accuracy of roughly 90 percent. That is, they can simulate the way a scene in a natural environment might look to a particular species of animal, what shifting shapes and objects might stand out most. The team uses commercially available cameras to record video in four color channels—blue, green, red, and ultraviolet—and then applies open source software to translate the picture according to the mix of light receptor sensitivities a given animal may have. © 2024 NautilusNext Inc.,

Keyword: Vision; Evolution
Link ID: 29133 - Posted: 02.06.2024

By Mark Johnson There had been early clues, but it was a family game of dominoes around Christmas 2021 that convinced Susan Stewart that something was wrong with her husband. Charlie Stewart, then 75 and retired, struggled to match the dots on different domino tiles. Susan assumed it was a vision problem. Charlie’s memory was fine, and he had no family history of dementia. But months later the Marin County, Calif., couple were shocked to learn that his domino confusion was a sign he had a lesser-known variant of Alzheimer’s disease. For patients with this variant, called posterior cortical atrophy, the disease begins with problems affecting vision rather than memory. The unusual early symptoms mean that thousands of people may go years before receiving the correct diagnosis, experts said. That may change with the first large-scale international study of the condition, published Monday in the journal Lancet Neurology. An international team led by researchers at the University of California at San Francisco studied records of 1,092 PCA patients from 16 countries and found that, on average, the syndrome begins affecting patients at age 59 ― about five to six years earlier than most patients with the more common form of Alzheimer’s. Although the number of patients with PCA has not been established, researchers say that the variant may account for as many as 10 percent of all Alzheimer’s cases; that would put the number of Americans with the condition close to 700,000. “We have a lot of work to do to raise awareness about the syndrome,” said Gil D. Rabinovici, one of the study’s authors and director of the UCSF Alzheimer’s Disease Research Center. “One thing that we found in our large study is that by the time people are diagnosed, they’ve had [the disease] for quite a few years.” The study authors said they hope greater awareness of the syndrome will help doctors diagnose it earlier and will encourage researchers to include patients with PCA in future Alzheimer’s clinical trials. Unusual symptoms delay diagnosis

Keyword: Alzheimers; Vision
Link ID: 29107 - Posted: 01.23.2024

By Jaimie Seaton It’s not uncommon for Veronica Smith to be looking at her partner’s face when suddenly she sees his features changing—his eyes moving closer together and then farther apart, his jawline getting wider and narrower, and his skin moving and shimmering. Smith, age 32, has experienced this phenomenon when looking at faces since she was four or five years old, and while it’s intermittent when she’s viewing another person’s face, it’s more constant when she views her own. “I almost always experience it when I look at my own face in the mirror, which makes it really hard to get ready because I’ll think that I look weird,” Smith explains. “I can more easily tell that I’m experiencing distortions when I’m looking at other people because I know what they look like.” Smith has a rare condition called prosopometamorphopsia (PMO), in which faces appear distorted in shape, texture, position or color. (PMO is related to Alice in Wonderland syndrome, or AIWS, which distorts the size perception of objects or one’s own body.) PMO has fascinated many scientists. The late neurologist and writer Oliver Sacks co-wrote a paper on the condition that was published in 2014, the year before he died. Brad Duchaine, a professor of psychological and brain sciences at Dartmouth College, explains that some people with it see distortions that affect the whole face (bilateral PMO) while others see only the left or right half of a face as distorted (hemi-PMO). “Not surprisingly, people with PMO find the distortions extremely distressing. Over the last century, approximately 75 cases have been reported in the literature. However, little is known about the condition because cases with face distortions have usually been documented by neurologists who don’t have expertise in visual neuroscience or the time to study the cases in depth,” Duchaine says. For 25 years Duchaine’s work has focused on prosopagnosia (face blindness), but after co-authoring a study on hemi-PMO that was published in 2020, Duchaine shifted much of his lab’s work to PMO. © 2023 SCIENTIFIC AMERICAN,

Keyword: Attention; Vision
Link ID: 29051 - Posted: 12.16.2023

By Roberta McLain Dreams have fascinated people for millennia, yet we struggle to understand their purpose. Some theories suggest dreams help us deal with emotions, solve problems or manage hidden desires. Others postulate that they clean up brain waste, make memories stronger or deduce the meaning of random brain activity. A more recent theory suggests nighttime dreams protect visual areas of the brain from being co-opted during sleep by other sensory functions, such as hearing or touch. David Eagleman, a neuroscientist at Stanford University, has proposed the idea that dreaming is necessary to safeguard the visual cortex—the part of the brain responsible for processing vision. Eagleman’s theory takes into account that the human brain is highly adaptive, with certain areas able to take on new tasks, an ability called neuroplasticity. He argues that neurons compete for survival. The brain, Eagleman explains, distributes its resources by “implementing a do-or-die competition” for brain territory in which sensory areas “gain or lose neural territory when inputs slow, stop or shift.” Experiences over a lifetime reshape the map of the brain. “Just like neighboring nations, neurons stake out their territory and chronically defend them,” he says. Eagleman points to children who have had half their brain removed because of severe health problems and then regain normal function. The remaining brain reorganizes itself and takes over the roles of the missing sections. Similarly, people who lose sight or hearing show heightened sensitivity in the remaining senses because the region of the brain normally used by the lost sense is taken over by other senses. Reorganization can happen fast. Studies published in 2007 and 2008 by Lotfi Merabet of Harvard Medical School and his colleagues showed just how quickly this takeover can happen. The 2008 study, in which subjects were blindfolded, revealed that the seizing of an idle area by other senses begins in as little as 90 minutes. And other studies found that this can occur within 45 minutes. When we sleep, we can smell, hear and feel, but visual information is absent—except during REM sleep. © 2023 SCIENTIFIC AMERICAN,

Keyword: Sleep; Vision
Link ID: 29045 - Posted: 12.13.2023

By Meghan Rosen In endurance athletes, some brain power may come from an unexpected source. Marathon runners appear to rely on myelin, the fatty tissue bundled around nerve fibers, for energy during a race, scientists report October 10 in a paper posted at In the day or two following a marathon, this tissue seems to dwindle drastically, brain scans of runners reveal. Two weeks after the race, the brain fat bounces back to nearly prerace levels. The find suggests that the athletes burn so much energy running that they need to tap into a new fuel supply to keep the brain operating smoothly. “This is definitely an intriguing observation,” says Mustapha Bouhrara, a neuroimaging scientist at the National Institute on Aging in Baltimore. “It is quite plausible that myelin lipids are used as fuel in extended exercise.” If what the study authors are seeing is real, he says, the work could have therapeutic implications. Understanding how runners’ myelin recovers so rapidly might offer clues for developing potential treatments — like for people who’ve lost myelin due to aging or neurodegenerative disease. Much of the human brain contains myelin, tissue that sheathes nerve fibers and acts as an insulator, like rubber coating an electrical wire. That insulation lets electrical messages zip from nerve cell to nerve cell, allowing high-speed communication that’s crucial for brain function. The fatty tissue seems to be a straightforward material with a straightforward job, but there’s likely more to it than that, says Klaus-Armin Nave, a neurobiologist at the Max Planck Institute for Multidisciplinary Sciences in Göttingen, Germany. “For the longest time, it was thought that myelin sheathes were assembled, inert structures of insulation that don’t change much after they’re made,” he says. Today, there’s evidence that myelin is a dynamic structure, growing and shrinking in size and abundance depending on cellular conditions. The idea is called myelin plasticity. “It’s hotly researched,” Nave says. © Society for Science & the Public 2000–2023.

Keyword: Glia; Multiple Sclerosis
Link ID: 28983 - Posted: 11.01.2023

By Jacqueline Howard and Deidre McPhillips, Most families of children with autism may face long wait times to diagnose their child with the disorder, and once a diagnosis is made, it sometimes may not be definitive. But now, two studies released Tuesday suggest that a recently developed eye-tracking tool could help clinicians diagnose children as young as 16 months with autism – and with more certainty. Kids’ developmental disability diagnoses became more common during pandemic, but autism rates held steady, CDC report says “This is not a tool to replace expert clinicians,” said Warren Jones, director of research at the Marcus Autism Center at Children’s Healthcare of Atlanta and Nien Distinguished Chair in Autism at Emory University School of Medicine, who was an author on both studies. Rather, he said, the hope with this eye-tracking technology is that “by providing objective measurements that objectively measure the same thing in each child,” it can help inform the diagnostic process. The tool, called EarliPoint Evaluation, is cleared by the US Food and Drug Administration to help clinicians diagnose and assess autism, according to the researchers. Traditionally, children are diagnosed with autism based on a clinician’s assessment of their developmental history, behaviors and parents’ reports. Evaluations can take hours, and some subtle behaviors associated with autism may be missed, especially among younger children. “Typically, the way we diagnose autism is by rating our impressions,” said Whitney Guthrie, a clinical psychologist and scientist at the Children’s Hospital of Philadelphia’s Center for Autism Research. She was not involved in the new studies, but her research focuses on early diagnosis of autism.

Keyword: Autism; Schizophrenia
Link ID: 28904 - Posted: 09.13.2023

Jean Bennett Gene therapy is a set of techniques that harness DNA or RNA to treat or prevent disease. Gene therapy treats disease in three primary ways: by substituting a disease-causing gene with a healthy new or modified copy of that gene; turning genes on or off; and injecting a new or modified gene into the body. Get facts about the coronavirus pandemic and the latest research How has gene therapy changed how doctors treat genetic eye diseases and blindness? In the past, many doctors did not think it necessary to identify the genetic basis of eye disease because treatment was not yet available. However, a few specialists, including me and my collaborators, identified these defects in our research, convinced that someday treatment would be made possible. Over time, we were able to create a treatment designed for individuals with particular gene defects that lead to congenital blindness. This development of gene therapy for inherited disease has inspired other groups around the world to initiate clinical trials targeting other genetic forms of blindness, such as choroideremia, achromatopsia, retinitis pigmentosa and even age-related macular degeneration, all of which lead to vision loss. There are at least 40 clinical trials enrolling patients with other genetic forms of blinding disease. Gene therapy is even being used to restore vision to people whose photoreceptors – the cells in the retina that respond to light – have completely degenerated. This approach uses optogenetic therapy, which aims to revive those degenerated photoreceptors by adding light-sensing molecules to cells, thereby drastically improving a person’s vision. © 2010–2023, The Conversation US, Inc.

Keyword: Vision; Genes & Behavior
Link ID: 28781 - Posted: 05.13.2023

A National Institutes of Health team has identified a compound already approved by the U.S. Food and Drug Administration that keeps light-sensitive photoreceptors alive in three models of Leber congenital amaurosis type 10 (LCA 10), an inherited retinal ciliopathy disease that often results in severe visual impairment or blindness in early childhood. LCA 10 is caused by mutations of the cilia-centrosomal gene (CEP290). Such mutations account for 20% to 25% of all LCA – more than any other gene. In addition to LCA, CEP290 mutations can cause multiple syndromic diseases involving a range of organ systems. Using a mouse model of LCA10 and two types of lab-created tissues from stem cells known as organoids, the team screened more than 6,000 FDA-approved compounds to identify ones that promoted survival of photoreceptors, the types of cells that die in LCA, leading to vision loss. The high-throughput screening identified five potential drug candidates, including Reserpine, an old medication previously used to treat high blood pressure. Observation of the LCA models treated with Reserpine shed light on the underlying biology of retinal ciliopathies, suggesting new targets for future exploration. Specifically, the models showed a dysregulation of autophagy, the process by which cells break down old or abnormal proteins, which in this case resulted in abnormal primary cilia, a microtubule organelle that protrudes from the surface of most cell types. In LCA10, CEP290 gene mutations cause dysfunction of the primary cilium in retinal cells. Reserpine appeared to partially restore autophagy, resulting in improved primary cilium assembly.

Keyword: Vision
Link ID: 28720 - Posted: 03.29.2023

By Jack Tamisiea Even a fisher’s yarn would sell a whale shark short. These fish—the biggest on the planet—stretch up to 18 meters long and weigh as much as two elephants. The superlatives don’t end there: Whale sharks also have one of the longest vertical ranges of any sea creature, filter feeding from the surface of the ocean to nearly 2000 meters down into the inky abyss. Swimming between bright surface waters and the pitch black deep sea should strain the shark’s eyes, making their lifestyle impossible. But researchers have now uncovered the genetic wiring that prevents this from happening. The study, published this week in the Proceedings of the National Academy of Sciences, pinpoints a genetic mutation that makes a visual pigment in the whale shark’s retina more sensitive to temperature changes. As a result, the pigments—which sense blue light in dark environments—are activated in the chilly deep sea and deactivated when the sharks return to the balmy surface to feed, allowing them to prioritize different parts of their vision at different depths. Ironically, the genetic alteration is surprisingly similar to one that degrades pigments in human retinas, causing night blindness. It remains unclear why whale sharks dive so deep. Because prey is scarce at these depths, the behavior may be linked to mating. But whatever they do, the sharks rely on a light-sensing pigment in their retinas called rhodopsin to navigate the dark waters. Although the pigments are less useful in sunny habitats, they help many vertebrates, including humans, detect light in dim environments. In the deep sea, the rhodopsin pigments in whale shark eyes are specifically calibrated to see blue light—the only color that reaches these depths. Previous research has revealed bottom-dwelling cloudy catsharks (Scyliorhinus torazame) have similarly calibrated pigments in their eyes to spot blue light. But these small sharks are content in the deep, making whale sharks the only known sharks to sport these pigments in the shallows. In lighter waters, these blue light–sensing pigments could act as a hindrance to seeing other kinds of light, but whale sharks are still able to maneuver with ease as they vacuum up seafood.

Keyword: Vision; Genes & Behavior
Link ID: 28719 - Posted: 03.25.2023

By Allison Parshall Functional magnetic resonance imaging, or fMRI, is one of the most advanced tools for understanding how we think. As a person in an fMRI scanner completes various mental tasks, the machine produces mesmerizing and colorful images of their brain in action. Looking at someone’s brain activity this way can tell neuroscientists which brain areas a person is using but not what that individual is thinking, seeing or feeling. Researchers have been trying to crack that code for decades—and now, using artificial intelligence to crunch the numbers, they’ve been making serious progress. Two scientists in Japan recently combined fMRI data with advanced image-generating AI to translate study participants’ brain activity back into pictures that uncannily resembled the ones they viewed during the scans. The original and re-created images can be seen on the researchers’ website. “We can use these kinds of techniques to build potential brain-machine interfaces,” says Yu Takagi, a neuroscientist at Osaka University in Japan and one of the study’s authors. Such future interfaces could one day help people who currently cannot communicate, such as individuals who outwardly appear unresponsive but may still be conscious. The study was recently accepted to be presented at the 2023 Conference on The study has made waves online since it was posted as a preprint (meaning it has not yet been peer-reviewed or published) in December 2022. Online commentators have even compared the technology to “mind reading.” But that description overstates what this technology is capable of, experts say. “I don’t think we’re mind reading,” says Shailee Jain, a computational neuroscientist at the University of Texas at Austin, who was not involved in the new study. “I don’t think the technology is anywhere near to actually being useful for patients—or to being used for bad things—at the moment. But we are getting better, day by day.”

Keyword: Vision; Brain imaging
Link ID: 28708 - Posted: 03.18.2023

Ari Daniel Fred Crittenden, 73, lost his sight to retinitis pigmentosa when he was 35 years old. Today he has no visual perception of light. "It's total darkness," he says. Still, he has cells in his eyes that use light to keep his internal clock ticking along nicely. Marta Iwanek for NPR Every baseball season, 73-year-old Fred Crittenden plants himself in front of his television in his small one-bedroom apartment an hour north of Toronto. "Oh, I love my sports — I love my Blue Jays," says Crittenden. "They need me to coach 'em — they'd be winning, I'll tell ya." He listens to the games in his apartment. He doesn't watch them, because he can't see. "I went blind," Crittenden recalls, when "I was 35 years young." Crittenden has retinitis pigmentosa, an inherited condition that led to the deterioration of his retinas. He lost all his rods (the cells that help us see in dim light) and all his cones (the cells that let us see color in brighter light). Within a single year, in 1985, Crittenden says he went from perfect vision to total blindness. Certain cells within Crittenden's retinas that contain melanopsin help his brain to detect light, even if what he sees is darkness. Among other things, these light-detecting cells help his body regulate his sleep cycles. Marta Iwanek for NPR "The last thing I saw clearly," he says, thinking back, "it was my daughter, Sarah. She was 5 years old then. I used to go in at night and just look at her when she was in the crib. And I could just barely still make her out — her little eyes or her nose or her lips or her chin, that kind of stuff. Even to this day it's hard." © 2022 npr

Keyword: Biological Rhythms; Vision
Link ID: 28598 - Posted: 12.17.2022

By Yasemin Saplakoglu Memory and perception seem like entirely distinct experiences, and neuroscientists used to be confident that the brain produced them differently, too. But in the 1990s neuroimaging studies revealed that parts of the brain that were thought to be active only during sensory perception are also active during the recall of memories. “It started to raise the question of whether a memory representation is actually different from a perceptual representation at all,” said Sam Ling, an associate professor of neuroscience and director of the Visual Neuroscience Lab at Boston University. Could our memory of a beautiful forest glade, for example, be just a re-creation of the neural activity that previously enabled us to see it? “The argument has swung from being this debate over whether there’s even any involvement of sensory cortices to saying ‘Oh, wait a minute, is there any difference?’” said Christopher Baker, an investigator at the National Institute of Mental Health who runs the learning and plasticity unit. “The pendulum has swung from one side to the other, but it’s swung too far.” Even if there is a very strong neurological similarity between memories and experiences, we know that they can’t be exactly the same. “People don’t get confused between them,” said Serra Favila, a postdoctoral scientist at Columbia University and the lead author of a recent Nature Communications study. Her team’s work has identified at least one of the ways in which memories and perceptions of images are assembled differently at the neurological level. When we look at the world, visual information about it streams through the photoreceptors of the retina and into the visual cortex, where it is processed sequentially in different groups of neurons. Each group adds new levels of complexity to the image: Simple dots of light turn into lines and edges, then contours, then shapes, then complete scenes that embody what we’re seeing. Simons Foundation © 2022

Keyword: Attention; Vision
Link ID: 28597 - Posted: 12.15.2022

By Lisa Mulcahy If you’ve ever had your vision “white out” (or “gray out”), you’ve probably felt a little unnerved by the experience. “You’ll see a bright light, and your vision will go pale,” says Teri K. Geist, an optometrist and trustee of the American Optometry Association. As disconcerting as they are, vision whiteouts are usually benign. Making sure, though, means talking with a physician or optometrist. Before you do, here are some things to consider. If you have recurrent whiteouts, counting their duration in real time can help get you the correct diagnosis. Note any specific details the whiteouts appear to have in common. Do they happen right after you stand up from a chair, for example? Most often, whiteouts occur when a person is ready to pass out because of a sudden drop in blood pressure. About 1 in 3 people will faint at some point in their lives. “Fainting can be benign when it’s related to a sudden stress,” says Sarah Thornton, a neuro-ophthalmologist at Wills Eye Hospital in Philadelphia. “Standing up too fast, overexerting, becoming dehydrated or taking certain medications can also lead to hypotension — low blood pressure — and potentially, a whiteout.” A less common risk: “Whiteouts can occur with changes in G force,” says Geist, for instance, in a car accident or on a roller coaster. A whiteout caused by physical stress or exertion will clear within just a few minutes. Although fainting is usually benign, always tell your doctor if you’ve fainted — occasionally, whiteouts and fainting are tied to something serious. “An underlying heart condition, such as aortic stenosis, could cause fainting symptoms, including whiteout,” says Dean M. Cestari, a neuro-ophthalmologist at Mass General Brigham Mass Eye and Ear in Boston and associate professor of ophthalmology at Harvard Medical School. Other such conditions can include arrhythmias, heart failure and atrial fibrillation.

Keyword: Vision
Link ID: 28568 - Posted: 11.30.2022

By Nicola Jones What color is a tree, or the sky, or a sunset? At first glance, the answers seem obvious. But it turns out there is plenty of variation in how people see the world — both between individuals and between different cultural groups. A lot of factors feed into how people perceive and talk about color, from the biology of our eyes to how our brains process that information, to the words our languages use to talk about color categories. There’s plenty of room for differences, all along the way. For example, most people have three types of cones — light receptors in the eye that are optimized to detect different wavelengths or colors of light. But sometimes, a genetic variation can cause one type of cone to be different, or absent altogether, leading to altered color vision. Some people are color-blind. Others may have color superpowers. Our sex can also play a role in how we perceive color, as well as our age and even the color of our irises. Our perception can change depending on where we live, when we were born and what season it is. To learn more about individual differences in color vision, Knowable Magazine spoke with visual neuroscientist Jenny Bosten of the University of Sussex in England, who wrote about the topic in the 2022 Annual Review of Vision Science. This conversation has been edited for length and clarity. How many colors are there in the rainbow? Physically, the rainbow is a continuous spectrum. The wavelengths of light vary smoothly between two ends within the visible range. There are no lines, no sharp discontinuities. The human eye can discriminate far more than seven colors within that range. But in our culture, we would say that we see seven color categories in the rainbow: red, orange, yellow, green, blue, indigo and violet. That’s historical and cultural. © 2022 Annual Reviews

Keyword: Vision
Link ID: 28531 - Posted: 10.28.2022

Yuta Senzai Massimo Scanziani Does rapid eye movement during sleep reveal where you’re looking at in the scenery of dreams, or are they simply the result of random jerks of our eye muscles? Since the discovery of REM sleep in the early 1950s, the significance of these rapid eye movements has intrigued and fascinated scores of scientists, psychologists and philosophers. REM sleep, as the name implies, is a period of sleep when your eyes move under your closed eyelids. It’s also the period when you experience vivid dreams. We are researchers who study how the brain processes sensory information during wakefulness and sleep. In our recently published study, we found that the eye movements you make while you sleep may reflect where you’re looking in your dreams. Past studies have attempted to address this question by monitoring the eye movements of people as they slept and waking them up to ask what they were dreaming. The goal was to find a possible connection between the content of a dream just before waking up (say, a car coming in from the left) and the direction the eyes moved at that moment. Unfortunately, these studies have led to contradictory results. It could be that some participants inaccurately reported dreams, and it’s technically difficult to match a given eye movement to a specific moment in a self-reported dream. We decided to bypass the problem of dream self-reporting. Instead, we used a more objective way to measure dreams: the electrical activity of a sleeping mouse brain. Mice, like humans and many other animals, also experience REM sleep. Additionally, they have a sort of internal compass in their brains that gives them a sense of head direction. When the mouse is awake and running around, the electrical activity of this internal compass precisely reports its head direction, or “heading,” as it moves in its environment. © 2010–2022, The Conversation US, Inc.

Keyword: Sleep; Vision
Link ID: 28453 - Posted: 08.27.2022

By Betsy Mason 08.05.2022 What is special about humans that sets us apart from other animals? Less than some of us would like to believe. As scientists peer more deeply into the lives of other animals, they’re finding that our fellow creatures are far more emotionally, socially, and cognitively complex than we typically give them credit for. A deluge of innovative research is revealing that behavior we would call intelligent if humans did it can be found in virtually every corner of the animal kingdom. Already this year scientists have shown that Goffin’s cockatoos can use multiple tools at once to solve a problem, Australian Magpies will cooperate to remove tracking devices harnessed to them by scientists, and a small brown songbird can sometimes keep time better than the average professional musician — and that’s just among birds. This pileup of fascinating findings may be at least partly responsible for an increase in people’s interest in the lives of other animals — a trend that’s reflected in an apparent uptick in books and television shows on the topic, as well as in legislation concerning other species. Public sentiment in part pushed the National Institutes of Health to stop supporting biomedical research on chimpanzees in 2015. In Canada, an outcry led to a ban in 2019 on keeping cetaceans like dolphins and orcas in captivity. And earlier this year, the United Kingdom passed an animal welfare bill that officially recognizes that many animals are sentient beings capable of suffering, including invertebrates like octopuses and lobsters. Many of these efforts are motivated by human empathy for animals we’ve come to see as intelligent, feeling beings like us, such as chimpanzees and dolphins. But how can we extend that concern to the millions of other species that share the planet with us?

Keyword: Vision; Hearing
Link ID: 28447 - Posted: 08.27.2022

By Fionna M. D. Samuels, Liz Tormes Experiencing art, whether through melody or oil paint, elicits in us a range of emotions. This speaks to the innate entanglement of art and the brain: Mirror neurons can make people feel like they are physically experiencing a painting. And listening to music can change their brain chemistry. For the past 11 years, the Netherlands Institute for Neuroscience in Amsterdam has hosted the annual Art of Neuroscience Competition and explored this intersection. This year’s competition received more than 100 submissions, some created by artists inspired by neuroscience and others by neuroscientists inspired by art. The top picks explore a breadth of ideas—from the experience of losing consciousness to the importance of animal models in research—but all of them tie back to our uniquely human brain. In the moment between wakefulness and sleep, we may feel like we are losing ourself to the void of unconsciousness. This is the moment Daniela de Paulis explores with her interdisciplinary project Mare Incognito. “I always had a fascination for the moment of falling asleep,” she says. “Since I was a very small child, I always found this moment as quite transformative, also quite frightening in a way.” The winning Art of Neuroscience submission is the culmination of her project: a film that recorded de Paulis falling asleep among the silver, treelike antennas of the Square Kilometer Array at the Mullard Radio Observatory in Cambridge, England, while her brain activity was converted into radio waves and transmitted directly into space. “We combined the scientific interest with my poetic fascination in this idea of losing consciousness,” she says. In the clip above, Tristan Bekinschtein, a neuroscientist at the University of Cambridge, explains the massive change humans and their brain experience when they drift from consciousness into sleep. As someone falls asleep, their brain activity slows down in stages until they are fully out. Then bursts of activity light up their gray matter as their brain switches over to rapid eye movement (REM) sleep, and they begin to dream. © 2022 Scientific American,

Keyword: Vision; Brain imaging
Link ID: 28446 - Posted: 08.27.2022

By Diana Kwon During an embryo's development, a piece of the still-growing brain branches off to form the retina, a sliver of tissue in the back of the eye. This makes the retina, which is composed of several layers of neurons, a piece of the central nervous system. As evidence builds that changes in the brain can manifest in this region, scientists are turning to retinas as a potential screening target for early signs of Alzheimer's, an incurable neurodegenerative disease that affects an estimated six million people in the U.S. alone. Initially clinicians could diagnose Alzheimer's only through brain autopsies after patients died. Since the early 2000s, however, research advances have made it possible to pinpoint signs of the disease—and to begin to investigate treatment—years before symptoms first appear. Today positron emission tomography (PET) brain imaging and tests of cerebrospinal fluid (CSF), the clear liquid surrounding the brain and spinal cord, aid Alzheimer's diagnosis at its early stages. “There have been tremendous improvements in our ability to detect early disease,” says Peter J. Snyder, a neuropsychologist and neuroscientist at the University of Rhode Island. But these diagnostic methods are not always readily available, and they can be expensive and invasive. PET imaging requires injecting a radioactive tracer molecule into the bloodstream, and spinal fluid must be extracted with a needle inserted between vertebrae in the back. “We need ways of funneling the right high-risk individuals into the diagnostic process with low-cost screening tools that are noninvasive and simple to administer,” Snyder says. The retina is a particularly attractive target, he adds, because it is closely related to brain tissue and can be examined noninvasively through the pupil, including with methods routinely used to check for eye diseases. © 2022 Scientific American,

Keyword: Alzheimers; Vision
Link ID: 28442 - Posted: 08.24.2022