Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Diana Kwon Your brain is a bit like a concert hall. To drive our cognitive processes, several groups of neurons need to become active—and, like the various sections of an orchestra, work in harmony to produce the symphony of computations that allow us to perceive and interact with our surroundings. As with an orchestra, the brain likely requires a conductor to keep all its active parts in sync. There are neuroscientists who think that gamma rhythms, fast brain waves that fluctuate at a frequency of approximately 40 cycles per second, play this role. By ticking at regular intervals, these oscillations are thought to act like a clock that coordinates information transfer from one group of neurons to another. There is ample evidence suggesting that gamma waves are important for the brain's computations: decades of studies in humans and other animals have found these patterns in many parts of brain and have associated them with a range of cognitive processes, such as attention and the mental scratchpad of working memory. Some studies have even linked disturbances in these oscillations to various neurological diseases, including schizophrenia and Alzheimer's. But a consensus does not exist. Some neuroscientists think that these gamma waves may not do much at all. Rather than a relevant physiological signal, one camp believes that these rhythms are simply “an exhaust fume of computation,” says Chris Moore, a neuroscientist at the Carney Institute for Brain Science at Brown University. In the same way your car releases emissions each time you drive it—the gamma signal could be perfectly correlated with brain activity, but not provide any meaningful contribution to the actual function of the car, he explains. © 2019 Scientific American
Keyword: Biological Rhythms; Attention
Link ID: 26429 - Posted: 07.19.2019
By Virginia Morell A bold claim about gorilla societies is drawing mixed reviews. Great apes, humans’ closest evolutionary relatives, were thought to lack our social complexity. Chimpanzees, for example, form only small bands that are aggressive toward strangers. But based on years of watching gorillas gather in food-rich forest clearings, a team of scientists has concluded the apes have hierarchical societies similar to those of humans, perhaps to help them exploit rich troves of food. The finding, reported in the current issue of the Proceedings of the Royal Society B, challenges the prevailing notion that such sophisticated societies evolved relatively recently, after humans split from chimpanzees. Instead, these researchers say, the origins of such social systems extend at least as far back as the common ancestor of humans and gorillas, but were lost in chimpanzees. The group has presented “a pretty convincing case for a hierarchical social structure in gorillas,” says Richard Connor, a cetacean biologist and expert on dolphin society at the University of Massachusetts in Dartmouth. But because other primates that are not great apes—notably baboons, geladas, and colobine monkeys—show similar hierarchies, he’s not surprised they have turned up in gorillas, too. Gorillas spend most of their time in dense forests, travel great distances to a new home spot daily, and are slow to get used to observers, making their social lives hard to study. But western gorillas in the Republic of Congo gather periodically at swampy clearings in the forests to feed primarily on the highly abundant vegetation, but also on favorite and rare foods such as certain fig trees that produce massive amounts of fruit only every 3 to 5 years, says Robin Morrison, a zoologist at the University of Cambridge in the United Kingdom and the study’s lead author. © 2019 American Association for the Advancement of Science.
Keyword: Evolution
Link ID: 26428 - Posted: 07.18.2019
By Tanya Lewis Late on Tuesday evening, Elon Musk, the charismatic and eccentric CEO of SpaceX and Tesla, took to the stage at the California Academy of Sciences to make a big announcement. This time, he was not unveiling a new rocket or electric car but a system for recording the activity of thousands of neurons in the brain. With typical panache, Musk talked about putting this technology into a human brain by as early as next year. The work is the product of Neuralink, a company Musk founded in 2016 to develop a high-bandwidth, implantable brain-computer interface (BCI). He says the initial goal is to enable people with quadriplegia to control a computer or smartphone using just their thoughts. But Musk’s vision is much more ambitious than that: he seeks to enable humans to “merge” with AI, giving people superhuman intelligence—an objective that is much more hype than an actual plan for new technology development. Neuralink prototype device. Credit: Neuralink On a more practical note, “the goal is to record from and stimulate [signals called] spikes in neurons” with an order of magnitude more bandwidth than what has been done to date and to have it be safe, Musk said at Tuesday’s event, which was livestreamed. Advertisement The system unveiled last night was a long way from Musk’s sci-fi vision. But it was nonetheless marked an impressive technical development. The team says it has now developed arrays with a very large number of “channels”—up to 3,072 flexible electrodes—which can be implanted in the brain’s outer layer, or cortex, using a surgical robot (a version of which was described as a “sewing machine” in a preprint paper posted on bioRxiv earlier this year). The electrodes are packaged in a small, implantable device containing custom-built integrated circuits, which connects to a USB port outside the brain (the team hopes to ultimately make the port wireless). © 2019 Scientific American
Keyword: Brain imaging; Regeneration
Link ID: 26427 - Posted: 07.18.2019
By Brian X. Chen For the last two weeks, I’ve added an extra step to my bedtime routine: strapping a computer around my wrist. The new nightly move was prompted by a cascade of wearable gadgets from companies like Fitbit and Apple, which claim that their sensor-laden bracelets and watches can improve our lives by helping us detect health problems so that we can come up with solutions. For many years, fitness gadgets have measured basic data, like footsteps or calories burned, to motivate us to stay active or shed pounds. Sleep tracking is still a nascent area that tech companies are experimenting with — one that I’ve watched with interest as someone who has been sleep deprived for many years. In the past, I’ve tried several gadgets with sleep tech, including Fitbit watches and Bose’s sleep-aid earbuds. But I hadn’t consistently tracked my sleep habits and patterns before. Would it really make a difference, I wondered, to have this data? Would it help me to sleep better? I decided to test it out. I wore an Apple Watch, since it is one of the most popular health-tracking devices. I also downloaded a top-rated app called AutoSleep, which uses the Apple Watch’s sensors to follow my movements and determine when I fell asleep and woke up. (The Apple Watch lacks a built-in sleep tracker.) Here’s what AutoSleep gathered on my sleep habits. But the excitement ended there. Ultimately, the technology did not help me sleep more. It didn’t reveal anything that I didn’t already know, which is that I average about five and a half hours of slumber a night. And the data did not help me answer what I should do about my particular sleep problems. In fact, I’ve felt grumpier since I started these tests. © 2019 The New York Times Company
Keyword: Sleep
Link ID: 26426 - Posted: 07.18.2019
Shuai Xu, Arun Jayaraman and John A. Rogers. Thin, soft electronic systems that stick onto skin are beginning to transform health care. Millions of early versions1 of sensors, computers and transmitters woven into flexible films, patches, bandages or tattoos are being deployed in dozens of trials in neurology applications alone2; and their numbers growing rapidly. Within a decade, many people will wear such sensors all the time. The data they collect will be fed into machine-learning algorithms to monitor vital signs, spot abnormalities and track treatments. Medical problems will be revealed earlier. Doctors will monitor their patients’ recovery remotely while the patient is at home, and intervene if their condition deteriorates. Epidemic spikes will be flagged quickly, allowing authorities to mobilize resources, identify vulnerable populations and monitor the safety and efficacy of drugs issued. All of this will make health care more predictive, safe and efficient. Where are we now? The first generation of biointegrated sensors can track biophysical signals, such as cardiac rhythms, breathing, temperature and motion3. More advanced systems are emerging that can track certain biomarkers (such as glucose) as well as actions such as swallowing and speech. Small companies are commercializing soft biosensor systems that measure clinical data continuously. These include Vital Connect in San Jose, California; iRhythm in San Francisco, California; MC10 in Lexington, Massachusetts; and Sibel Health in Evanston, Illinois. For example, iRhythm’s single-use Zio patch monitors electrical pulses from the heart for 14 days, and is more effective than intermittent hospital check-ups at detecting abnormal rhythms4. But it is bulky and temporary, and the data must be downloaded after use, rather than transmitted in real time.
Keyword: Pain & Touch; Robotics
Link ID: 26425 - Posted: 07.18.2019
By Jessica Hamzelou Anorexia nervosa isn’t just a psychiatric condition – it is a metabolic one, too, according to a genetic study of around 72,500 people. The findings help to explain some of the symptoms of anorexia, and could help to shape future treatments. Anorexia affects between 0.9 and 4 per cent of women and 0.3 per cent of men, but is still poorly understood. “Anorexia has the highest mortality rate of any psychiatric disorder,” says Cynthia Bulik at the University of North Carolina at Chapel Hill. “We’re not very good at treating anorexia. There’s no medication, and that’s probably because we don’t understand the underlying causes.” Previous research has found that genetic factors, as well as environmental ones, can increase a person’s risk of anorexia. To investigate, Bulik and her colleagues compared the genomes of just under 17,000 people with anorexia with those of 55,500 people who didn’t have the condition. The team used a technique that applies thousands of markers to the genome, and compares these markers across all the volunteers. “It points you to where in the genome the differences lie,” says Bulik. The search pinpointed eight locations across the genome that seem to play a role in anorexia. But this is likely to represent only a tiny fraction of all the genetic factors involved in the condition, says Bulik. “It’s a complex trait, so we expect lots of genes to each have a small to moderate effect,” she says. © Copyright New Scientist Ltd.
Keyword: Anorexia & Bulimia
Link ID: 26424 - Posted: 07.16.2019
Nicola Davis The belief that men are more likely to get turned on by sexual images than women may be something of a fantasy, according to a study suggesting brains respond to such images the same way regardless of biological sex. The idea that, when it comes to sex, men are more “visual creatures” than women has often been used to explain why men appear to be so much keener on pornography. But the study casts doubt on the notion. “We are challenging that idea with this paper,” said Hamid Noori, co-author of the research from the Max Planck Institute for Biological Cybernetics in Germany. “At least at the level of neural activity … the brains of men and women respond the same way to porn.” Writing in the Proceedings of the National Academy of Sciences, Noori and his colleagues report how they came to their conclusions by analysing the results of 61 published studies involving adults of different biological sex and sexual orientation. The subjects were shown everyday images of people as well as erotic images while they lay inside a brain-scanning machine. Noori said all participants rated the sexual images as arousing before being scanned. Previously studies based on self-reporting have suggested men are more aroused by images than women, and it has been proposed that these differences could be down to the way the brain processes the stimuli – but studies have returned different results. Now, looking at the whole body of research, Noori and his colleagues say they have found little sign of functional differences. For both biological sexes, a change in activity was seen in the same brain regions including the amygdala, insula and striatum when sexual images were shown. © 2019 Guardian News & Media Limited
Keyword: Sexual Behavior
Link ID: 26423 - Posted: 07.16.2019
Laura Sanders A praying mantis depends on precision targeting when hunting insects. Now, scientists have identified nerve cells that help calculate the depth perception required for these predators’ surgical strikes. In addition to providing clues about insect vision, the principles of these cells’ behavior, described June 28 in Nature Communications, may also lead to advances in robot vision or other automated systems. So far, praying mantises are the only insects known to be able to see in 3-D. In the new study, neuroscientist Ronny Rosner of Newcastle University in England and colleagues used a tiny theater that played praying mantises’ favorite films — moving disks that mimic bugs. The disks appeared in three dimensions because the insects’ eyes were covered with different colored filters, creating minuscule 3-D glasses. As a praying mantis watched the films, electrodes monitored the behavior of individual nerve cells in the optic lobe, a brain structure responsible for many aspects of vision. There, researchers found four types of nerve cells that seem to help merge the two different views from each eye into a complete 3-D picture, a skill that human vision cells use to sense depth, too. One cell type called a TAOpro neuron possesses three elaborate, fan-shaped bundles that receive incoming visual information. Along with the three other cell types, TAOpro neurons are active when each eye’s view of an object is different, a mismatch that’s needed for depth perception. |© Society for Science & the Public 2000 - 2019.
Keyword: Vision
Link ID: 26422 - Posted: 07.16.2019
By Elizabeth Pennisi PROVIDENCE—Looking a squid in the eye is eerily like looking in a mirror. Squids, octopuses, and other cephalopods are on a very different part of the tree of life from vertebrates. But both have evolved sophisticated peepers that rely on a lens to focus light and provide excellent vision. This independent evolution of such complexity has puzzled biologists for centuries and has prompted searches for clues about how this might have come about. Evolutionary developmental biologists have now discovered that the genes that guide the initial formation of legs in us and other vertebrates also guide the formation of the squid’s lens (seen in cross section of eye above). The find is yet another example of how nature recruits genes used for one purpose to do another job for the body. The squid lens forms as extra-long membranes jutting out for specialized eye cells overlap to form a tight ball. Our lenses are actually degraded cells themselves packed with a clear protein. To learn how the squid lenses form, these researchers carefully tracked where, when, and which genes turn on and off as embryos of Doryteuthis pealeii, a squid commonly served as fried appetizers, develop. © 2019 American Association for the Advancement of Science
Keyword: Vision; Evolution
Link ID: 26421 - Posted: 07.16.2019
By Ryan D'Agostino If you have a son, you have a one-in-seven chance that he has been diagnosed with ADHD. If you have a son who has been diagnosed, it's more than likely that he has been prescribed a stimulant—the most famous brand names are Ritalin and Adderall; newer ones include Vyvanse and Concerta—to deal with the symptoms of that psychiatric condition. The Drug Enforcement Administration classifies stimulants as Schedule II drugs, defined as having a "high potential for abuse" and "with use potentially leading to severe psychological or physical dependence." (According to a University of Michigan study, Adderall is the most abused brand-name drug among high school seniors.) In addition to stimulants like Ritalin, Adderall, Vyvanse, and Concerta, Schedule II drugs include cocaine, methamphetamine, Demerol, and OxyContin. According to manufacturers of ADHD stimulants, they are associated with sudden death in children who have heart problems, whether those heart problems have been previously detected or not. They can bring on a bipolar condition in a child who didn't exhibit any symptoms of such a disorder before taking stimulants. They are associated with "new or worse aggressive behavior or hostility." They can cause "new psychotic symptoms (such as hearing voices and believing things that are not true) or new manic symptoms." They commonly cause noticeable weight loss and trouble sleeping. In some children, some stimulants can cause the paranoid feeling that bugs are crawling on them. Facial tics. They can cause children's eyes to glaze over, their spirits to dampen. One study reported fears of being harmed by other children and thoughts of suicide. ©2019 Hearst Magazine Media, Inc.
Keyword: ADHD; Drug Abuse
Link ID: 26420 - Posted: 07.15.2019
Researchers at the University of Waterloo say they have developed a new, "kid-friendly" way of diagnosing autism in young children. It uses infrared technology to read the way a child's eyes move as they process the features of a person's face. "A neuro-typical child will spend a whole lot more time looking at the person's — or the face's — eye," Anita Layton, a professor of applied mathematics, pharmacy and biology, told CBC Kitchener-Waterloo. "A [child with autism] will look at the mouth a lot more." Layton and her team developed the technique by showing a group of 40 children 44 photographs on a screen connected to their eye-tracking device. The children were all around 5 years old. Seventeen had been previously diagnosed as on the spectrum, the other 23 were considered neuro-typical. The difference in eye tracking has been well documented, Layton said. Her team found a way to turn that difference into a diagnostic tool that works well for young children and even non-verbal kids on the more complicated end of the spectrum. Right now, the two most popular ways of diagnosing Autism Spectrum Disorder are by having the child or parent fill out a comprehensive questionnaire, or have the child evaluated by a psychologist. "It's not easy for a child. Imagine a four or five-year-old child, neuro-typical or [autistic] to sit there for a long time, to answer your questions. That simply is no fun for a kid," Layton said. The eye-tracking test, on the other hand, can be done in just a few minutes. ©2019 CBC/Radio-Canada
By Thomas Stackpole The run happened — or didn’t — maybe five days into the raw-diet experiment. I had formed a sort of fitness pact with a friend to forgo cooked food, and after days of nothing but salads, almonds, sashimi and black coffee, my body felt taut and ready for action. And for about half a mile, it was, my strides floating above the pavement as a few fistfuls of raw kale percolated in my belly. Then suddenly I sputtered, feeling an unambiguous alarm go off: Tank is empty, sorry, this is the end of the line. After a pause, I tried running again but made it maybe a block before my legs revolted again and I slowed to a walk. My new healthy diet, it seemed, didn’t accommodate any actual exercise. When I told all this to my co-workers the next morning, it was fodder for a good laugh. My obsessions were — and often still are — a kind of running joke. I’ve been conducting a series of shifting and poorly planned “wellness” experiments on myself for about a decade. I’ve eaten keto, low-carb and sometimes not at all. One time, I ate almost nothing but lean ground turkey and broccoli over greens for maybe two months as part of a YouTube bodybuilder’s plan. More than once, I’ve lost 10 pounds in a week. I’ve also obsessed over bulking up, gaining 25 pounds over about six months of lifting, before pivoting and deciding to train for a marathon to run it off. Then there were the gut biome vitamins, the metabolism-boosting mushrooms, the experiments with LSD microdosing and calorie trackers. Despite years of cycling through boutique insanities, it didn’t occur to me that I might have a problem until earlier this year, when the Twitter founder turned Silicon Valley wellness influencer Jack Dorsey detailed his fasting regimen. The news that he eats one meal a day during the week and nothing on the weekend provoked scornful cries that he was advocating little more than anorexia with a bro-y tech-world veneer. I, on the other hand, saw a kindred spirit. © 2019 The New York Times Company
Keyword: Anorexia & Bulimia
Link ID: 26418 - Posted: 07.15.2019
By Susan Berger Julie Staple was a child when her dad, Mark Womack, began exhibiting odd behavior. An award-winning violin, viola and cello maker, Womack was not following through for clients nor returning phone calls promptly. He was watching more TV and taking more breaks from work. He began drinking and was quick to become angry. The behavior lasted years and took its toll. Staple and her mom, Ginny Womack, a professional violinist, thought Mark Womack was depressed. Her parents got divorced. Mark Womack was fired from two jobs making instruments in Nebraska and Texas. There were other disturbing events. A body shop wouldn’t fix his car because he couldn’t recall insurance information. A drive to his parents’ home that normally took two hours took five. And then came a phone call from his boss to the family — Mark Womack was crying and couldn’t remember how to make a violin. The boss took him to a clinic. At age 53, Mark Womack was diagnosed with early onset Alzheimer’s in September 2015. Further evaluation a few months back revealed instead a diagnosis of frontotemporal dementia or FTD. Ginny Womack became his caregiver. “Had my mom known, she would never have divorced him and been his caretaker from the beginning,” Staple, of Deerfield, Ill., said. FTD often is misdiagnosed as a psychiatric disorder or Alzheimer’s. It affects the area of the brain generally associated with personality, behavior and language and is often diagnosed in people between the ages of 40 and 45. About 5.8 million people in the United States are living with Alzheimer’s and dementia, said Heather Snyder, senior director for medical and scientific operations for the Alzheimer’s Association. The number is expected to rise to 14 million by 2050. Approximately 16 million people are caregivers. © 1996-2019 The Washington Post
Keyword: Alzheimers
Link ID: 26417 - Posted: 07.15.2019
By Knvul Sheikh A tropical parasite transmitted through rats and snails has caught the attention of health officials in Hawaii. But few scientists have studied the infection once it makes its way into humans, and researchers can’t say for certain whether the disease is becoming more widespread. The parasite, Angiostrongylus cantonensis, typically resides in a rat’s pulmonary arteries and is commonly known as “rat lungworm.” When its eggs hatch, tiny larvae are shed in the animals’ feces and eaten by snails or slugs. Those slugs, in turn, are often mistakenly eaten by people, on unwashed produce or in drinks that have been left uncovered. Although the larvae can’t grow into adult worms in a human host, they still can cause various complications, including flulike symptoms, headaches, stiff necks and bursts of nerve pain that seem to shift from one part of the body to another. M.R.I. scans suggest that the worms can also wriggle into the brain, leading to eosinophilic meningitis, which in rare cases can cause paralysis. Doctors in the state have noted cases of rat lungworm disease since at least 1959. But it is difficult to diagnose. To better track it, and to identify areas that prevention efforts should target, the Hawaii Department of Health began monitoring rat lungworm infections about a decade ago. From 2007 to 2017, officials tallied 82 cases, two of which resulted in death. Another 10 cases were reported in 2018, and six more have been reported among visitors and residents already this year. From the team at NYT Parenting: Get the latest news and guidance for parents. We'll celebrate the little parenting moments that mean a lot — and share stories that matter to families. The east side of the Big Island, in particular, has become a hot spot for infections, according to a review of cases published Monday in the American Journal of Tropical Medicine and Hygiene. Researchers are not sure why. Rats may be more numerous there, or more heavily infected, or more likely to cross paths with humans and infect them. Increased awareness about the disease may also have led to more infections being recognized than in the past. © 2019 The New York Times Company
Keyword: Pain & Touch
Link ID: 26416 - Posted: 07.13.2019
By Emily Anthes In 2002, Marin Sardy and her younger brother Tom traveled to a small Costa Rican town for what they hoped would be a low-key beach vacation. The siblings, both in their 20s, planned to spend a few weeks relaxing, learning to surf, and just generally enjoying each other’s company. Sardy reveals what it means to love someone who is mentally ill and how hard it is to truly understand another person’s mind. And then, one day, Tom began to complain about his face. His bones, he said, had detached from each other, and his jaw had separated from his head. He couldn’t get his face back into alignment, he told Sardy. He began to talk — excitedly and cryptically — about “building matrices” and his plans to swim from Alaska to Japan. His facial expressions turned blank. Sardy observed these developments with growing alarm. She and Tom had grown up with a mother whose life had been derailed by schizophrenia, and she was well acquainted with its signs and symptoms. “Memories unfurl inside as I watch Tom,” Sardy writes in her intimate, multigenerational memoir, “The Edge of Every Day: Sketches of Schizophrenia.” “It is as if I already know that doctors and medications and hospitals and our efforts will all fail him.” “The Edge of Every Day” is Sardy’s attempt to come to terms with a fundamentally mysterious disease and how its effects ripple throughout her family. It’s a deeply compassionate book about what it means to love someone who is mentally ill — about how hard it is to truly understand another person’s mind and the importance of continuing to try. Copyright 2019 Undark
Keyword: Schizophrenia
Link ID: 26415 - Posted: 07.13.2019
Katarina Zimmer About two years ago, 29 people visited a neuroscience lab in the Netherlands to sing karaoke. Wearing muffled headphones so they could hear the music but not their own voices, it was almost inevitable that they would sing “Silent Night” or the Dutch national anthem out of tune. Dutch researchers recorded each individual sing, then played the recording back to him or her. Listening to themselves sing solo evoked feelings of shame and embarrassment and sparked higher-than-normal activity in the subjects’ amygdalae. Fortunately for some study participants, a good night’s sleep was enough to lessen the amygdala’s response when they listened to the recording again the next day. But others who had experienced restless sleep—specifically poor-quality REM, or rapid eye movement, sleep—experienced the opposite: their amygdalae were just as sensitive, if not more, to the recording the next day. The findings suggest that poor-quality REM sleep can interfere with the amygdala’s ability to process emotional memories overnight, the scientists who conducted the study say. They posit that this has implications for people with psychological disorders linked to disturbed REM sleep patterns, such as depression, anxiety, and post-traumatic stress disorder (PTSD). The research appears today (July 11) in Current Biology. © 1986–2019 The Scientist.
Keyword: Sleep; Learning & Memory
Link ID: 26414 - Posted: 07.13.2019
Jon Hamilton In a waiting room at the Banner Alzheimer's Institute in Phoenix, a 74-year-old woman named Rubie is about to find out whether she has a gene that puts her at risk for Alzheimer's. "I'm a little bit apprehensive about it, and I hope I don't have it," she says. "But if I do, I want to be able to plan for my future." The gene is called APOE E4, and it's the most powerful known genetic risk factor for Alzheimer's after age 65. APOE E4 doesn't cause the disease, and many of those who carry it never develop Alzheimer's. Still, about 1 in 4 people who carries a single copy will develop Alzheimer's by 85. Among people who get two copies (one from each parent) up to 55% will develop Alzheimer's by age 85. Rubie is one of several participants in a research study at Banner who agreed to speak both before and after learning their APOE E4 status. The participants are identified only by first name to protect their privacy. Like many people in their 60s and 70s, Rubie has seen dementia up close. "My mother had Alzheimer's in the last stage of her life, and I've got friends and family that have Alzheimer's," she says. "It's a terrible sickness." Rubie wanted to do something to help researchers find a treatment for Alzheimer's. So she volunteered for the Generation Program, which is testing an experimental drug meant to prevent or delay the disease. © 2019 npr
Keyword: Alzheimers
Link ID: 26413 - Posted: 07.13.2019
By Stephen L. Macknik When Susana Martinez-Conde and I talk to audiences about NeuroMagic—our research initiative to study the brain with magic (and vice-versa), people often ask us how we bring both fields together. They want to know in what ways magic tricks can inform neuroscience, and what a day in the life of a neuromagic scientist looks like. How do we run a neuromagic experiment, from collecting the data to using the results to gain knowledge about the mind's inner secrets? Our new study, led by Anthony Barnhart (aka Magic Tony) and just published in the Journal of Eye Movement Research, illustrates some of the ways in which we investigate magic in the lab. You can download the paper for free, but as it is written for academics, I'll give you the gist here. The experiment addresses how various neural circuits interact in your brain while you watch a magic performance. There's the visual system—critical for perception—there's the oculomotor system—critical for targeting and moving the eyes—and there's the attentional system—critical for filtering out irrelevant information and allowing you to literally and figuratively focus both the visual and oculomotor systems at the right place and at the right time. Without all three of these systems working together, you would be unable to conduct most visual tasks. Advertisement Magic is one of the inroads available to dissect the function of many perceptual and cognitive systems, and especially so in situations that are fairly similar to those we encounter in real life. This concept—ecological validity—is important to testing whether neuroscience theories will hold up outside of the lab, and one of the reasons why magic tricks are attractive for studying everyday perception and cognition. © 2019 Scientific American
Keyword: Vision; Attention
Link ID: 26412 - Posted: 07.13.2019
Partial sight has been restored to six blind people via an implant that transmits video images directly to the brain. Some vision was made possible – with the participants’ eyes bypassed – by a video camera attached to glasses which sent footage to electrodes implanted in the visual cortex of the brain. University College London lecturer and Optegra Eye Hospital surgeon Alex Shortt said it was a significant development by specialists from Baylor Medical College in Texas and the University of California Los Angeles. “Previously all attempts to create a bionic eye focused on implanting into the eye itself. It required you to have a working eye, a working optic nerve,” Shortt told the Daily Mail. “By bypassing the eye completely you open the potential up to many, many more people. “This is a complete paradigm shift for treating people with complete blindness. It is a real message of hope.” How eye-gaze technology brought creativity back into an artist's life The technology has not been proven on those born blind. The US team behind the study asked participants, each of whom has been completely blind for years, to look at a blacked-out computer screen and identify a white square appearing randomly at different locations on the monitor. The majority of the time, they can find the square. © 2019 Guardian News & Media Limited
Keyword: Vision; Robotics
Link ID: 26411 - Posted: 07.13.2019
Tina Hesman Saey No one should have to sleep with the fishes, but new research on zebrafish suggests that we sleep like them. Sleeping zebrafish have brain activity similar to both deep slow-wave sleep and rapid eye movement, or REM, sleep that’s found in mammals, researchers report July 10 in Nature. And the team may have tracked down the cells that kick off REM sleep. The findings suggest that the basics of sleep evolved at least 450 million years ago in zebrafish ancestors, before the evolution of animals that give birth to live young instead of laying eggs. That’s 150 million years earlier than scientists thought when they discovered that lizards sleep like mammals and birds (SN: 5/28/16, p. 9). What’s more, sleep may have evolved underwater, says Louis C. Leung, a neuroscientist at Stanford University School of Medicine. “These signatures [of sleep] really have important functions — even though we may not know what they are — that have survived hundreds of millions of years of evolution.” In mammals, birds and lizards, sleep has several stages characterized by specific electrical signals. During slow-wave sleep, the brain is mostly quiet except for synchronized waves of electrical activity. The heart rate decreases and muscles relax. During REM or paradoxical sleep, the brain lights up with activity almost like it’s awake. But the muscles are paralyzed (except for rapid twitching of the eyes) and the heart beats erratically. |© Society for Science & the Public 2000 - 2019
Keyword: Sleep
Link ID: 26410 - Posted: 07.11.2019


.gif)

