Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 4461 - 4480 of 29583

Shawna Williams In 1987, political scientist James Flynn of the University of Otago in New Zealand documented a curious phenomenon: broad intelligence gains in multiple human populations over time. Across 14 countries where decades’ worth of average IQ scores of large swaths of the population were available, all had upward swings—some of them dramatic. Children in Japan, for example, gained an average of 20 points on a test known as the Wechsler Intelligence Scale for Children between 1951 and 1975. In France, the average 18-year-old man performed 25 points better on a reasoning test in 1974 than did his 1949 counterpart.1 Flynn initially suspected the trend reflected faulty tests. Yet in the ensuing years, more data and analyses supported the idea that human intelligence was increasing over time. Proposed explanations for the phenomenon, now known as the Flynn effect, include increasing education, better nutrition, greater use of technology, and reduced lead exposure, to name but four. Beginning with people born in the 1970s, the trend has reversed in some Western European countries, deepening the mystery of what’s behind the generational fluctuations. But no consensus has emerged on the underlying cause of these trends. A fundamental challenge in understanding the Flynn effect is defining intelligence. At the dawn of the 20th century, English psychologist Charles Spearman first observed that people’s average performance on a variety of seemingly unrelated mental tasks—judging whether one weight is heavier than another, for example, or pushing a button quickly after a light comes on—predicts our average performance on a completely different set of tasks. Spearman proposed that a single measure of general intelligence, g, was responsible for that commonality. © 1986 - 2018 The Scientist

Keyword: Intelligence; Learning & Memory
Link ID: 25714 - Posted: 11.24.2018

Ashley Yeager For an hour a day, five days a week, mice in Hiroshi Maejima’s physiology lab at Hokkaido University in Sapporo, Japan, hit the treadmill. The researcher’s goal in having the animals follow the exercise routine isn’t to measure their muscle mass or endurance. He wants to know how exercise affects their brains. Researchers have long recognized that exercise sharpens certain cognitive skills. Indeed, Maejima and his colleagues have found that regular physical activity improves mice’s ability to distinguish new objects from ones they’ve seen before. Over the past 20 years, researchers have begun to get at the root of these benefits, with studies pointing to increases in the volume of the hippocampus, development of new neurons, and infiltration of blood vessels into the brain. Now, Maejima and others are starting to home in on the epigenetic mechanisms that drive the neurological changes brought on by physical activity. In October, Maejima’s team reported that the brains of rodents that ran had greater than normal histone acetylation in the hippocampus, the brain region considered the seat of learning and memory.1 The epigenetic marks resulted in higher expression of Bdnf, the gene for brain-derived neurotrophic factor (BDNF). By supporting the growth and maturation of new nerve cells, BDNF is thought to promote brain health, and higher levels of it correlate with improved cognitive performance in mice and humans. With a wealth of data on the benefits of working out emerging from animal and human studies, clinicians have begun prescribing exercise to patients with neurodegenerative diseases such as Parkinson’s and Alzheimer’s, as well as to people with other brain disorders, from epilepsy to anxiety. Many clinical trials of exercise interventions for neurodegenerative diseases, depression, and even aging are underway. Promising results could bolster the use of exercise as a neurotherapy. © 1986 - 2018 The Scientist

Keyword: Learning & Memory; Muscles
Link ID: 25713 - Posted: 11.24.2018

Marshall Allen Last March, Tony Schmidt discovered something unsettling about the machine that helps him breathe at night. Without his knowledge, it was spying on him. From his bedside, the device was tracking when he was using it and sending the information not just to his doctor, but to the maker of the machine, to the medical supply company that provided it and to his health insurer. Schmidt, an information technology specialist from Carrollton, Texas, was shocked. "I had no idea they were sending my information across the wire." Schmidt, 59, has sleep apnea, a disorder that causes worrisome breaks in his breathing at night. Like millions of people, he relies on a continuous positive airway pressure, or CPAP, machine that streams warm air into his nose while he sleeps, keeping his airway open. Without it, Schmidt would wake up hundreds of times a night; then, during the day, he'd nod off at work, sometimes while driving and even as he sat on the toilet. "I couldn't keep a job," he recalls. "I couldn't stay awake." The CPAP, he says, saved his career, maybe even his life. As many CPAP users discover, the life-altering device comes with caveats: Health insurance companies are often tracking whether patients use them. If they aren't, the insurers might not cover the machines or the supplies that go with them. And, faced with the popularity of CPAPs — which can cost $400 to $800 — and their need for replacement filters, face masks and hoses, health insurers have deployed a host of tactics that can make the therapy more expensive or even price it out of reach. Patients have been required to rent CPAPs at rates that total much more than the retail price of the devices, or they've discovered that the supplies would be substantially cheaper if they didn't have insurance at all. © 2018 npr

Keyword: Sleep
Link ID: 25712 - Posted: 11.24.2018

Abby Olena Mice with faulty circadian clocks are prone to obesity and diabetes. So are mice fed a diet high in fat. Remarkably, animals that have both of these obesity-driving conditions can stay lean and metabolically healthy by simply limiting the time of day when they eat. In a study published today (August 30) in Cell Metabolism, researchers report that restricting feeding times to mice’s active hours can overcome both defective clock genes and an unhealthy diet, a finding that may have an impact in the clinic. The work corroborates previous research showing how powerful restricted feeding can be to improve clock function, says Kristin Eckel-Mahan, a circadian biologist at the University of Texas Health Science Center at Houston who did not participate in the study. Over the last 20 years, biologists have found circadian clocks keeping physiologic time in almost every organ. They have also shown that mice with disrupted clocks often develop metabolic diseases, such as obesity, and that circadian clock proteins physically bind to the promoters of many metabolic regulators and instruct them when to turn on and off. For Satchidananda Panda of the Salk Institute, these lines of evidence came together in 2009, when his group published a study showing that in mice without the clock component Cryptochrome, feeding and fasting could drive the expression of some, but not all, of the metabolic regulators throughout the body. Other groups have also confirmed that even in the absence of the clock it is still possible to drive some genetic rhythms. In this latest study, he and colleagues wanted to look more closely at how the cycling of clock and metabolic transcripts induced by time-restricted feeding, rather than normal genetic rhythms, influences the health of mice. © 1986 - 2018 The Scientist

Keyword: Obesity
Link ID: 25711 - Posted: 11.24.2018

Selene Meza-Perez, Troy D. Randall Fat is a loaded tissue. Not only is it considered unsightly, the excess flab that plagues more than two-thirds of adults in America is associated with many well-documented health problems. In fact, obesity (defined as having a body mass index of 30 or more) is a comorbidity for almost every other type of disease. But, demonized as all body fat is, deep belly fat known as visceral adipose tissue (VAT) also has a good side: it’s a critical component of the body’s immune system. VAT is home to many cells of both the innate and adaptive immune systems. These cells influence adipocyte biology and metabolism, and in turn, adipocytes regulate the functions of the immune cells and provide energy for their activities. Moreover, the adipocytes themselves produce antimicrobial peptides, proinflammatory cytokines, and adipokines that together act to combat infection, modify the function of immune cells, and maintain metabolic homeostasis. Unfortunately, obesity disrupts both the endocrine and immune functions of VAT, thereby promoting inflammation and tissue damage that can lead to diabetes or inflammatory bowel disease. As researchers continue to piece together the complex connections between immunity, gut microbes, and adipose tissues, including the large deposit of fat in the abdomen known as the omentum, they hope not only to gain an understanding of how fat and immunity are linked, but to also develop fat-targeted therapeutics that can moderate the consequences of infectious and inflammatory diseases. © 1986 - 2018 The Scientist.

Keyword: Obesity; Neuroimmunology
Link ID: 25710 - Posted: 11.24.2018

Abby Olena Anticipating something tasty can lead to a watering mouth and grumbling stomach, but these familiar responses aren’t the only ways the body prepares for nourishment. According to a study published today (November 15) in Cell, sensing food primes mice to process incoming nutrients by directions from the central nervous system to the liver. “It’s a great tour de force combining [several strategies] in one paper to then identify pathways by which food anticipation could alter hepatic metabolism,” says Christoph Buettner, a physician and researcher at Icahn School of Medicine at Mount Sinai in New York who was not involved in the study. “It’s interesting that even before your food hits your tongue or ends up in your stomach, there are changes that prepare an organism for nutrient storage.” Two types of cells in the brain’s hypothalamus have been shown in previous studies to play opposing roles in regulating how much an organism eats. AgRP neurons are turned on when energy stores are low, making an animal seek out food, while POMC neurons, activated when an animal is sated, inhibit eating. Up until a few years ago, the prevailing wisdom was that ingested food resulted in hormonal changes and subsequent neuronal activation after some lag time, says Jens Brüning, an endocrinologist and geneticist at the Max Planck Institute for Metabolism Research in Germany. But in 2015, researchers from the University of California, San Francisco, showed in mice that these neurons change their state of activation nearly instantaneously in response to the sight or smell of food. © 1986 - 2018 The Scientist

Keyword: Obesity
Link ID: 25709 - Posted: 11.24.2018

By Sharon Begley, The brain surgeon began as he always does, making an incision in the scalp and gently spreading it apart to expose the skull. He then drilled a 3-inch circular opening through the bone, down to the thick, tough covering called the dura. He sliced through that, and there in the little porthole he’d made was the glistening, blood-flecked, pewter-colored brain, ready for him to approach the way spies do a foreign embassy: He bugged it. Dr. Ashesh Mehta, a neurosurgeon at the Feinstein Institute for Medical Research on Long Island, was operating on his epilepsy patient to determine the source of seizures. But the patient agreed to something more: to be part of an audacious experiment whose ultimate goal is to translate thoughts into speech. While he was in there, Mehta carefully placed a flat array of microelectrodes on the left side of the brain’s surface, over areas involved in both listening to and formulating speech. By eavesdropping on the electrical impulses that crackle through the gray matter when a person hears in the “mind’s ear” what words he intends to articulate (often so quickly it’s barely conscious), then transmitting those signals wirelessly to a computer that decodes them, the electrodes and the rest of the system hold the promise of being the first “brain-computer interface” to go beyond movement and sensation. If all goes well, it will conquer the field’s Everest: developing a brain-computer interface that could enable people with a spinal cord injury, locked-in syndrome, ALS, or other paralyzing condition to talk again. © 2018 Scientific America

Keyword: Brain imaging; Robotics
Link ID: 25708 - Posted: 11.21.2018

By Gina Kolata Whenever I see a photo from the 1960s or 1970s, I am startled. It’s not the clothes. It’s not the hair. It’s the bodies. So many people were skinny. In 1976, 15 percent of American adults were obese. Now the it’s nearly 40 percent. No one really knows why bodies have changed so much. Scientists do a lot of hand-waving about our “obesogenic environment” and point to favorite culprits: the abundance of cheap fast foods and snacks; food companies making products so tasty they are addictive; larger serving sizes; the tendency to graze all day. Whatever the combination of factors at work, something about the environment is making many people as fat as their genetic makeup permits. Obesity has always been with us, but never has it been so common. Everyone — from doctors to drug companies, from public health officials to overweight people themselves — would love to see a cure, a treatment that brings weight to normal and keeps it there. Why hasn’t anyone discovered one? It’s not for lack of trying. Yes, some individuals have managed to go from fat to thin with diets and exercise, and have kept off the weight. But they are the rare exceptions. Most spend years dieting and regaining, dieting and regaining, in a fruitless, frustrating cycle. There is just one almost uniformly effective treatment, and it is woefully underused: only about 1 percent of the 24 million American adults who are eligible get the procedure. That treatment is bariatric surgery, a drastic operation that turns the stomach into a tiny pouch and, in one version, also reroutes the intestines. Most who have it lose significant amounts of weight — but many of them remain overweight, or even obese. Their health usually improves anyway. Many with diabetes no longer need insulin. Cholesterol and blood pressure levels tend to fall. Sleep apnea disappears. Backs, hips and knees stop aching. © 2018 The New York Times Company

Keyword: Obesity
Link ID: 25707 - Posted: 11.21.2018

National Institutes of Health scientists and their colleagues have found evidence of the infectious agent of sporadic Creutzfeldt-Jakob disease (CJD) in the eyes of deceased CJD patients. The finding suggests that the eye may be a source for early CJD diagnosis and raises questions about the safety of routine eye exams and corneal transplants. Sporadic CJD, a fatal neurodegenerative prion disease of humans, is untreatable and difficult to diagnose. Prion diseases originate when normally harmless prion protein molecules become abnormal and gather in clusters and filaments in the body and brain. Scientists hope that early diagnosis of prion and related diseases—such as Alzheimer’s, Parkinson’s and dementia with Lewy bodies—could lead to effective treatments that slow or prevent these diseases. Scientists from NIH’s National Institute of Allergy and Infectious Diseases (NIAID) collaborated on the research with colleagues from the University of California at San Diego and UC-San Francisco. About 40 percent of sporadic CJD patients develop eye problems that could lead to an eye exam, meaning the potential exists for the contamination of eye exam equipment designed for repeat use. Further, cadaveric corneal transplants from undiagnosed CJD patients have led to two probable and three possible cases of disease transmission, the researchers say. Previous studies have shown that the eyes of CJD patients contain infectious prions, though the distribution of prions among the various components of the eye was not known. To address this question, the scientists recruited 11 CJD patients who agreed to donate their eyes upon death. The researchers found evidence of prion infection throughout the eyes of all 11 deceased patients using real time quaking-induced conversion (RT-QuIC), a highly sensitive test NIAID scientists developed that detects prion seeding activity in a sample as evidence of infection.

Keyword: Prions; Vision
Link ID: 25706 - Posted: 11.21.2018

By Virginia Morell Like any fad, the songs of humpback whales don’t stick around for long. Every few years, males swap their chorus of squeaks and groans for a brand new one. Now, scientists have figured out how these “cultural revolutions” take place. All male humpbacks in a population sing the same song, and they appear to learn new ones somewhat like people do. Males in the eastern Australian population of humpbacks, for example, pick up a new song every few years from the western Australian population at shared feeding grounds or while migrating. Over the next few years, the songs spread to all South Pacific populations. To understand how the whales learn the novel ballads, scientists analyzed eastern Australian whale songs over 13 consecutive years. Using spectrograms of 412 song cycles from 95 singers, the scientists scored each tune’s complexity for the number of sounds and themes, and studied the subtle variations individual males can add to stand out. Complexity increased as the songs evolved (as heard in the video below), the team reports today in the Proceedings of the Royal Society B. But after a song revolution, the ballads became shorter with fewer sounds and themes. The revolutionary songs may be less complex than the old ones because the whales can only learn a certain amount of new material at a time, the scientists conclude. That could mean that although humpback whales are still the crooners of the sea, their learning skills are a bit limited. © 2018 American Association for the Advancement of Scienc

Keyword: Animal Communication; Language
Link ID: 25705 - Posted: 11.21.2018

By Pam Belluck It’s a rare person in America who doesn’t know of someone with Alzheimer’s disease. The most common type of dementia, it afflicts about 44 million people worldwide, including 5.5 million in the United States. Experts predict those numbers could triple by 2050 as the older population increases. So why is there still no effective treatment for it, and no proven way to prevent or delay its effects? Why is there still no comprehensive understanding of what causes the disease or who is destined to develop it? The answer, you could say, is: “It’s complicated.” And that is certainly part of it. For nearly two decades, researchers, funding agencies and clinical trials have largely focused on one strategy: trying to clear the brain of the clumps of beta amyloid protein that form the plaques integrally linked to the disease. But while some drugs have reduced the accumulation of amyloid, none have yet succeeded in stopping or reversing dementia. And amyloid doesn’t explain everything about Alzheimer’s — not everyone with amyloid plaques has the disease. “It’s not that amyloid is not an important factor,” said Dr. John Morris, director of the Knight Alzheimer’s Disease Research Center at the Washington University School of Medicine in St. Louis. “On the other hand, we’ve had some 200-plus trials since 2001 that have been negative.” Not all trials have targeted amyloid. Some have focused on tau, a protein that, in Alzheimer’s, forms threads that stick together in tangles inside neurons, sandbagging their communications with one another. Tau tangles seem to spread after amyloid accumulates into plaques between neurons. But so far, anti-tau drugs haven’t successfully attacked Alzheimer’s itself. Only five drugs have been approved to treat this dementia, but they address early symptoms and none have been shown to work very well for very long. It’s been 15 years since the last one was approved. © 2018 The New York Times Company

Keyword: Alzheimers
Link ID: 25704 - Posted: 11.20.2018

By Benedict Carey Nothing humbles history’s great thinkers more quickly than reading their declarations on the causes of madness. Over the centuries, mental illness has been attributed to everything from a “badness of spirit” (Aristotle) and a “humoral imbalance” (Galen) to autoerotic fixation (Freud) and the weakness of the hierarchical state of the ego (Jung). The arrival of biological psychiatry, in the past few decades, was expected to clarify matters, by detailing how abnormalities in the brain gave rise to all variety of mental distress. But that goal hasn’t been achieved — nor is it likely to be, in this lifetime. Still, the futility of the effort promises to inspire a change in the culture of behavioral science in the coming decades. The way forward will require a closer collaboration between scientists and the individuals they’re trying to understand, a mutual endeavor based on a shared appreciation of where the science stands, and why it hasn’t progressed further. “There has to be far more give and take between researchers and the people suffering with these disorders,” said Dr. Steven Hyman, director of the Stanley Center for Psychiatric Research at the Broad Institute of M.I.T. and Harvard. “The research cannot happen without them, and they need to be convinced it’s promising.” The course of Science Times coincides almost exactly with the tear-down and rebuilding of psychiatry. Over the past 40 years, the field remade itself from the inside out, radically altering how researchers and the public talked about the root causes of persistent mental distress. The blueprint for reassembly was the revision in 1980 of psychiatry’s field guide, the Diagnostic and Statistical Manual of Mental Disorders, which effectively excluded psychological explanations. Gone was the rich Freudian language about hidden conflicts, along with the empty theories about incorrect or insufficient “mothering.” Depression became a cluster of symptoms and behaviors; so did obsessive-compulsive disorder, bipolar disorder, schizophrenia, autism and the rest. © 2018 The New York Times Company

Keyword: Schizophrenia; Depression
Link ID: 25703 - Posted: 11.20.2018

A countrywide shortage of a common antidepressant medication has caused alarm among doctors, pharmacists and patients with mental illnesses. Nearly a dozen pharmacies in Saskatoon and Regina have told CBC News that they have run out of bupropion— both the brand-name product Wellbutrin and its generic counterparts — and can't get more from their suppliers. More than 12,000 patients in Saskatchewan take bupropion, according to the Ministry of Health. National figures are not readily available. The prescription antidepressant is used to treat major depressive disorder and seasonal affective disorder. "This might have been the drug that gave you the energy to live your life, do the things you needed to do, get on with your job, do your studies," said Dr. Sara Dungavell, a Saskatoon psychiatrist. She said she fielded anxious phone calls from patients about the shortage. Two pharmaceutical companies that produce generic bupropion are reporting a shortage or anticipated shortage on the Health Canada website. The company that manufactures Wellbutrin, Bausch Health, reported its shortage to Health Canada six weeks ago. On Thursday, it told CBC News it had resolved its shortage, and Canadian pharmacies would receive the drug "imminently," depending on delivery schedules. By Saturday afternoon, pharmacies in Calgary, Saskatoon, Regina and Winnipeg said they had yet to receive a shipment, and their pharmacists said it was still listed as unavailable in their system. ©2018 CBC/Radio-Canada

Keyword: Depression
Link ID: 25702 - Posted: 11.20.2018

By Scott Barry Kaufman "We experience ourselves, our thoughts and feelings as something separate from the rest. A kind of optical delusion of consciousness." -- Albert Einstein "In our quest for happiness and the avoidance of suffering, we are all fundamentally the same, and therefore equal. Despite the characteristics that differentiate us - race, language, religion, gender, wealth and many others - we are all equal in terms of our basic humanity." -- Dalai Lama (on twitter) The belief that everything in the universe is part of the same fundamental whole exists throughout many cultures and philosophical, religious, spiritual, and scientific traditions, as captured by the phrase 'all that is.' The Nobel winner Erwin Schrodinger once observed that quantum physics is compatible with the notion that there is indeed a basic oneness of the universe. Therefore, despite it seeming as though the world is full of many divisions, many people throughout the course of human history and even today truly believe that individual things are part of some fundamental entity. Despite the prevalence of this belief, there has been a lack of a well validated measure in psychology that captures this belief. While certain measures of spirituality do exist, the belief in oneness questions are typically combined with other questions that assess other aspects of spirituality, such as meaning, purpose, sacredness, or having a relationship with God. What happens when we secularize the belief in oneness? © 2018 Scientific American

Keyword: Consciousness
Link ID: 25701 - Posted: 11.19.2018

By John Horgan Don't Make Me One with Everything The mystical doctrine of oneness is metaphysically disturbing, and it can foster authoritarian behavior and encourage an unhealthy detachment. Credit: Mark D Callanan Getty Images A recurring claim of sages east and west is that reality, which seems to consist of many things that keep changing, is actually one thing that never changes. This is the mystical doctrine of oneness. Enlightenment supposedly consists of realizing your oneness with reality, hence the old joke: What did the Buddhist say to the hotdog vendor? Make me one with everything. A column by my fellow Scientific American blogger, psychologist Scott Barry Kaufman, touts the oneness doctrine. “The belief that everything in the universe is part of the same fundamental whole exists throughout many cultures and philosophical, religious, spiritual, and scientific traditions,” Kaufman writes. His column considers, as his headline puts it, “What Would Happen If Everyone Truly Believed Everything Is One?" Kaufman notes that psychologists Kate Diebels and Mark Leary have explored this question. They define oneness, among other ways, as the idea that “beneath surface appearances, everything is one,” and “the separation among individual things is an illusion.” Diebels and Leary found that 20 percent of their respondents have thought about oneness “often or many times,” and many report having spiritual experiences related to oneness. Diebels and Leary state that “a belief in oneness was related to values indicating a universal concern for the welfare of other people, as well as greater compassion for other people.” Believers “have a more inclusive identity that reflects their sense of connection with other people, nonhuman animals, and aspects of nature.” © 2018 Scientific American

Keyword: Consciousness
Link ID: 25700 - Posted: 11.19.2018

Andrew Anthony Of all the mysteries of the mind, perhaps none is greater than memory. Why do we remember some things and forget others? What is memory’s relationship to consciousness and our identities? Where and how is memory stored? How reliable are our memories? And why did our memory evolve to be so rich and detailed? In a sense there are two ways of looking at memory: the literary and the scientific. There is the Proustian model in which memory is about meaning, an exploration of the self, a subjective journey into the past. And then there is the analytical model, where memory is subjected to neurological study, psychological experiments and magnetic resonance imaging. A new book – or rather a recent translation of a two-year-old book – by a pair of Norwegian sisters seeks to marry the two approaches. The co-authors of Adventures in Memory: The Science and Secrets of Remembering and Forgetting are Ylva Østby, a clinical neuropsychologist, and Hilda Østby, an editor and novelist. Their book begins in 1564, with Julius Caesar Arantius performing a dissection of a human brain. Cutting deep into the temporal lobe, where it meets the brain stem, he encounters a small, wormlike ridge of tissue that resembles a sea horse. He calls it hippocampus – or “horse sea monster” in Latin. The significance of this discovery would take almost 400 years to come to light. As with so much to do with our understanding of the brain, the breakthrough came through a malfunction. An American named Henry Molaison suffered from acute epilepsy, and in 1953 he underwent an operation in which the hippocampi from both sides of his brain were removed. The surgery succeeded in controlling his epilepsy but at the cost of putting an end to his memory. © 2018 Guardian News and Media Limited

Keyword: Learning & Memory
Link ID: 25699 - Posted: 11.19.2018

By Perri Klass, M.D. More than 30 years ago, I went to a parent meeting at my oldest child’s day care center, when he was in the 2-year-old room, and it turned out that many of the children in the room were not reliably sleeping through the night. It felt like a revelation, discovering that mine was not the only child who occasionally — or regularly — woke in the night and needed some attention. In our family, we had come to terms with this, and we had managed to make — and generally keep — some rules: no food, no drink, no coming out of the crib, but yes, once a night one of your parents is willing to stagger down the hall, look in on you, rub your back and say something like, “We haven’t moved away and left you, now go back to sleep.” (Or maybe sometimes it was, “Go back to sleep or we will move away and leave you,” but that is lost in the mists of history.) It wasn’t ideal, but we were managing. In the current issue of the journal Pediatrics, researchers describe a study of almost 400 mothers in Canada who were asked to report: “During the night, how many consecutive hours does your child sleep without waking up?” The researchers took six or eight hours of uninterrupted sleep as definitions of “sleeping through the night.” They found that at 6 months of age, 62.4 percent of mothers reported that their infants slept for 6 hours or more at a stretch, and only 43 percent of the mothers reported eight-hour blocks of consecutive sleep. At 12 months, 72.1 percent of the mothers reported six hours of consecutive sleep, and 56.6 percent reported eight hours; since all infants wake several times a night, those who were reported as sleeping consecutively presumably awoke and went back to sleep by themselves without the mothers knowing it. So by these criteria, a significant number of the babies were not “sleeping through the night” at 6 months, and even at 12 months. At some time points, girls were more likely to sleep for longer periods than boys, but at other times there was no significant difference. © 2018 The New York Times Company

Keyword: Sleep; Development of the Brain
Link ID: 25698 - Posted: 11.19.2018

By Michael Price Patient BAA, who is 35, lost her sight when she was 27. She can still detect light and dark, but for all intents and purposes, she is blind. Now, she—and other formerly sighted people—may one day regain a limited form of vision using electrodes implanted in the brain. In a new study, such electrodes caused parts of BAA’s and other people’s visual cortexes to light up in specific patterns, allowing them to see shapes of letters in their mind’s eyes. The work is a step forward in a field that emerged more than 40 years ago but has made relatively little progress. The findings suggest technical ways to stimulate images in the brain “are now within reach,” says Pieter Roelfsema, a neuroscientist who directs the Netherlands Institute for Neuroscience in Amsterdam and wasn’t involved in the work. Research to electrically spur blind people’s brains to see shapes began in the 1970s, when biomedical researcher William Dobelle, then at The University of Utah in Salt Lake City, first implanted electrodes in the brain to stimulate the visual cortex. Typically, the rods and cones in retinas translate light waves into neural impulses that travel to the brain. Specialized layers of cells there, known as the visual cortex, process that information for the rest of the brain to use. Dobelle’s implants took advantage of a phenomenon known as retinal mapping. The visual field—the plane of space you see when you look out into the world—roughly maps onto a segment of the visual cortex. By electrically stimulating parts of this brain map, Dobelle could cause flashes of light called phosphenes to appear in the minds of people who were blind, but who had experienced at least a few years of vision. By stimulating different electrodes, he could get phosphenes to flash in different parts of a person’s visual field. © 2018 American Association for the Advancement of Science

Keyword: Vision; Robotics
Link ID: 25697 - Posted: 11.17.2018

Exposure to uncomfortable sensations elicits a wide range of appropriate and quick reactions, from reflexive withdrawal to more complex feelings and behaviors. To better understand the body’s innate response to harmful activity, researchers at the National Center for Complementary and Integrative Health (NCCIH), part of the National Institutes of Health, have identified activity in the brain that governs these reactions. Using heat as the source of discomfort, experiments conducted by the center’s intramural program showed that bodily responses to pain are controlled by a neural pathway involving heightened activity in the spinal cord and two parts of the brainstem. Results of the study were published in the journal Neuron. “Much is known about local spinal cord circuits for simple reflexive responses, but the mechanisms underlying more complex behaviors remain poorly understood,” said Alexander T. Chesler, Ph.D., a Stadtman Investigator at NCCIH and senior author of the study. “We set out to describe the brain pathway that controls motor responses and involuntary behaviors when the body is faced with painful experiences.” Just as people respond to increasingly uncomfortable surfaces like a sandy beach on a hot day by lifting their feet, hopping, and eventually running to a water source, so, too, do laboratory models show a predictable sequence of behaviors. Experiments showed that the parts of a brainstem involved in this circuit are the parabrachial nucleus (PBNI) and the dorsal reticular formation in the medulla (MdD). A specific group of nerve cells in the PBNI is activated by standing on a hot surface, triggering escape responses through connections to the MdD. These PBNI cells express a gene called Tac1, which codes for substances called tachykinins that participate in many functions in the body and contribute to multiple disease processes. The MdD cells involved in this circuit also express Tac1. A different group of cells in the PBNI participates in the aspects of the response to noxious heat that involve the forebrain.

Keyword: Pain & Touch
Link ID: 25696 - Posted: 11.17.2018

By Donald G. McNeil Jr. The first treatment for sleeping sickness that relies on pills alone was approved on Friday by Europe’s drug regulatory agency, paving the way for use in Africa, the last bastion of the horrific disease. With treatment radically simplified, sleeping sickness could become a candidate for elimination, experts said, because there are usually fewer than 2,000 cases in the world each year. The disease, also called human African trypanosomiasis, is transmitted by tsetse flies. The protozoan parasites, injected as the flies suck blood, burrow into the brain. Before they kill, drive their victims mad in ways that resemble the last stages of rabies. The personalities of the infected change. They have terrifying hallucinations and fly into rages; they have been known to beat their children and even attack family members with machetes. They may become ravenous and scream with pain if water touches their skin. Only in the end, do they lapse into a long coma and die. The new drug, fexinidazole, cures all stages of the disease within 10 days. Previously, everyone with the parasites found in a blood test also had to undergo a spinal tap to see if the parasites had reached their brains. If so, patients had to suffer through a complex and sometimes dangerous intravenous regimen requiring hospitalization. An oral treatment that can safely be taken at home “is a completely new paradigm — it could let us bring treatment down to the village level,” said Dr. Bernard Pecoul, founder and executive director of the Drugs for Neglected Diseases Initiative, which was started in 2005 by the medical charity Doctors Without Borders to find new cures for tropical diseases. © 2018 The New York Times Company

Keyword: Neurotoxins
Link ID: 25695 - Posted: 11.17.2018