Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 26469

By James Gorman There’s something about a really smart dog that makes it seem as if there might be hope for the world. China is in the midst of a frightening disease outbreak and nobody knows how far it will spread. The warming of the planet shows no signs of stopping; it reached a record 70 degrees in Antarctica last week. Not to mention international tensions and domestic politics. But there’s a dog in Norway that knows not only the names of her toys, but also the names of different categories of toys, and she learned all this just by hanging out with her owners and playing her favorite game. So who knows what other good things could be possible? Right? This dog’s name is Whisky. She is a Border collie that lives with her owners and almost 100 toys, so it seems like things are going pretty well for her. Even though I don’t have that many toys myself, I’m happy for her. You can’t be jealous of a dog. Or at least you shouldn’t be. Whisky’s toys have names. Most are dog-appropriate like “the colorful rope” or “the small Frisbee.” However, her owner, Helge O. Svela said on Thursday that since the research was done, her toys have grown in number from 59 to 91, and he has had to give some toys “people” names, like Daisy or Wenger. “That’s for the plushy toys that resemble animals like ducks or elephants (because the names Duck and Elephant were already taken),” he said. During the research, Whisky proved in tests that she knew the names for at least 54 of her 59 toys. That’s not just the claim of a proud owner, and Mr. Svela is quite proud of Whisky, but the finding of Claudia Fugazza, an animal behavior researcher from Eötvös Loránd University in Budapest, who tested her. That alone makes Whisky part of a very select group, although not a champion. You may recall Chaser, another Border collie that knew the names of more than 1,000 objects and also knew words for categories of objects. And there are a few other dogs with shockingly large vocabularies, Dr. Fugazza said, including mixed breeds, and a Yorkie. These canine verbal prodigies are, however, few and far between. “It is really, really unusual, and it is really difficult to teach object names to dogs,” Dr. Fugazza said. © 2020 The New York Times Company

Keyword: Language; Learning & Memory
Link ID: 27063 - Posted: 02.21.2020

By Sara Reardon To many people’s eyes, artist Mark Rothko’s enormous paintings are little more than swaths of color. Yet a Rothko can fetch nearly $100 million. Meanwhile, Pablo Picasso’s warped faces fascinate some viewers and terrify others. Why do our perceptions of beauty differ so widely? The answer may lie in our brain networks. Researchers have now developed an algorithm that can predict art preferences by analyzing how a person’s brain breaks down visual information and decides whether a painting is “good.” The findings show for the first time how intrinsic features of a painting combine with human judgment to give art value in our minds. Most people—including researchers—consider art preferences to be all over the map, says Anjan Chatterjee, a neurologist and cognitive neuroscientist at the University of Pennsylvania who was not involved in the study. Many preferences are rooted in biology–sugary foods, for instance, help us survive. And people tend to share similar standards of beauty when it comes to human faces and landscapes. But when it comes to art, “There are relatively arbitrary things we seem to care about and value,” Chatterjee says. To figure out how the brain forms value judgments about art, computational neuroscientist Kiyohito Iigaya and his colleagues at the California Institute of Technology first asked more than 1300 volunteers on the crowdsourcing website Amazon Mechanical Turk to rate a selection of 825 paintings from four Western genres including impressionism, cubism, abstract art, and color field painting. Volunteers were all over the age of 18, but researchers didn’t specify their familiarity with art or their ethnic or national origin. © 2020 American Association for the Advancement of Science

Keyword: Vision; Attention
Link ID: 27062 - Posted: 02.21.2020

By Viviane Callier In 1688 Irish philosopher William Molyneux wrote to his colleague John Locke with a puzzle that continues to draw the interest of philosophers and scientists to this day. The idea was simple: Would a person born blind, who has learned to distinguish objects by touch, be able to recognize them purely by sight if he or she regained the ability to see? The question, known as Molyneux’s problem, probes whether the human mind has a built-in concept of shapes that is so innate that such a blind person could immediately recognize an object with restored vision. The alternative is that the concepts of shapes are not innate but have to be learned by exploring an object through sight, touch and other senses, a process that could take a long time when starting from scratch. An attempt was made to resolve this puzzle a few years ago by testing Molyneux's problem in children who were congenitally blind but then regained their sight, thanks to cataract surgery. Although the children were not immediately able to recognize objects, they quickly learned to do so. The results were equivocal. Some learning was needed to identify an object, but it appeared that the study participants were not starting completely from scratch. Lars Chittka of Queen Mary University of London and his colleagues have taken another stab at finding an answer, this time using another species. To test whether bumblebees can form an internal representation of objects, Chittka and his team first trained the insects to discriminate spheres and cubes using a sugar reward. The bees were trained in the light, where they could see but not touch the objects that were isolated inside a closed petri dish. Then they were tested in the dark, where they could touch but not see the spheres or cubes. The researchers found that the invertebrates spent more time in contact with the shape they had been trained to associate with the sugar reward, even though they had to rely on touch rather than sight to discriminate the objects. © 2020 Scientific American

Keyword: Development of the Brain; Vision
Link ID: 27061 - Posted: 02.21.2020

By Richard Klasco, M.D. A. The theory of the “sugar high” has been debunked, yet the myth persists. The notion that sugar might make children behave badly first appeared in the medical literature in 1922. But the idea did not capture the public’s imagination until Dr. Ben Feingold’s best-selling book, “Why Your Child Is Hyperactive,” was published in 1975. In his book, Dr. Feingold describes the case of a boy who might well be “patient zero” for the putative connection between sugar and hyperactivity: [The mother’s] fair-haired, wiry son loved soft drinks, candy and cake — not exactly abnormal for any healthy child. He also seemed to go completely wild after birthday parties and during family gatherings around holidays. In the mid-’70s, stimulant drugs such as Ritalin and amphetamine were becoming popular for the treatment of attention deficit hyperactivity disorder. For parents who were concerned about drug side effects, the possibility of controlling hyperactivity by eliminating sugar proved to be an enticing, almost irresistible, prospect. Some studies supported the theory. They suggested that high sugar diets caused spikes in insulin secretion, which triggered adrenaline production and hyperactivity. But the data were weak and were soon questioned by other scientists. An extraordinarily rigorous study settled the question in 1994. Writing in the New England Journal of Medicine, a group of scientists tested normal preschoolers and children whose parents described them as being sensitive to sugar. Neither the parents, the children nor the research staff knew which of the children were getting sugary foods and which were getting a diet sweetened with aspartame and other artificial sweeteners. Urine was tested to verify compliance with the diets. Nine different measures of cognitive and behavioral performance were assessed, with measurements taken at five-second intervals. © 2020 The New York Times Company

Keyword: ADHD; Obesity
Link ID: 27060 - Posted: 02.21.2020

By David Grimm More than 3 years after it hosted a workshop on the science and ethics of biomedical studies on monkeys, the National Institutes of Health (NIH) this week convened another workshop on nonhuman primate research. And much like the previous event, the meeting is drawing sharply divergent reactions from biomedical and animal advocacy groups. “It was a very good look at the opportunities and challenges of doing this type of research,” says Alice Ra’anan, director of government relations and science policy at the American Physiological Society, a group that represents nearly 10,000 scientists, doctors, and veterinarians. It was “an excellent and robust discussion around fostering rigorous research in nonhuman primates,” adds Matthew Bailey, president of that National Association for Biomedical Research. But Emily Trunnell, a research associate at People for the Ethical Treatment of Animals, an animal rights group, counters that the event was a wasted opportunity to talk about the ethics of using nonhuman primates in the first place. “It was just a bunch of scientists clamoring for more money and more monkeys.” The workshop comes at a time when scientists are using a near-record number of rhesus macaques, marmosets, and other nonhuman primates in biomedical research. The animals, many researchers say, have become increasingly important in revealing how the human brain works and in developing treatments for infectious diseases. There’s been a particular surge in demand for marmosets, which are being genetically engineered to serve as models for autism, Parkinson’s, and other neurological disorders. © 2020 American Association for the Advancement of Science.

Keyword: Animal Rights
Link ID: 27059 - Posted: 02.21.2020

Diana Kwon In the 16th century, when the study of human anatomy was still in its infancy, curious onlookers would gather in anatomical theaters to catch of a glimpse of public dissections of the dead. In the years since, scientists have carefully mapped the viscera, bones, muscles, nerves, and many other components of our bodies, such that a human corpse no longer holds that same sense of mystery that used to draw crowds. New discoveries in gross anatomy—the study of bodily structures at the macroscopic level—are now rare, and their significance is often overblown, says Paul Neumann, a professor who specializes in the history of medicine and anatomical nomenclature at Dalhousie University. “The important discoveries about anatomy, I think, are now coming from studies of tissues and cells.” Over the last decade, there have been a handful of discoveries that have helped overturn previous assumptions and revealed new insights into our anatomy. “What’s really interesting and exciting about almost all of the new studies is the illustration of the power of new [microscopy and imaging] technologies to give deeper insight,” says Tom Gillingwater, a professor of anatomy at the University of Edinburgh in the UK. “I would guess that many of these discoveries are the start, rather than the end, of a developing view of the human body.” © 1986–2020 The Scientist

Keyword: Brain imaging
Link ID: 27058 - Posted: 02.20.2020

By Kim Tingley Hearing loss has long been considered a normal, and thus acceptable, part of aging. It is common: Estimates suggest that it affects two out of three adults age 70 and older. It is also rarely treated. In the U.S., only about 14 percent of adults who have hearing loss wear hearing aids. An emerging body of research, however, suggests that diminished hearing may be a significant risk factor for Alzheimer’s disease and other forms of dementia — and that the association between hearing loss and cognitive decline potentially begins at very low levels of impairment. In November, a study published in the journal JAMA Otolaryngology — Head and Neck Surgery examined data on hearing and cognitive performance from more than 6,400 people 50 and older. Traditionally, doctors diagnose impairment when someone experiences a loss in hearing of at least 25 decibels, a somewhat arbitrary threshold. But for the JAMA study, researchers included hearing loss down to around zero decibels in their analysis and found that they still predicted correspondingly lower scores on cognitive tests. “It seemed like the relationship starts the moment you have imperfect hearing,” says Justin Golub, the study’s lead author and an ear, nose and throat doctor at the Columbia University Medical Center and NewYork-Presbyterian. Now, he says, the question is: Does hearing loss actually cause the cognitive problems it has been associated with and if so, how? Preliminary evidence linking dementia and hearing loss was published in 1989 by doctors at the University of Washington, Seattle, who compared 100 patients with Alzheimer’s-like dementia with 100 demographically similar people without it and found that those who had dementia were more likely to have hearing loss, and that the extent of that loss seemed to correspond with the degree of cognitive impairment. But that possible connection wasn’t rigorously investigated until 2011, when Frank Lin, an ear, nose and throat doctor at Johns Hopkins School of Medicine, and colleagues published the results of a longitudinal study that tested the hearing of 639 older adults who were dementia-free and then tracked them for an average of nearly 12 years, during which time 58 had developed Alzheimer’s or another cognitive impairment. They discovered that a subject’s likelihood of developing dementia increased in direct proportion to the severity of his or her hearing loss at the time of the initial test. The relationship seems to be “very, very linear,” Lin says, meaning that the greater the hearing deficit, the greater the risk a person will develop the condition. © 2020 The New York Times Company

Keyword: Hearing; Alzheimers
Link ID: 27057 - Posted: 02.20.2020

By Judith Graham, Kaiser Health News Do I know I’m at risk for developing dementia? You bet. My father died of Alzheimer’s disease at age 72; my sister was felled by frontotemporal dementia at 58. And that’s not all: Two maternal uncles had Alzheimer’s, and my maternal grandfather may have had vascular dementia. (In his generation, it was called senility.) So what happens when I misplace a pair of eyeglasses or can’t remember the name of a movie I saw a week ago? “Now comes my turn with dementia,” I think. Then I talk myself down from that emotional cliff. Am I alone in this? Hardly. Many people, like me, who’ve watched this cruel illness destroy a family member, dread the prospect that they, too, might become demented. The lack of a cure or effective treatments only adds to the anxiety. It seems a common refrain, the news that another treatment to stop Alzheimer’s has failed. How do we cope as we face our fears and peer into our future? Andrea Kline, whose mother, as well as her aunt and uncle, had Alzheimer’s disease, just turned 71 and lives in Boynton Beach, Fla. She’s a retired registered nurse who teaches yoga to seniors at community centers and assisted-living facilities. “I worry about dementia incessantly: Every little thing that goes wrong, I’m convinced it’s the beginning,” she told me. Because Ms. Kline has had multiple family members with Alzheimer’s, she’s more likely to have a genetic vulnerability than someone with a single occurrence in their family. But that doesn’t mean this condition lies in her future. A risk is just that: It’s not a guarantee. The age of onset is also important. People with close relatives struck by dementia early — before age 65 — are more likely to be susceptible genetically. Ms. Kline was the primary caregiver for her mother, Charlotte Kline, who received an Alzheimer’s diagnosis in 1999 and passed away in 2007 at age 80. “I try to eat very healthy. I exercise. I have an advance directive, and I’ve discussed what I want” in the way of care “with my son,” she said. © 2020 The New York Times Company

Keyword: Alzheimers
Link ID: 27056 - Posted: 02.20.2020

Maternal obesity may increase a child’s risk for attention-deficit hyperactivity disorder (ADHD), according to an analysis by researchers from the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), part of the National Institutes of Health. The researchers found that mothers — but not fathers — who were overweight or obese before pregnancy were more likely to report that their children had been diagnosed with attention-deficit hyperactivity disorder (ADHD) or to have symptoms of hyperactivity, inattentiveness or impulsiveness at ages 7 to 8 years old. Their study appears in The Journal of Pediatrics. The study team analyzed the NICHD Upstate KIDS Study, which recruited mothers of young infants and followed the children through age 8 years. In this analysis of nearly 2,000 children, the study team found that women who were obese before pregnancy were approximately twice as likely to report that their child had ADHD or symptoms of hyperactivity, inattention or impulsiveness, compared to children of women of normal weight before pregnancy. The authors suggest that, if their findings are confirmed by additional studies, healthcare providers may want to screen children of obese mothers for ADHD so that they could be offered earlier interventions. The authors also note that healthcare providers could use evidence-based strategies to counsel women considering pregnancy on diet and lifestyle. Resources for plus-size pregnant women and their healthcare providers are available as part of NICHD’s Pregnancy for Every Body initiative.

Keyword: ADHD; Development of the Brain
Link ID: 27055 - Posted: 02.20.2020

Merrit Kennedy As doctors in London performed surgery on Dagmar Turner's brain, the sound of a violin filled the operating room. The music came from the patient on the operating table. In a video from the surgery, the violinist moves her bow up and down as surgeons behind a plastic sheet work to remove her brain tumor. The King's College Hospital surgeons woke her up in the middle of the operation in order to ensure they did not compromise parts of the brain necessary for playing the violin, such as parts that control precise hand movements and coordination. "We knew how important the violin is to Dagmar, so it was vital that we preserved function in the delicate areas of her brain that allowed her to play," Keyoumars Ashkan, a neurosurgeon at King's College Hospital, said in a press release. Turner, 53, learned that she had a slow-growing tumor in 2013. Late last year, doctors found that it had become more aggressive and the violinist decided to have surgery to remove it. In an interview with ITV News, Turner recalled doctors telling her, "Your tumor is on the right-hand side, so it will not affect your right-hand side, it will affect your left-hand side." "And I'm just like, 'Oh, hang on, this is my most important part. My job these days is playing the violin,' " she said, making a motion of pushing down violin strings with her left hand. Ashkan, an accomplished pianist, and his colleagues came up with a plan to keep the hand's functions intact. © 2020 npr

Keyword: Epilepsy; Movement Disorders
Link ID: 27054 - Posted: 02.20.2020

Amy Schleunes New Zealand’s North Island robins (Petroica longipes), known as toutouwai in Maori, are capable of remembering a foraging task taught to them by researchers for up to 22 months in the wild, according to a study published on February 12 in Biology Letters. These results echo the findings of a number of laboratory studies of long-term memory in animals, but offer a rare example of a wild animal retaining a learned behavior with no additional training. The study also has implications for conservation and wildlife management: given the birds’ memory skills, researchers might be able to teach them about novel threats and resources in their constantly changing habitat. “This is the first study to show [memory] longevity in the wild,” says Vladimir Pravosudov, an animal behavior researcher at the University of Nevada, Reno, who was not involved in the study. Rachael Shaw, a coauthor and behavioral ecologist at Victoria University in New Zealand, says she was surprised that the birds remembered the new skill she had taught them. “Wild birds have so much that they have to contend with in their daily lives,” she says. “You don’t really expect that it’s worth their while to retain this learned task they hardly had the opportunity to do, and they can’t predict that they will have an opportunity to do again.” Shaw is generally interested in the cognitive abilities of animals and the evolution of intelligence, and the toutouwai, trainable food caching birds that can live up to roughly 10 years, make perfect subjects for her behavioral investigations. “They’ve got this kind of boldness and curiosity that a lot of island bird species share,” says Shaw. These qualities make them vulnerable to predation by invasive cats, rats, and ermines (also known as stoats), but also inquisitive and relatively unafraid of humans, an ideal disposition for testing memory retention in the field. © 1986–2020 The Scientist

Keyword: Learning & Memory; Evolution
Link ID: 27053 - Posted: 02.20.2020

By Rachel Cericola A year ago, I was diagnosed with nasal polyps and regularly snored like a wild boar. I’ve had the polyps removed, but the snoring continues. I’m not alone. According to Principles and Practice of Sleep Medicine (Fifth Edition), “about 40 percent of the adult population” snores. Sometimes my snoring wakes up my husband (and vice versa), so I decided to try out six popular over-the-counter anti-snoring contraptions. To get a baseline measurement of how much I was snoring without any intervention, I used SnoreLab, a highly rated app that listens for snoring sounds, records clips, and analyzes your resting audio. After calculating an average of four nights’ intervention-free snoring readings to get a starting “sleep score,” I then slept with each anti-snoring device for several nights and tracked my SnoreLab results against that baseline. (Note that some of these devices may work for you and not me — and none of them should be used to treat sleep apnea. If you’re having restless sleep, gasping awake, or even feeling tired and foggy in the daytime, see a doctor.) While longer-term testing is needed before we could confidently recommend any of these, a few devices showed promise in our preliminary — and far from scientific — trials. Here’s how they did, in order of how much I found they helped: Smart Nora, $329 at the time of publication This system will slightly move your head when it catches you snoring. It includes a wireless, mic-equipped device that can sit bedside or be wall-mounted to detect snoring. Once it does that, it communicates with an under-bed base station that pumps air through a tube to an insert that lives inside your pillow. That motion gently adjusts your head position to reduce snoring (in my case, it effectively did so without waking me up). It sounds bizarre, but this was actually the most effective device I tried, cutting my total snoring in half, according to my SnoreLab sleep scores. It is also the most expensive. There are many options for personalization, which we will continue to test. © 2020 The New York Times Company

Keyword: Sleep
Link ID: 27052 - Posted: 02.20.2020

By Katherine Kornei Imagine a frog call, but with a metallic twang—and the intensity of a chainsaw. That’s the “boing” of a minke whale. And it’s a form of animal communication in danger of being drowned out by ocean noise, new research shows. By analyzing more than 42,000 minke whale boings, scientists have found that, as background noise intensifies, the whales are losing their ability to communicate over long distances. This could limit their ability to find mates and engage in important social contact with other whales. Tyler Helble, a marine acoustician at the Naval Information Warfare Center Pacific, and colleagues recorded minke whale boings over a 1200-square-kilometer swathe of the U.S. Navy’s Pacific Missile Range Facility near the Hawaiian island of Kauai from 2012 to 2017. By measuring when a single boing arrived at various underwater microphones, the team pinpointed whale locations to within 10 to 20 meters. The researchers then used these positions, along with models of how sound propagates underwater, to calculate the intensity of each boing when it was emitted. The team compared these measurements with natural ambient noise, including waves, wind, and undersea earthquakes (no military exercises were conducted nearby during the study period). They found that minke whale boings grew louder in louder conditions. That’s not surprising—creatures across the animal kingdom up their volume when there’s background noise. (This phenomenon, dubbed the Lombard effect, holds true for humans, too—think of holding a conversation at a loud concert.) © 2019 American Association for the Advancement of Science.

Keyword: Animal Communication; Hearing
Link ID: 27051 - Posted: 02.19.2020

Ian Sample Science editor Consuming a western diet for as little as one week can subtly impair brain function and encourage slim and otherwise healthy young people to overeat, scientists claim. Researchers found that after seven days on a high fat, high added sugar diet, volunteers in their 20s scored worse on memory tests and found junk food more desirable immediately after they had finished a meal. The finding suggests that a western diet makes it harder for people to regulate their appetite, and points to disruption in a brain region called the hippocampus as the possible cause. “After a week on a western-style diet, palatable food such as snacks and chocolate becomes more desirable when you are full,” said Richard Stevenson, a professor of psychology at Macquarie University in Sydney. “This will make it harder to resist, leading you to eat more, which in turn generates more damage to the hippocampus and a vicious cycle of overeating.” Previous work in animals has shown that junk food impairs the hippocampus, a brain region involved in memory and appetite control. It is unclear why, but one idea is that the hippocampus normally blocks or weakens memories about food when we are full, so looking at a cake does not flood the mind with memories of how nice cake can be. “When the hippocampus functions less efficiently, you do get this flood of memories, and so food is more appealing,” Stevenson said. To investigate how the western diet affects humans, the scientists recruited 110 lean and healthy students, aged 20 to 23, who generally ate a good diet. Half were randomly assigned to a control group who ate their normal diet for a week. The other half were put on a high energy western-style diet, which featured a generous intake of Belgian waffles and fast food. © 2020 Guardian News & Media Limited

Keyword: Learning & Memory; Obesity
Link ID: 27050 - Posted: 02.19.2020

Researchers at the National Institutes of Health found evidence that specific immune cells may play a key role in the devastating effects of cerebral malaria, a severe form of malaria that mainly affects young children. The results, published in the Journal of Clinical Investigation, suggest that drugs targeting T cells may be effective in treating the disease. The study was supported by the NIH Intramural Research Program. “This is the first study showing that T cells target blood vessels in brains of children with cerebral malaria,” said Dorian McGavern, Ph.D., chief of the Viral Immunology and Intravital Imaging Section at the NIH’s National Institute of Neurological Disorders and Stroke (NINDS) who co-directed the study with Susan Pierce, Ph.D., chief of the Laboratory of Immunogenetics at the National Institute of Allergy and Infectious Diseases (NIAID). “These findings build a bridge between mouse and human cerebral malaria studies by implicating T cells in the development of disease pathology in children. It is well established that T cells cause the brain vasculature injury associated with cerebral malaria in mice, but this was not known in humans.” More than 200 million people worldwide are infected annually with mosquito-borne parasites that cause malaria. In a subset of those patients, mainly young children, the parasites accumulate in brain blood vessels causing cerebral malaria, which leads to increased brain pressure from swelling. Even with available treatment, cerebral malaria still kills up to 25% of those affected resulting in nearly 400,000 deaths annually. Children who survive the infection will often have long-lasting neurological problems such as cognitive impairment.

Keyword: Neuroimmunology
Link ID: 27049 - Posted: 02.19.2020

Nicola Davis Parents should not worry about their teenagers’ delinquent behaviour provided they were well behaved in their earlier childhood, according to researchers behind a study that suggests those who offend throughout their life showed antisocial behaviour from a young age and have a markedly different brain structure as adults. According to figures from the Ministry of Justice, 24% of males in England and Wales aged 10–52 in 2006 had a conviction, compared with 6% of females. Previous work has shown that crime rises in adolescence and young adulthood but that most perpetrators go on to become law-abiding adults, with only a minority – under 10% of the general population – continuing to offend throughout their life. Such trends underpin many modern criminal justice strategies, including in the UK where police can use their discretion as to whether to a young offender should enter the formal justice system. Now researchers say they have found that adults with a long history of offences show striking differences in brain structure compared with those who have stuck to the straight and narrow or who transgressed only as adolescents. “These findings underscore prior research that really highlights that there are different types of young offenders – they are not all the same. They should not all be treated the same,” said Prof Essi Viding, a co-author of the study from University College London. Prof Terrie Moffitt, another co-author of the research from Duke University in North Carolina, said the study helped to shed light on what may be behind persistent antisocial behaviour. © 2020 Guardian News & Media Limited

Keyword: Aggression; Brain imaging
Link ID: 27048 - Posted: 02.18.2020

By Laura Sanders SEATTLE — Live bits of brain look like any other piece of meat — pinkish, solid chunks of neural tissue. But unlike other kinds of tissue or organs donated for research, they hold the memories, thoughts and feelings of a person. “It is identified with who we are,” Karen Rommelfanger, a neuroethicist at Emory University in Atlanta, said February 13 in a news conference at the annual meeting of the American Association for the Advancement of Science. That uniqueness raises a whole new set of ethical quandaries when it comes to experimenting with living brain tissue, she explained. Such donations are crucial to emerging research aimed at teasing out answers to what makes us human. For instance, researchers at the Seattle-based Allen Institute for Brain Science conduct experiments on live brain tissue to get clues about how the cells in the human brain operate (SN: 8/7/19). These precious samples, normally discarded as medical waste, are donated by patients undergoing brain surgery and raced to the lab while the nerve cells are still viable. Other experiments rely on systems that are less sophisticated than a human brain, such as brain tissue from other animals and organoids. These clumps of neural tissue, grown from human stem cells, are still a long way from mimicking the complexities of the human brain (SN: 10/24/19). But with major advances, these systems might one day be capable of much more advanced behavior, which might ultimately lead to awareness, a conundrum that raises ethical issues. © Society for Science & the Public 2000–2020

Keyword: Consciousness; Emotions
Link ID: 27047 - Posted: 02.18.2020

Fergus Walsh Medical correspondent A new gene therapy has been used to treat patients with a rare inherited eye disorder which causes blindness. It's hoped the NHS treatment will halt sight loss and even improve vision. Matthew Wood, 48, one of the first patients to receive the injection, told the BBC: "I value the remaining sight I have so if I can hold on to that it would be a big thing for me." The treatment costs around £600,000 but NHS England has agreed a discounted price with the manufacturer Novartis. Luxturna (voretigene neparvovec), has been approved by The National Institute for Health and Care Excellence (NICE), which estimates that just under 90 people in England will be eligible for the treatment. The gene therapy is for patients who have retinal dystrophy as a result of inheriting a faulty copy of the RPE65 gene from both parents. The gene is important for providing the pigment that light sensitive cells need to absorb light. Initially this affects night vision but eventually, as the cells die, it can lead to complete blindness. An injection is made into the back of the eye - this delivers working copies of the RPE65 gene. These are contained inside a harmless virus, which enables them to penetrate the retinal cells. Once inside the nucleus, the gene provides the instructions to make the RPE65 protein, which is essential for healthy vision. © 2020 BBC

Keyword: Vision; Genes & Behavior
Link ID: 27046 - Posted: 02.18.2020

By Jane E. Brody I’ve long thought the human body was not meant to run on empty, that fasting was done primarily for religious reasons or political protest. Otherwise we needed a reliably renewed source of fuel to function optimally, mentally and emotionally as well as physically. Personal experience reinforced that concept; I’m not pleasant to be around when I’m hungry. There’s even an official name for that state of mind, confirmed by research: Hangry! But prompted by recent enthusiasm for fasting among people concerned about their health, weight or longevity, I looked into the evidence for possible benefits — and risks — of what researchers call intermittent fasting. Popular regimens range from ingesting few if any calories all day every other day or several times a week to fasting for 16 hours or more every day. A man I know in his early 50s said he had lost 12 pounds in about two months on what he calls the 7-11 diet: He eats nothing from 7 p.m. until 11 a.m. the next morning, every day. I was skeptical, but it turns out there is something to be said for practicing a rather prolonged diurnal fast, preferably one lasting at least 16 hours. Mark P. Mattson, neuroscientist at the National Institute on Aging and Johns Hopkins University School of Medicine, explained that the liver stores glucose, which the body uses preferentially for energy before it turns to burning body fat. “It takes 10 to 12 hours to use up the calories in the liver before a metabolic shift occurs to using stored fat,” Dr. Mattson told me. After meals, glucose is used for energy and fat is stored in fat tissue, but during fasts, once glucose is depleted, fat is broken down and used for energy. Most people trying to lose weight should strive for 16 calorie-free hours, he said, adding that “the easiest way to do this is to stop eating by 8 p.m., skip breakfast the next morning and then eat again at noon the next day.” (Caffeine-dependent people can have sugar- free black coffee or tea before lunch.) But don’t expect to see results immediately; it can take up to four weeks to notice an effect, he said. © 2020 The New York Times Company

Keyword: Obesity
Link ID: 27045 - Posted: 02.18.2020

By Tam Hunt Strangely, modern science was long dominated by the idea that to be scientific means to remove consciousness from our explanations, in order to be “objective.” This was the rationale behind behaviorism, a now-dead theory of psychology that took this trend to a perverse extreme. Behaviorists like John Watson and B.F. Skinner scrupulously avoided any discussion of what their human or animal subjects thought, intended or wanted, and focused instead entirely on behavior. They thought that because thoughts in other peoples’ heads, or in animals, are impossible to know with certainty, we should simply ignore them in our theories. We can only be truly scientific, they asserted, if we focus solely on what can be directly observed and measured: behavior. Erwin Schrödinger, one of the key architects of quantum mechanics in the early part of the 20th century, labeled this approach in his philosophical 1958 book Mind and Matter, the “principle of objectivation” and expressed it clearly: Advertisement “By [the principle of objectivation] I mean … a certain simplification which we adopt in order to master the infinitely intricate problem of nature. Without being aware of it and without being rigorously systematic about it, we exclude the Subject of Cognizance from the domain of nature that we endeavor to understand. We step with our own person back into the part of an onlooker who does not belong to the world, which by this very procedure becomes an objective world.” Schrödinger did, however, identify both the problem and the solution. He recognized that “objectivation” is just a simplification that is a temporary step in the progress of science in understanding nature. © 2020 Scientific American

Keyword: Consciousness
Link ID: 27044 - Posted: 02.18.2020