Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 4901 - 4920 of 29488

Rhitu Chatterjee People addicted to prescription opioids or heroin are far more likely to have run-ins with the law than those who don't use opioids, according to a study published Friday in JAMA Network Open. The study provides the first nationwide estimate for the number of people using opioids who end up in the American criminal justice system. The results suggest a need to engage law enforcement officials and corrections systems to tackle the opioid epidemic. The connection between the criminal justice system and substance abuse is well-known. About 65 percent of people who are incarcerated are known to have a substance use disorder, according to the National Institute on Drug Abuse. And yet there is little national data tracking the intersection of the criminal justice system and the ongoing opioid epidemic. "There have been reports that jails and prisons are bearing the brunt of the opioid epidemic, but we didn't know nationally how many people who use opioids are involved in the criminal justice system," says Tyler Winkelman, a clinician-investigator at Hennepin Healthcare in Minneapolis and the lead author of the study. To get that national picture, Winkelman and his colleagues analyzed data from 78,976 respondents to the annual National Survey on Drug Use And Health, which collects information on substance use by respondents, as well as information on their socioeconomic status, education and health. © 2018 npr

Keyword: Drug Abuse
Link ID: 25179 - Posted: 07.07.2018

Laura Sanders Newly identified nerve cells deep in the brains of mice compel them to eat. Similar cells exist in people, too, and may ultimately represent a new way to target eating disorders and obesity. These neurons, described in the July 6 Science, are not the first discovered to control appetite. But because of the mysterious brain region where they are found and the potential relevance to people, the mouse results “are worth pursuing,” says neurobiologist and physiologist Sabrina Diano of Yale University School of Medicine. Certain nerve cells in the human brain region called the nucleus tuberalis lateralis, or NTL, are known to malfunction in neurodegenerative diseases such as Huntington’s and Alzheimer’s. But “almost nothing is known about [the region],” says study coauthor Yu Fu of the Singapore Bioimaging Consortium, Agency for Science, Technology and Research. In people, the NTL is a small bump along the bottom edge of the hypothalamus, a brain structure known to regulate eating behavior. But in mice, a similar structure wasn’t thought to exist at all, until Fu and colleagues discovered it by chance. The researchers were studying cells that produce a hormone called somatostatin — a molecular signpost of some NTL cells in people. In mice, that cluster of cells in the hypothalamus seemed to correspond to the human NTL. Not only do these cells exist in mice, but they have a big role in eating behavior. The neurons sprang into action when the mice were hungry, or when the hunger-signaling hormone ghrelin was around, the team found. |© Society for Science & the Public 2000 - 201

Keyword: Obesity
Link ID: 25178 - Posted: 07.06.2018

Christine Calder A yawn consists of an extended gaping of the mouth followed by a more rapid closure. In mammals and birds, a long intake of breath and shorter exhale follows the gaping of the mouth, but in other species such as fish, amphibians and snakes there is no intake of breath. But what’s behind a yawn, why does it occur? In the past, people have had many hypotheses. As far back as 400 B.C., Hippocrates thought yawning removed bad air from the lungs before a fever. In the 17th and 18th century, doctors believed yawning increased oxygen in the blood, blood pressure, heart rate and blood flow itself. More recently, consensus moved toward the idea that yawning cools down the brain, so when ambient conditions and temperature of the brain itself increase, yawning episodes increase. Despite all these theories, the truth is that scientists do not know the true biological function of a yawn. What we do know is that yawning occurs in just about every species. It happens when an animal is tired. It can be used as a threat display in some species. Yawning can occur during times of social conflict and stress, something researchers call a displacement behavior. And that wide-open mouth can be contagious, especially in social species such as humans, chimpanzees, bonobos, macaques and wolves. Watching someone yawn – heck, even reading about yawns – can lead you to yawn yourself. Why? © 2010–2018, The Conversation US, Inc.

Keyword: Sleep; Attention
Link ID: 25177 - Posted: 07.06.2018

Philip Lieberman In the 1960s, researchers at Yale University’s Haskins Laboratories attempted to produce a machine that would read printed text aloud to blind people. Alvin Liberman and his colleagues figured the solution was to isolate the “phonemes,” the ostensible beads-on-a-string equivalent to movable type that linguists thought existed in the acoustic speech signal. Linguists had assumed (and some still do) that phonemes were roughly equivalent to the letters of the alphabet and that they could be recombined to form different words. However, when the Haskins group snipped segments from tape recordings of words or sentences spoken by radio announcers or trained phoneticians, and tried to link them together to form new words, the researchers found that the results were incomprehensible.1 That’s because, as most speech scientists agree, there is no such thing as pure phonemes (though some linguists still cling to the idea). Discrete phonemes do not exist as such in the speech signal, and instead are always blended together in words. Even “stop consonants,” such as [b], [p], [t], and [g], don’t exist as isolated entities; it is impossible to utter a stop consonant without also producing a vowel before or after it. As such, the consonant [t] in the spoken word tea, for example, sounds quite different from that in the word to. To produce the vowel sound in to, the speakers’ lips are protruded and narrowed, while they are retracted and open for the vowel sound in tea, yielding different acoustic representations of the initial consonant. Moreover, when the Haskins researchers counted the number of putative phonemes that would be transmitted each second during normal conversations, the rate exceeded that which can be interpreted by the human auditory system—the synthesized phrases would have become an incomprehensible buzz. © 1986 - 2018 The Scientist.

Keyword: Language; Evolution
Link ID: 25176 - Posted: 07.06.2018

by Sarah Kaplan For years, scientists at the Smithsonian Tropical Research Institute in Panama had whispered about the remote island where monkeys used stone tools. A botanist had witnessed the phenomenon during a long-ago survey — but, being more interested in flora than fauna at the time, she couldn't linger to investigate. A return to the site would require new funds, good weather for a treacherous 35-mile boat ride, and days of swimming, hiking and camping amid rocky, wave-pounded shorelines and dense tropical forest. “For while, it kind of just stayed a rumor,” said Brendan Barrett, a behavioral ecologist at the Max Planck Institute in Germany and a visiting researcher at STRI. But when Barrett and his colleagues finally arrived at Jicarón Island in Panama's Coiba National Park last year, what they found was well-worth the effort: Tiny white-faced capuchin monkeys were using stones almost half their body weight as hammers to smash open shellfish, nuts and other foods. “We were stunned,” said Barrett, the lead author of a new paper on the discovery posted on the preprint website bioRxiv. The capuchins are the first animals of their genus observed using stone tools, and only the fourth group of nonhuman primates known to do so. Sophisticated, social, and tolerant of observation, they also provide scientists with an ideal system for studying what causes a species to venture into the stone age — and could help researchers understand how and why our own ancestors first picked up stone tools more than 2 million years ago. © 1996-2018 The Washington Post

Keyword: Evolution
Link ID: 25175 - Posted: 07.06.2018

By JoAnna Klein Owl eyes are round, but not spherical. These immobile, tubular structures sit on the front of an owl’s face like a pair of built-in binoculars. They allow the birds to focus in on prey and see in three dimensions, kind of like humans — except we don’t have to turn our whole heads to spot a slice of pizza beside us. Although owls and humans both have binocular vision, it has been unclear whether these birds of prey process information they collect from their environments like humans, because their brains aren’t as complex. But in a study published in the Journal of Neuroscience on Monday, scientists tested the ability of barn owls to find a moving target among various shifting backgrounds, a visual processing task earlier tested only in primates. The research suggests that barn owls, with far simpler brains than humans and other primates, also group together different elements as they move in the same direction, to make sense of the world around them. “Humans are not so different from birds as you may think,” said Yoram Gutfreund, a neuroscientist at Technion Israel Institute of Technology who led the study with colleagues from his university and RWTH Aachen University in Germany. A critical part of perception is being able to distinguish an object from its background. One way humans do this is by grouping elements of a scene together to perceive each part as a whole. In some cases, that means combining objects that move similarly, like birds flying in a flock, or the single bird that breaks away from it. Scientists have generally considered this type of visual processing as a higher level task that requires complex brain structures. As such, they’ve only studied it in humans and primates. But Dr. Gutfreund and his team believed this ability was more basic — like seeing past camouflage. A barn owl, for example, might have evolved a similar mechanism to detect a mouse moving in a meadow as wind blows the grass in the same direction. © 2018 The New York Times Company

Keyword: Vision; Evolution
Link ID: 25174 - Posted: 07.05.2018

By Elizabeth Pennisi Bats and their prey are in a constant arms race. Whereas the winged mammals home in on insects with frighteningly accurate sonar, some of their prey—such as the tiger moth—fight back with sonar clicks and even jamming signals. Now, in a series of bat-moth skirmishes (above), scientists have shown how other moths create an “acoustic illusion,” with long wing-tails that fool bats into striking the wrong place. The finding helps explain why some moths have such showy tails, and it may also provide inspiration for drones of the future. Moth tails vary from species to species: Some have big lobes at the bottom of the hindwing instead of a distinctive tail; others have just a short protrusion. Still others have long tails that are thin strands with twisted cuplike ends. In 2015, sensory ecologist Jesse Barber of Boise State University in Idaho and colleagues discovered that some silk moths use their tails to confuse bat predators. Now, graduate student Juliette Rubin has shown just what makes the tails such effective deterrents. Working with three species of silk moths—luna, African moon, and polyphemus—Rubin shortened or cut off some of their hindwings and glued longer or differently shaped tails to others. She then tied the moths to a string hanging from the top of a large cage and released a big brown bat (Eptesicus fuscus) inside. She used high-speed cameras and microphones to record the ensuing fight. © 2018 American Association for the Advancement of Science.

Keyword: Hearing; Evolution
Link ID: 25173 - Posted: 07.05.2018

By Gretchen Reynolds Can working out help us to drop pounds after all? A provocative new study involving overweight men and women suggests that it probably can, undercutting a widespread notion that exercise, by itself, is worthless for weight loss. But the findings also indicate that, to benefit, we may need to exercise quite a bit. In theory, exercise should contribute substantially to weight loss. It burns calories. If we do not replace them, our bodies should achieve negative energy balance, use stored fat for fuel and shed pounds. But life and our metabolisms are not predictable or fair, as multiple exercise studies involving people and animals show. In these experiments, participants lose less weight than would be expected, given the energy they expend during exercise. The studies generally have concluded that the exercisers had compensated for the energy they had expended during exercise, either by eating more or moving less throughout the day. These compensations were often unwitting but effective. Some researchers had begun to wonder, though, if the amount of exercise might matter. Many of the past human experiments had involved about 30 minutes a day or so of moderate exercise, which is the amount generally recommended by current guidelines to improve health. But what if people exercised more, some researchers asked. Would they still compensate for all the calories that they burned? To find out, scientists from the University of North Dakota and other institutions decided to invite 31 overweight, sedentary men and women to a lab for measurements of their resting metabolic rate and body composition. The volunteers also recounted in detail what they had eaten the previous day and agreed to wear a sophisticated activity tracker for a week. © 2018 The New York Times Company

Keyword: Obesity
Link ID: 25172 - Posted: 07.05.2018

by Amy Ellis Nutt The possibility of using brain stimulation to help prevent future violence just passed a proof of concept stage, according to new research published Monday in the Journal of Neuroscience. In a double-blind, randomized controlled study, a group of volunteers who received a charge to their dorsolateral prefrontal cortex — the part of the brain that lies directly behind the forehead and is responsible for planning, reasoning and inhibition were — were less likely to say they would consider engaging in aggressive behavior compared to a similar group that received a sham treatment. The experiment looked at aggressive intent as well as how people reasoned about violence and found that a sense of moral wrongfulness about hypothetical acts of aggression was heightened in the group receiving the transcranial direct current stimulation (tDCS). This form of brain stimulation delivers targeted impulses to the brain through electrodes placed on a person's scalp. "Zapping offenders with an electrical current to fix their brains sounds like pulp fiction, but it might not be as crazy as it sounds," said Adrian Raine, a neurocriminologist at the University of Pennsylvania and one of the study's investigators. "This study goes some way toward documenting a causal association by showing that enhancing the prefrontal cortex puts the brakes on the impulse to act aggressively." © 1996-2018 The Washington Post

Keyword: Aggression
Link ID: 25171 - Posted: 07.03.2018

David Cyranoski An ambitious Chinese study tracking tens of thousands of babies and their mothers has begun to bear fruit — just six years after the study’s leaders recruited their first sets of mothers and babies. Researchers have already published results based on the cohort study, which collects biological, environmental and social data, some with important public-health implications. And many more investigations are under way. One, in particular, will examine infants’ microbiomes, the collections of bacteria and other microorganisms that inhabit their bodies — a hot topic in health research and a key goal of the cohort study. The Born in Guangzhou Cohort Study has recruited about 33,000 babies and their mothers since 20121. The study’s leaders are hoping to reach 50,000 baby–mother sets by 2020. And this year, investigators started recruiting 5,000 maternal grandmothers to the project, enabling studies across multiple generations. “The data is vast, and there is space for many different groups globally to mine this information,” says Maria Gloria Dominguez-Bello, a microbiologist at Rutgers, The State University of New Jersey, in New Brunswick, who is not involved in the study. “I really admire this effort from the Chinese team. Very few countries can achieve this scale.” © 2018 Macmillan Publishers Limited,

Keyword: Obesity; Schizophrenia
Link ID: 25170 - Posted: 07.03.2018

Allison Aubrey Coffee is far from a vice. There's now lots of evidence pointing to its health benefits, including a possible longevity boost for those of us with a daily coffee habit. The latest findings come from a study published Monday in JAMA Internal Medicine that included about a half-million people in England, Scotland and Wales. Participants ranged in age from 38 to 73. "We found that people who drank two to three cups per day had about a 12 percent lower risk of death compared to non-coffee drinkers" during the decade-long study, says Erikka Loftfield, a research fellow at the National Cancer Institute. This was true among all coffee drinkers — even those who were determined to be slow metabolizers of caffeine. (These are people who tend to be more sensitive to the effects of caffeine.) And the association held up among drinkers of decaffeinated coffee, too. In the U.S., there are similar findings linking higher consumption of coffee to a lower risk of early death in African-Americans, Japanese-Americans, Latinos and white adults, both men and women. A daily coffee habit is also linked to a decreased risk of stroke and Type 2 diabetes. What is it about coffee that may be protective? It's not likely to be the caffeine. While studies don't prove that coffee extends life, several studies have suggested a longevity boost among drinkers of decaf as well as regular coffee. © 2018 npr

Keyword: Drug Abuse
Link ID: 25169 - Posted: 07.03.2018

NPR's Lulu Garcia-Navarro talks with Dr. Randi Hutter Epstein about her new book Aroused, which tells the story of the scientific quest to understand human hormones. LULU GARCIA-NAVARRO, HOST: What do sleep, sex, insulin, mood and hunger have in common? Well, they're all controlled by hormones. But just a century ago, the power of our chemical messengers was barely understood. A new book by Dr. Randi Hutter Epstein called "Aroused" tells the stories of the scientists who work to explore and explain our hormones. Dr. Epstein joins us now from our New York bureau. Welcome to the program. RANDI HUTTER EPSTEIN: Thanks for having me. GARCIA-NAVARRO: The book is organized around stories from key moments in hormone research. And I have to say, many of the studies they were doing in the early days were pretty gruesome. EPSTEIN: When we say study, we tend now to think of the randomised clinical controlled trial. You know, you have one sample here. You compare it to another. When they were doing studies, they were doing sort of weird experiments on people and dogs and all kind of things. So there was Harvey Cushing. He was one of the first people to talk about that pituitary tumors can really muck you up and like send a lot of hormones awry. But here's what he tried to do that didn't work out that's kind of a wacky experiment. He had a 48-year-old man that had a pituitary tumor that was making him have double vision and headaches and other endocrine issues. And Harvey Cushing thought, what if we take a nice, healthy pituitary of a baby that just died if there is a newborn that didn't make it and just implant that in this old man, and then we just revive him and he'd be back to normal. Newspapers got a hold of it, as media tends to do. And there were wonderful headlines like baby brain, you know, broken brain fixed by baby. And it went wild in terms of, wow, we can now cure broken, old brains. And, spoiler alert, let's just say that we don't replace baby pituitary glands into grownups when they have pituitary tumors anymore.

Keyword: Hormones & Behavior
Link ID: 25168 - Posted: 07.03.2018

By Michael Shermer In 1967 British biologist and Nobel laureate Sir Peter Medawar famously characterized science as, in book title form, The Art of the Soluble. “Good scientists study the most important problems they think they can solve. It is, after all, their professional business to solve problems, not merely to grapple with them,” he wrote. For millennia, the greatest minds of our species have grappled to gain purchase on the vertiginous ontological cliffs of three great mysteries—consciousness, free will and God—without ascending anywhere near the thin air of their peaks. Unlike other inscrutable problems, such as the structure of the atom, the molecular basis of replication and the causes of human violence, which have witnessed stunning advancements of enlightenment, these three seem to recede ever further away from understanding, even as we race ever faster to catch them in our scientific nets. Are these “hard” problems, as philosopher David Chalmers characterized consciousness, or are they truly insoluble “mysterian” problems, as philosopher Owen Flanagan designated them (inspired by the 1960s rock group Question Mark and the Mysterians)? The “old mysterians” were dualists who believed in nonmaterial properties, such as the soul, that cannot be explained by natural processes. The “new mysterians,” Flanagan says, contend that consciousness can never be explained because of the limitations of human cognition. I contend that not only consciousness but also free will and God are mysterian problems—not because we are not yet smart enough to solve them but because they can never be solved, not even in principle, relating to how the concepts are conceived in language. Call those of us in this camp the “final mysterians.” © 2018 Scientific American

Keyword: Consciousness
Link ID: 25167 - Posted: 07.02.2018

By Hannah Furfaro, It was a sunny California afternoon in January 2015 when Dennis Wall received an unexpected gift: ‘smart glasses’ made by Google that had failed to live up to their hype in the press. An employee from the company pulled up to Wall’s lab at Stanford University in a sleek gray Tesla, popped open the sedan’s trunk and unloaded a brown cardboard box with long, dangling cords. It was a scene straight out of the television comedy “Silicon Valley,” which satirizes the absurdity of the tech world. Wall’s ambition for the Google Glass, however, is dead earnest: He aims to help people with autism interpret others’ emotions. Many people with autism have trouble understanding social cues and emotions, and this can greatly limit how they fare in the world. Wall developed an algorithm that relies on artificial intelligence. His plan was to incorporate the algorithm into the glasses, so that someone wearing the glasses would see a tiny emoticon that matches the expression on the face of another person. The algorithm was all set to go, and Wall had been waiting for the glasses to test his idea. “It was lifesaving for us because we were desperate to get started,” he recalls. © 2018 Scientific American

Keyword: Autism; Emotions
Link ID: 25166 - Posted: 07.02.2018

By Alex Therrien Health reporter, BBC News Think of magic mushrooms and LSD and it's likely that science is not the first thing that springs to mind. Psychedelic drugs are more likely to be associated with hippies and the counterculture of the 1960s than people in white lab coats and clinical trials. But that might soon change. Increasingly, scientists are looking at whether these mind-altering drugs - which also include mescaline and DMT among others - might also have the potential to be mind-healing. A number of small studies have found psychedelics to show promise in treating mental health disorders like depression, addiction and post-traumatic stress disorder, often where other treatments have failed. Now UK researchers are about to take part in the first major trials into whether one of these hallucinogenic drugs could be more effective than a leading antidepressant in the treatment of depression. Researchers at Imperial College London are to compare the magic mushroom compound psilocybin with a leading SSRI (selective serotonin reuptake inhibitors) antidepressant, escitalopram, in a large trial expected to take at least two years. "[Psychedelics] have a revolutionary potential, and that's not an exaggeration," says Dr Robin Carhart-Harris, who will lead the study. But it is not the first time scientists have been excited about these mind-bending substances. More than 50 years ago they rapidly came to scientific attention, before research in the field came to a sudden halt. During the 1950s and 60s psychedelics were considered to be a promising potential treatment for numerous mental health disorders, with more than 1,000 studies taking place. © 2018 BBC.

Keyword: Depression; Drug Abuse
Link ID: 25165 - Posted: 07.02.2018

/ By Eric Allen Been ‘Anew generation of scientists is not satisfied merely to watch and describe brain activity,” writes David Adam. “They want to interfere, to change and improve the brain — to neuroenhance it.” In his new book “The Genius Within: Unlocking Your Brain’s Potential” (Pegasus), Adam offers a many-sided investigation of neuroenhancement — a hodgepodge of technologies and drug treatments aimed at improving intelligence. A London-based science writer and editor, he previously wrote about obsessive-compulsive disorder, its history, and his own struggle with it in “The Man Who Couldn’t Stop” (2014). “We wonder at the stars, and then we start to work out how far away things are. And then we design a spacecraft that’s going to take us up there. I think that’s happened with neuroscience.” For this installment of the Undark Five, I talked with Adam about neuroenhancement — among other things, whether it’s fair to enhance some people’s cognitive abilities but not others’, why the subject of intelligence makes so many people uncomfortable, and whether “smart drugs” will one day make us all Einsteins. Here’s our conversation, edited for length and clarity. UNDARK — There’s been a shift within neuroscience from not just trying to understand how the brain works but to enhance it. How did that happen? Copyright 2018 Undark

Keyword: Attention
Link ID: 25164 - Posted: 07.02.2018

Bruce Bower England’s King George III descended into mental chaos, or what at the time was called madness, in 1789. Physicians could not say whether he would recover or if a replacement should assume the throne. That political crisis jump-started the study of human heredity. Using archival records, science historian Theodore M. Porter describes how the king’s deteriorating condition invigorated research at England’s insane asylums into the inheritance of madness. Well before DNA’s discovery, heredity started out as a science of record keeping and statistical calculations. In the 1800s, largely forgotten doctors in both Europe and North America meticulously collected family histories of madness, intellectual disability and crime among the growing numbers of people consigned to asylums, schools for “feebleminded” children and prisons. Some physicians who specialized in madness, known as alienists, saw severe mental deficits as a disease caused by modern life’s pressures. But most alienists regarded heredity, the transmission of a presumed biological factor among family members, as the true culprit. Asylum directors launched efforts to track down all sick relatives of patients. The increasing number of people institutionalized for mental deficits fueled the view that individuals from susceptible families should be discouraged from reproducing. © Society for Science & the Public 2000 - 2018

Keyword: Schizophrenia; Genes & Behavior
Link ID: 25163 - Posted: 07.02.2018

By Denise Gellene Dr. Arvid Carlsson, a Swedish scientist whose discoveries about the brain led to the development of drugs for Parkinson’s disease and earned him a Nobel Prize, died on Friday. He was 95. His death was announced by the Sahlgrenska Academy at the University of Gothenburg, where he had been a professor of pharmacology. It did not say where he died. When Dr. Carlsson started his research in the 1950s, dopamine, a chemical in the brain, was thought to have little significance. Dr. Carlsson discovered that it was, in fact, an important neurotransmitter — a brain chemical that passes signals from one neuron to the next. He then found that dopamine was concentrated in the basal ganglia, the portion of the brain that controls movement. He showed that rabbits lost their ability to move after they were given a drug that lowered their dopamine stores; their mobility was restored after they received L-dopa, a drug that is converted into dopamine in the brain. Noting that the movement difficulties of his rabbits were similar to those of people with Parkinson’s disease, Dr. Carlsson proposed that the illness was related to a loss of dopamine. Other scientists confirmed that dopamine is depleted in people with Parkinson’s disease, a degenerative condition that causes tremors and rigidity, and L-dopa soon became the standard treatment for the illness. Dr. Carlsson shared the 2000 Nobel Prize in Physiology or Medicine with two American researchers, Dr. Eric Kandel and Paul Greengard, who made their own discoveries about the transmission of chemical signals in the brain. In awarding the Nobel, the Karolinska Institute of Sweden said the contributions of the three scientists were “crucial for an understanding of the normal function of the brain” and for how signal disturbances could “give rise to neurological and psychiatric disorders.” © 2018 The New York Times Company

Keyword: ADHD
Link ID: 25162 - Posted: 07.02.2018

By Karen Weintraub New Caledonian crows are known for their toolmaking, but Alex Taylor and his colleagues wanted to understand just how advanced they could be. Crows from New Caledonia, an island in the South Pacific, can break off pieces of a branch to form a hook, using it to pull a grub out of a log, for instance. Once, in captivity, when a New Caledonian male crow had taken all the available hooks, its mate Betty took a straight piece of wire and bent it to make one. “They are head and shoulders above almost every other avian subjects” at toolmaking, said Irene Pepperberg, an avian cognition expert and research associate in Harvard University’s department of psychology. “These crows are just amazing.” Dr. Taylor, a researcher at the University of Auckland in New Zealand, and several European colleagues wondered how the crows, without an ability to talk and showing no evidence of mimicry, might learn such sophisticated toolmaking. Perhaps, the scientists hypothesized in a new paper published Thursday in Scientific Reports, they used “mental template matching,” where they formed an image in their heads of tools they’d seen used by others and then copied it. “Could they look at a tool and just based on mental image of the tool — can they recreate that tool design?” Dr. Taylor said. “That’s what we set out to test, and that’s what our results show.” In a series of steps, the researchers taught the birds to feed pieces of paper into a mock vending machine to earn food rewards. The scientists chose a task that was similar enough to something the animals do in the wild — while also brand new. The birds had never seen card stock before, but learned how to rip it into big or little shapes after being shown they would get a reward for the appropriate size. The template used to show the birds the right size of paper was not available to them when they made their “tools,” yet the crows were able to use their beaks to tear off bits of paper, which they sometimes held between their feet for leverage. © 2018 The New York Times Company

Keyword: Intelligence; Evolution
Link ID: 25161 - Posted: 06.29.2018

A small-molecule drug is one of the first to preserve hearing in a mouse model of an inherited form of progressive human deafness, report investigators at the University of Iowa, Iowa City, and the National Institutes of Health’s National Institute on Deafness and Other Communication Disorders (NIDCD). The study, which appears online in Cell (link is external), sheds light on the molecular mechanism that underlies a form of deafness (DFNA27), and suggests a new treatment strategy. “We were able to partially restore hearing, especially at lower frequencies, and save some sensory hair cells,” said Thomas B. Friedman, Ph.D., chief of the Laboratory of Human Molecular Genetics at the NIDCD, and a coauthor of the study. “If additional studies show that small-molecule-based drugs are effective in treating DFNA27 deafness in people, it’s possible that using similar approaches might work for other inherited forms of progressive hearing loss.” The seed for the advance was planted a decade ago, when NIDCD researchers led by Friedman and Robert J. Morell, Ph.D., another coauthor of the current study, analyzed the genomes of members of an extended family, dubbed LMG2. Deafness is genetically dominant in the LMG2 family, meaning that a child needs to inherit only one copy of the defective gene from a parent to have progressive hearing loss. The investigators localized the deafness-causing mutation to a region on chromosome four called DFNA27, which includes a dozen or so genes. The precise location of the mutation eluded the NIDCD team, however.

Keyword: Hearing; Regeneration
Link ID: 25160 - Posted: 06.29.2018