Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 7861 - 7880 of 29332

By Elizabeth Pennisi The “brrreeet” you hear in the video above is not coming from this broadbill’s beak, but rather from its wings. Charles Darwin marveled at “instrumental music” of birds—from the rattled quills of peacocks to the wing-drumming of grouse and the wing “booming” of night-jars. But those percussive noises are no match for the definitive tones generated by the three Smithornis broadbills (S. rufolateralis, S. capensis, and S. sharpei) that live in remote forests in sub-Saharan Africa. One bird acoustics specialist was so intrigued in 1986 by a recording of this “song,” that he vowed to hear it for himself. More than 2 years ago, he and his colleagues tracked two of these species down in the wild. Synchronized high-speed video and acoustic recordings revealed the downstroke of the wings produces the tones as the bird flies in a meter-wide oval from its perch and back again. At first the researchers thought the outermost flight feathers flutter to make the sounds, but studies of a wing and of the feathers themselves in a wind tunnel showed that the inner flight feathers are “singing” the most, the team reports today in the Journal of Experimental Biology. The tones may scale with the species’ body and feather size, with the bigger ones producing deeper tones, the researchers suggest. The wing tones seemed to have replaced vocal singing, they note, and are likely unique to this group of birds. Audible 100 meters away in dense forest, they represent yet another innovation for communicating with one’s peers. © 2016 American Association for the Advancement of Science

Keyword: Sexual Behavior; Animal Communication
Link ID: 22056 - Posted: 04.01.2016

Ewen Callaway Homo floresiensis, the mysterious and diminutive species found in Indonesia in 2003, is tens of thousands of years older than originally thought — and may have been driven to extinction by modern humans. After researchers discovered H. floresiensis, which they nicknamed the hobbit, in Liang Bua cave on the island of Flores, they concluded that its skeletal remains were as young as 11,000 years old. But later excavations that have dated more rock and sediment around the remains now suggest that hobbits were gone from the cave by 50,000 years ago, according to a study published in Nature on 30 March1. That is around the time that modern humans moved through southeast Asia and Australia. “I can’t believe that it is purely coincidence, based on what else we know happens when modern humans enter a new area,” says Richard Roberts, a geochronologist at the University of Wollongong, Australia. He notes that Neanderthals vanished soon after early modern humans arrived in Europe from Africa. Roberts co-led the study with archaeologist colleague Thomas Sutikna (who also helped coordinate the 2003 dig), and Matthew Tocheri, a paleoanthropologist at Lakehead University in Thunder Bay, Canada. The first hobbit fossil, known as LB1, was found in 20032 beneath about 6 metres of dirt and rock. Its fragile bones were too precious for radiocarbon dating, so the team collected nearby charcoal, on the assumption that it had accrued at the same time as the bones. That charcoal was as young as 11,000 years old, researchers reported at the time3, 4. “Somehow these tiny people had survived on this island 30,000 years after modern humans arrived,” says Roberts. “We were scratching our heads. It couldn’t add up.” © 2016 Nature Publishing Group,

Keyword: Evolution
Link ID: 22055 - Posted: 03.31.2016

By Jordana Cepelewicz The bacteria that inhabit our guts have become key players for neuroscientists. A growing body of research links them to a wide array of mental and neurological disorders—from anxiety and depression to schizophrenia and Alzheimer’s disease. Now a study in mice published this week in Nature Medicine suggests that striking the right microbial balance could cause changes in the immune system that significantly reduce brain damage after a stroke—the second leading cause of both death and disability for people around the globe. (Scientific American is part of Springer Nature.) Experts have known for some time that stroke severity is influenced by the presence of two types of cell, found abundantly within the intestine, that calibrate immune responses: Regulatory T cells have a beneficial inflammatory effect, protecting an individual from stroke. But gamma delta T cells produce a cytokine that causes harmful inflammation after a stroke. A team of researchers at Weill Cornell Medical College and Memorial Sloan Kettering Cancer Center set about investigating whether they could tilt the balance of these cells in the favor of beneficial cells by tinkering with the body’s bacterial residents. To do so, they bred two colonies of mice: One group’s intestinal flora was resistant to antibiotics whereas the other’s gut bacteria was vulnerable to treatment. As a result, when given a combination of antibiotics over the course of two weeks, only the latter’s microbiota underwent change. The researchers then obstructed the cerebral arteries of the mice, inducing an ischemic stroke (the most common type). They found that subsequent brain damage was 60 percent smaller in the drug-susceptible mice than it was in the other group. © 2016 Scientific American,

Keyword: Stroke
Link ID: 22054 - Posted: 03.31.2016

By Ariana Eunjung Cha LAS VEGAS — Jamie Tyler was stressed. He had just endured a half-hour slog through airport security and needed some relief. Many travelers in this situation might have headed for the nearest bar or popped an aspirin. But Tyler grabbed a triangular piece of gadgetry from his bag and held it to his forehead. As he closed his eyes, the device zapped him with low-voltage electrical currents. Within minutes, Tyler said, he was feeling serene enough to face the crowds once again. This is no science fiction. The Harvard-trained neurobiologist was taking advantage of one of his own inventions, a device called Thync, which promises to help users activate their body's “natural state of energy or calm” — for a retail price of a mere $199. Americans’ obsession with wellness is fueling a new category of consumer electronics, one that goes far beyond the ubiquitous Fitbits and UP activity wristbands that only passively monitor users' physical activity. The latest wearable tech, to put it in the simplest terms, is about hacking your brain. These gadgets claim to be able to make you have more willpower, think more creatively and even jump higher. One day, their makers say, the technology may even succeed in delivering on the holy grail of emotions: happiness. There’s real, peer-reviewed science behind the theory driving these devices. It involves stimulating key regions of the brain — with currents or magnetic fields — to affect emotions and physical well-being.

Keyword: Emotions
Link ID: 22053 - Posted: 03.31.2016

By Matthew Hutson Earlier this month, a computer program called AlphaGo defeated a (human) world champion of the board game Go, years before most experts expected computers to rival the best flesh-and-bone players. But then last week, Microsoft was forced to silence its millennial-imitating chatbot Tay for blithely parroting Nazi propaganda and misogynistic attacks after just one day online, her failure a testimony to the often underestimated role of human sensibility in intelligent behavior. Why are we so compelled to pit human against machine, and why are we so bad at predicting the outcome? As the number of jobs susceptible to automation rises, and as Stephen Hawking, Elon Musk, and Bill Gates warn that artificial intelligence poses an existential threat to humanity, it’s natural to wonder how humans measure up to our future robot overlords. But even those tracking technology’s progress in taking on human skills have a hard time setting an accurate date for the uprising. That’s in part because one prediction strategy popular among both scientists and journalists—benchmarking the human brain with digital metrics such as bits, hertz, and million instructions per section, or MIPS—is severely misguided. And doing so could warp our expectations of what technology can do for us and to us. Since their development, digital computers have become a standard metaphor for the mind and brain. The comparison makes sense, in that brains and computers both transform input into output. Most human brains, like computers, can also manipulate abstract symbols. (Think arithmetic or language processing.) But like any metaphor, this one has limitations.

Keyword: Brain imaging; Robotics
Link ID: 22052 - Posted: 03.31.2016

By David Z. Hambrick Nearly a century after James Truslow Adams coined the phrase, the “American dream” has become a staple of presidential campaign speeches. Kicking off her 2016 campaign, Hillary Clinton told supporters that “we need to do a better job of getting our economy growing again and producing results and renewing the American dream.” Marco Rubio lamented that “too many Americans are starting to doubt” that it is still possible to achieve the American dream, and Ted Cruz asked his supporters to “imagine a legal immigration system that welcomes and celebrates those who come to achieve the American dream.” Donald Trump claimed that “the American dream is dead” and Bernie Sanders quipped that for many “the American dream has become a nightmare.” But the American dream is not just a pie-in-the-sky notion—it’s a scientifically testable proposition. The American dream, Adams wrote, “is not a dream of motor cars and high wages merely, but a dream of social order in which each man and each woman shall be able to attain to the fullest stature of which they are innately capable…regardless of the fortuitous circumstances of birth or position.” In the parlance of behavioral genetics—the scientific study of genetic influences on individual differences in behavior—Adams’ idea was that all Americans should have an equal opportunity to realize their genetic potential. A study just published in Psychological Science by psychologists Elliot Tucker-Drob and Timothy Bates reveals that this version of the American dream is in serious trouble. Tucker-Drob and Bates set out to evaluate evidence for the influence of genetic factors on IQ-type measures (aptitude and achievement) that predict success in school, work, and everyday life. Their specific question was how the contribution of genes to these measures would compare at low versus high levels of socioeconomic status (or SES), and whether the results would differ across countries. The results reveal, ironically, that the American dream is more of a reality for other countries than it is for America: genetic influences on IQ were uniform across levels of SES in Western Europe and Australia, but, in the United States, were much higher for the rich than for the poor. © 2016 Scientific American

Keyword: Genes & Behavior; Intelligence
Link ID: 22051 - Posted: 03.30.2016

Chris French The fallibility of human memory is one of the most well established findings in psychology. There have been thousands of demonstrations of the unreliability of eyewitness testimony under well-controlled conditions dating back to the very earliest years of the discipline. Relatively recently, it was discovered that some apparent memories are not just distorted memories of witnessed events: they are false memories for events that simply never took place at all. Psychologists have developed several reliable methods for implanting false memories in a sizeable proportion of experimental participants. It is only in the last few years, however, that scientists have begun to systematically investigate the phenomenon of non-believed memories. These are subjectively vivid memories of personal experiences that an individual once believed were accurate but now accepts are not based upon real events. Prior to this, there were occasional anecdotal reports of non-believed memories. One of the most famous was provided by the influential developmental psychologist Jean Piaget. He had a clear memory of almost being kidnapped at about the age of two and of his brave nurse beating off the attacker. His grateful family were so impressed with the nurse that they gave her a watch as a reward. Years later, the nurse confessed that she had made the whole story up. Even after he no longer believed that the event had taken place, Piaget still retained his vivid and detailed memory of it. © 2016 Guardian News and Media Limited

Keyword: Learning & Memory
Link ID: 22050 - Posted: 03.30.2016

We might finally be figuring out how an increasingly popular therapy that uses electricity to boost the brain’s functioning has its effects – by pushing up levels of calcium in cells. Transcranial direct current stimulation (tDCS) involves using electrodes to send a weak current across the brain. Stimulating brain tissue like this has been linked to effects ranging from accelerating learning to improving the symptoms of depression and faster recovery from strokes. The broad consensus is that tDCS does this by lowering the threshold at which neurons fire, making it easier for them to pass on electrical signals. This leads to changes in the connectivity between neurons and alters information processing. But the cellular mechanisms that lead to such broad neurological changes are not clear and some researchers suggest that tDCS may not have any effect on the brain. Despite the doubts, devices are being developed for sale to people keen to influence their own brains. Now Hajime Hirase at the RIKEN Brain Science Institute in Tokyo, Japan, and his colleagues may have found an answer. They have identified large, sudden surges in calcium flow in the brains of mice seconds after they receive low doses of tDCS. These surges seem to start in cells called astrocytes – star-shaped cells that don’t fire themselves, but help to strengthen the connections between neurons and regulate the electrical signals that pass between them. © Copyright Reed Business Information Ltd.

Keyword: Depression; Glia
Link ID: 22049 - Posted: 03.30.2016

Brendan Maher It took less than a minute of playing League of Legends for a homophobic slur to pop up on my screen. Actually, I hadn't even started playing. It was my first attempt to join what many agree to be the world's leading online game, and I was slow to pick a character. The messages started to pour in. “Pick one, kidd,” one nudged. Then, “Choose FA GO TT.” It was an unusual spelling, and the spaces may have been added to ease the word past the game's default vulgarity filter, but the message was clear. Online gamers have a reputation for hostility. In a largely consequence-free environment inhabited mostly by anonymous and competitive young men, the antics can be downright nasty. Players harass one another for not performing well and can cheat, sabotage games and do any number of things to intentionally ruin the experience for others — a practice that gamers refer to as griefing. Racist, sexist and homophobic language is rampant; aggressors often threaten violence or urge a player to commit suicide; and from time to time, the vitriol spills beyond the confines of the game. In the notorious 'gamergate' controversy that erupted in late 2014, several women involved in the gaming industry were subjected to a campaign of harassment, including invasions of privacy and threats of death and rape. League of Legends has 67 million players and grossed an estimated US$1.25 billion in revenue last year. But it also has a reputation for toxic in-game behaviour, which its parent company, Riot Games in Los Angeles, California, sees as an obstacle to attracting and retaining players. © 2016 Nature Publishing Group

Keyword: Aggression; Learning & Memory
Link ID: 22048 - Posted: 03.30.2016

By Ariana Eunjung Cha In the movie "Concussion," which is based on the life of Bennet Omalu, a doctor who studied traumatic brain injury, Omalu explains that the reason the prognosis is so poor for so many of them is because their symptoms went undiagnosed. When head injuries aren't treated or are under-treated, it puts patients at risk of more serious injury. This is why children with concussions are often asked not to return to class or sports until their symptoms have resolved and adults often have to take days off work. One of the challenges has been that concussions are tricky to diagnose, and it isn't uncommon for a patient to rush to the ER only to be met with a vague response from the doctor about whether there's anything worrisome. Symptoms often aren't apparent for hours or even days after the initial injury, and the imaging technology we have can't pick up anything other than larger bleeds and lesions. How different could things have been if there was a simple blood test to detect a concussion? In a paper published in JAMA Neurology on Monday, researchers reported that they may be closer than ever to such a test. The study involved 600 patients admitted to a trauma center from March 2010 to March 2014. All had suffered some kind of head injury resulting in loss of consciousness, amnesia or disorientation.

Keyword: Brain Injury/Concussion; Glia
Link ID: 22047 - Posted: 03.30.2016

Opioids are becoming the latest serious addiction problem in this country. Among these drugs manufactured from opium, heroin is the most serious, dangerous, cheap and available everywhere. In April's edition of Harper's Magazine, Dan Baum has examined a new response to this latest addiction problem: the legalization of drugs. NPR's Linda Wertheimer asks Baum about how he began to delve into the topic of America's war on drugs and why he calls attempts at legalization a big risk based on our approach to solving the widespread problem. Interview Highlights You go back, covering the war on drugs, I wonder if you could tell us the story which kicks off your article. I was starting a book on the politics of drug enforcement. And in 1994 I got word that John Erlichman was doing minority recruitment at an engineering firm in Atlanta. Well, I'm 60. Erlichman was one of the great villains of American History, a Watergate villain. And he was Richard Nixon's drug policy advisor. And Richard Nixon was the one who coined the phrase, "war on drugs." And he told me an amazing thing. I started asking him some earnest, wonky policy questions and he waved them away. He said, "Can we cut the B.S.? Can I just tell you what this was all about?" The Nixon campaign in '68 and the Nixon White House had two enemies: black people and the anti-war left. He said, and we knew that if we could associate heroin with black people and marijuana with the hippies, we could project the police into those communities, arrest their leaders, break up their meetings and most of all, demonize them night after night on the evening news. And he looked me in the eyes and said, "Did we know we were lying about the drugs? Of course we did." © 2016 npr

Keyword: Drug Abuse
Link ID: 22046 - Posted: 03.29.2016

by Sarah Zielinski There must be something wrong with the guy who never leaves home, right? Maybe not — at least if that guy is a male spotted hyena. Males that stay with their birth clan, instead of taking off to join a new group, may simply be making a good choice, a new study suggests. Spotted hyenas are a matriarchal society. Females are in charge. They rank higher than every male in the clan. And the females generally stay with the clan for their entire lives. But males face a choice when they reach two and a half years in age. They can stay with the clan, or they can leave and join a new clan. Each choice has its pros and cons. Staying with the clan means that a male hyena keeps a place at the top of the male pecking order. He’ll probably have his mother around to help. But he’ll be limited in the number of females he can mate with, because many of the female hyenas won’t mate with him because they might be related. If he joins a new clan, the male hyena might have access to more females — and they might even be better than the ones in his home clan — but he’ll start with the lowest social rank and have to spend years fighting his way to the top. Among most group-living mammal species, the guys that stay at home turn out to be losers, siring fewer offspring. But spotted hyenas, it appears, are an exception. Eve Davidian of the Leibniz Institute for Zoo and Wildlife Research in Berlin and colleagues tracked 254 male spotted hyenas that lived in eight clans in Ngorongoro Crater in Tanzania throughout their lives, a study lasting 20 years. When these males reached the age of maturity, they left their clans to take a look at the other options available to them. Forty-one hyenas returned to their home clans, and 213 settled with new ones. © Society for Science & the Public 2000 - 2016

Keyword: Sexual Behavior; Evolution
Link ID: 22045 - Posted: 03.29.2016

By Roni Caryn Rabin Here’s another reason to eat your fruits and veggies: You may reduce your risk of vision loss from cataracts. Cataracts that cloud the lenses of the eye develop naturally with age, but a new study is one of the first to suggest that diet may play a greater role than genetics in their progression. Researchers had about 1,000 pairs of female twins in Britain fill out detailed food questionnaires that tracked their nutrient intake. Their mean age was just over 60. The study participants underwent digital imaging of the eye to measure the progression of cataracts. The researchers found that women who consumed diets rich in vitamin C and who ate about two servings of fruit and two servings of vegetables a day had a 20 percent lower risk of cataracts than those who ate a less nutrient-rich diet. Ten years later, the scientists followed up with 324 of the twin pairs, and found that those who had reported consuming more vitamin C in their diet — at least twice the recommended dietary allowance of 75 milligrams a day for women (the R.D.A. for adult men is 90 milligrams) — had a 33 percent lower risk of their cataracts progressing than those who get less vitamin C. The researchers concluded that genetic factors account for about 35 percent of the difference in cataract progression, while environmental factors like diet account for 65 percent. “We found no beneficial effect from supplements, only from the vitamin C in the diet,” said Dr. Christopher Hammond, a professor of ophthalmology at King’s College London and an author of the study,published in Ophthalmology. Foods high in vitamin C include oranges, cantaloupe, kiwi, broccoli and dark leafy greens. © 2016 The New York Times Company

Keyword: Vision
Link ID: 22044 - Posted: 03.29.2016

By Patrick Monahan Yesterday, mountaineer Richard Parks set out for Kathmandu to begin some highly unusual data-gathering. As part of Project Everest Cynllun, he will climb Mount Everest without supplemental oxygen and perform—on himself—a series of blood draws, muscle biopsies, and cognitive tests. If he makes it to the summit, these will be the highest-elevation blood and tissue samples ever collected. Damian Bailey, a physiologist at the University of South Wales, Pontypridd, in the United Kingdom and the project’s lead scientist, hopes the risky experiment will yield new information about how the human body responds to low-oxygen conditions, and how similar mechanisms might drive cognitive decline with aging. As Parks began the acclimatization process with warm-up climbs on two smaller peaks, Bailey told ScienceInsider about his ambitions for the project. This interview has been edited for clarity and brevity. Q: Parks is an extreme athlete who has climbed Everest before. What can his performance tell us about regular people? A: What we’re trying to understand is, what is it about Richard’s brain that is potentially different from other people’s brains, and can that provide us with some clues to accelerated cognitive decline, which occurs with aging [and] dementia. We know that sedentary aging is associated with a progressive decline in blood flow to the brain. … And the main challenge for sedentary aging is we have to wait so long to see the changes occurring. So this is almost a snapshot, a day in the life of a patient with cognitive decline. © 2016 American Association for the Advancement of Science.

Keyword: Alzheimers; Learning & Memory
Link ID: 22043 - Posted: 03.29.2016

By C. CLAIBORNE RAY Q. Why do we become desensitized to a perfume we are wearing while others can still smell it? A. Ceasing to smell one’s perfume after continuous exposure while casual passers-by can still smell it is just one example of a phenomenon called olfactory adaptation or odor fatigue. After some time without exposure, sensitivity is usually restored. A similar weakening of odor signals with continued exposure also takes place in animals other than humans, and researchers often rely on animal studies to try to understand the cellular and molecular bases for the condition. It has been suggested that odor fatigue is useful because it enables animals to sort out the signals of a new odor from the background noise of continuous odors. It may also enable them to sense when an odor grows stronger. Studies published in the journal Science in 2002 pinpointed a chemical that seems to act as a gatekeeper for neurons involved in smell, opening and closing their electric signal channels. Genetically engineered mice that did not produce the substance, a protein called CNGA4, had profoundly impaired olfactory adaptation. A separate test-tube study found similar changes on a cellular level, with the signal channels remaining open when CNGA4 was absent. question@nytimes.com © 2016 The New York Times Company

Keyword: Chemical Senses (Smell & Taste)
Link ID: 22042 - Posted: 03.29.2016

By Emily Underwood This tangle of wiry filaments is not a bird’s nest or a root system. Instead, it’s the largest map to date of the connections between brain cells—in this case, about 200 from a mouse’s visual cortex. To map the roughly 1300 connections, or synapses, between the cells, researchers used an electron microscope to take millions of nanoscopic pictures from a speck of tissue not much bigger than a dust mite, carved into nearly 3700 slices. Then, teams of “annotators” traced the spindly projections of the synapses, digitally stitching stacked slices together to form the 3D map. The completed map reveals some interesting clues about how the mouse brain is wired: Neurons that respond to similar visual stimuli, such as vertical or horizontal bars, are more likely to be connected to one another than to neurons that carry out different functions, the scientists report online today in Nature. (In the image above, some neurons are color-coded according to their sensitivity to various line orientations.) Ultimately, by speeding up and automating the process of mapping such networks in both mouse and human brain tissue, researchers hope to learn how the brain’s structure enables us to sense, remember, think, and feel. © 2016 American Association for the Advancement of Science

Keyword: Brain imaging
Link ID: 22041 - Posted: 03.29.2016

By BENEDICT CAREY BEDFORD, Mass. — In a small room banked by refrigerators of preserved brains, a pathologist held a specimen up to the light in frank admiration. Then it was time to cut — once in half and then a thick slice from the back, the tissue dense and gray-pink, teeming with folds and swirls. It was the brain of a professional running back. “There,” said Dr. Ann McKee, the chief of neuropathology at the V.A. Boston Healthcare System and a professor of neurology and pathology at Boston University’s medical school, pointing to a key area that had an abnormal separation. “That’s one thing we look for right away.” Over the past several years, Dr. McKee’s lab, housed in a pair of two-story brick buildings in suburban Boston, has repeatedly made headlines by revealing that deceased athletes, including at least 90 former N.F.L. players, were found to have had a degenerative brain disease called chronic traumatic encephalopathy, or C.T.E., that is believed to cause debilitating memory and mood problems. This month, after years of denying or playing down a connection, a top N.F.L. official acknowledged at a hearing in Washington that playing football and having C.T.E. were “certainly” linked. His statement effectively ended a very public dispute over whether head blows sustained while playing football are associated with the disorder. But it will not resolve a quieter debate among scientists about how much risk each football player has of developing it, or answer questions about why some players seem far more vulnerable to it than others. Some researchers worry that the rising drumbeat of C.T.E. diagnoses is far outpacing scientific progress in pinpointing the symptoms, risks and prevalence of the disease. The American Academy of Clinical Neuropsychology, an organization of brain injury specialists, is preparing a public statement to point out that much of the science of C.T.E. is still unsettled and to contend that the evidence to date should not be interpreted to mean that parents must keep their children off sports teams, officials of the group say. © 2016 The New York Times Company

Keyword: Brain Injury/Concussion; Brain imaging
Link ID: 22040 - Posted: 03.28.2016

By Esther Hsieh Spinal implants have suffered similar problems as those in the brain—they tend to abrade tissue, causing inflammation and ultimately rejection by the body. Now an interdisciplinary research collaboration based in Switzerland has made a stretchable implant that appears to solve this problem. Like Lieber's new brain implant, it matches the physical qualities of the tissue where it is embedded. The “e-dura” implant is made from a silicone rubber that has the same elasticity as dura mater, the protective skin that surrounds the spinal cord and brain, explains Stéphanie Lacour, a professor at the school of engineering at the Swiss Federal Institute of Technology in Lausanne. This feature allows the implant to mimic the movement of the surrounding tissues. Embedded in the e-dura are electrodes for stimulation and microchannels for drug therapy. Ultrathin gold wires are made with microscopic cracks that allow them to stretch. Also, the electrodes are coated with a special platinum-silicone mixture that is stretchable. In an experiment that lasted two months, the scientists found that healthy rats with an e-dura spinal implant could walk across a ladder as well as a control group with no implant. Yet rats with a traditional plastic implant (which is flexible but not stretchable) started stumbling and missing rungs a few weeks after surgery. The researchers removed the implants and found that rats with a traditional implant had flattened, damaged spinal cords—but the e-dura implants had left spinal cords intact. Cellular testing also showed a strong immune response to the traditional implant, which was minimal in rats with the e-dura implant. © 2016 Scientific American

Keyword: Regeneration; Movement Disorders
Link ID: 22039 - Posted: 03.28.2016

By DAVID FRANK and JAMES GORMAN Social life is good for you, even when your friends have lice — if you’re a Japanese macaque. Whether the same is true for humans hasn’t been tested directly, at least not the way researchers in Japan conducted their experiments with networks of female macaques. Julie Duboscq, a researcher at Kyoto University’s Primate Research Institute in Japan, tracked louse infestation and grooming interactions in about 20 adult female macaques. As she, Andrew J.J. MacIntosh and their colleagues noted in describing their research in Scientific Reports, grooming is known to reduce lice, but such close physical contact can also make it easy for lice to pass from one animal to another. Dr. Duboscq is interested in the costs and benefits of social behavior. For animals that live in social groups, as macaques and people do, the benefits of social life are many, from defense against predators (for wild monkeys, and no doubt for humans at some point in their history) to emotional health and well-being (for humans, and probably monkeys, too). But there are negatives associated with sociality, like the transmission of parasites and diseases. “We don’t fully understand the costs and benefits,” Dr. Duboscq said. In this study, she and her colleagues estimated the degree of louse infestation by the number of nits picked. The more nits, they calculated, the more lice-producing nits. © 2016 The New York Times Company

Keyword: Stress
Link ID: 22038 - Posted: 03.28.2016

New York's Tribeca Film Festival will not show Vaxxed, a controversial film about the MMR vaccine, its founder Robert De Niro says. As recently as Friday, Mr De Niro stood by his decision to include the film by anti-vaccination activist Andrew Wakefield in next month's festival. The link the film makes between the measles, mumps and rubella vaccine and autism has been widely discredited. "We have concerns with certain things in this film," said Mr De Niro. Mr De Niro, who has a child with autism, said he had hoped the film would provide the opportunity for discussion of the issue. But after reviewing the film with festival organisers and scientists, he said: "We do not believe it contributes to or furthers the discussion I had hoped for." Image caption Wakefield published his controversial study in 1998 Vaxxed was directed and co-written by Mr Wakefield, who described it as a "whistle-blower documentary". In a statement issued following the Tribeca Film Festival's decision, he and the film's producer Del Bigtree said that "we have just witnessed yet another example of the power of corporate interests censoring free speech, art and truth". The British doctor was the lead author of a controversial study published in 1998, which argued there might be a link between MMR and autism and bowel disease. Mr Wakefield suggested that parents should opt for single jabs against mumps, measles and rubella instead of the three-in-one vaccine. His comments and the subsequent media furore led to a sharp drop in the number of children being vaccinated against these diseases. But the study, first published in The Lancet, was later retracted by the medical journal. Mr Wakefield's research methods were subsequently investigated by the General Medical Council and he was struck off the medical register.

Keyword: Autism
Link ID: 22037 - Posted: 03.28.2016