Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
After a string of scandals involving accusations of misconduct and retracted papers, social psychology is engaged in intense self-examination—and the process is turning out to be painful. This week, a global network of nearly 100 researchers unveiled the results of an effort to replicate 27 well-known studies in the field. In more than half of the cases, the result was a partial or complete failure. As the replicators see it, the failed do-overs are a healthy corrective. “Replication helps us make sure what we think is true really is true,” says Brent Donnellan, a psychologist at Michigan State University in East Lansing who has undertaken three recent replications of studies from other groups—all of which came out negative. “We are moving forward as a science,” he says. But rather than a renaissance, some researchers on the receiving end of this organized replication effort see an inquisition. “I feel like a criminal suspect who has no right to a defense and there is no way to win,” says psychologist Simone Schnall of the University of Cambridge in the United Kingdom, who studies embodied cognition, the idea that the mind is unconsciously shaped by bodily movement and the surrounding environment. Schnall’s 2008 study finding that hand-washing reduced the severity of moral judgment was one of those Donnellan could not replicate. About half of the replications are the work of Many Labs, a network of about 50 psychologists around the world. The results of their first 13 replications, released online in November, were greeted with a collective sigh of relief: Only two failed. Meanwhile, Many Labs participant Brian Nosek, a psychologist at the University of Virginia in Charlottesville, put out a call for proposals for more replication studies. After 40 rolled in, he and Daniël Lakens, a psychologist at Eindhoven University of Technology in the Netherlands, chose another 14 to repeat. © 2014 American Association for the Advancement of Science.
Keyword: Attention; Emotions
Link ID: 19636 - Posted: 05.20.2014
|By Isaac Bédard Very few animals have revealed an ability to consciously think about the future—behaviors such as storing food for the winter are often viewed as a function of instinct. Now a team of anthropologists at the University of Zurich has evidence that wild orangutans have the capacity to perceive the future, prepare for it and communicate those future plans to other orangutans. The researchers observed 15 dominant male orangutans in Sumatra for several years. These males roam through immense swaths of dense jungle, emitting loud yells every couple of hours so that the females they mate with and protect can locate and follow them. The shouts also warn away any lesser males that might be in the vicinity. These vocalizations had been observed by primatologists before, but the new data reveal that the apes' last daily call, an especially long howl, is aimed in the direction they will travel in the morning—and the other apes take note. The females stop moving when they hear this special 80-second call, bed down for the night, and in the morning begin traveling in the direction indicated the evening before. The scientists believe that the dominant apes are planning their route in advance and communicating it to other orangutans in the area. They acknowledge, however, that the dominant males might not intend their long calls to have such an effect on their followers. Karin Isler, a Zurich anthropologist who co-authored the study in PLOS ONE last fall, explains, “We don't know whether the apes are conscious. This planning does not have to be conscious. But it is also more and more difficult to argue that they [do not have] some sort of mind of their own.” © 2014 Scientific American
Keyword: Evolution; Attention
Link ID: 19635 - Posted: 05.20.2014
By BENEDICT CAREY SAN DIEGO – The last match of the tournament had all the elements of a classic showdown, pitting style versus stealth, quickness versus deliberation, and the world’s foremost card virtuoso against its premier numbers wizard. If not quite Ali-Frazier or Williams-Sharapova, the duel was all the audience of about 100 could ask for. They had come to the first Extreme Memory Tournament, or XMT, to see a fast-paced, digitally enhanced memory contest, and that’s what they got. The contest, an unusual collaboration between industry and academic scientists, featured one-minute matches between 16 world-class “memory athletes” from all over the world as they met in a World Cup-like elimination format. The grand prize was $20,000; the potential scientific payoff was large, too. One of the tournament’s sponsors, the company Dart NeuroScience, is working to develop drugs for improved cognition. The other, Washington University in St. Louis, sent a research team with a battery of cognitive tests to determine what, if anything, sets memory athletes apart. Previous research was sparse and inconclusive. Yet as the two finalists, both Germans, prepared to face off — Simon Reinhard, 35, a lawyer who holds the world record in card memorization (a deck in 21.19 seconds), and Johannes Mallow, 32, a teacher with the record for memorizing digits (501 in five minutes) — the Washington group had one preliminary finding that wasn’t obvious. “We found that one of the biggest differences between memory athletes and the rest of us,” said Henry L. Roediger III, the psychologist who led the research team, “is in a cognitive ability that’s not a direct measure of memory at all but of attention.” People have been performing feats of memory for ages, scrolling out pi to hundreds of digits, or phenomenally long verses, or word pairs. Most store the studied material in a so-called memory palace, associating the numbers, words or cards with specific images they have already memorized; then they mentally place the associated pairs in a familiar location, like the rooms of a childhood home or the stops on a subway line. The Greek poet Simonides of Ceos is credited with first describing the method, in the fifth century B.C., and it has been vividly described in popular books, most recently “Moonwalking With Einstein,” by Joshua Foer. © 2014 The New York Times Company
Keyword: Learning & Memory; Attention
Link ID: 19634 - Posted: 05.20.2014
By David Grimm, A shaggy brown terrier approaches a large chocolate Labrador in a city park. When the terrier gets close, he adopts a yogalike pose, crouching on his forepaws and hiking his butt into the air. The Lab gives an excited bark, and soon the two dogs are somersaulting and tugging on each other’s ears. Then the terrier takes off and the Lab gives chase, his tail wagging wildly. When the two meet once more, the whole thing begins again. Watch a couple of dogs play, and you’ll probably see seemingly random gestures, lots of frenetic activity and a whole lot of energy being expended. But decades of research suggest that beneath this apparently frivolous fun lies a hidden language of honesty and deceit, empathy and perhaps even a humanlike morality. Take those two dogs. That yogalike pose is known as a “play bow,” and in the language of play it’s one of the most commonly used words. It’s an instigation and a clarification, a warning and an apology. Dogs often adopt this stance as an invitation to play right before they lunge at another dog; they also bow before they nip (“I’m going to bite you, but I’m just fooling around”) or after some particularly aggressive roughhousing (“Sorry I knocked you over; I didn’t mean it.”). All of this suggests that dogs have a kind of moral code — one long hidden to humans until a cognitive ethologist named Marc Bekoff began to crack it. A wiry 68-year-old with reddish-gray hair tied back in a long ponytail, Bekoff is a professor emeritus at the University of Colorado at Boulder, where he taught for 32 years. He began studying animal behavior in the early 1970s, spending four years videotaping groups of dogs, wolves and coyotes in large enclosures and slowly playing back the tapes, jotting down every nip, yip and lick. “Twenty minutes of film could take a week to analyze,” he says. © 1996-2014 The Washington Post
Keyword: Development of the Brain; Evolution
Link ID: 19633 - Posted: 05.20.2014
By ABIGAIL ZUGER, M.D. Sweet revenge comes in many delectable forms, among them the receipt of accolades for work long scorned. And then to get to tell the whole story at length and without a single interruption — small wonder that the Nobel laureate Dr. Stanley B. Prusiner, a renowned neurologist at the University of California, San Francisco, writes with a cheerful bounce. Once disparaged, his scientific work is now hailed as visionary, and his memoir takes the reader on a leisurely and immensely readable victory lap from then to now. In the process, two stories unfold. The first is the progress of Dr. Prusiner’s thinking on the transmissible proteins he named prions (PREE-ons) in 1982, starting with his first experiments on an obscure disease of sheep and ending with the most recent work linking prions to an array of human neurological catastrophes, including Alzheimer’s disease. The science is convoluted, like the proteins, and for the uninitiated the best way to achieve a rudimentary grasp of the subject is to hear it the way Dr. Prusiner tells it, from the very beginning. But a parallel narrative turns out to be equally fascinating: perhaps not since James D. Watson’s 1968 memoir “The Double Helix“ has the down and dirty business of world-class science been given such an airing. Dr. Watson raised eyebrows with his gossipy account of the serious task of unraveling the genetic code — and he was working in genteel postwar Britain at the time, with experimental science still at least in theory a gentleman’s game. That illusion is long gone: The stakes are considerably higher now, the competition fierce, the pace frantic, and Dr. Prusiner, 71, revisits quite a few of the battles that punctuated his long research career. He was an underachiever in high school and then an achiever in college and medical school, captivated by the laboratory early on. He finished his medical training on the neurology wards in San Francisco, where he met the patient who would set the course of his career: a slim, tanned 60-year-old woman from Marin County who was having trouble unzipping her golf bag. Months later she was dead of Creutzfeldt-Jakob disease, one of several related and invariably fatal neurological diseases (mad cow among them) that leave the brain of the affected human or animal riddled with holes, a useless sponge. © 2014 The New York Times Company
Keyword: Prions
Link ID: 19632 - Posted: 05.20.2014
Sara Reardon The researchers' technique shows neurons throughout the body twinkling with activity. Researchers have for the first time imaged all of the neurons firing in a living organism, the nematode worm Caenorhabditis elegans. The achievement, reported today in Nature Methods1 shows how signals travel through the body in real time. Scientists mapped the connections among all 302 of the nematode's neurons in 19862 — a first that has not been repeated with any other organism. But this wiring diagram, or 'connectome', does not allow researchers to determine the neuronal pathways that lead to a particular action. Nor does it allow researchers to predict what the nematode will do at any point in time, says neuroscientist Alipasha Vaziri of the University of Vienna. By providing a means of displaying signaling activity between neurons in three dimensions and in real-time, the new technique should allow scientists to do both. Vaziri and his colleagues engineered C. elegans so that when a neuron fires and calcium ions pass through its cell membranes, the neuron lights up. To capture those signals, they imaged the whole worm using a technique called light-field deconvolution microscopy, which combines images from a set of tiny lenses and analyses them using an algorithm to give a high-resolution three-dimensional image. The researchers took as many as 50 images per second of the entire worm, enabling them to watch the neurons firing in the brain, ventral cord, and tail (see video). Next, the group applied the technique to the transparent larvae of the zebrafish (Danio rerio), imaging the entire brain as the fish responded to the odours of chemicals pumped into their water. They were able to capture the activity of about 5,000 neurons simultaneously (the zebrafish has about 100,000 total neurons). © 2014 Nature Publishing Group
Keyword: Brain imaging
Link ID: 19631 - Posted: 05.19.2014
By NATASHA SINGER Joseph J. Atick cased the floor of the Ronald Reagan Building and International Trade Center in Washington as if he owned the place. In a way, he did. He was one of the organizers of the event, a conference and trade show for the biometrics security industry. Perhaps more to the point, a number of the wares on display, like an airport face-scanning checkpoint, could trace their lineage to his work. A physicist, Dr. Atick is one of the pioneer entrepreneurs of modern face recognition. Having helped advance the fundamental face-matching technology in the 1990s, he went into business and promoted the systems to government agencies looking to identify criminals or prevent identity fraud. “We saved lives,” he said during the conference in mid-March. “We have solved crimes.” Thanks in part to his boosterism, the global business of biometrics — using people’s unique physiological characteristics, like their fingerprint ridges and facial features, to learn or confirm their identity — is booming. It generated an estimated $7.2 billion in 2012, according to reports by Frost & Sullivan. Making his rounds at the trade show, Dr. Atick, a short, trim man with an indeterminate Mediterranean accent, warmly greeted industry representatives at their exhibition booths. Once he was safely out of earshot, however, he worried aloud about what he was seeing. What were those companies’ policies for retaining and reusing consumers’ facial data? Could they identify individuals without their explicit consent? Were they running face-matching queries for government agencies on the side? Now an industry consultant, Dr. Atick finds himself in a delicate position. While promoting and profiting from an industry that he helped foster, he also feels compelled to caution against its unfettered proliferation. He isn’t so much concerned about government agencies that use face recognition openly for specific purposes — for example, the many state motor vehicle departments that scan drivers’ faces as a way to prevent license duplications and fraud. Rather, what troubles him is the potential exploitation of face recognition to identify ordinary and unwitting citizens as they go about their lives in public. Online, we are all tracked. But to Dr. Atick, the street remains a haven, and he frets that he may have abetted a technology that could upend the social order. © 2014 The New York Times Company
Keyword: Robotics
Link ID: 19630 - Posted: 05.19.2014
|By Beth Skwarecki The protein family notorious for causing neurogenerative diseases such as Parkinson's—not to mention mad cow—appears to play an important role in healthy cells. “Do you think God created prions just to kill?” muses Eric R. Kandel of Columbia University. “These things have evolved initially to have a physiological function.” Kandel's work on memory helped to reveal that animals make and use prions in their nervous systems as part of an essential function: stabilizing the synapses involved with forming long-term memories. These natural prions are not infectious, but on a molecular level they chain up exactly the same way as their disease-causing brethren. (Some researchers call them “prionlike” to avoid confusion.) Now neuroscientist Kausik Si of the Stowers Institute for Medical Research in Kansas City, Mo., one of Kandel's former students, has shown that the prion's action is tightly controlled by the cell and can be turned on when a new long-term memory needs to be formed. Once the prion's chain reaction gets started, it is self-perpetuating, and thus the synapse—where neurons connect—can be maintained after the initial trigger is gone, perhaps for a lifetime. But that still does not explain how the first prion is triggered or why it happens at only certain of the synapses, which play a crucial role in forming memories. Si's work, published February 11 in PLOS Biology, traces the biochemistry of this protein-preservation process in fruit flies, showing how the cell turns on the machinery responsible for the persistence of memory—and how the memory can be stabilized at just the right time and in the right place. © 2014 Scientific American
Keyword: Prions; Learning & Memory
Link ID: 19629 - Posted: 05.19.2014
By GINIA BELLAFANTE The opening shots of “The Normal Heart,” HBO’s adaptation of Larry Kramer’s 1985 play about the early days of the AIDS crisis in New York, reveal a crew of sinewy and amorous young men disembarking from a ferry on Fire Island on a beautiful July day in 1981. The tableau is meant to suggest the final hour of unburdened desire, the moment before so much youth and beauty would be sacrificed to the cruelest attacks of physiology and cultural indifference. On the way home from the weekend, Ned Weeks, Mr. Kramer’s proxy, a man distanced from the surrounding hedonism, is shown reading a piece in The New York Times headlined, “Rare Cancer Seen in 41 Homosexuals.” The film (which will make its debut on May 25) arrives at a transformative time in the history of AIDS prevention. On May 14, federal health officials, in a move that would have been unimaginable 30 years ago, recommended the use of a prophylactic drug regimen to prevent infection with H.I.V. The drug currently used is known as Truvada, and two years ago, David Duran, a writer and gay-rights campaigner, coined the term “Truvada whore,” controversially, as a judgment against gay men who were abandoning safer sex in favor of taking the antiretroviral. Though he has since characterized this view as “prudish,” there are doctors in the city who continue to harangue patients for what the longtime AIDS activist Peter Staley calls “any break with the condom code.” And yet whatever ideological divisions existed in the period Mr. Kramer’s narrative recalls and whatever have emerged since, the fight against AIDS has been one of the most successful and focused public health movements. In another distinguishing moment, the city health department announced this year that for the first time AIDS had fallen out of the 10 leading causes of death in New York. Replacing it was Alzheimer’s, whose damage is sure to multiply as the number of older New Yorkers increases — by 2030 there will be close to 500,000 more people over age 60 than there were at the beginning of the century. According to a study from Rush University Medical Center in March, the number of deaths attributable to the disease had been vastly undercalculated. The research showed that Alzheimer’s was the underlying cause in 500,000 deaths in the United States in 2010, a figure close to six times the estimate from the Centers for Disease Control. This means that in a single year, Alzheimer’s claimed nearly as many lives as AIDS — responsible for 636,000 deaths in this country — had taken in more than three decades. © 2014 The New York Times Company
Keyword: Alzheimers
Link ID: 19628 - Posted: 05.19.2014
By ALAN SCHWARZ ATLANTA — More than 10,000 American toddlers 2 or 3 years old are being medicated for attention deficit hyperactivity disorder outside established pediatric guidelines, according to data presented on Friday by an official at the Centers for Disease Control and Prevention. The report, which found that toddlers covered by Medicaid are particularly prone to be put on medication such as Ritalin and Adderall, is among the first efforts to gauge the diagnosis of A.D.H.D. in children below age 4. Doctors at the Georgia Mental Health Forum at the Carter Center in Atlanta, where the data was presented, as well as several outside experts strongly criticized the use of medication in so many children that young. The American Academy of Pediatrics standard practice guidelines for A.D.H.D. do not even address the diagnosis in children 3 and younger — let alone the use of such stimulant medications, because their safety and effectiveness have barely been explored in that age group. “It’s absolutely shocking, and it shouldn’t be happening,” said Anita Zervigon-Hakes, a children’s mental health consultant to the Carter Center. “People are just feeling around in the dark. We obviously don’t have our act together for little children.” Dr. Lawrence H. Diller, a behavioral pediatrician in Walnut Creek, Calif., said in a telephone interview: “People prescribing to 2-year-olds are just winging it. It is outside the standard of care, and they should be subject to malpractice if something goes wrong with a kid.” Friday’s report was the latest to raise concerns about A.D.H.D. diagnoses and medications for American children beyond what many experts consider medically justified. Last year, a nationwide C.D.C. survey found that 11 percent of children ages 4 to 17 have received a diagnosis of the disorder, and that about one in five boys will get one during childhood. A vast majority are put on medications such as methylphenidate (commonly known as Ritalin) or amphetamines like Adderall, which often calm a child’s hyperactivity and impulsivity but also carry risks for growth suppression, insomnia and hallucinations. Only Adderall is approved by the Food and Drug Administration for children below age 6. However, because off-label use of methylphenidate in preschool children had produced some encouraging results, the most recent American Academy of Pediatrics guidelines authorized it in 4- and 5-year-olds — but only after formal training for parents and teachers to improve the child’s environment were unsuccessful. © 2014 The New York Times Company
Keyword: ADHD; Drug Abuse
Link ID: 19627 - Posted: 05.17.2014
Eleven years on, I still remember the evening I decided to kill my baby daughter. It's not something you're supposed to feel as a new parent with a warm, tiny bundle in your arms. But this is how postnatal depression can twist your logic. At the time it made perfect sense. Catherine was screaming, in pain. She had colic, there was nothing I could do about it. If an animal were in this much pain you'd put it out of its misery, so why not a human? Postnatal depression can have this kind of effect even on the most reasonable woman, yet you won't find much about it in baby books. We're expected to love our kids the moment they pop out, even while the memory of the labour pains is still raw. I knew a baby would be hard work, of course, but I expected motherhood to be fulfilling. As it happened I had a wonderful pregnancy, followed by a quick and easy birth. But the problems started soon after. Catherine wouldn’t feed, her blood sugar levels tumbled and I ended up bottle-feeding her, in tears, in a hospital room filled with posters promoting the breast. I was a Bad Mother within 48 hours. Things were no better after the first month. This was meant to be a joyous time, but all I seemed to feel was rage and resentment. In pregnancy all the attention had been on me, and suddenly I was a sideshow to this wailing thing in a crib. I was tired, tetchy and resentful. My daughter had rapidly become a ball and chain. My freedom was over. I kept hoping this was just the “baby blues” and that it would soon pass, but things only got worse. When colic set in, for around five hours each evening Catherine would scream, her face a mix of red and purple rage. No amount of pacing, tummy-rubbing or soothing words could stop this tiny demanding creature. So one night, alone with her in her room, I decided it would be best to put her out of her misery. © 2014 Guardian News and Media Limited
Keyword: Depression; Hormones & Behavior
Link ID: 19626 - Posted: 05.17.2014
|By Sam Kean It is possible to take the idea of left/right differences within the brain too far: it’s not like one side of the brain talks or emotes or recognizes faces all by itself while the other one just sits there twiddling its neurons. But the left and right hemispheres of the human brain do show striking differences in some areas, especially with regard to language, the trait that best defines us as human beings. Scientists suspect that left-right specialization first evolved many millions of years ago, since many other animals show subtle hemispheric differences: they prefer to use one claw or paw to eat, for instance, or they strike at prey more often in one direction than another. Before this time, the left brain and right brain probably monitored sensory data and recorded details about the world to an equal degree. But there’s no good reason for both hemispheres to do the same basic job, not if the corpus callosum—a huge bundle of fibers that connects the left and right brain—can transmit data between them. So the brain eliminated the redundancy, and the left brain took on new tasks. This process accelerated in human beings, and we humans show far greater left/right differences than any other animal. In the course of its evolution the left brain also took on the crucial role of master interpreter. Neuroscientists have long debated whether certain people have two independent minds running in parallel inside their skulls. That sounds spooky, but some evidence suggests yes. For example, there are split-brain patients, who had their corpus callosums surgically severed to help control epilepsy and whose left and right brain cannot communicate as a result. Split-brain patients have little trouble drawing two different geometric figures at the same time, one with each hand. Normal people bomb this test. (Try it, and you’ll see how mind-bendingly hard it is.) Some neuroscientists scoff at these anecdotes, saying the claims for two separate minds are exaggerated. But one thing is certain: two minds or no, split-brain people feel mentally unified; they never feel the two hemispheres fighting for control, or feel their consciousness flipping back and forth. That’s because one hemisphere, usually the left, takes charge. And many neuroscientists argue that the same thing happens in normal brains. One hemisphere probably always dominates the mind, a role that neuroscientist Michael Gazzaniga called the interpreter. (Per George W. Bush, you could also call it “the decider.”) © 2014 Scientific American
Keyword: Laterality
Link ID: 19625 - Posted: 05.17.2014
Tastes are a privilege. The oral sensations not only satisfy foodies, but also on a primal level, protect animals from toxic substances. Yet cetaceans—whales and dolphins—may lack this crucial ability, according to a new study. Mutations in a cetacean ancestor obliterated their basic machinery for four of the five primary tastes, making them the first group of mammals to have lost the majority of this sensory system. The five primary tastes are sweet, bitter, umami (savory), sour, and salty. These flavors are recognized by taste receptors—proteins that coat neurons embedded in the tongue. For the most part, taste receptor genes present across all vertebrates. Except, it seems, cetaceans. Researchers uncovered a massive loss of taste receptors in these animals by screening the genomes of 15 species. The investigation spanned the two major lineages of cetaceans: Krill-loving baleen whales—such as bowheads and minkes—were surveyed along with those with teeth, like bottlenose dolphins and sperm whales. The taste genes weren’t gone per se, but were irreparably damaged by mutations, the team reports online this month in Genome Biology and Evolution. Genes encode proteins, which in turn execute certain functions in cells. Certain errors in the code can derail protein production—at which point the gene becomes a “pseudogene” or a lingering shell of a trait forgotten. Identical pseudogene corpses were discovered across the different cetacean species for sweet, bitter, umami, and sour taste receptors. Salty tastes were the only exception. © 2014 American Association for the Advancement of Science.
Keyword: Chemical Senses (Smell & Taste); Evolution
Link ID: 19624 - Posted: 05.17.2014
Katia Moskvitch The hundreds of suckers on an octopus’s eight arms leech reflexively to almost anything they come into contact with — but never grasp the animal itself, even though an octopus does not always know what its arms are doing. Today, researchers reveal that the animal’s skin produces a chemical that stops the octopus’s suckers from grabbing hold of its own body parts, and getting tangled up. “Octopus arms have a built-in mechanism that prevents the suckers from grabbing octopus skin,” says neuroscientist Guy Levy at the Hebrew University of Jerusalem, the lead author of the work, which appears today in Current Biology1. It is the first demonstration of a chemical self-recognition mechanism in motor control, and could help scientists to build better bio-inspired soft robots. To find out just how an octopus avoids latching onto itself, Levy and his colleagues cut off an octopus’s arm and subjected it to a series of tests. (The procedure is not considered traumatic, says Levy, because octopuses occasionally lose an arm in nature and behave normally while the limb regenerates.) The severed arms remained active for more than an hour after amputation, firmly grabbing almost any object, with three exceptions: the former host; any other live octopus; and other amputated arms. “But when we peeled the skin off an amputated arm and submitted it to another amputated arm, we were surprised to see that it grabbed the skinned arm as any other item,” says co-author Nir Nesher, also a neuroscientist at the Hebrew University. © 2014 Nature Publishing Group,
Keyword: Miscellaneous
Link ID: 19623 - Posted: 05.16.2014
By Brady Dennis The Food and Drug Administration is worried that a sleeping pill you take tonight could make for a riskier drive to work tomorrow. In its latest effort to make sure that the millions of Americans taking sleep medications don’t drowsily endanger themselves or others, the agency on Thursday said it will require the manufacturer of the popular drug Lunesta to lower the recommended starting dose, after data showed that people might not be alert enough to drive the morning after taking the drug, even if they feel totally awake. The current recommended starting dose of eszopiclone, the drug marketed as Lunesta, is 2 milligrams at bedtime for both men and women. The FDA said that initial dose should be cut in half to 1 milligram, though it could be increased if needed. People currently taking 2 and 3 milligram doses should ask a doctor about how to safely continue taking the medication, as higher doses are more likely to impair driving and other activities that require alertness the following morning, the agency said. “To help ensure patient safety, health care professionals should prescribe, and patients should take, the lowest dose of a sleep medicine that effectively treats their insomnia,” Ellis Unger, of FDA’s Center for Drug Evaluation and Research, said in a statement. In 2013, the FDA said, approximately 3 million prescriptions of Lunesta were dispensed to nearly a million patients in the United States. Lunesta, made by Sunovion Pharmaceuticals, also recently became available in generic form. The new rules, including changes to existing labels, will apply both to the brand-name and generic forms of the drug. FDA officials said the decision came, in part, after seeing findings from a study of 91 healthy adults between the ages 25 and 40. Compared to patients on a placebo, those taking a 3 milligram dose of Lunesta were associated with “severe next-morning psychomotor and memory impairment in both men and women,” the agency said. The study found that even people taking the recommended dose could suffer from impaired driving skills, memory and coordination as long as 11 hours after taking the drug. Even scarier: The patients often claimed that they felt completely alert, with no hint of drowsiness. © 1996-2014 The Washington Post
Keyword: Sleep
Link ID: 19622 - Posted: 05.16.2014
By Venkat Srinivasan In 1995, Ivan Goldberg, a New York psychiatrist, published one of the first diagnostic tests for Internet Addiction Disorder. The criteria appeared on psycom.net, a psychiatry bulletin board, and began with an air of earnest authenticity: “A maladaptive pattern of Internet use, leading to clinically significant impairment or distress as manifested by three (or more) of the following.” The test listed seven symptoms. You might have a problem if you were online “for longer periods of time than was intended,” or if you made “unsuccessful efforts to cut down or control Internet use.” Hundreds of people heard of the diagnostic test, logged on, clicked through and diagnosed themselves as being Internet addicts. Goldberg’s test, however, was a parody of the rigid language in the Diagnostic and Statistical Manual of Mental Disorders (DSM), the American Psychiatric Association (APA)’s psychiatric research manual. In a New Yorker story in January 1997, Goldberg said having an Internet addiction support group made “about as much sense as having a support group for coughers.” I’ve been researching the science and controversy over the last five years and wrote a long story about it last year for The Caravan. Since Goldberg’s prank, about one hundred scientific journals in psychology, sociology, neuroscience, anthropology, healthy policy and computer science have taken up the addiction question in some form. And after two decades of ridicule, research, advocacy and pushbacks, the debate is still about four basic questions. What do you call it? Does the ‘it’ exist? How do we size up such an addiction? Does it matter? © 2014 Scientific American
Keyword: Drug Abuse
Link ID: 19621 - Posted: 05.16.2014
A single alcohol binge can cause bacteria to leak from the gut and increase levels of bacterial toxins in the blood, according to a study funded by the National Institutes of Health. Increased levels of these bacterial toxins, called endotoxins, were shown to affect the immune system, with the body producing more immune cells involved in fever, inflammation, and tissue destruction. Binge drinking is defined by NIAAA as a pattern of drinking alcohol that brings blood alcohol concentration (BAC) to 0.08g/dL or above. For a typical adult, this pattern corresponds to consuming five or more drinks for men, or four or more drinks for women, in about two hours. Some individuals will reach a 0.08g/dL BAC sooner depending on body weight. Binge drinking is known to pose health and safety risks, including car crashes and injuries. Over the long term, binge drinking can damage the liver and other organs. “While the negative health effects of chronic drinking are well-documented, this is a key study to show that a single alcohol binge can cause damaging effects such as bacterial leakage from the gut into the blood stream,” said Dr. George Koob, director of the National Institute on Alcohol Abuse and Alcoholism, part of NIH. The study was led by Gyongyi Szabo, M.D., Ph.D., Professor and Vice Chair of Medicine and Associate Dean for Clinical and Translational Sciences at the University of Massachusetts Medical School. The article appears online in PLOS ONE In the study, 11 men and 14 women were given enough alcohol to raise their blood alcohol levels to at least .08 g/dL within an hour. Blood samples were taken every 30 minutes for four hours after the binge and again 24 hours later.
Keyword: Drug Abuse
Link ID: 19620 - Posted: 05.16.2014
By RONI CARYN RABIN For decades, scientists have embarked on the long journey toward a medical breakthrough by first experimenting on laboratory animals. Mice or rats, pigs or dogs, they were usually male: Researchers avoided using female animals for fear that their reproductive cycles and hormone fluctuations would confound the results of delicately calibrated experiments. That laboratory tradition has had enormous consequences for women. Name a new drug or treatment, and odds are researchers know far more about its effect on men than on women. From sleeping pills to statins, women have been blindsided by side effects and dosage miscalculations that were not discovered until after the product hit the market. Now the National Institutes of Health says that this routine gender bias in basic research must end. In a commentary published on Wednesday in the journal Nature, Dr. Francis Collins, director of the N.I.H., and Dr. Janine A. Clayton, director of the institutes’ Office of Research on Women’s Health, warned scientists that they must begin testing their theories in female lab animals and in female tissues and cells. The N.I.H. has already taken researchers to task for their failure to include adequate numbers of women in clinical trials. The new announcement is an acknowledgment that this gender disparity begins much earlier in the research process. “Most scientists want to do the most powerful experiment to get the most durable, powerful answers,” Dr. Collins said in an interview. “For most, this has not been on the radar screen as an important issue. What we’re trying to do here is raise consciousness.” Women now make up more than half the participants in clinical research funded by the institutes, but it has taken years to get to this point, and women still are often underrepresented in clinical trials carried out by drug companies and medical device manufacturers. © 2014 The New York Times Company
Keyword: Sexual Behavior; Depression
Link ID: 19619 - Posted: 05.15.2014
By Matty Litwack One year ago, I thought I was going to die. Specifically, I believed an amoeba was eating my brain. As I’ve done countless times before, I called my mother in a panic: “Mom, I think I’m dying.” As she has done countless times before, she laughed at me. She doesn’t really take me seriously anymore, because I’m a massive hypochondriac. If there exists a disease, I’ve probably convinced myself that I have it. Every time I have a cough, I assume it’s lung cancer. One time I thought I had herpes, but it was just a piece of candy stuck to my face. In the case of the brain amoeba, however, I had a legitimate reason to believe I was dying. Several days prior, I had visited a doctor to treat my nasal congestion. The doctor deemed my sickness not severe enough to warrant antibiotics and instead suggested I try a neti pot to clear up my congestion. A neti pot is a vessel shaped like a genie’s lamp that’s used to irrigate the sinuses with saline solution. My neti pot came with an instruction manual, which I immediately discarded. Why would I need instructions? Nasal irrigation seemed like a simple enough process: water goes up one nostril and flows down the other – that’s just gravity. I dumped a bottle of natural spring water into the neti pot, mixed in some salt, shoved it in my nostril and started pouring. If there was in fact a genie living in the neti pot, I imagine this was very unpleasant for him. The pressure in my sinuses was instantly reduced. It worked so well that over the next couple of days, I was raving about neti pots to anybody who would allow me to annoy them. It was honestly surprising how little people wanted to hear about nasal irrigation. Some nodded politely, others asked me to stop talking about it, but one friend had a uniquely interesting reaction: “Oh, you’re using a neti pot?” he asked. “Watch out for the brain-eating amoeba.” This was hands-down the strangest warning I had ever received. I assumed it was a joke, but I made a mental note to Google brain amoebas as soon as I was done proselytizing the masses on the merits of saltwater nose genies. © 2014 Scientific American
Keyword: Miscellaneous
Link ID: 19618 - Posted: 05.15.2014
by Nathan Collins There's a new twist in mental health. People with depression seem three times as likely as those without it to have two brain lobes curled around each other. The brains of people with depression can be physically different from other brains – they are often smaller, for example – but exactly why that is so remains unclear. In humans, some studies point to changes in the size of the hippocampi, structures near the back of the brain thought to support memory formation. "There are so many studies that show a smaller hippocampus in almost every psychiatric disorder," says Jerome Maller, a neuroscientist at the Monash Alfred Psychiatry Research Centre in Melbourne, Australia, who led the latest work looking at brain lobes. "But very few can actually show or hypothesize why that is." Maller thinks he has stumbled on an explanation. He had been using a brain stimulation technique known as transcranial magnetic stimulation as a therapy for antidepressant-resistant depression. This involved using fMRI scans to create detailed maps of the brain to determine which parts to stimulate. While pouring over hundreds of those maps, Maller noticed that many of them showed signs of occipital bending. This is where occipital lobes – which are important for vision – at the back of the brain's left and right hemispheres twist around each other. So he and his colleagues scanned 51 people with and 48 without major depressive disorder. They found that about 35 per cent of those with depression and 12.5 per cent of the others showed signs of occipital bending. The difference was even greater in women: 46 per cent of women with depression had occipital bending compared with just 6 per cent of those without depression. © Copyright Reed Business Information Ltd.
Keyword: Depression; Laterality
Link ID: 19617 - Posted: 05.15.2014