Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By GINIA BELLAFANTE The opening shots of “The Normal Heart,” HBO’s adaptation of Larry Kramer’s 1985 play about the early days of the AIDS crisis in New York, reveal a crew of sinewy and amorous young men disembarking from a ferry on Fire Island on a beautiful July day in 1981. The tableau is meant to suggest the final hour of unburdened desire, the moment before so much youth and beauty would be sacrificed to the cruelest attacks of physiology and cultural indifference. On the way home from the weekend, Ned Weeks, Mr. Kramer’s proxy, a man distanced from the surrounding hedonism, is shown reading a piece in The New York Times headlined, “Rare Cancer Seen in 41 Homosexuals.” The film (which will make its debut on May 25) arrives at a transformative time in the history of AIDS prevention. On May 14, federal health officials, in a move that would have been unimaginable 30 years ago, recommended the use of a prophylactic drug regimen to prevent infection with H.I.V. The drug currently used is known as Truvada, and two years ago, David Duran, a writer and gay-rights campaigner, coined the term “Truvada whore,” controversially, as a judgment against gay men who were abandoning safer sex in favor of taking the antiretroviral. Though he has since characterized this view as “prudish,” there are doctors in the city who continue to harangue patients for what the longtime AIDS activist Peter Staley calls “any break with the condom code.” And yet whatever ideological divisions existed in the period Mr. Kramer’s narrative recalls and whatever have emerged since, the fight against AIDS has been one of the most successful and focused public health movements. In another distinguishing moment, the city health department announced this year that for the first time AIDS had fallen out of the 10 leading causes of death in New York. Replacing it was Alzheimer’s, whose damage is sure to multiply as the number of older New Yorkers increases — by 2030 there will be close to 500,000 more people over age 60 than there were at the beginning of the century. According to a study from Rush University Medical Center in March, the number of deaths attributable to the disease had been vastly undercalculated. The research showed that Alzheimer’s was the underlying cause in 500,000 deaths in the United States in 2010, a figure close to six times the estimate from the Centers for Disease Control. This means that in a single year, Alzheimer’s claimed nearly as many lives as AIDS — responsible for 636,000 deaths in this country — had taken in more than three decades. © 2014 The New York Times Company
Keyword: Alzheimers
Link ID: 19628 - Posted: 05.19.2014
By ALAN SCHWARZ ATLANTA — More than 10,000 American toddlers 2 or 3 years old are being medicated for attention deficit hyperactivity disorder outside established pediatric guidelines, according to data presented on Friday by an official at the Centers for Disease Control and Prevention. The report, which found that toddlers covered by Medicaid are particularly prone to be put on medication such as Ritalin and Adderall, is among the first efforts to gauge the diagnosis of A.D.H.D. in children below age 4. Doctors at the Georgia Mental Health Forum at the Carter Center in Atlanta, where the data was presented, as well as several outside experts strongly criticized the use of medication in so many children that young. The American Academy of Pediatrics standard practice guidelines for A.D.H.D. do not even address the diagnosis in children 3 and younger — let alone the use of such stimulant medications, because their safety and effectiveness have barely been explored in that age group. “It’s absolutely shocking, and it shouldn’t be happening,” said Anita Zervigon-Hakes, a children’s mental health consultant to the Carter Center. “People are just feeling around in the dark. We obviously don’t have our act together for little children.” Dr. Lawrence H. Diller, a behavioral pediatrician in Walnut Creek, Calif., said in a telephone interview: “People prescribing to 2-year-olds are just winging it. It is outside the standard of care, and they should be subject to malpractice if something goes wrong with a kid.” Friday’s report was the latest to raise concerns about A.D.H.D. diagnoses and medications for American children beyond what many experts consider medically justified. Last year, a nationwide C.D.C. survey found that 11 percent of children ages 4 to 17 have received a diagnosis of the disorder, and that about one in five boys will get one during childhood. A vast majority are put on medications such as methylphenidate (commonly known as Ritalin) or amphetamines like Adderall, which often calm a child’s hyperactivity and impulsivity but also carry risks for growth suppression, insomnia and hallucinations. Only Adderall is approved by the Food and Drug Administration for children below age 6. However, because off-label use of methylphenidate in preschool children had produced some encouraging results, the most recent American Academy of Pediatrics guidelines authorized it in 4- and 5-year-olds — but only after formal training for parents and teachers to improve the child’s environment were unsuccessful. © 2014 The New York Times Company
Keyword: ADHD; Drug Abuse
Link ID: 19627 - Posted: 05.17.2014
Eleven years on, I still remember the evening I decided to kill my baby daughter. It's not something you're supposed to feel as a new parent with a warm, tiny bundle in your arms. But this is how postnatal depression can twist your logic. At the time it made perfect sense. Catherine was screaming, in pain. She had colic, there was nothing I could do about it. If an animal were in this much pain you'd put it out of its misery, so why not a human? Postnatal depression can have this kind of effect even on the most reasonable woman, yet you won't find much about it in baby books. We're expected to love our kids the moment they pop out, even while the memory of the labour pains is still raw. I knew a baby would be hard work, of course, but I expected motherhood to be fulfilling. As it happened I had a wonderful pregnancy, followed by a quick and easy birth. But the problems started soon after. Catherine wouldn’t feed, her blood sugar levels tumbled and I ended up bottle-feeding her, in tears, in a hospital room filled with posters promoting the breast. I was a Bad Mother within 48 hours. Things were no better after the first month. This was meant to be a joyous time, but all I seemed to feel was rage and resentment. In pregnancy all the attention had been on me, and suddenly I was a sideshow to this wailing thing in a crib. I was tired, tetchy and resentful. My daughter had rapidly become a ball and chain. My freedom was over. I kept hoping this was just the “baby blues” and that it would soon pass, but things only got worse. When colic set in, for around five hours each evening Catherine would scream, her face a mix of red and purple rage. No amount of pacing, tummy-rubbing or soothing words could stop this tiny demanding creature. So one night, alone with her in her room, I decided it would be best to put her out of her misery. © 2014 Guardian News and Media Limited
Keyword: Depression; Hormones & Behavior
Link ID: 19626 - Posted: 05.17.2014
|By Sam Kean It is possible to take the idea of left/right differences within the brain too far: it’s not like one side of the brain talks or emotes or recognizes faces all by itself while the other one just sits there twiddling its neurons. But the left and right hemispheres of the human brain do show striking differences in some areas, especially with regard to language, the trait that best defines us as human beings. Scientists suspect that left-right specialization first evolved many millions of years ago, since many other animals show subtle hemispheric differences: they prefer to use one claw or paw to eat, for instance, or they strike at prey more often in one direction than another. Before this time, the left brain and right brain probably monitored sensory data and recorded details about the world to an equal degree. But there’s no good reason for both hemispheres to do the same basic job, not if the corpus callosum—a huge bundle of fibers that connects the left and right brain—can transmit data between them. So the brain eliminated the redundancy, and the left brain took on new tasks. This process accelerated in human beings, and we humans show far greater left/right differences than any other animal. In the course of its evolution the left brain also took on the crucial role of master interpreter. Neuroscientists have long debated whether certain people have two independent minds running in parallel inside their skulls. That sounds spooky, but some evidence suggests yes. For example, there are split-brain patients, who had their corpus callosums surgically severed to help control epilepsy and whose left and right brain cannot communicate as a result. Split-brain patients have little trouble drawing two different geometric figures at the same time, one with each hand. Normal people bomb this test. (Try it, and you’ll see how mind-bendingly hard it is.) Some neuroscientists scoff at these anecdotes, saying the claims for two separate minds are exaggerated. But one thing is certain: two minds or no, split-brain people feel mentally unified; they never feel the two hemispheres fighting for control, or feel their consciousness flipping back and forth. That’s because one hemisphere, usually the left, takes charge. And many neuroscientists argue that the same thing happens in normal brains. One hemisphere probably always dominates the mind, a role that neuroscientist Michael Gazzaniga called the interpreter. (Per George W. Bush, you could also call it “the decider.”) © 2014 Scientific American
Keyword: Laterality
Link ID: 19625 - Posted: 05.17.2014
Tastes are a privilege. The oral sensations not only satisfy foodies, but also on a primal level, protect animals from toxic substances. Yet cetaceans—whales and dolphins—may lack this crucial ability, according to a new study. Mutations in a cetacean ancestor obliterated their basic machinery for four of the five primary tastes, making them the first group of mammals to have lost the majority of this sensory system. The five primary tastes are sweet, bitter, umami (savory), sour, and salty. These flavors are recognized by taste receptors—proteins that coat neurons embedded in the tongue. For the most part, taste receptor genes present across all vertebrates. Except, it seems, cetaceans. Researchers uncovered a massive loss of taste receptors in these animals by screening the genomes of 15 species. The investigation spanned the two major lineages of cetaceans: Krill-loving baleen whales—such as bowheads and minkes—were surveyed along with those with teeth, like bottlenose dolphins and sperm whales. The taste genes weren’t gone per se, but were irreparably damaged by mutations, the team reports online this month in Genome Biology and Evolution. Genes encode proteins, which in turn execute certain functions in cells. Certain errors in the code can derail protein production—at which point the gene becomes a “pseudogene” or a lingering shell of a trait forgotten. Identical pseudogene corpses were discovered across the different cetacean species for sweet, bitter, umami, and sour taste receptors. Salty tastes were the only exception. © 2014 American Association for the Advancement of Science.
Keyword: Chemical Senses (Smell & Taste); Evolution
Link ID: 19624 - Posted: 05.17.2014
Katia Moskvitch The hundreds of suckers on an octopus’s eight arms leech reflexively to almost anything they come into contact with — but never grasp the animal itself, even though an octopus does not always know what its arms are doing. Today, researchers reveal that the animal’s skin produces a chemical that stops the octopus’s suckers from grabbing hold of its own body parts, and getting tangled up. “Octopus arms have a built-in mechanism that prevents the suckers from grabbing octopus skin,” says neuroscientist Guy Levy at the Hebrew University of Jerusalem, the lead author of the work, which appears today in Current Biology1. It is the first demonstration of a chemical self-recognition mechanism in motor control, and could help scientists to build better bio-inspired soft robots. To find out just how an octopus avoids latching onto itself, Levy and his colleagues cut off an octopus’s arm and subjected it to a series of tests. (The procedure is not considered traumatic, says Levy, because octopuses occasionally lose an arm in nature and behave normally while the limb regenerates.) The severed arms remained active for more than an hour after amputation, firmly grabbing almost any object, with three exceptions: the former host; any other live octopus; and other amputated arms. “But when we peeled the skin off an amputated arm and submitted it to another amputated arm, we were surprised to see that it grabbed the skinned arm as any other item,” says co-author Nir Nesher, also a neuroscientist at the Hebrew University. © 2014 Nature Publishing Group,
Keyword: Miscellaneous
Link ID: 19623 - Posted: 05.16.2014
By Brady Dennis The Food and Drug Administration is worried that a sleeping pill you take tonight could make for a riskier drive to work tomorrow. In its latest effort to make sure that the millions of Americans taking sleep medications don’t drowsily endanger themselves or others, the agency on Thursday said it will require the manufacturer of the popular drug Lunesta to lower the recommended starting dose, after data showed that people might not be alert enough to drive the morning after taking the drug, even if they feel totally awake. The current recommended starting dose of eszopiclone, the drug marketed as Lunesta, is 2 milligrams at bedtime for both men and women. The FDA said that initial dose should be cut in half to 1 milligram, though it could be increased if needed. People currently taking 2 and 3 milligram doses should ask a doctor about how to safely continue taking the medication, as higher doses are more likely to impair driving and other activities that require alertness the following morning, the agency said. “To help ensure patient safety, health care professionals should prescribe, and patients should take, the lowest dose of a sleep medicine that effectively treats their insomnia,” Ellis Unger, of FDA’s Center for Drug Evaluation and Research, said in a statement. In 2013, the FDA said, approximately 3 million prescriptions of Lunesta were dispensed to nearly a million patients in the United States. Lunesta, made by Sunovion Pharmaceuticals, also recently became available in generic form. The new rules, including changes to existing labels, will apply both to the brand-name and generic forms of the drug. FDA officials said the decision came, in part, after seeing findings from a study of 91 healthy adults between the ages 25 and 40. Compared to patients on a placebo, those taking a 3 milligram dose of Lunesta were associated with “severe next-morning psychomotor and memory impairment in both men and women,” the agency said. The study found that even people taking the recommended dose could suffer from impaired driving skills, memory and coordination as long as 11 hours after taking the drug. Even scarier: The patients often claimed that they felt completely alert, with no hint of drowsiness. © 1996-2014 The Washington Post
Keyword: Sleep
Link ID: 19622 - Posted: 05.16.2014
By Venkat Srinivasan In 1995, Ivan Goldberg, a New York psychiatrist, published one of the first diagnostic tests for Internet Addiction Disorder. The criteria appeared on psycom.net, a psychiatry bulletin board, and began with an air of earnest authenticity: “A maladaptive pattern of Internet use, leading to clinically significant impairment or distress as manifested by three (or more) of the following.” The test listed seven symptoms. You might have a problem if you were online “for longer periods of time than was intended,” or if you made “unsuccessful efforts to cut down or control Internet use.” Hundreds of people heard of the diagnostic test, logged on, clicked through and diagnosed themselves as being Internet addicts. Goldberg’s test, however, was a parody of the rigid language in the Diagnostic and Statistical Manual of Mental Disorders (DSM), the American Psychiatric Association (APA)’s psychiatric research manual. In a New Yorker story in January 1997, Goldberg said having an Internet addiction support group made “about as much sense as having a support group for coughers.” I’ve been researching the science and controversy over the last five years and wrote a long story about it last year for The Caravan. Since Goldberg’s prank, about one hundred scientific journals in psychology, sociology, neuroscience, anthropology, healthy policy and computer science have taken up the addiction question in some form. And after two decades of ridicule, research, advocacy and pushbacks, the debate is still about four basic questions. What do you call it? Does the ‘it’ exist? How do we size up such an addiction? Does it matter? © 2014 Scientific American
Keyword: Drug Abuse
Link ID: 19621 - Posted: 05.16.2014
A single alcohol binge can cause bacteria to leak from the gut and increase levels of bacterial toxins in the blood, according to a study funded by the National Institutes of Health. Increased levels of these bacterial toxins, called endotoxins, were shown to affect the immune system, with the body producing more immune cells involved in fever, inflammation, and tissue destruction. Binge drinking is defined by NIAAA as a pattern of drinking alcohol that brings blood alcohol concentration (BAC) to 0.08g/dL or above. For a typical adult, this pattern corresponds to consuming five or more drinks for men, or four or more drinks for women, in about two hours. Some individuals will reach a 0.08g/dL BAC sooner depending on body weight. Binge drinking is known to pose health and safety risks, including car crashes and injuries. Over the long term, binge drinking can damage the liver and other organs. “While the negative health effects of chronic drinking are well-documented, this is a key study to show that a single alcohol binge can cause damaging effects such as bacterial leakage from the gut into the blood stream,” said Dr. George Koob, director of the National Institute on Alcohol Abuse and Alcoholism, part of NIH. The study was led by Gyongyi Szabo, M.D., Ph.D., Professor and Vice Chair of Medicine and Associate Dean for Clinical and Translational Sciences at the University of Massachusetts Medical School. The article appears online in PLOS ONE In the study, 11 men and 14 women were given enough alcohol to raise their blood alcohol levels to at least .08 g/dL within an hour. Blood samples were taken every 30 minutes for four hours after the binge and again 24 hours later.
Keyword: Drug Abuse
Link ID: 19620 - Posted: 05.16.2014
By RONI CARYN RABIN For decades, scientists have embarked on the long journey toward a medical breakthrough by first experimenting on laboratory animals. Mice or rats, pigs or dogs, they were usually male: Researchers avoided using female animals for fear that their reproductive cycles and hormone fluctuations would confound the results of delicately calibrated experiments. That laboratory tradition has had enormous consequences for women. Name a new drug or treatment, and odds are researchers know far more about its effect on men than on women. From sleeping pills to statins, women have been blindsided by side effects and dosage miscalculations that were not discovered until after the product hit the market. Now the National Institutes of Health says that this routine gender bias in basic research must end. In a commentary published on Wednesday in the journal Nature, Dr. Francis Collins, director of the N.I.H., and Dr. Janine A. Clayton, director of the institutes’ Office of Research on Women’s Health, warned scientists that they must begin testing their theories in female lab animals and in female tissues and cells. The N.I.H. has already taken researchers to task for their failure to include adequate numbers of women in clinical trials. The new announcement is an acknowledgment that this gender disparity begins much earlier in the research process. “Most scientists want to do the most powerful experiment to get the most durable, powerful answers,” Dr. Collins said in an interview. “For most, this has not been on the radar screen as an important issue. What we’re trying to do here is raise consciousness.” Women now make up more than half the participants in clinical research funded by the institutes, but it has taken years to get to this point, and women still are often underrepresented in clinical trials carried out by drug companies and medical device manufacturers. © 2014 The New York Times Company
Keyword: Sexual Behavior; Depression
Link ID: 19619 - Posted: 05.15.2014
By Matty Litwack One year ago, I thought I was going to die. Specifically, I believed an amoeba was eating my brain. As I’ve done countless times before, I called my mother in a panic: “Mom, I think I’m dying.” As she has done countless times before, she laughed at me. She doesn’t really take me seriously anymore, because I’m a massive hypochondriac. If there exists a disease, I’ve probably convinced myself that I have it. Every time I have a cough, I assume it’s lung cancer. One time I thought I had herpes, but it was just a piece of candy stuck to my face. In the case of the brain amoeba, however, I had a legitimate reason to believe I was dying. Several days prior, I had visited a doctor to treat my nasal congestion. The doctor deemed my sickness not severe enough to warrant antibiotics and instead suggested I try a neti pot to clear up my congestion. A neti pot is a vessel shaped like a genie’s lamp that’s used to irrigate the sinuses with saline solution. My neti pot came with an instruction manual, which I immediately discarded. Why would I need instructions? Nasal irrigation seemed like a simple enough process: water goes up one nostril and flows down the other – that’s just gravity. I dumped a bottle of natural spring water into the neti pot, mixed in some salt, shoved it in my nostril and started pouring. If there was in fact a genie living in the neti pot, I imagine this was very unpleasant for him. The pressure in my sinuses was instantly reduced. It worked so well that over the next couple of days, I was raving about neti pots to anybody who would allow me to annoy them. It was honestly surprising how little people wanted to hear about nasal irrigation. Some nodded politely, others asked me to stop talking about it, but one friend had a uniquely interesting reaction: “Oh, you’re using a neti pot?” he asked. “Watch out for the brain-eating amoeba.” This was hands-down the strangest warning I had ever received. I assumed it was a joke, but I made a mental note to Google brain amoebas as soon as I was done proselytizing the masses on the merits of saltwater nose genies. © 2014 Scientific American
Keyword: Miscellaneous
Link ID: 19618 - Posted: 05.15.2014
by Nathan Collins There's a new twist in mental health. People with depression seem three times as likely as those without it to have two brain lobes curled around each other. The brains of people with depression can be physically different from other brains – they are often smaller, for example – but exactly why that is so remains unclear. In humans, some studies point to changes in the size of the hippocampi, structures near the back of the brain thought to support memory formation. "There are so many studies that show a smaller hippocampus in almost every psychiatric disorder," says Jerome Maller, a neuroscientist at the Monash Alfred Psychiatry Research Centre in Melbourne, Australia, who led the latest work looking at brain lobes. "But very few can actually show or hypothesize why that is." Maller thinks he has stumbled on an explanation. He had been using a brain stimulation technique known as transcranial magnetic stimulation as a therapy for antidepressant-resistant depression. This involved using fMRI scans to create detailed maps of the brain to determine which parts to stimulate. While pouring over hundreds of those maps, Maller noticed that many of them showed signs of occipital bending. This is where occipital lobes – which are important for vision – at the back of the brain's left and right hemispheres twist around each other. So he and his colleagues scanned 51 people with and 48 without major depressive disorder. They found that about 35 per cent of those with depression and 12.5 per cent of the others showed signs of occipital bending. The difference was even greater in women: 46 per cent of women with depression had occipital bending compared with just 6 per cent of those without depression. © Copyright Reed Business Information Ltd.
Keyword: Depression; Laterality
Link ID: 19617 - Posted: 05.15.2014
By Pippa Stephens Health reporter, BBC News An anti-depressant drug could be used to slow the onset of Alzheimer's disease, say scientists in the US. Research into 23 people, and transgenic mice, found citalopram hampered a protein which helps to build destructive plaques in the brains of Alzheimer's patients. Scientists said they hoped the study could help prevent the disease. Experts said the study was "interesting" and that using an approved drug could be beneficial. Alzheimer's disease is the most common cause of dementia, affecting around 496,000 people in the UK. It affects the brain through protein plaques and tangles which lead to the death of brain cells, and a shortage of chemicals important for transmitting messages. Symptoms include loss of memory, mood changes, and problems with communication and reasoning. Researchers at the University of Pennsylvania and Washington University School of Medicine carried out the study between 2012 and 2014. They bred mice with Alzheimer's disease and looked at the levels of the peptide - or protein component - amyloid beta (AB), in the brain. AB clusters in plaques which, alongside the tau protein, are thought to trigger Alzheimer's. After giving the mice citalopram, the level of AB fell by 25%, compared to the control group, with no anti-depressant. And after two months of anti-depressants, the growth of new plaques was reduced, and existing plaques did not grow any further, the study said. But it noted the drug could not cause existing plaques to shrink, or decrease in number. BBC © 2014
Keyword: Alzheimers; Depression
Link ID: 19616 - Posted: 05.15.2014
By ANAHAD O’CONNOR Two medications could help tens of thousands of alcoholics quit drinking, yet the drugs are rarely prescribed to patients, researchers reported on Tuesday. The medications, naltrexone and acamprosate, reduce cravings for alcohol by fine-tuning the brain’s chemical reward system. They have been approved for treating alcoholism for over a decade. But questions about their efficacy and a lack of awareness among doctors have resulted in the drugs’ being underused, the researchers said. Less than a third of all people with alcohol problems receive treatment of any kind, and less than 10 percent are prescribed medications. The Affordable Care Act requires that insurers provide coverage for substance abuse treatments and services, and addiction specialists expect to see increases this year in the number of people seeking help for alcoholism. George Koob, the director of the National Institute on Alcohol Abuse and Alcoholism, said the new study should reassure doctors that naltrexone and acamprosate, while not silver bullets, can help many patients. “This is an important paper,” said Dr. Koob, who was not involved in the study. “There are effective medications for the treatment of alcoholism, and it would be great if the world would use them.” In the new study, which was published online on Tuesday in JAMA, the journal of the American Medical Association, a team of researchers based mostly at the University of North Carolina at Chapel Hill compiled findings from the most rigorous trials of medications for alcoholism in the past few decades. Ultimately, they analyzed data on roughly 23,000 people from 122 randomized trials. The researchers focused on a measure known as the “number needed to treat,” an indicator of how many people need to take a pill for one person to be helped. The study found that to prevent one person from returning to drinking, the number needed to treat for acamprosate was 12; for naltrexone, the number was 20. By comparison, large studies of widely used drugs, like the cholesterol-lowering statins, have found that 25 to more than 100 people need treatment to prevent one cardiovascular event. © 2014 The New York Times Company
Keyword: Drug Abuse
Link ID: 19615 - Posted: 05.15.2014
|By Andrea Anderson Our knack for language helps us structure our thinking. Yet the ability to wax poetic about trinkets, tools or traits may not be necessary to think about them abstractly, as was once suspected. A growing body of evidence suggests nonhuman animals can group living and inanimate things based on less than obvious shared traits, raising questions about how creatures accomplish this task. In a study published last fall in the journal PeerJ, for example, Oakland University psychology researcher Jennifer Vonk investigated how well four orangutans and a western lowland gorilla from the Toronto Zoo could pair photographs of animals from the same biological groups. Vonk presented the apes with a touch-screen computer and got them to tap an image of an animal—for instance, a snake—on the screen. Then she showed each ape two side-by-side animal pictures: one from the same category as the animal in the original image and one from another—for example, images of a different reptile and a bird. When they correctly matched animal pairs, they received a treat such as nuts or dried fruit. When they got it wrong, they saw a black screen before beginning the next trial. After hundreds of such trials, Vonk found that all five apes could categorize other animals better than expected by chance (although some individuals were better at it than others). The researchers were impressed that the apes could learn to classify mammals of vastly different visual characteristics together—such as turtles and snakes—suggesting the apes had developed concepts for reptiles and other categories of animals based on something other than shared physical traits. Dogs, too, seem to have better than expected abstract-thinking abilities. They can reliably recognize pictures of other dogs, regardless of breed, as a study in the July 2013 Animal Cognition showed. The results surprised scientists not only because dog breeds vary so widely in appearance but also because it had been unclear whether dogs could routinely identify fellow canines without the advantage of smell and other senses. Other studies have found feats of categorization by chimpanzees, bears and pigeons, adding up to a spate of recent research that suggests the ability to sort things abstractly is far more widespread than previously thought. © 2014 Scientific American
Keyword: Intelligence; Evolution
Link ID: 19614 - Posted: 05.15.2014
The Presidential Commission for the Study of Bioethical Issues today released its first set of recommendations for integrating ethics into neuroscience research in the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative. Last July, President Barack Obama charged the commission with identifying key ethical questions that may arise through the BRAIN Initiative and wider neuroscience research. The report is “a dream come true,” says Judy Illes, a neuroethicist at the University of British Columbia in Vancouver, Canada, who was a guest presenter to the commission. Brain research raises unique ethical issues because it “strikes at the very core of who we are,” said political scientist and philosopher Amy Gutmann of the University of Pennsylvania, who chairs the commission, in a call with reporters yesterday. Specific areas of concern identified in the report include questions of brain privacy raised by advances in neuroimaging research; whether research participants and patients with dementia can give informed consent to participate in experimental trials; and research into cognitive enhancement, which raises “issues of distributive justice and fairness,” Gutmann says. Parsing hope from hype is key to ethical neuroscience research and its application, Gutmann notes. Citing the troubled ethical history of psychosurgery in the United States, in which more than 40,000 people were lobotomized based on shaky evidence that the procedure could treat psychiatric illnesses such as schizophrenia and depression, Gutmann cautions that a similar ethical derailment is possible in contemporary neuroscience research. A misstep with invasive experimental treatments such as deep brain stimulation surgery would not only be tragic for patients, but have “devastating consequences” for scientific progress, she says. © 2014 American Association for the Advancement of Science
Keyword: Emotions; Brain imaging
Link ID: 19613 - Posted: 05.15.2014
By KATIE THOMAS Almost overnight, a powerful new painkiller has become a $100 million business and a hot Wall Street story. But nearly as quickly, questions are emerging about how the drug is being sold, and to whom. The drug, Subsys, is a form of fentanyl, a narcotic that is often used when painkillers like morphine fail to provide relief. The product was approved in 2012 for a relatively small number of people — cancer patients — but has since become an outsize moneymaker for the obscure company that makes it, Insys Therapeutics. In the last year, the company’s sales have soared and its share price has jumped nearly 270 percent. Behind that business success is an unusual marketing machine that may have pushed Subsys far beyond the use envisioned by the Food and Drug Administration. The F.D.A. approved Subsys only for cancer patients who are already using round-the-clock painkillers, and warned that it should be prescribed only by oncologists and pain specialists. But just 1 percent of prescriptions are written by oncologists, according to data provided by Symphony Health, which analyzes drug trends. About half of the prescriptions were written by pain specialists, and a wide range of doctors prescribed the rest, including general practice physicians, neurologists and even dentists and podiatrists. Interviews with several former Insys sales representatives suggest the company, based in Chandler, Ariz., has aggressively marketed the painkiller, including to physicians who did not treat many cancer patients and by paying its sales force higher commissions for selling higher doses of the drug. Under F.D.A. rules, manufacturers may market prescription drugs only for approved uses. But doctors may prescribe drugs as they see fit. Over the last decade, pharmaceutical companies have paid billions of dollars to settle claims that they encouraged doctors to use drugs for nonapproved treatments, or so-called off-label uses, to increase sales and profits. © 2014 The New York Times Compan
Keyword: Drug Abuse; Pain & Touch
Link ID: 19612 - Posted: 05.15.2014
Ewen Callaway In the silk business, sex is money. Male silkworms weave cocoons with more silk of a higher quality than females do, and the multibillion dollar sericulture industry has long sought an easy way to breed only males. That might now be a realistic goal, as researchers have identified the process that determines sex in the silkworm Bombyx mori1. The sex factor is found to be a small RNA molecule — the first time that anything other than a protein has been implicated in a sex-detemination process. In nearly all Lepidoptera — the order that includes moths and butterflies — sex is determined in silkworms by a WZ chromosome system, in contrast to the XY system used in mammals. Female silkworms carry W and Z sex chromosomes, whereas males boast a pair of Z chromosomes. Last year, researchers showed how to genetically modify silkworms so that the females would express a deadly protein (see 'Genetic kill switch eradicates female silkworms for a better crop'). But efforts to identify the genes on the W chromosome that make silkworms female have come up short: the W does not seem to have any protein-making genes, and is instead almost completely filled with parasitic, mobile genetic elements called transposons. In 2011, a team led by entomologist Susumu Katsuma at the University of Tokyo reported that the W chromosome produces short RNA molecules that keep transposons at bay in newly formed egg cells2. Katsuma and his team report in Nature today1 that one such molecule, which the authors called Fem, is specific to female silkworms, suggesting that it has a role in sex determination. The Fem RNA breaks down a corresponding molecule made by a gene known as Masculinizer, which is found on the Z chromosome. When the researchers silenced Masculinizer, embryos execute a genetic programme that makes female tissue. © 2014 Nature Publishing Group
Keyword: Sexual Behavior
Link ID: 19611 - Posted: 05.15.2014
Bullying casts a long shadow. Children who are bullied are more prone to depression and suicidal tendencies even when they grow up; they're also more likely to get sick and have headaches and stomach troubles, researchers have discovered. A new study may have found the underlying cause: A specific indicator of illness, called C-reactive protein (CRP), is higher than normal in bullying victims, even when they get older. In contrast, the bullies, by the same gauge, seem to be healthier. The researchers focused on CRP because it's a common, easily tested marker of inflammation, the runaway immune system activity that's a feature of many chronic illnesses including cardiovascular disease, diabetes, chronic pain, and depression, explains lead author William Copeland, a psychologist and epidemiologist at Duke University Medical Center in Durham, North Carolina. To link inflammation to bullying, the researchers asked 1420 youngsters between the ages of 9 and 16 whether, and how often, they had been bullied or had bullied others. Interviewers asked participants whether they felt more teased, bullied, or treated meanly by siblings, friends, and peers than other children—and whether they had upset or hurt other people on purpose, tried to get others in trouble, or forced people to do something by threatening or hurting them. The researchers took finger stick blood tests at each assessment. Interviews took place once a year until the participants turned 16, and again when they were 19 and 21. The children interviewed were participants in the larger Great Smoky Mountains Study, in which some 12,000 children in North Carolina were assessed to track the development of psychiatric conditions. In the short term, the effect of bullying on the victims was immediate. CRP levels increased along with the number of reported bullying instances, and more than doubled in those who said they'd been bullied three times or more in the previous year, compared with kids who had never been bullied. No change was seen in bullies, or in kids who hadn't been involved with bullying one way or the other, the researchers report online today in the Proceedings of the National Academy of Sciences. © 2014 American Association for the Advancement of Science.
Keyword: Aggression; Genes & Behavior
Link ID: 19610 - Posted: 05.13.2014
by Anil Ananthaswamy Children born with split brains – whereby the two hemispheres of their brains are not connected – can develop new brain wiring that helps to connect the two halves, according to brain scans of people with the condition. Such circuitry is not present in normal brains, and explains how some people with split brains can still maintain normal function. It also suggests that the developing brain is even more adaptable than previously thought. Research into people with split brains goes back to the 1960s, when neuroscientists studied people who had undergone brain surgery to treat particularly severe epilepsy. The surgery involved cutting the corpus callosum, the thick bundle of neuronal fibres that connects the brain's two halves. This disconnection prevented epileptic seizures spreading from one brain hemisphere to the other. The recipients of such split-brain surgery showed a form of disconnection syndrome whereby the two halves of their brains could not exchange information. For instance, if a patient touched an object with their left hand without seeing the object, they would be unable to name it. That is because sensory-motor signals from the left hand are processed in the right hemisphere. To put a name to the object, the tactile information from the hand has to reach the brain's left hemisphere, the seat of language. With the central connection between hemispheres severed, the object's naming information cannot be retrieved. Conversely, if that person were to touch an object with their right hand without seeing it, the sensory-motor signals from that hand would go to the left hemisphere, which hosts the brain's language centres, making naming the object easy. However, children born without a corpus callosum – and therefore whose two brain hemispheres are separated – can often pass such tactile naming tests when they are old enough to take them. Their brain hemispheres are obviously communicating, but it wasn't clear how. © Copyright Reed Business Information Ltd
Keyword: Laterality; Development of the Brain
Link ID: 19609 - Posted: 05.13.2014


.gif)

