Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By LISA FELDMAN BARRETT Think about the people in your life who are 65 or older. Some of them are experiencing the usual mental difficulties of old age, like forgetfulness or a dwindling attention span. Yet others somehow manage to remain mentally sharp. My father-in-law, a retired doctor, is 83 and he still edits books and runs several medical websites. Why do some older people remain mentally nimble while others decline? “Superagers” (a term coined by the neurologist Marsel Mesulam) are those whose memory and attention isn’t merely above average for their age, but is actually on par with healthy, active 25-year-olds. My colleagues and I at Massachusetts General Hospital recently studied superagers to understand what made them tick. Our lab used functional magnetic resonance imaging to scan and compare the brains of 17 superagers with those of other people of similar age. We succeeded in identifying a set of brain regions that distinguished the two groups. These regions were thinner for regular agers, a result of age-related atrophy, but in superagers they were indistinguishable from those of young adults, seemingly untouched by the ravages of time. What are these crucial brain regions? If you asked most scientists to guess, they might nominate regions that are thought of as “cognitive” or dedicated to thinking, such as the lateral prefrontal cortex. However, that’s not what we found. Nearly all the action was in “emotional” regions, such as the midcingulate cortex and the anterior insula. My lab was not surprised by this discovery, because we’ve seen modern neuroscience debunk the notion that there is a distinction between “cognitive” and “emotional” brain regions. © 2017 The New York Times Company
Keyword: Alzheimers; Learning & Memory
Link ID: 23045 - Posted: 01.02.2017
Michael Byrne Hunger is complicated. It's not merely a single drive, though this is mostly how may experience it consciously: a single dimension of hunger magnitude. We are more or less hungry, sometimes not at all. But there's something else lurking in the brain: anti-hunger. We can be hungry and not hungry simultaneously, in a sense. In more concrete terms, we can imagine that there is in the brain a certain subset of "hunger neurons." When we feel hungry—as during periods of fasting—it means that these neurons are active. Otherwise, the hunger neurons are silent. Hunger neurons are quite real: neuroscientists have demonstrated their function by stimulating hunger neurons artificially, causing mice to eat at weird times and gain weight. But something interesting happens as we start cranking hunger neurons (agouti-related protein, or AgRP, neurons) up. There's a limit. Mice won't just eat themselves to death. This indicates that there's something else to hunger, a moderating factor. This factor is described for the first time this week in Nature Neuroscience by researchers at Harvard Medical School: a new population of neurons that intermingle with AgRP neurons and basically have the opposite effect. Anti-hunger. Anti-hunger is in itself not a brand new idea. For a long time, neuroscientists looked to pro-opiomelanocortin (POMC) neurons, which are likewise intermingled with the AgRP hunger neurons, for filling this role. This is reasonable: genetic mutations and manipulations to the POMC neurons have been observed to lead to obesity in mice. © 2017 Vice Media LLC
Keyword: Obesity
Link ID: 23044 - Posted: 01.02.2017
By JANE E. BRODY The adornments in the office of Eric L. Adams, the Brooklyn borough president, are hardly typical: a full-size refrigerator stocked with fresh fruits and vegetables; a work station where he prepares and blends these plant-based ingredients for his meals and snacks; and a convection oven and hot plate where he cooks them. In an adjacent anteroom, there’s a stationary bike, 15-pound weights, a multipurpose fitness tower and a TRX suspension trainer hanging on the door. His laptop is mounted on a music stand so he can use it while working out on a mini-stepper. Eight months ago, Mr. Adams learned during a health checkup for abdominal pain that he had Type 2 diabetes. He said his average blood sugar level was so high that the doctor was surprised he had not already lapsed into a coma. His hemoglobin A1C level — a lab test that shows the average level of blood glucose over the previous three months — was 17 percent, about three times normal. He wasted no time in tackling his disease with fervor. Spurning the American tendency to treat every ailment with medication, he instead explored the body’s ability to heal itself. Mr. Adams, a 56-year-old former police captain, now needs a new publicity photo. He no longer resembles the roly-poly image on official posters. By adopting a vegan diet, preparing his own meals and working exercise into his everyday routines, he’s shed 30 pounds and completely reversed his diabetes, a pancreatic disorder that can lead to heart attacks, stroke, nerve damage, kidney disease, visual loss and cognitive impairment. Within three months, his A1C level was down to a normal 5.7. He now strives to inform his millions of constituents about how to counter this health- and life-robbing disease, which has reached epidemic proportions in this country, even among children. Starting on the home front, he stripped the Brooklyn Borough Hall drink machine of sugary beverages and the snack machine of everything cooked in oil or unnaturally sweetened. Those searching for a pick-me-up can indulge in plain or sparkling water, diet soda, nuts, dried fruit, protein bars and whole-grain baked chips. © 2017 The New York Times Company
Keyword: Obesity
Link ID: 23043 - Posted: 01.02.2017
By Susana Martinez-Conde Our perceptual and cognitive systems like to keep things simple. We describe the line drawings below as a circle and a square, even though their imagined contours consist—in reality—of discontinuous line segments. The Gestalt psychologists of the 19th and early 20th century branded this perceptual legerdemain as the Principle of Closure, by which we tend to recognize shapes and concepts as complete, even in the face of fragmentary information. Now at the end of the year, it is tempting to seek a cognitive kind of closure: we want to close the lid on 2016, wrap it with a bow and start a fresh new year from a blank slate. Of course, it’s just an illusion, the Principle of Closure in one of its many incarnations. The end of the year is just as arbitrary as the end of the month, or the end of the week, or any other date we choose to highlight in the earth’s recurrent journey around the sun. But it feels quite different. That’s why we have lists of New Year’s resolutions, or why we start new diets or exercise regimes on Mondays rather than Thursdays. Researchers have also found that, even though we measure time in a continuous scale, we assign special meaning to idiosyncratic milestones such as entering a new decade. What should we do about our brain’s oversimplification tendencies concerning the New Year—if anything? One strategy would be to fight our feelings of closure and rebirth as we (in truth) seamlessly move from the last day of 2016 to the first day of 2017. But that approach is likely to fail. Try as we might, the Principle of Closure is just too ingrained in our perceptual and cognitive systems. In fact, if you already have the feeling that the beginning of the year is somewhat special (hey, it only happens once a year!), you might as well decide that resistance is futile, and not just embrace the illusion, but do your best to channel it. © 2017 Scientific American
Keyword: Vision; Attention
Link ID: 23042 - Posted: 01.02.2017
Bret Stetka With a president-elect who has publicly supported the debunked claim that vaccines cause autism, suggested that climate change is a hoax dreamed up by the Chinese, and appointed to his Cabinet a retired neurosurgeon who doesn't buy the theory of evolution, things might look grim for science. Yet watching Patti Smith sing "A Hard Rain's a-Gonna Fall" live streamed from the Nobel Prize ceremony in early December to a room full of physicists, chemists and physicians — watching her twice choke up, each time stopping the song altogether, only to push on through all seven wordy minutes of one of Bob Dylan's most beloved songs — left me optimistic. Taking nothing away from the very real anxieties about future funding and support for science, neuroscience in particular has had plenty of promising leads that could help fulfill Alfred Nobel's mission to better humanity. In the spirit of optimism, and with input from the Society for Neuroscience, here are a few of the noteworthy neuroscientific achievements of 2016. One of the more fascinating fields of neuroscience of late entails mapping the crosstalk between our biomes, brains and immune systems. In July, a group from the University of Virginia published a study in Nature showing that the immune system, in addition to protecting us from a daily barrage of potentially infectious microbes, can also influence social behavior. The researchers had previously shown that a type of white blood cells called T cells influence learning behavior in mice by communicating with the brain. Now they've shown that blocking T cell access to the brain influences rodent social preferences. © 2016 npr
Keyword: Alzheimers; Learning & Memory
Link ID: 23041 - Posted: 12.31.2016
By KATIE THOMAS The Food and Drug Administration has approved the first drug to treat patients with spinal muscular atrophy, a savage disease that, in its most severe form, kills infants before they turn 2. “This is a miracle — seriously,” Dr. Mary K. Schroth, a lung specialist in Madison, Wis., who treats children who have the disease, said of the approval, which was made last week. “This is a life-changing event, and this will change the course of this disease.” Dr. Schroth has previously worked as a paid consultant to Biogen, which is selling the drug. The drug, called Spinraza, will not come cheap — and, by some estimates, will be among the most expensive drugs in the world. Biogen, which is licensing Spinraza from Ionis Pharmaceuticals, said this week that one dose will have a list price of $125,000. That means the drug will cost $625,000 to $750,000 to cover the five or six doses needed in the first year, and about $375,000 annually after that, to cover the necessary three doses a year. Patients will presumably take Spinraza for the rest of their lives. The pricing could put the drug in the cross hairs of lawmakers and other critics of high drug prices, and perhaps discourage insurers from covering it. High drug prices have attracted intense scrutiny in the last year, and President-elect Donald J. Trump has singled them out as an important issue. “We believe the Spinraza pricing decision is likely to invite a storm of criticism, up to and including presidential tweets,” Geoffrey C. Porges, an analyst for Leerink Partners, said in a note to investors on Thursday. Mr. Porges said the price could lead some insurers to balk or to limit the drug to patients who are the most severely affected, such as infants, even though the F.D.A. has approved Spinraza for all patients with the condition. © 2016 The New York Times Company
Keyword: Movement Disorders; Genes & Behavior
Link ID: 23040 - Posted: 12.31.2016
Alan Yu Being overweight can raise your blood pressure, cholesterol and risk for developing diabetes. It could be bad for your brain, too. A diet high in saturated fats and sugars, the so-called Western diet, actually affects the parts of the brain that are important to memory and make people more likely to crave the unhealthful food, says psychologist Terry Davidson, director of the Center for Behavioral Neuroscience at American University in Washington, D.C. He didn't start out studying what people ate. Instead, he was interested in learning more about the hippocampus, a part of the brain that's heavily involved in memory. He was trying to figure out which parts of the hippocampus do what. He did that by studying rats that had very specific types of hippocampal damage and seeing what happened to them. In the process, Davidson noticed something strange. The rats with the hippocampal damage would go to pick up food more often than the other rats, but they would eat a little bit, then drop it. Davidson realized these rats didn't know they were full. He says something similar may happen in human brains when people eat a diet high in fat and sugar. Davidson says there's a vicious cycle of bad diets and brain changes. He points to a 2015 study in the Journal of Pediatrics that found obese children performed more poorly on memory tasks that test the hippocampus compared with kids who weren't overweight. He says if our brain system is impaired by that kind of diet, "that makes it more difficult for us to stop eating that diet. ... I think the evidence is fairly substantial that you have an effect of these diets and obesity on brain function and cognitive function." © 2016 npr
Keyword: Obesity; Learning & Memory
Link ID: 23039 - Posted: 12.31.2016
By Nicole Mortillaro Post-traumatic stress disorder can be a debilitating condition. It's estimated that it affects nearly one in 10 Canadian veterans who served in Afghanistan. Now, there's promising research that could lead to the treatment of the disorder. Following a particularly traumatic event — one where there is the serious threat of death or a circumstance that was overwhelming — we often exhibit physical symptoms immediately. But the effects in our brains actually take some time to form. That's why symptoms of PTSD — reliving an event, nightmares, anxiety — don't show up until some time later. Research has shown that, after such an event, the hippocampus — which is important in dealing with emotions and memory — shrinks, while our amygdala — also important to memory and emotions — becomes hyperactive. In earlier research, Sumantra Chattarji from the National Centre for Biological Sciences (NCBS) and the Institute for Stem Cell Biology and Regenerative Medicine (inStem), in Bangalore, India, discovered that traumatic events cause new nerve connections to form in the amygdala, which also causes hyperactivity. This plays a crucial role in people dealing with post-traumatic stress disorder. Chattarji has been studying changes in the brain after traumatic events for more than a decade. In an earlier study, he concluded that a single stress event had no immediate event on the amygdala of rats. However, 10 days later, the rats exhibited increased anxiety. There were even changes to the brain, and, in particular the amygdala. So Chattarji set out to see if there was a way to prevent these changes. Post-traumatic stress disorder can seriously affect those who have served in the military. New research may help to one day prevent that. (Shamil Zhumatov/Reuters) The new research focused on a particular cell receptor in the brain, called N-Methyl-D-Aspartate Receptor (NMDA-R), which is crucial in forming memories. ©2016 CBC/Radio-Canada.
Keyword: Stress; Learning & Memory
Link ID: 23038 - Posted: 12.31.2016
By Alice Callahan Can psychiatric medications alter the mother-baby bond? I am having a baby in a month and am on an antidepressant, antipsychotic and mood stabilizer. I don't feel a natural instinct to mother or connect to my baby yet. Could it be because of my medications? It’s normal for expectant parents to worry if they don’t feel a strong connection to the baby right away. “Those kinds of mixed fears and anxieties are really common in most pregnancies, certainly first pregnancies,” said Dorothy Greenfeld, a licensed clinical social worker and professor of obstetrics and gynecology at Yale School of Medicine. Bonding is a process that takes time, and while it can begin in pregnancy, the relationship between parent and child mostly develops after birth. Psychiatric conditions, and the medicines used to treat them, can complicate the picture. Antidepressants, the most widely used class of psychiatric drugs, do not seem to interfere with a woman’s attachment to the fetus during pregnancy, as measured by the amount of time the mother spends thinking about and planning for the baby, a 2011 study in the Archives of Women’s Mental Health found. On the other hand, the study found that women with major depression in pregnancy had lower feelings of maternal-fetal attachment, and this sense of disconnection intensified with more severe symptoms of depression. “Depression can definitely affect a person’s ability to bond with their baby, to feel those feelings of attachment, which is why we encourage treatment so strongly,” said Dr. Amy Salisbury, the study leader and a professor of pediatrics and psychiatry at the Alpert Medical School at Brown University. “That’s more likely to interfere than the medication itself.” There is less research on the effects of other types of mental health medications on mother-baby bonding, but psychiatric medications can have side effects that might interfere with parenting. For example, a small percentage of people taking mood-stabilizing medications have feelings of apathy, and that could hinder the bonding process, said Dr. Salisbury. And some mental health medications, depending on dosage and combination, might make a person feel too sedated. But again, letting mental illness go untreated is likely far riskier for both the mother and the baby. © 2016 The New York Times Company
Keyword: Depression; Sexual Behavior
Link ID: 23037 - Posted: 12.31.2016
By Laura Beil, Justin Shamoun began to hate his body a few weeks into seventh grade. He was a year younger than his suburban Detroit classmates, having skipped a grade. Many of his peers were entering puberty, their bodies solidifying into sleek young men. Justin still had the doughy build of a boy. After gym class one day, someone told Justin he could probably run faster if he weren’t so fat. The remark crushed him. Ashamed, he started hiding his body under ever-baggier clothes and making excuses to skip P.E., the pool, anywhere required to expose bare skin. Finally, he decided to fix himself. He dove headlong into sports and cut back on food. Before long, he was tossing his lunch into the garbage and picking at his dinner. He ate just enough to blunt his hunger, until the time came when he ate barely at all. The thought that he had an eating disorder never occurred to him. Long considered an affliction of women, eating disorders — the most deadly of all mental illnesses — are increasingly affecting men. The National Eating Disorders Association predicts that 10 million American men alive today will be affected, but that number is only an estimate based on the limited research available. The official criteria for diagnosing eating disorders were updated to be more inclusive of men only in 2013. And last year, Australian researchers writing in the Journal of Eating Disorders noted that “the prevalence of extreme weight control behaviors, such as extreme dietary restriction and purging” may be increasing at a faster rate in men than women. © 2016 Scientific American
Keyword: Anorexia & Bulimia; Sexual Behavior
Link ID: 23036 - Posted: 12.31.2016
By BENEDICT CAREY She was all there, all the time: exuberant in describing her mania, savage and tender when recalling her despair. And for decades, she gracefully wore the legacy of her legendary role as Princess Leia, worshiped by a generation of teenage girls as the lone female warrior amid the galactic male cast of the “Star Wars” trilogy. In her long, openhearted life, the actress and author Carrie Fisher brought the subject of bipolar disorder into the popular culture with such humor and hard-boiled detail that her death on Tuesday triggered a wave of affection on social media and elsewhere, from both fans and fellow bipolar travelers, whose emotional language she knew and enriched. She channeled the spirit of people like Patty Duke, who wrote about her own bipolar illness, and Kitty Dukakis, who wrote about depression and alcoholism, and turned it into performance art. Ms. Fisher’s career coincided with the growing interest in bipolar disorder itself, a mood disorder characterized by alternating highs and lows, paralyzing depressions punctuated by flights of exuberant energy. Her success fed a longstanding debate on the relationship between mental turmoil and creativity. And her writing and speaking helped usher in a confessional era in which mental disorders have entered the pop culture with a life of their own: Bipolar is now a prominent trait of another famous Carrie, Claire Danes’s character Carrie Mathison in the Showtime television series “Homeland.” “She was so important to the public because she was telling the truth about bipolar disorder, not putting on airs or pontificating, just sharing who she is in an honest-to-the-bone way,” said Judith Schlesinger, a psychologist and author of “The Insanity Hoax: Exposing the Myth of the Mad Genius.” © 2016 The New York Times Company
Keyword: Schizophrenia
Link ID: 23035 - Posted: 12.29.2016
By Heather M. Snyder For more than 25 years, Mary Read was a successful nurse in Lititz, Pennsylvania. But in 2010, at the age of 50, she started having trouble with her memory and thinking, making it difficult for her to complete routine tasks and follow instructions at work. The problems worsened, bringing her career to an abrupt end. In 2011, her doctor conducted a comprehensive evaluation, including a cognitive assessment, and found that she was in the early stages of younger-onset Alzheimer’s, which affects hundreds of thousands of people under 65. A year earlier, Elizabeth Wolf faced another sort of upheaval. The 36-year-old community health program director was forced to abandon her own career, home and community in Vermont when both of her parents were diagnosed with Alzheimer’s three months apart. Wolf took the difficult decision to move back into her childhood home in Mount Laurel, New Jersey in order to become their primary caregiver. These stories are not unusual. Alzheimer’s dementia disproportionately affects women in a variety of ways. Compared with men, 2.5 times as many women as men provide 24-hour care for an affected relative. Nearly 19 percent of these wives, sisters and daughters have had to quit work to do so. In addition, women make up nearly two-thirds of the more than 5 million Americans living with Alzheimer’s today. According to the Alzheimer’s Association 2016 Alzheimer’s Disease Facts and Figures, an estimated 3.3 million women aged 65 and older in the United States have the disease. To put that number in perspective, a woman in her sixties is now about twice as likely to develop Alzheimer’s as breast cancer within her lifetime. © 2016 Scientific American
Keyword: Alzheimers; Sexual Behavior
Link ID: 23034 - Posted: 12.29.2016
Ian Sample Science editor The first subtle hints of cognitive decline may reveal themselves in an artist’s brush strokes many years before dementia is diagnosed, researchers believe. The controversial claim is made by psychologists who studied renowned artists, from the founder of French impressionism, Claude Monet, to the abstract expressionist Willem de Kooning. While Monet aged without obvious mental decline, de Kooning was diagnosed with Alzheimer’s disease more than a decade before his death in 1997. Strobe lighting provides a flicker of hope in the fight against Alzheimer’s Alex Forsythe at the University of Liverpool analysed more than 2,000 paintings from seven famous artists and found what she believes are progressive changes in the works of those who went on to develop Alzheimer’s. The changes became noticeable when the artists were in their 40s. Though intriguing, the small number of artists involved in the study means the findings are highly tentative. While Forsythe said the work does not point to an early test for dementia, she hopes it may open up fresh avenues for investigating the disease. The research provoked mixed reactions from other scientists. Richard Taylor, a physicist at the University of Oregon, described the work as a “magnificent demonstration of art and science coming together”. But Kate Brown, a physicist at Hamilton College in New York, was less enthusiastic and dismissed the research as “complete and utter nonsense”. © 2016 Guardian News and Media Limited
Keyword: Alzheimers
Link ID: 23033 - Posted: 12.29.2016
By KEVIN DEUTSCH An anesthetic commonly used for surgery has surpassed heroin to become the deadliest drug on Long Island, killing at least 220 people there in 2016, according to medical examiners’ records. The drug, fentanyl, is a synthetic opioid, which can be 100 times more potent than morphine. The numbers from Long Island are part of a national pattern, as fentanyl fatalities have already surpassed those from heroin in other parts of the country, including New England, as its use has skyrocketed. Part of the reason for the increase is economic — because fentanyl can be manufactured in the lab, it is much cheaper and easier than cultivating heroin. In New York City, more than 1,000 people are expected to die from drug overdoses this year — the first recorded four-digit death total in city history, according to statistics compiled by the Department of Health and Mental Hygiene. Nearly half of all unintentional drug overdose deaths in the city since July have involved fentanyl, the health department said. The medical examiners of Long Island’s two counties, Nassau and Suffolk, compiled the new numbers. “Fentanyl has surpassed heroin as the most commonly detected drug in fatal opioid overdoses,” Dr. Michael J. Caplan, the Suffolk County medical examiner, said in a written statement about the statistics, which were obtained by The New York Times ahead of their release. “The influx of illicitly manufactured fentanyl from overseas is a nationwide issue that requires a multidisciplinary intervention from all levels of government.” Nationwide, recorded deaths from opioids surpassed 30,000 in 2015, according to data compiled by the Centers for Disease Control and Prevention. And overdoses caused by synthetic opioids like fentanyl increased by 72.2 percent in 2015 over 2014 — one of the deadliest year-over-year surges for any drug in United States history, the same data shows. © 2016 The New York Times Company
Keyword: Drug Abuse; Pain & Touch
Link ID: 23032 - Posted: 12.29.2016
Perry Link People who study other cultures sometimes note that they benefit twice: first by learning about the other culture and second by realizing that certain assumptions of their own are arbitrary. In reading Colin McGinn’s fine recent piece, “Groping Toward the Mind,” in The New York Review, I was reminded of a question I had pondered in my 2013 book Anatomy of Chinese: whether some of the struggles in Western philosophy over the concept of mind—especially over what kind of “thing” it is—might be rooted in Western language. The puzzles are less puzzling in Chinese. Indo-European languages tend to prefer nouns, even when talking about things for which verbs might seem more appropriate. The English noun inflation, for example, refers to complex processes that were not a “thing” until language made them so. Things like inflation can even become animate, as when we say “we need to combat inflation” or “inflation is killing us at the check-out counter.” Modern cognitive linguists like George Lakoff at Berkeley call inflation an “ontological metaphor.” (The inflation example is Lakoff’s.) When I studied Chinese, though, I began to notice a preference for verbs. Modern Chinese does use ontological metaphors, such as fāzhăn (literally “emit and unfold”) to mean “development” or xὶnxīn (“believe mind”) for “confidence.” But these are modern words that derive from Western languages (mostly via Japanese) and carry a Western flavor with them. “I firmly believe that…” is a natural phrase in Chinese; you can also say “I have a lot of confidence that…” but the use of a noun in such a phrase is a borrowing from the West. © 1963-2016 NYREV, Inc
Keyword: Consciousness; Language
Link ID: 23031 - Posted: 12.28.2016
Morwenna Ferrier Is my face attractive? Don’t answer that. Not because I’m ducking out of this, but because you can’t. Attractiveness is subjective, perhaps the most subjective question of all; that we outsource the answer to Google (and we do, in our droves) is ironic since it depends on a bias that is impossible to unpack. Yet in searching the internet for an answer, it also reveals the question to be one of the great existential tensions of our time. Because, as we all know, being attractive is absolutely 100% the A-road to happiness. If you are Googling to rate your attractiveness, then you are probably working on the assumption that you aren’t. You’re also, possibly, more vulnerable and susceptible to being told that you aren’t. In short, you’re a sitting duck, someone who had a sore throat and who asked good old Dr Google for advice only to be told it was cancer. Still, it’s only in investigating precisely why Google is the last person you should ask – being a search engine therefore insentient – that you can start cobbling together an idea of what attractiveness really is. It’s worth starting with semantics. Beauty is not attractiveness and vice versa, though we commonly confuse the two. Beauty (arguably) has a template against which we intuit and against which we measure ourselves. It is hinged around genetics and a particular look associated with this politically correct (and largely western-governed) model. Darwin wouldn’t agree: “It is certainly not true that there is in the mind of man any universal standards of beauty with respect to the human body,” he said. But a lot has changed since his time. © 2016 Guardian News and Media Limited
Keyword: Sexual Behavior
Link ID: 23030 - Posted: 12.28.2016
By Ben Andrew Henry Traveling from the forests and fields of Europe to the grasslands south of the Sahara desert is a monumental trip for anyone, and especially for a diminutive insect. Yet every year, populations of the painted lady (Vanessa cardui) butterfly make that journey over the course of several generations. The logistics of this migratory feat had been speculated for some time, but never fully understood, in part because of the difficulty of tracking the tiny insects across long distances. In a study published October 4 in Biology Letters, researchers reported having measured the isotopic composition of butterfly wings in Europe and south of the Sahara. Since the fraction of heavy hydrogen isotopes in the environment varies geographically, the team used its analysis to identify the origins of butterflies captured, confirming that groups of butterflies in the Sahara did originate in Europe. The butterflies do not linger in Africa long. They most likely make their trip, the authors suggested, to take advantage of the burst of productivity in the tropical savannah that follows the rainy season—and to breed the generation that will start the trip back. Europe’s freshwater eels (Anguilla anguilla) live out their days in rivers and streams, but they never spawn there. Massive catches of larval eels in the Sargasso Sea tipped researchers off a century ago that eels must spawn in the swirling mid-Atlantic gyre of free-floating seaweed and then migrate to Europe. Eels leave their homes in the late fall, but other than that, the details of their journey have been a mystery. © 1986-2016 The Scientist
Keyword: Animal Migration
Link ID: 23029 - Posted: 12.28.2016
by Bethany Brookshire An opioid epidemic is upon us. Prescription painkillers such as fentanyl and morphine can ease terrible pain, but they can also cause addiction and death. The Centers for Disease Control and Prevention estimates that nearly 2 million Americans are abusing or addicted to prescription opiates. Politicians are attempting to stem the tide at state and national levels, with bills to change and monitor how physicians prescribe painkillers and to increase access to addiction treatment programs. Those efforts may make access to painkillers more difficult for some. But pain comes to everyone eventually, and opioids are one of the best ways to make it go away. Morphine is the king of pain treatment. “For hundreds of years people have used morphine,” says Lakshmi Devi, a pharmacologist at the Ichan School of Medicine Mount Sinai in New York City. “It works, it’s a good drug, that’s why we want it. The problem is the bad stuff.” The “bad stuff” includes tolerance — patients have to take higher and higher doses to relieve their pain. Drugs such as morphine depress breathing, an effect that can prove deadly. They also cause constipation, drowsiness and vomiting. But “for certain types of pain, there are no medications that are as effective,” says Bryan Roth, a pharmacologist and physician at the University of North Carolina at Chapel Hill. The trick is constructing a drug with all the benefits of an opioid painkiller, and few to none of the side effects. Here are three ways that scientists are searching for the next big pain buster, and three of the chemicals they’ve turned up. |© Society for Science & the Public 2000 - 2016
Keyword: Pain & Touch; Drug Abuse
Link ID: 23028 - Posted: 12.27.2016
By GINA KOLATA It was Oct. 11, 2015, and a middle-aged man and a young woman, both severely obese, were struggling with the same lump-in-the-throat feeling. The next day they were going to have an irreversible operation. Were they on the threshold of a new beginning or a terrible mistake? They were strangers, scheduled for back-to-back bariatric surgery at the University of Michigan with the same doctor. He would cut away most of their stomachs and reroute their small intestines. They were almost certain to lose much of their excess weight. But despite the drastic surgery, their doctor told them it was unlikely that they would ever be thin. Nearly 200,000 Americans have bariatric surgery each year. Yet far more — an estimated 24 million — are heavy enough to qualify for the operation, and many of them are struggling with whether to have such a radical treatment, the only one that leads to profound and lasting weight loss for virtually everyone who has it. Most people believe that the operation simply forces people to eat less by making their stomachs smaller, but scientists have discovered that it actually causes profound changes in patients’ physiology, altering the activity of thousands of genes in the human body as well as the complex hormonal signaling from the gut to the brain. It often leads to astonishing changes in the way things taste, making cravings for a rich slice of chocolate cake or a bag of White Castle hamburgers simply vanish. Those who have the surgery naturally settle at a lower weight. © 2016 The New York Times Company
Keyword: Obesity
Link ID: 23027 - Posted: 12.27.2016
By GINA KOLATA Bariatric surgery is an option that obesity medicine specialists say is too often ignored or dismissed. Yet it is the only option that almost always works to help very heavy people lose a lot of weight and that also can mysteriously make some chronic conditions vanish. Here are some answers about bariatric surgery and what it does. HOW MANY AMERICANS ARE ELIGIBLE FOR BARIATRIC SURGERY? Twenty-four million, according to the American Society for Metabolic and Bariatric Surgery. The criteria are a body mass index above 40, or a B.M.I. of at least 35 along with other medical conditions like diabetes, hypertension, sleep apnea or acid reflux. HOW MANY HAVE THE SURGERY EACH YEAR? Fewer than 200,000. WHAT ARE THE OPERATIONS? There are four in use today. The two most popular procedures are the Roux-en-Y gastric bypass and the gastric sleeve. Both make the stomach smaller. The bypass also reroutes the small intestine. A simpler procedure, the gastric band, is less effective and has fallen out of favor. And a much more drastic operation, the biliopancreatic diversion with duodenal switch, which bypasses a large part of the small intestine, is rarely used because it has higher mortality and complication rates. HOW MUCH DO THE OPERATIONS COST? The average cost of a sleeve gastrectomy is $16,000 to $19,000, and the average cost of a gastric bypass is $20,000 to $25,000. Most insurance plans cover the cost for patients who qualify, though some plans require that patients try dieting for a certain amount of time first. DOES THE SURGERY SAVE MONEY ON OTHER HEALTH CARE COSTS IN THE END? © 2016 The New York Times Company
Keyword: Obesity
Link ID: 23026 - Posted: 12.27.2016


.gif)

