Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Scicurious I’ve got a terrible sweet tooth. And I am kind of proud of it, in a way. Yeah, I CAN eat that whole chocolate cake. I’d even LIKE it. Honeycomb dipped in chocolate? YES PLEASE. There are very few sweet things that I’d refuse. But should I really be ok with my sweet tooth? Could my sweet tooth correlate with something more sinister…a preference for alcohol? I can blame my sweet tooth on my parents, probably. Studies have shown that variability in preference for sweet things (though, to a greater or lesser extent, we all like sweet things), has a genetic basis. But the sweet tooth doesn’t go alone. In animals (mice especially), a preference for sugar in their water correlates with preference for alcohol as well. When you breed mice to make sure they drink alcohol (this is done to study alcoholism, for example), they also tend to really prefer sweet things, above and beyond mice that aren’t so into martinis. There is a correlation in humans, too. Humans who are more into sweet things are slightly more likely to abuse alcohol. But what is the basis? The authors of this study wanted to look at the reward related systems of humans, and see how sweet taste might compare to alcohol drinking. They took 16 people, put them in an fMRI scanner, and then carefully sprayed their tongues with sugar water. fMRI looks at the blood oxygen levels in various areas of the brain. Higher blood oxygen levels are thought to correlate with increased “activity” of the brain (the idea being that more neurons in use means the area needs more oxygen). An example of this would be that your visual cortex will show increased blood oxygen levels when you are looking at something. © 2013 Scientific American
Keyword: Drug Abuse; Obesity
Link ID: 18431 - Posted: 07.30.2013
Erika Check Hayden The hair of three Incan mummies bears evidence that one of them used large amounts of coca and alcohol in the year before she died, which may have been fed to her as part of a ritual that led to her death. The children, who were found in 1999 near the summit of the Llullaillaco volcano in Argentina, probably died about 500 years ago in a sacrificial ritual known as capacocha. In the study, published today in Proceedings of the National Academy of Sciences1, researchers led by archaeologist Timothy Taylor of the University of Bradford, UK, used mass spectrometry to analyse variations in levels of chemical residues in the children’s hair in the months before their deaths. The researchers looked for by-products of the metabolization of coca and alcohol — both important in Andean culture and ritual — and found that all three children ingested both substances in the year before they died. But the eldest — a 13-year-old girl known as the Maiden — took much more of both substances than the younger children. The pattern of consumption suggests that a series of rituals preparing her for her fate began about a year before she was left to die on top of the 6,739-metre-high Llullaillaco. The levels of metabolites in her hair, for instance, increased about a year before her death and then shot up to very high levels about a month and a half before she died — her hair recorded the highest level of coca ever found in Andean archaeological remains, says John Verano, a biological anthropologist at Tulane University in New Orleans, Louisiana. © 2013 Nature Publishing Group,
Keyword: Drug Abuse
Link ID: 18430 - Posted: 07.30.2013
By CARL ZIMMER The golden lion tamarin, a one-pound primate that lives in Brazil, is a stunningly monogamous creature. A male will typically pair with a female and they will stay close for the rest of their lives, mating only with each other and then working together to care for their young. To biologists, this deeply monogamous way of life — found in 9 percent of mammal species — is puzzling. A seemingly better evolutionary strategy for male mammals would be to spend their time looking for other females with which to mate. “Monogamy is a problem,” said Dieter Lukas of the University of Cambridge in a telephone news conference on Monday. “Why should the male keep to one female?” The evolution of monogamy has inspired many different ideas. “These hypotheses have been suggested for the past 40 years, and there’s been no resolution of the debate,” said Kit Opie of the University College London in an interview. On Monday, Dr. Opie and Dr. Lukas each published a large-scale study of monogamy that they hoped would resolve the debate. But they ended up coming to opposing conclusions, which means the debate over monogamy continues. Dr. Lukas, co-author of a paper in the journal Science with Tim Clutton-Brock of Cambridge, looked at 2,545 species of mammals, tracing their mating evolution from their common ancestor some 170 million years ago. The scientists found that mammals shifted from solitary living to monogamy 61 times over their evolution. They then searched for any factors that these mammals had in common. They concluded that monogamy evolves when females become hostile with one another and live in ranges that do not overlap. When females live this way, they set up so much distance between one another that a single male cannot prevent other males from mating with them. Staying close to one female became a better strategy. Once males began doing so, they sometimes evolved to provide care to their offspring as well. © 2013 The New York Times Company
Keyword: Sexual Behavior; Evolution
Link ID: 18429 - Posted: 07.30.2013
By David Brown, Charles Sabine, who spent more than two decades as a television reporter for NBC covering wars, revolutions and natural disasters, is familiar with something he calls “real fear.” He’s seen it in the eyes of people about to die or be killed. It chilled his blood when a Bosnian guerrilla held a gun to his chest as he stood near a bullet-pocked execution wall. He felt it when he walked point for his camera crew in Baghdad during Iraq’s sectarian war. But nothing terrified him like the news he got eight years ago after taking the gene test for Huntington’s disease, whose slow downward course toward death makes it one of mankind’s most dread afflictions. “I learned that the disease that took my father and is inflicting on my brother the same terrible decline in his prime will take me, too,” said Sabine, 53, an Englishman who worked for NBC for 26 years. And yet Sabine has turned that knowledge to a purpose that can only be called thrilling. He’s on a mission to make Huntington’s the model for a Hopeless Disease About Which There’s Hope. He wants to put it at the forefront of the “patient-centered care” movement, the effort to always ask patients what they consider success or hope to get out of treatment. He wants to make sure there are Huntington’s patients ready for clinical trials that are just around the corner. He wants to get everybody to think a little more sophisticatedly about genetic testing. Closer to home, he’s turning the knowledge of his biological fate into a tool to help him savor every day, be a good father and husband, make amends, not deceive. © 1996-2013 The Washington Post
Keyword: Huntingtons; Genes & Behavior
Link ID: 18428 - Posted: 07.30.2013
By JAMES GORMAN Should some of the most social, intelligent and charismatic animals on the planet be kept in captivity by human beings? That is a question asked more frequently than ever by both scientists and animal welfare advocates, sometimes about close human cousins like chimpanzees and other great apes, but also about another animal that is remarkable for its intelligence and complex social organization — the killer whale, or orca. Killer whales, found in all the world’s oceans, were once as despised as wolves. But in the last half century these elegant black-and-white predators — a threat to seals and other prey as they cruise the oceans, but often friendly to humans in the wild — have joined the pantheon of adored wildlife, along with the familiar polar bears, elephants and lions. With life spans that approach those of humans, orcas have strong family bonds, elaborate vocal communication and cooperative hunting strategies. And their beauty and power, combined with a willingness to work with humans, have made them legendary performers at marine parks since they were first captured and exhibited in the 1960s. They are no longer taken from the wild as young to be raised and trained, but are bred in captivity in the United States for public display at marine parks. Some scientists and activists have argued for years against keeping them in artificial enclosures and training them for exhibition. They argue for more natural settings, like enclosed sea pens, as well as an end to captive breeding and to the orcas’ use in what opponents call entertainment and marine parks call education. Now the issue has been raised with new intensity in the documentary film “Blackfish” and the book “Death at SeaWorld,” by David Kirby, just released in paperback. © 2013 The New York Times Company
Keyword: Aggression; Intelligence
Link ID: 18427 - Posted: 07.30.2013
By Glen Tellis, Rickson C. Mesquita, and Arjun G. Yodh Terrence Murgallis, a 20 year-old undergraduate student in the Department of Speech-Language Pathology at Misericordia University has stuttered all his life and approached us recently about conducting brain research on stuttering. His timing was perfect because our research group, in collaboration with a team led by Dr. Arjun Yodh in the Department of Physics and Astronomy at the University of Pennsylvania, had recently deployed two novel optical methods to compare blood flow and hemoglobin concentration differences in the brains of those who stutter with those who are fluent. These noninvasive methods employ diffusing near-infrared light and have been dubbed near-infrared spectroscopy (NIRS) for concentration dynamics, and diffuse correlation spectroscopy (DCS) for flow dynamics. The near-infrared light readily penetrates through intact skull to probe cortical regions of the brain. The low power light has no known side-effects and has been successfully utilized for a variety of clinical studies in infants, children, and adults. DCS measures fluctuations of scattered light due to moving targets in the tissue (mostly red blood cells). The technique measures relative changes in cerebral blood flow. NIRS uses the relative transmission of different colors of light to detect hemoglobin concentration changes in the interrogated tissues. Though there are numerous diagnostic tools available to study brain activity, including positron emission tomography (PET), magnetic resonance imaging (MRI), and magnetoencephalography (MEG), these methods are often invasive and/or expensive to administer. In the particular case of electroencephalography (EEG), its low spatial resolution is a significant limitation for investigations of verbal fluency. © 2013 Scientific American
Keyword: Language
Link ID: 18426 - Posted: 07.30.2013
By RONI CARYN RABIN Marie Theriault started having trouble with her hands more than three years ago. She was the director of a day care center, but suddenly she couldn’t change diapers or tie shoelaces. She started dropping things. “People would say to me, ‘Look, you dropped your folder,’ ” Mrs. Theriault, 59, said. “I wasn’t aware I had dropped it.” Though she did not have any problems with memory, Mrs. Theriault eventually found out that she has a rare form of Alzheimer’s disease. The diagnosis enabled her family to plan ahead: Her husband took early retirement and found a clinical trial for her to enroll in, and the two went on a safari that had been a dream for years. “We’re front-loading a bit, enjoying life as much as we can, now that the disease is manageable,” said Paul Theriault, 57. “It can get pretty ugly.” For the Theriaults, getting an accurate diagnosis of Alzheimer’s disease brought a measure of relief, even though the future might be grim. Indeed, there is a growing interest in the early detection of dementia, not only in patients like Mrs. Theriault but also in people with normal age-related memory changes or even no symptoms at all. The idea is that treatments for Alzheimer’s disease and other dementias have been largely ineffective because the conditions aren’t caught early enough. Now researchers are starting clinical trials that focus on people in the “pre-symptomatic phase” of Alzheimer’s disease. Medicare is paying for wellness visits that include cognitive assessments and screening. Copyright 2013 The New York Times Company
Keyword: Alzheimers
Link ID: 18425 - Posted: 07.30.2013
By Daniel Engber Brain-bashing, once an idle pastime of the science commentariat, went mainstream in June. At the beginning of the month, Slate contributor Sally Satel and Scott O. Lilienfeld published Brainwashed: The Seductive Appeal of Mindless Neuroscience, a well-informed attack on the extravagances of “neurocentrist” thought. We’re living in dangerous era, they warn in the book’s introduction. “Naïve media, slick neuroentrepreneurs, and even an occasional overzealous neuroscientist exaggerate the capacity of scans to reveal the contents of our minds, exalt brain physiology as inherently the most valuable level of explanation for understanding behavior, and rush to apply underdeveloped, if dazzling, science for commercial and forensic use.” In the United Kingdom, the neuro-gadfly Raymond Tallis—whose own attack on popular brain science, Aping Mankind, came out in 2011—added to the early-summer beat-down, complaining in the Observer that “studies that locate irreducibly social phenomena … in the function or dysfunction of bits of our brains are conceptually misconceived.” By mid-June, these sharp rebukes made their way into the mind of David Brooks, a long-time dabbler in neural data who proposed not long ago that “brain science helps fill the hole left by the atrophy of theology and philosophy.” Brooks read Brainwashed and became a convert to its cause: “From personal experience, I can tell you that you get captivated by [neuroscience] and sometimes go off to extremes,” he wrote in a recent column with the headline “Beyond the Brain.” Then he gave the following advice: “The next time somebody tells you what a brain scan says, be a little skeptical. The brain is not the mind.” © 2013 The Slate Group, LLC
Keyword: Brain imaging
Link ID: 18424 - Posted: 07.30.2013
By ABIGAIL ZUGER, M.D. A journey into the human brain starts with the usual travel decisions: will you opt for a no-frills sightseeing jaunt, a five-star luxury cruise, or trek a little off the beaten track, skipping the usual tourist attractions? Now that science’s newfound land is suddenly navigable, hordes of eager guides are offering up books that range from the basic to the lavishly appointed to the minutely subspecialized. But those who prefer wandering off trail may opt for two new ones, neither by a neuroscientist. When the philosopher Patricia S. Churchland explains that her book represents “the story of getting accustomed to my brain,” she is speaking as both a brain-owning human being and a career humanist. An emerita professor at the University of California, San Diego, she has spent a career probing the physical brain for the self and its moral center. And unlike many humanists who hate the science for the irritating violence it does to centuries of painstaking intellectual labor, she is entranced by the power of the data, and her delight is utterly contagious. She loses little time in dispatching the archaic notion of the soul, and suggests that near-death visions of heaven simply represent “neural funny business” in a malfunctioning brain. Can humans still live a moral and spiritual life even without the ideas of soul and heaven? You bet they can. “We may still say that the sun is setting even when we know full well that earth is turning,” Professor Churchland points out, and she is off and running. © 2013 The New York Times Company
Keyword: Emotions; Consciousness
Link ID: 18423 - Posted: 07.30.2013
By MOISES VELASQUEZ-MANOFF Although professionals may bemoan their long work hours and high-pressure careers, really, there’s stress, and then there’s Stress with a capital “S.” The former can be considered a manageable if unpleasant part of life; in the right amount, it may even strengthen one’s mettle. The latter kills. What’s the difference? Scientists have settled on an oddly subjective explanation: the more helpless one feels when facing a given stressor, they argue, the more toxic that stressor’s effects. That sense of control tends to decline as one descends the socioeconomic ladder, with potentially grave consequences. Those on the bottom are more than three times as likely to die prematurely as those at the top. They’re also more likely to suffer from depression, heart disease and diabetes. Perhaps most devastating, the stress of poverty early in life can have consequences that last into adulthood. Even those who later ascend economically may show persistent effects of early-life hardship. Scientists find them more prone to illness than those who were never poor. Becoming more affluent may lower the risk of disease by lessening the sense of helplessness and allowing greater access to healthful resources like exercise, more nutritious foods and greater social support; people are not absolutely condemned by their upbringing. But the effects of early-life stress also seem to linger, unfavorably molding our nervous systems and possibly even accelerating the rate at which we age. Even those who become rich are more likely to be ill if they suffered hardship early on. The British epidemiologist Michael Marmot calls the phenomenon “status syndrome.” He’s studied British civil servants who work in a rigid hierarchy for decades, and found that accounting for the usual suspects — smoking, diet and access to health care — won’t completely abolish the effect. There’s a direct relationship among health, well-being and one’s place in the greater scheme. “The higher you are in the social hierarchy,” he says, “the better your health.” © 2013 The New York Times Company
Keyword: Stress; Neuroimmunology
Link ID: 18422 - Posted: 07.29.2013
Adam Withnall Drinking several cups of coffee a day could halve the risk of suicide in men and women, scientists from Harvard suggest In a study published by the Word Journal of Biological Pyschiatry, researchers analysed the caffeine consumption of more than 200,000 people spanning a period of nearly 20 years. They found that, for both men and women, those who took in 400mg of the stimulant a day – the equivalent of two to three cups of coffee – were statistically 50 per cent less likely to commit suicide. And while the research surveyed people on all sorts of caffeine sources, from tea to chocolate, they found that between 71 and 80 per cent of intake was from coffee. Lead researcher Michel Lucas, from the Department of Nutrition at the Harvard School of Public Health, said: “Unlike previous investigations, we were able to assess association of consumption of caffeinated and non-caffeinated beverages, and we identify caffeine as the most likely candidate of any putative protective effect of coffee.” The scientists said the statistics could possibly be explained by the fact that caffeine boosts production of serotonin, dopamine, and noradrenaline, effectively acting as a mild antidepressant. Coffee has in the past been shown to reduce the risk of depression in women, and it also stimulates the central nervous system. © independent.co.uk
Keyword: Stress; Drug Abuse
Link ID: 18421 - Posted: 07.29.2013
By James Gallagher Health and science reporter, BBC News Researchers believe they are closer to developing a blood test that could diagnose Alzheimer's. There is no definitive test for the brain-wasting disease. Doctors rely on cognition tests and brain scans. A technique published in the journal Genome Biology showed differences in the tiny fragments of genetic material floating in the blood could be used to identify patients. The test was accurate 93% of the time in trials on 202 people. One of the main goals of Alzheimer's research is to find ways of detecting the disease earlier. It starts years before symptoms appear and it is thought that future treatments will need to be given before large parts of the brain are destroyed. This will require new ways of testing for the condition. The team at the Saarland University, in Germany, analysed 140 microRNAs (fragments of genetic code) in patients with Alzheimer's disease and in healthy people. They found 12 microRNAs in the blood which were present in markedly different levels in people with Alzheimer's. These became the basis of their test. Early trials showed it was successful and was "able to distinguish with high diagnostic accuracies between Alzheimer's disease patients and healthy" people. BBC © 2013
Keyword: Alzheimers
Link ID: 18420 - Posted: 07.29.2013
If you look directly at the "spinning" ball in this illusion by Arthur Shapiro, it appears to fall straight down. But if you look to one side, the ball appears to curve to one side. The ball appears to swerve because our peripheral vision system cannot process all of its features independently. Instead, our brains combine the downward motion of the ball and its leftward spin to create the impression of a curve. Line-of-sight (or foveal) vision, on the other hand, can extract all the information from the ball's movement, which is why the curve disappears when you view the ball dead-on.
Keyword: Vision
Link ID: 18419 - Posted: 07.29.2013
John Hawks Humans are known for sporting big brains. On average, the size of primates' brains is nearly double what is expected for mammals of the same body size. Across nearly seven million years, the human brain has tripled in size, with most of this growth occurring in the past two million years. Determining brain changes over time is tricky. We have no ancient brains to weigh on a scale. We can, however, measure the inside of ancient skulls, and a few rare fossils have preserved natural casts of the interior of skulls. Both approaches to looking at early skulls give us evidence about the volumes of ancient brains and some details about the relative sizes of major cerebral areas. For the first two thirds of our history, the size of our ancestors' brains was within the range of those of other apes living today. The species of the famous Lucy fossil, Australopithecus afarensis, had skulls with internal volumes of between 400 and 550 milliliters, whereas chimpanzee skulls hold around 400 ml and gorillas between 500 and 700 ml. During this time, Australopithecine brains started to show subtle changes in structure and shape as compared with apes. For instance, the neocortex had begun to expand, reorganizing its functions away from visual processing toward other regions of the brain. The final third of our evolution saw nearly all the action in brain size. Homo habilis, the first of our genus Homo who appeared 1.9 million years ago, saw a modest hop in brain size, including an expansion of a language-connected part of the frontal lobe called Broca's area. The first fossil skulls of Homo erectus, 1.8 million years ago, had brains averaging a bit larger than 600 ml. © 2013 Scientific American
Keyword: Evolution; Intelligence
Link ID: 18418 - Posted: 07.29.2013
By DAVID CRARY, AP National Writer NEW YORK (AP) — There's extensive evidence that pigs are as smart and sociable as dogs. Yet one species is afforded affection and respect; the other faces mass slaughter en route to becoming bacon, ham and pork chops. Seeking to capitalize on that discrepancy, animal-welfare advocates are launching a campaign called The Someone Project that aims to highlight research depicting pigs, chickens, cows and other farm animals as more intelligent and emotionally complex than commonly believed. The hope is that more people might view these animals with the same empathy that they view dogs, cats, elephants, great apes and dolphins. "When you ask people why they eat chickens but not cats, the only thing they can come up with is that they sense cats and dogs are more cognitively sophisticated that then species we eat — and we know this isn't true," said Bruce Friedrich of Farm Sanctuary, the animal-protection and vegan-advocacy organization that is coordinating the new project. "What it boils down to is people don't know farm animals the way they know dogs or cats," Friedrich said. "We're a nation of animal lovers, and yet the animals we encounter most frequently are the animals we pay people to kill so we can eat them." The lead scientist for the project is Lori Marino, a lecturer in psychology at Emory University who has conducted extensive research on the intelligence of whales, dolphins and primates. She plans to review existing scientific literature on farm animals' intelligence, identify areas warranting new research, and prepare reports on her findings that would be circulated worldwide via social media, videos and her personal attendance at scientific conferences. © 2013 Hearst Communications Inc.
Keyword: Intelligence; Evolution
Link ID: 18417 - Posted: 07.29.2013
Kelly Servick Our imperfect memory is inconvenient at the grocery store and downright dangerous on the witness stand. In extreme cases, we may be confident that we remember something that never happened at all. Now, a group of neuroscientists say that they’ve identified a potential mechanism of false memory creation and have planted such a memory in the brain of a mouse. Neuroscientists are only beginning to tackle the phenomenon of false memory, says Susumu Tonegawa of the Massachusetts Institute of Technology in Cambridge, whose team conducted the new research. “It’s there, and it’s well established,” he says, “but the brain mechanisms underlying this false memory are poorly known.” With optogenetics—the precise stimulation of neurons with light—scientists can seek out the physical basis of recall and even tweak it a bit, using mouse models. Like us, mice develop memories based on context. When a mouse returns to an environment where it felt pain in the past, it recalls that experience and freezes with fear. Tonegawa’s team knew that the hippocampus, a part of the brain responsible for establishing memory, plays a role in encoding context-based experiences, and that stimulating cells in a part of the hippocampus called the dentate gyrus can make a mouse recall and react to a mild electric shock that it received in the past. The new goal was to connect that same painful shock memory to a context where the mouse had not actually received a shock. © 2012 American Association for the Advancement of Science
Keyword: Learning & Memory
Link ID: 18416 - Posted: 07.27.2013
Sleepless night, the moon is bright. People sleep less soundly when there's a full moon, researchers discovered when they analyzed data from a past sleep study. If you were tossing and turning and howling at your pillow this week, you’re not necessarily a lunatic, at least in the strictest sense of the word. The recent full moon might be to blame for your poor sleep. In the days close to a full moon, people take longer to doze off, sleep less deeply, and sleep for a shorter time, even if the moon isn’t shining in their window, a new study has found. “A lot of people are going to say, ‘Yeah, I knew this already. I never sleep well during a full moon.’ But this is the first data that really confirms it,” says biologist Christian Cajochen of the University of Basel in Switzerland, lead author of the new work. “There had been numerous studies before, but many were very inconclusive.” Anecdotal evidence has long suggested that people’s sleep patterns, moods, and even aggression is linked to moon cycles. But past studies of potential lunar effects have been tainted by statistical weaknesses, biases, or inconsistent methods, Cajochen says. Between 2000 and 2003, he and his colleagues had collected detailed data on the sleep patterns of 33 healthy volunteers for an unrelated study on the effects of aging on sleep. Using electroencephalograms (EEG) that measure brain activity, they recorded how deep and how long each participant’s nightly sleep was in a controlled, laboratory setting. Years after the initial experiment, the scientists were drinking in a pub—during a full moon—and came up with the idea of going back to the data to test for correlations with moon cycles. © 2012 American Association for the Advancement of Science.
Keyword: Sleep; Biological Rhythms
Link ID: 18415 - Posted: 07.27.2013
Brain cells talk to each other in a variety of tones. Sometimes they speak loudly but other times struggle to be heard. For many years scientists have asked why and how brain cells change tones so frequently. Today National Institutes of Health researchers showed that brief bursts of chemical energy coming from rapidly moving power plants, called mitochondria, may tune brain cell communication. “We are very excited about the findings,” said Zu-Hang Sheng, Ph.D., a senior principal investigator and the chief of the Synaptic Functions Section at the NIH’s National Institute of Neurological Disorders and Stroke (NINDS). “We may have answered a long-standing, fundamental question about how brain cells communicate with each other in a variety of voice tones.” The network of nerve cells throughout the body typically controls thoughts, movements and senses by sending thousands of neurotransmitters, or brain chemicals, at communication points made between the cells called synapses. Neurotransmitters are sent from tiny protrusions found on nerve cells, called presynaptic boutons. Boutons are aligned, like beads on a string, on long, thin structures called axons. They help control the strength of the signals sent by regulating the amount and manner that nerve cells release transmitters. Mitochondria are known as the cell’s power plant because they use oxygen to convert many of the chemicals cells use as food into adenosine triphosphate (ATP), the main energy that powers cells. This energy is essential for nerve cell survival and communication. Previous studies showed that mitochondria can rapidly move along axons, dancing from one bouton to another.
Keyword: Miscellaneous
Link ID: 18414 - Posted: 07.27.2013
By Meghan Rosen In a spacious hotel room not far from the beach in La Jolla, Calif., Kelsey Heenan gripped her fiancé’s hand. Heenan, a 20-year-old anorexic woman, couldn’t believe what she was hearing. Walter Kaye, director of the eating disorders program at the University of California, San Diego, was telling a handful of rapt patients and their family members what the latest brain imaging research suggested about their disorder. It’s not your fault, he told them. Heenan had always assumed that she was to blame for her illness. Kaye’s data told a different story. He handed out a pile of black-and-white brain scans — some showed the brains of healthy people, others were from people with anorexia nervosa. The scans didn’t look the same. “People were shocked,” Heenan says. But above all, she remembers, the group seemed to sigh in relief, breathing out years of buried guilt about the disorder. “It’s something in the way I was wired — it’s something I didn’t choose to do,” Heenan says. “It was pretty freeing to know that there could be something else going on.” Years of psychological and behavioral research have helped scientists better understand some signs and triggers of anorexia. But that knowledge hasn’t straightened out the disorder’s tangled roots, or pointed scientists to a therapy that works for everyone. “Anorexia has a high death rate, it’s expensive to treat and people are chronically ill,” says Kaye. © Society for Science & the Public 2000 - 2013
Keyword: Anorexia & Bulimia
Link ID: 18413 - Posted: 07.27.2013
Heidi Ledford A procedure increasingly used to treat obesity by reducing the size of the stomach also reprogrammes the intestines, making them burn sugar faster, a study in diabetic and obese rats has shown. If the results, published today in Science1, hold true in humans, they could explain how gastric bypass surgery improves sugar control in people with diabetes. They could also lead to less invasive ways to produce the same effects. “This opens up the idea that we could take the most effective therapy we have for obesity and diabetes and come up with ways to do it without a scalpel,” says Randy Seeley, an obesity researcher at the University of Cincinnati in Ohio, who was not involved in the work. As rates of obesity and diabetes skyrocket in many countries, physicians and patients are turning to operations that reconfigure the digestive tract so that only a small part of the stomach is used. Such procedures are intended to allow people to feel full after smaller meals, reducing the drive to consume extra calories. But clinical trials in recent years have shown that they can also reduce blood sugar levels in diabetics, even before weight is lost2, 3. “We have to think about this surgery differently,” says Seeley. “It’s not just changing the plumbing, it’s altering how the gut handles glucose.” © 2013 Nature Publishing Group,
Keyword: Obesity
Link ID: 18412 - Posted: 07.27.2013