Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 11281 - 11300 of 28882

By Jeffrey A. Lieberman Like many psychiatrists, I have been amazed by the debates surrounding the DSM-5, the first major revision of the American Psychiatric Association’s Diagnostic and Statistical Manual of Mental Disorders in nearly twenty years, which was just released. Never before has a thick medical text of diagnostic nomenclature been the subject of so much attention. Although I was heartened to see more and more people discussing the real-world issues and challenges—for patients, families, clinicians and caregivers–within mental health care, for which the book offers an up-to-the-minute diagnostic GPS, I was also alarmed at the harsh criticism of the field of psychiatry and the APA. Consequently, I believe that as you read and watch this increased coverage, it’s important to understand the difference between thoughtful, legitimate debate, and the inevitable outcry from a small group of critics –made louder by social media and support from dubious sources —who have relentlessly sought to undermine the credibility of psychiatric medicine and question the validity of mental illness.. DSM-5 has ignited a broad dialogue on mental illness and opened up a conversation about the state of psychiatry and mental healthcare in this country. Critiques have ranged in focus from the inclusion of specific disorders in DSM-5, to the concern over a lack of biological measures which define them. Some have even questioned the entire diagnostic system, urging us to look with an eye focused on the impact to patients. These are the kinds of debate that I hope will continue long after DSM-5’s shiny cover becomes warn and wrinkled. Such meaningful discourse only fuels our ability to produce a manual that best serves those touched by mental illness. But there’s another type of critique that does not contribute to this goal. These are the groups who are actually proud to identify themselves as “anti-psychiatry.” © 2013 Scientific American

Keyword: Schizophrenia; Depression
Link ID: 18175 - Posted: 05.21.2013

By Tara Haelle Identification and treatment issues surrounding attention deficit hyperactivity disorder (ADHD) are challenging enough. Now research is shedding light on long-term outcomes for people with ADHD. A recent study in Pediatrics reports that men who had ADHD in childhood are twice as likely to be obese in middle age, even if they no longer exhibit symptoms of ADHD. ADHD is a mental disorder characterized by hyperactivity, impulsivity, inattention and inability to focus. It affects approximately 6.8 percent of U.S. children ages 3 to 17 in any given year, according to a recent report by the CDC. Medications used to treat ADHD, such as Ritalin (methylphenidate) or Adderall (dextroamphetamine and amphetamine), are stimulants that can suppress appetite, however, a couple recent retrospective studies have pointed to a possible increased risk for obesity among adults diagnosed with ADHD as children. The new 33-year prospective study started with 207 healthy middle-class white boys from New York City between 6 and 12 years old, who had been diagnosed with ADHD. When the cohort reached an average age of 18, another 178 healthy boys without ADHD were recruited for comparison. At the most recent follow-up when the participants were an average age of 41, a total of 222 men remained in the study. A troubling pattern emerged: A comparison of the men’s self-reported height and weight revealed that twice as many men with childhood ADHD were obese than those without childhood ADHD. The average body mass index (BMI) of the men with childhood ADHD was 30.1 and 41.4 percent were obese, whereas those without the condition as kids reported an average BMI of 27.6 and an obesity rate of 21.6 percent. The association held even after the researchers controlled for socioeconomic status, depression, anxiety and substance abuse disorders. © 2013 Scientific American

Keyword: ADHD; Obesity
Link ID: 18174 - Posted: 05.20.2013

By JANE E. BRODY Sugar, and especially the high-fructose corn syrup that sweetens many processed foods and nearly all soft drinks, has been justly demonized for adding nutritionally empty calories to our diet and causing metabolic disruptions linked to a variety of diseases. But a closer look at what and how Americans eat suggests that simply focusing on sugar will do little to quell the rising epidemic of obesity. This is a multifaceted problem with deep historical roots, and we are doing too little about many of its causes. More than a third of American adults and nearly one child in five are now obese, according to the Centers for Disease Control and Prevention. Our failure to curtail this epidemic is certain to exact unprecedented tolls on health and increase the cost of medical care. Effective measures to achieve a turnaround require a clearer understanding of the forces that created the problem and continue to perpetuate it. The increase in obesity began nearly half a century ago with a rise in calories consumed daily and a decline in meals prepared and eaten at home. According to the Department of Agriculture, in 1970 the food supply provided 2,086 calories per person per day, on average. By 2010, this amount had risen to 2,534 calories, an increase of more than 20 percent. Consuming an extra 448 calories each day could add nearly 50 pounds to the average adult in a year. Sugar, it turns out, is a minor player in the rise. More than half of the added calories — 242 a day — have come from fats and oils, and another 167 calories from flour and cereal. Sugar accounts for only 35 of the added daily calories. Copyright 2013 The New York Times Company

Keyword: Obesity; Chemical Senses (Smell & Taste)
Link ID: 18173 - Posted: 05.20.2013

By Laura Beil When chemists Richard Marshall and Earl Kooi started fiddling with cornstarch, the powder made from the dense insides of corn kernels, their intention was to turn glucose, which is easily produced from the starch, into fructose, which is sweeter. The idea wasn’t that far-fetched. The two sugar molecules are cousins, both made from the same atomic parts slightly rearranged. The duo’s experiment, which took place at the Corn Projects Refining Company in Argo, Ill., was a success. Marshall and Kooi discovered that the bacterium Aeromonas hydrophila produced an enzyme that could reconfigure the components of glucose from corn like so many Lego blocks. It was the first leap forward for a food industry dream: a mass-produced glucose-fructose-blend sweetener that would free commercial food manufacturers from the historical volatility of cane sugar crops. The scientists announced their triumph in a short report in Science in 1957. There the discovery sat in quiet obscurity for almost two decades, until a worldwide spike in sugar prices sent manufacturers scrambling. By the end of the 1980s, high fructose corn syrup had replaced cane sugar in soft drinks, and it soon became popular among makers of baked goods, dairy products, sauces and other foods. Few consumers seemed to care until 2004, when Barry Popkin, a nutrition scientist at the University of North Carolina at Chapel Hill, along with George Bray, at the Pennington Biomedical Research Center in Baton Rouge, La., published a commentary in the American Journal of Clinical Nutrition pointing out that the country’s obesity crisis appeared to rise in tandem with the embrace of high fructose corn syrup by food producers. That shift began in the early 1970s — just about the time Japanese researchers, who had noted Marshall and Kooi’s experiment with keen interest, overcame the technical hurdles of industrial production. © Society for Science & the Public 2000 - 2013

Keyword: Obesity; Chemical Senses (Smell & Taste)
Link ID: 18172 - Posted: 05.20.2013

By ALLISON HERSH LONDON I’M in line at the supermarket holding three items close to my chest. But I might as well be juggling my Kleenex box, toothpaste tube and an orange. Because — as you’d surely notice if you were behind me in line — I‘m bent forward at a sharp angle, which makes holding things difficult. I know you don’t want to stare, but you do. Maybe you think you’re being considerate when you say, apropos of nothing, “You look like you’re in pain.” Well, thanks, I am — but I’ll resist replying the way I want (“You look like you’re having a bad hair day”). I’m sorry. I know you mean well. Anyway it’s my turn at the register which means I’m closer to being at home where I can lie down and wait for the spasms to subside. Besides, if I told you what my issue was, you would probably shrug and reply that you’d never heard of it. There aren’t any public service announcements about it or telethons. No Angelina Jolies to bravely inform the world. Just people like me, in supermarket checkout lines. And this, I realize, is at the core of a problem that extends beyond me and my condition and that affects the way all of us respond to illnesses, some of which are the subject of public attention — and resources — and some of which are not. I have dystonia, a neurological disorder. Some years ago, for reasons no one knows, the muscles in my back and neck began to spasm involuntarily; the spasms multiply quickly, fatigue the muscles and force the body into repetitive movements and awkward postures like mine. There is no cure, only treatment options like deep brain stimulation, which requires a surgery I underwent last year as a last resort. © 2013 The New York Times Company

Keyword: Movement Disorders; Pain & Touch
Link ID: 18171 - Posted: 05.20.2013

By Bruce Bower In its idealized form, science resembles a championship boxing match. Theories square off, each vying for the gold belt engraved with “Truth.” Under the stern eyes of a host of referees, one theory triumphs by best explaining available evidence — at least until the next bout. But in the real world, science sometimes works more like a fashion show. Researchers clothe plausible explanations of experimental findings in glittery statistical suits and gowns. These gussied-up hypotheses charm journal editors and attract media coverage with carefully orchestrated runway struts, never having to battle competitors. Then there’s psychology. Even more than other social scientists — and certainly more than physical scientists — psychologists tend to overlook or dismiss hypotheses that might topple their own, says Klaus Fiedler of the University of Heidelberg in Germany. They explain experimental findings with ambiguous terms that make no testable predictions at all; they build careers on theories that have never bested a competitor in a fair scientific fight. In many cases, no one knows or bothers to check how much common ground one theory shares with others that address the same topic. Problems like these, Fiedler and his colleagues contended last November in Perspectives in Psychological Science, afflict sets of related theories about such psychological phenomena as memory and decision making. In the end, that affects how well these phenomena are understood. © Society for Science & the Public 2000 - 2013

Keyword: Attention
Link ID: 18170 - Posted: 05.20.2013

By SUSANA MARTINEZ-CONDE YOUR eyes are the sharks of the human body: they never stop moving. In the past minute alone, your eyes made as many as 240 quick movements called “saccades” (French for “jolts”). In your waking hours today, you will very likely make some 200,000 of them, give or take a few thousand. When you sleep, your eyes keep moving — though in different ways and at varying speeds, depending on the stage of sleep. A portion of our eye movements we do consciously and are at least aware of on some level: when we follow a moving bird or plane across the sky with our gaze, for instance. But most of these tiny back-and-forths and ups-and-downs — split-second moves that would make the Flying Karamazov Brothers weep with jealousy — are unconscious and nearly imperceptible to us. Our brain suppresses the feeling of our eye jumps, to avoid the sensation that the world is constantly quaking. Even when we think our gazes are petrified, in fact, we are still making eye motions, including tiny saccades called “microsaccades” — between 60 and 120 of them per minute. Just as we don’t notice most of our breathing, we are almost wholly unaware of this frenetic, nonstop ocular activity. Without it, though, we couldn’t see a thing. Humans are hardly unique in this way. Every known visual system depends on movement: we see things either because they move or because our eyes do. Some of the earliest clues to this came more than two centuries ago. Erasmus Darwin, a grandfather of Charles Darwin, observed in 1794 that staring at a small piece of scarlet silk on white paper for a long time — thereby minimizing (though not stopping) his eye movements — made it grow fainter in color, until it seemed to vanish. © 2013 The New York Times Company

Keyword: Vision
Link ID: 18169 - Posted: 05.20.2013

by Emily Underwood If you are one of the 20% of healthy adults who struggle with basic arithmetic, simple tasks like splitting the dinner bill can be excruciating. Now, a new study suggests that a gentle, painless electrical current applied to the brain can boost math performance for up to 6 months. Researchers don't fully understand how it works, however, and there could be side effects. The idea of using electrical current to alter brain activity is nothing new—electroshock therapy, which induces seizures for therapeutic effect, is probably the best known and most dramatic example. In recent years, however, a slew of studies has shown that much milder electrical stimulation applied to targeted regions of the brain can dramatically accelerate learning in a wide range of tasks, from marksmanship to speech rehabilitation after stroke. In 2010, cognitive neuroscientist Roi Cohen Kadosh of the University of Oxford in the United Kingdom showed that, when combined with training, electrical brain stimulation can make people better at very basic numerical tasks, such as judging which of two quantities is larger. However, it wasn't clear how those basic numerical skills would translate to real-world math ability. © 2010 American Association for the Advancement of Science

Keyword: Learning & Memory
Link ID: 18168 - Posted: 05.18.2013

by Douglas Heaven Got a memory like a fish? The first study to visualise live memory retrieval in the whole brain has not only debunked the "three-second memory" myth, but also sheds light on the brain processes involved in forming long-term memories. Even the haziest recollections have a physical basis in the brain, but the mechanisms behind the formation and retrieval of memories are not well understood. By working with zebrafish, which are small and partially transparent, Hitoshi Okamoto at the RIKEN Brain Science Institute in Wako, Japan, and colleagues were able to study the whole brain at once. This allowed them to observe the roles played by different brain regions as a memory was retrieved. The team used fish with a genetically engineered fluorescent protein in the brain that glows less brightly when calcium levels increase – which occurs when neurons fire. They were able to study the activity of these proteins under a microscope. First, the team trained a group of fish to respond to a visual cue to avoid a small electric shock. Each fish was placed in a tank containing two compartments. When a red light shone in one compartment the fish had to swim to the other to avoid the shock. The researchers then selected the fish that had learned to perform the avoidance task successfully at least 80 per cent of the time and looked at the activity in their brains while a red light was switched on and off. © Copyright Reed Business Information Ltd.

Keyword: Learning & Memory
Link ID: 18167 - Posted: 05.18.2013

By CARL ZIMMER Imagine a wolf catching a Frisbee a dozen times in a row, or leading police officers to a stash of cocaine, or just sleeping peacefully next to you on your couch. It’s a stretch, to say the least. Dogs may have evolved from wolves, but the minds of the two canines are profoundly different. Dog brains, as I wrote last month in The New York Times, have become exquisitely tuned to our own. Scientists are now zeroing in on some of the genes that were crucial to the rewiring of dog brains. Their results are fascinating, and not only because they can help us understand how dogs turned into man’s best friend. They may also teach us something about the evolution of our own brains: Some of the genes that evolved in dogs are the same ones that evolved in us. To trace the change in dog brains, scientists have first had to work out how dog breeds are related to one another, and how they’re all related to wolves. Ya-Ping Zhang, a geneticist at the Chinese Academy of Sciences, has led an international network of scientists who have compared pieces of DNA from different canines. They’ve come to the conclusion that wolves started their transformation into dogs in East Asia. Those early dogs then spread to other parts of the world. Many of the breeds we’re most familiar with, like German shepherds and golden retrievers, emerged only in the past few centuries. © 2013 The New York Times Company

Keyword: Aggression; Genes & Behavior
Link ID: 18166 - Posted: 05.18.2013

by Sara Reardon As suicide rates climb steeply in the US a growing number of psychiatrists are arguing that suicidal behaviour should be considered as a disease in its own right, rather than as a behaviour resulting from a mood disorder. They base their argument on mounting evidence showing that the brains of people who have committed suicide have striking similarities, quite distinct from what is seen in the brains of people who have similar mood disorders but who died of natural causes. Suicide also tends to be more common in some families, suggesting there may be genetic and other biological factors in play. What's more, most people with mood disorders never attempt to kill themselves, and about 10 per cent of suicides have no history of mental disease. The idea of classifying suicidal tendencies as a disease is being taken seriously. The team behind the fifth edition of the Diagnostic Standards Manual (DSM-5) – the newest version of psychiatry's "bible", released at the American Psychiatric Association's meeting in San Francisco this week – considered a proposal to have "suicide behaviour disorder" listed as a distinct diagnosis. It was ultimately put on probation: put into a list of topics deemed to require further research for possible inclusion in future DSM revisions. Another argument for linking suicidal people together under a single diagnosis is that it could spur research into the neurological and genetic factors they have in common. This could allow psychiatrists to better predict someone's suicide risk, and even lead to treatments that stop suicidal feelings. © Copyright Reed Business Information Ltd.

Keyword: Depression; Aggression
Link ID: 18165 - Posted: 05.18.2013

By Tina Hesman Saey COLD SPRING HARBOR, N.Y. – Taming foxes changes not only the animals’ behavior but also their brain chemistry, a new study shows. The finding could shed light on how the foxes’ genetic cousins, wolves, morphed into man’s best friend. Lenore Pipes of Cornell University presented the results May 10 at the Biology of Genomes conference. The foxes she worked with come from a long line started in 1959 when a Russian scientist named Dmitry Belyaev attempted to recreate dog domestication, but using foxes instead of wolves. He bred silver foxes (Vulpes vulpes), which are actually a type of red fox with white-tipped black fur. Belyaev and his colleagues selected the least aggressive animals they could find at local fox farms and bred them. Each generation, the scientists picked the tamest animals to mate, creating ever friendlier foxes. Now, more than 50 years later, the foxes act like dogs, wagging their tails, jumping with excitement and leaping into the arms of caregivers for caresses. At the same time, the scientists also bred the most aggressive foxes on the farms. The descendents of those foxes crouch, flatten their ears, growl, bare their teeth and lunge at people who approach their cages. The foxes’ tame and aggressive behaviors are rooted in genetics, but scientists have not found DNA changes that account for the differences. Rather than search for changes in genes themselves, Pipes and her colleagues took an indirect approach, looking for differences in the activity of genes in the foxes’ brains. © Society for Science & the Public 2000 - 2013

Keyword: Genes & Behavior
Link ID: 18164 - Posted: 05.16.2013

by Michael Balter From the human perspective, few events in evolution were more momentous than the split among primates that led to apes (large, tailless primates such as today's gorillas, chimpanzees, and humans) and Old World monkeys (which today include baboons and macaques). DNA studies of living primates have estimated that the rift took place between 25 million and 30 million years ago, but the earliest known fossils of both groups date no earlier than 20 million years ago. Now, a team working in Tanzania has found teeth and partial jaws from what it thinks are 25-million-year-old ancestors of both groups. If the interpretations hold up, the finds would reconcile the molecular and fossil evidence and possibly provide insights into what led to the split in the first place. Researchers have long been frustrated by a paucity of fossils from this key period in evolution, which sits at the borderline between two major geological epochs: the Miocene (about 23 million to 5 million years ago) and the Oligocene (about 34 million to 23 million years ago). The earliest known fossils of early apes and Old World monkeys date from the early Miocene and have been found in just a handful of sites in Kenya, Uganda, and North Africa. Meanwhile, molecular studies of existing primates consistently suggest that these two groups arose during the Oligocene, leading scientists to wonder whether the molecular dates are wrong or if paleontologists have been looking in the wrong places. For more than a decade, researchers from the United States and Tanzania have been combing Tanzania's Rukwa Rift Basin, searching for fossils of all kinds. During the 2011 and 2012 seasons, a team led by Nancy Stevens, a vertebrate paleontologist at Ohio University in Athens, discovered fossils that it identified as belonging to two previously unknown species of primates: one, an apparent ape ancestor the team has named Rukwapithecus fleaglei; the other, a claimed Old World monkey ancestor dubbed Nsungwepithecus gunnelli. © 2010 American Association for the Advancement of Science.

Keyword: Evolution
Link ID: 18163 - Posted: 05.16.2013

By David Brown, A team of researchers said Wednesday that it had produced embryonic stem cells — a possible source of disease-fighting spare parts — from a cloned human embryo. Scientists at the Oregon Health and Science University accomplished in humans what has been done over the past 15 years in sheep, mice, cattle and several other species. The achievement is likely to, at least temporarily, reawaken worries about “reproductive cloning” — the production of one-parent duplicate humans. But few experts think that production of stem cells through cloning is likely to be medically useful soon, or possibly ever. “An outstanding issue of whether it would work in humans has been resolved,” said Rudolf Jaenisch, a biologist at MIT’s Whitehead Institute in Cambridge, Mass., who added that he thinks the feat “has no clinical relevance.” “I think part of the significance is technical and part of the significance is historical,” said John Gearhart, head of the Institute for Regenerative Medicine at the University of Pennsylvania. “Many labs attempted it, and no one had ever been able to achieve it.” A far less controversial way to get stem cells is now available. It involves reprogramming mature cells (often ones taken from the skin) so that they return to what amounts to a second childhood from which they can grow into a new and different adulthood. Learning how to make and manipulate those “induced pluripotent stem” (IPS) cells is one of biology’s hottest fields. © 1996-2013 The Washington Post

Keyword: Stem Cells; Parkinsons
Link ID: 18162 - Posted: 05.16.2013

By Jason G. Goldman There is a rich tradition in psychology and neuroscience of using animals as models for understanding humans. Humans, after all, are enormously complicated creatures to begin even from a strictly biological perspective. Tacking on the messiness that comes with culture makes the study of the human mind tricky, at best. So, just as biomedical scientists have relied upon the humble mouse, psychological and cognitive scientists have too turned to our evolutionary cousins in the animal kingdom as a means of better understanding ourselves. In her new book Animal Wise, journalist Virginia Morrell recounts a conversation with one researcher who pointed out that decades of research were built upon “rats, pigeons, and college sophomores, preferably male.” The college undergrads stood in for all of humanity, the rats served as representatives of all other mammals, and pigeons served as a model for the rest of the animal kingdom. The silly part isn’t that non-human animals can be used effectively as a means of understanding more about our own species. The idea is simple: understand how a simple system works, and you can make careful inferences about the way that complex systems work. That is (or should be) obvious. In his interview with CNN today, memory research pioneer and Nobel Prize winner Eric Kandel said as much: “Rather than studying the most complex form of memory in a very complicated animal, we had to take the most simple form — an implicit form of memory — in a very simple animal.” © 2013 Scientific American

Keyword: Learning & Memory; Evolution
Link ID: 18161 - Posted: 05.16.2013

by Meera Senthilingam Malaria parasites give mosquitoes a keener sense of smell, it seems. A small-scale study in the lab finds that mosquitoes infected by the parasite are three times as likely as uninfected mosquitoes to respond to human odours. If the same results are seen in malaria-carrying mosquitoes in the wild, it could lead to new ways to combat the disease. Female anopheles mosquitoes are attracted to the chemicals in human odours, which help them find the source of blood they need to grow their eggs. When these mosquitoes carry Plasmodium falciparum – the most lethal form of malaria parasite – the likelihood that they will target humans rises. "We knew already that mosquitoes bite more often when they're infected. They probe the skin more frequently," says James Logan from the London School of Hygiene and Tropical Medicine. To quantify the effect – and try to work out its cause – Logan and his colleagues infected some lab-grown Anopheles gambiae mosquitoes with Plasmodium parasites, while leaving others uninfected. They then tested how both groups were attracted to human smells. Mosquitoes are particularly attracted to foot odours, so Logan's team used nylon stockings containing the volatile chemicals produced by our feet. Over a period of three minutes, Plasmodium-infected mosquitoes landed and attempted to bite the stockings around 15 times on average. By contrast, the uninfected mosquitoes attempted to bite only around five times on average during that time. © Copyright Reed Business Information Ltd.

Keyword: Chemical Senses (Smell & Taste)
Link ID: 18160 - Posted: 05.16.2013

Ed Yong The US adolescents who signed up for the Study of Mathematically Precocious Youth (SMPY) in the 1970s were the smartest of the smart, with mathematical and verbal-reasoning skills within the top 1% of the population. Now, researchers at BGI (formerly the Beijing Genomics Institute) in Shenzhen, China, the largest gene-sequencing facility in the world, are searching for the quirks of DNA that may contribute to such gifts. Plunging into an area that is littered with failures and riven with controversy, the researchers are scouring the genomes of 1,600 of these high-fliers in an ambitious project to find the first common genetic variants associated with human intelligence. The project, which was launched in August 2012 and is slated to begin data analysis in the next few months, has spawned wild accusations of eugenics plots, as well as more measured objections by social scientists who view such research as a distraction from pressing societal issues. Some geneticists, however, take issue with the study for a different reason. They say that it is highly unlikely to find anything of interest — because the sample size is too small and intelligence is too complex. Earlier large studies with the same goal have failed. But scientists from BGI’s Cognitive Genomics group hope that their super-smart sample will give them an edge, because it should be enriched with bits of DNA that confer effects on intelligence. “An exceptional person gets you an order of magnitude more statistical power than if you took random people from the population — I’d say we have a fighting chance,” says Stephen Hsu, a theoretical physicist from Michigan State University in East Lansing, who acts as a scientific adviser to BGI and is one of the project’s leaders. © 2013 Nature Publishing Group,

Keyword: Genes & Behavior; Intelligence
Link ID: 18159 - Posted: 05.15.2013

20:00 13 May 2013 by Douglas Heaven Genes in cells throughout the body switch on and off throughout the day in a coordinated way. Or at least they should. In people with clinical depression, genes in their brain tissues appear to be significantly out of sync – a finding that could lead to new treatments for the condition. We know from previous studies that genes in cells elsewhere, such as the skin, follow a 24-hour cycle of activity. But identifying patterns of genetic activity in a living brain isn't easy to do. "We always assumed we would have a clock [in our brain]," says Huda Akil at the University of Michigan in Ann Arbor. "But it had never been shown before." Akil and her colleagues examined the brains of 55 people with a known time of death, looking at around 12,000 genes in tissues from six brain regions. By matching the time of death with molecular signs of genetic activity – whether each gene was actively expressing itself or not – the team identified hundreds of genes that follow a daily cycle. Sudden death Akil says it was important to look at the brains of individuals who had died suddenly – through a heart attack or car accident, for example. Slower deaths can cause dramatic changes in the brain that would have obscured what they were looking for, but sudden death freezes the genetic activity. "We can capture an instant," she says. © Copyright Reed Business Information Ltd.

Keyword: Depression; Sleep
Link ID: 18158 - Posted: 05.14.2013

By Samyukta Mullangi A recent article in NYTimes [1] declared that the rising rate of suicides among our baby boomer generation now made suicides, by raw numbers alone, a bigger killer than motor vehicle accidents! Researchers quoted within the article pointed to complex reasons like the economic downturn over the past decade, the widespread availability of opioid drugs like oxycodone, and changes in marriage, social isolation and family roles. Then I scrolled down, as I always do, to peruse some of the readers’ comments, and that’s when I paused. I suppose in hindsight that I had expected readers to exclaim at the shocking statistics (suicide rates now stand at 27.3 per 100,000 for middle aged men, 8.1 per 100,000 for women), or lament over personal stories of relatives or friends who took their own lives. While I certainly saw a few such comments, I was amazed to discover the number of readers who were sympathetic to the idea of suicide. “Molly” wrote “Why is suicide usually looked upon as a desperate and forbidden act? Can’t we accept that in addition to poverty, loneliness, alienation, ill health, life in world [sic] that is sometimes personally pointless means that death is a relief? I believe the right to die, in a time and place (and wishfully peacefully without violence) is a basic human right.” This post was ‘recommended’ by 351 other readers at the time of this essay being written. © 2013 Scientific American

Keyword: Depression; Sleep
Link ID: 18157 - Posted: 05.14.2013

Brian Owens The gut is home to innumerable different bacteria — a complex ecosystem that has an active role in a variety of bodily functions. In a study published this week in Proceedings of the National Academy of Sciences1, a team of researchers finds that in mice, just one of those bacterial species plays a major part in controlling obesity and metabolic disorders such as type 2 diabetes. The bacterium, Akkermansia muciniphila, digests mucus and makes up 3–5% of the microbes in a healthy mammalian gut. But the intestines of obese humans and mice, and those with type 2 diabetes, have much lower levels. A team led by Patrice Cani, who studies the interaction between gut bacteria and metabolism at the Catholic University of Louvain in Belgium, decided to investigate the link. Mice that were fed a high-fat diet, the researchers found, had 100 times less A. muciniphila in their guts than mice fed normal diets. The researchers were able to restore normal levels of the bacterium by feeding the mice live A. muciniphila, as well as 'prebiotic' foods that encourage the growth of gut microbes. The effects of this treatment were dramatic. Compared with untreated animals, the mice lost weight and had a better ratio of fat to body mass, as well as reduced insulin resistance and a thicker layer of intestinal mucus. They also showed improvements in a host of other indicators related to obesity and metabolic disorders. “We found one specific common factor between all the different parameters that we have been investigating over the past ten years,” says Cani. © 2013 Nature Publishing Group

Keyword: Obesity
Link ID: 18156 - Posted: 05.14.2013