Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 10261 - 10280 of 29398

By Chris Wodskou, CBC News For the past 25 years, people suffering from depression have been treated with antidepressant drugs like Zoloft, Prozac and Paxil — three of the world’s best-selling selective serotonin reuptake inhibitors, or SSRIs. But people are questioning whether these drugs are the appropriate treatment for depression, and if they could even be causing harm. The drugs are designed to address a chemical imbalance in the brain and thereby relieve the symptoms of depression. In this case, it’s a shortage of serotonin that antidepressants work to correct. In fact, there are pharmaceutical treatments targeting chemical imbalances for just about every form of mental illness, from schizophrenia to ADHD, and a raft of anxiety disorders. Hundreds of millions of prescriptions are written for antipsychotic, antidepressant and anti-anxiety medications every year in the United States alone, producing billions of dollars in revenue for pharmaceutical companies. But what if the very premise behind these drugs is flawed? What if mental illnesses like depression aren’t really caused by chemical imbalances, and that millions of the people who are prescribed those drugs derive no benefit from them? And what if those drugs could actually make their mental illness worse and more intractable over the long term? Investigative journalist Robert Whitaker argued that psychiatric drugs are a largely ineffective way of treating mental illness in his 2010 book called Anatomy of an Epidemic: Magic Bullets, Psychiatric Drugs and the Astonishing Rise of Mental Illness in America. Whitaker maintains that the foundation of modern psychiatry, the chemical imbalance model, is scientifically unproven. © CBC 2014

Keyword: Depression
Link ID: 19712 - Posted: 06.09.2014

By JOHN COATES SIX years after the financial meltdown there is once again talk about market bubbles. Are stocks succumbing to exuberance? Is real estate? We thought we had exorcised these demons. It is therefore with something close to despair that we ask: What is it about risk taking that so eludes our understanding, and our control? Part of the problem is that we tend to view financial risk taking as a purely intellectual activity. But this view is incomplete. Risk is more than an intellectual puzzle — it is a profoundly physical experience, and it involves your body. Risk by its very nature threatens to hurt you, so when confronted by it your body and brain, under the influence of the stress response, unite as a single functioning unit. This occurs in athletes and soldiers, and it occurs as well in traders and people investing from home. The state of your body predicts your appetite for financial risk just as it predicts an athlete’s performance. If we understand how a person’s body influences risk taking, we can learn how to better manage risk takers. We can also recognize that mistakes governments have made have contributed to excessive risk taking. Consider the most important risk manager of them all — the Federal Reserve. Over the past 20 years, the Fed has pioneered a new technique of influencing Wall Street. Where before the Fed shrouded its activities in secrecy, it now informs the street in as clear terms as possible of what it intends to do with short-term interest rates, and when. Janet L. Yellen, the chairwoman of the Fed, declared this new transparency, called forward guidance, a revolution; Ben S. Bernanke, her predecessor, claimed it reduced uncertainty and calmed the markets. But does it really calm the markets? Or has eliminating uncertainty in policy spread complacency among the financial community and actually helped inflate market bubbles? We get a fascinating answer to these questions if we turn from economics and look into the biology of risk taking. © 2014 The New York Times Company

Keyword: Attention; Stress
Link ID: 19711 - Posted: 06.09.2014

By Jonathan Webb Science reporter, BBC News A new theory suggests that our male ancestors evolved beefy facial features as a defence against fist fights. The bones most commonly broken in human punch-ups also gained the most strength in early "hominin" evolution. They are also the bones that show most divergence between males and females. The paper, in the journal Biological Reviews, argues that the reinforcements evolved amid fighting over females and resources, suggesting that violence drove key evolutionary changes. For many years, this extra strength was seen as an adaptation to a tough diet including nuts, seeds and grasses. But more recent findings, examining the wear pattern and carbon isotopes in australopith teeth, have cast some doubt on this "feeding hypothesis". "In fact, [the australopith] boisei, the 'nutcracker man', was probably eating fruit," said Prof David Carrier, the new theory's lead author and an evolutionary biologist at the University of Utah. Masculine armour Instead of diet, Prof Carrier and his co-author, physician Dr Michael Morgan, propose that violent competition demanded the development of these facial fortifications: what they call the "protective buttressing hypothesis". In support of their proposal, Carrier and Morgan offer data from modern humans fighting. Several studies from hospital emergency wards, including one from the Bristol Royal Infirmary, show that faces are particularly vulnerable to violent injuries. BBC © 2014

Keyword: Aggression; Evolution
Link ID: 19710 - Posted: 06.09.2014

Jennifer Couzin-Frankel What if you could trick your body into thinking you were racing on a treadmill—and burning off calories at a rapid clip—while simply walking down the street? Changing our rate of energy expenditure is still far into the future, but work in mice explores how this might happen. Two teams of scientists suggest that activating immune cells in fat can convert the tissue from a type of fat that stores energy to one that burns it, opening up potential new therapies for obesity and diabetes. There are two types of fat in humans: white adipose tissue, which makes up nearly all the fat in adults, and brown adipose tissue, which is found in babies but disappears as they age. Brown fat protects against the cold (it’s also common in animals that hibernate), and researchers have found that mice exposed to cold show a temporary “browning” of some of their white fat. The same effect occurred in preliminary studies of people, where the browning—which creates a tissue known as beige fat—helps generate heat and burn calories. But cold is “the only stimulus we know that can increase beige fat mass or brown fat mass,” says Ajay Chawla, a physiologist at the University of California (UC), San Francisco. He wanted to better understand how cold caused this change in the tissue and whether there was a way to mimic cold and induce browning some other way. A few years ago, Chawla’s group had reported that cold exposure activated macrophages, a type of immune cell, in white adipose tissue. To further untangle what was going on, Chawla, his postdoc Yifu Qiu, and their colleagues used mice that lacked interleukin-4 (IL-4) and interleukin-13, proteins that help activate macrophages. When they exposed these mice to the cold, the animals developed far fewer beige fat cells than did normal animals, suggesting that macrophages were key to browning of white fat. © 2014 American Association for the Advancement of Science

Keyword: Obesity
Link ID: 19709 - Posted: 06.07.2014

Haroon Siddique The forehead and fingertips are the most sensitive parts to pain, according to the first map created by scientists of how the ability to feel pain varies across the human body. It is hoped that the study, in which volunteers had pain inflicted without touching them, could help the estimated 10 million people in the UK who suffer from chronic pain by allowing physicians to use lasers to monitor nerve damage across the body. This would offer a quantitative way to monitor the progression or regression of a condition. Lead author Dr Flavia Mancini, of the UCL Institute of Cognitive Neuroscience, said: "Acuity for touch has been known for more than a century, and tested daily in neurology to assess the state of sensory nerves on the body. It is striking that until now nobody had done the same for pain." In the study, a pair of lasers were used to cause brief sensation of pinprick pain to 26 blindfolded healthy volunteers on various parts of their body without any touch, in order to define our ability to identify where it hurts, known as "spatial acuity". Sometimes only one laser would be activated, and sometimes both. The participants were asked whether they felt one sting or two, at varying distances between the two beams and researchers recorded the minimum distance between the beams at which people were able to accurately say whether it was one sting or two. © 2014 Guardian News and Media Limited

Keyword: Pain & Touch
Link ID: 19708 - Posted: 06.07.2014

by Laura Sanders Transplanted cells can flourish for over a decade in the brain of a person with Parkinson’s disease, scientists write in the June 26 Cell Reports. Finding that these cells have staying power may encourage clinicians to pursue stem cell transplants, a still-experimental way to counter the brain deterioration that comes with Parkinson’s. Penelope Hallett of Harvard University and McLean Hospital in Belmont, Mass., and colleagues studied postmortem brain tissue from five people with advanced Parkinson’s. The five had received stem cell transplants between four and 14 years earlier. In all five people’s samples, neurons that originated from the transplanted cells showed signs of good health and appeared capable of sending messages with the brain chemical dopamine, a neurotransmitter that Parkinson’s depletes. Results are mixed about whether these transplanted cells are a good way to ease Parkinson’s symptoms. Some patients have shown improvements after the new cells stitched themselves into the brain, while others didn’t benefit from them. The cells can also cause unwanted side effects such as involuntary movements. P. J. Hallett et al. Long-term health of dopaminergic neuron transplants in Parkinson’s disease patients. Cell Reports. Vol. 7, June 26, 2014. doi: 10.1016/j.celrep.2014.05.027. © Society for Science & the Public 2000 - 2013

Keyword: Parkinsons; Stem Cells
Link ID: 19707 - Posted: 06.07.2014

By C. CLAIBORNE RAY Q. Does the slit shape of a cat’s pupil confer any advantages over the more rounded pupils of other animals? A. “There are significant advantages,” said Dr. Richard E. Goldstein, chief medical officer of the Animal Medical Center in New York City. “A cat can quickly adjust to different lighting conditions, control the amount of light that reaches the eye and see in almost complete darkness,” he said. “Moreover, the slit shape protects the sensitive retina in daylight.” The slit-shaped pupil found in many nocturnal animals, including the domestic cat, presumably allows more effective control of how much light reaches the retina, in terms of both speed and completeness. “A cat has the capacity to alter the intensity of light falling on its retina 135-fold, compared to tenfold in a human, with a circular pupil,” Dr. Goldstein said. “A cat’s eye has a large cornea, which allows more light into the eye, and a slit pupil can dilate more than a round pupil, allowing more light to enter in dark conditions.” Cats have other visual advantages as well, Dr. Goldstein said. A third eyelid, between the regular eyelids and the cornea, protects the globe and also has a gland at the bottom that produces extra tears. The eyes’ location, facing forward in the front of the skull, gives cats a large area of binocular vision, providing depth perception and helping them to catch prey. © 2014 The New York Times Company

Keyword: Vision
Link ID: 19706 - Posted: 06.07.2014

by Moheb Costandi Rest easy after learning a new skill. Experiments in mice suggest that a good night's sleep helps us lay down memories by promoting the growth of new connections between brain cells. Neuroscientists believe that memory involves the modification of synapses, which connect brain cells, and numerous studies published over the past decade have shown that sleep enhances the consolidation of newly formed memories in people. But exactly how these observations were related was unclear. To find out, Wenbiao Gan of the Skirball Institute of Biomolecular Medicine at New York University Medical School and his colleagues trained 15 mice to run backwards or forwards on a rotating rod. They allowed some of them to fall asleep afterwards for 7 hours, while the rest were kept awake. The team monitored the activity and microscopic structure of the mice's motor cortex, the part of the brain that controls movement, through a small transparent "window" in their skulls. This allowed them to watch in real time how the brain responded to learning the different tasks. Sprouting spines They found that learning a new task led to the formation of new dendritic spines – tiny structures that project from the end of nerve cells and help pass electric signals from one neuron to another – but only in the mice left to sleep. This happened during the non-rapid eye movement stage of sleep. Each task caused a different pattern of spines to sprout along the branches of the same motor cortex neurons. © Copyright Reed Business Information Ltd.

Keyword: Sleep; Learning & Memory
Link ID: 19705 - Posted: 06.06.2014

By Meeri Kim, Many of us find ourselves swimming along in the tranquil sea of life when suddenly a crisis hits — a death in the family, the loss of a job, a bad breakup. Some power through and find calm waters again, while others drown in depression. Scientists continue to search for the underlying genes and neurobiology that dictate our reactions to stress. Now, a study using mice has found a switch-like mechanism between resilience and defeat in an area of the brain that plays an important role in regulating emotions and has been linked with mood and anxiety disorders. (Bo Li/Cold Spring Harbor Laboratory) - Researchers at Cold Spring Harbor Laboratory identify the neurons in the brain that determine if a mouse will learn to cope with stress or become depressed. These neurons, located in a region of the brain known as the medial prefrontal cortex (top, green image) become hyperactive in depressed mice. The bottom panel is close-up of above image - yellow indicates activation. The team showed that this enhanced activity causes depression. After artificially enhancing the activity of neurons in that part of the brain — the medial prefrontal cortex — mice that previously fought to avoid electric shocks started to act helpless. Rather than leaping for an open escape route, they sat in a corner taking the pain — presumably out of a belief that nothing they could do would change their circumstances. “This helpless behavior is quite similar to what clinicians see in depressed individuals — an inability to take action to avoid or correct a difficult situation,” said study author and neuroscientist Bo Li of the Cold Spring Harbor Laboratory in New York. The results were published online May 27 in the Journal of Neuroscience. © 1996-2014 The Washington Post

Keyword: Depression
Link ID: 19704 - Posted: 06.06.2014

By ANNA NORTH The “brain” is a powerful thing. Not the organ itself — though of course it’s powerful, too — but the word. Including it in explanations of human behavior might make those explanations sound more legitimate — and that might be a problem. Though neuroscientific examinations of everyday experiences have fallen out of favor somewhat recently, the word “brain” remains popular in media. Ben Lillie, the director of the science storytelling series The Story Collider, drew attention to the phenomenon last week on Twitter, mentioning in particular a recent Atlantic article: “Your Kid’s Brain Might Benefit From an Extra Year in Middle School.” In the piece, Jessica Lahey, a teacher and education writer, examined the benefits of letting kids repeat eighth grade. Mr. Lillie told Op-Talk the word “brain” could take the emphasis off middle-school students as people: The piece, he said, was “not ignoring the fact that the middle schooler (in this case) is a person, but somehow taking a quarter-step away by focusing on a thing we don’t really think of as human.” The New York Times isn’t immune to “brain”-speak — in her 2013 project “Brainlines,” the artist Julia Buntaine collected all Times headlines using the word “brain” since 1851. She told Op-Talk in an email that “the number of headlines about the brain increased exponentially since around the year 2000, where some years before there were none at all, after that there were at least 30, 40, 80 headlines.” Adding “brain” to a headline may make it sound more convincing — some research shows that talking about the brain has measurable effects on how people respond to scientific explanations. In a 2008 study, researchers found that adding phrases like “brain scans indicate” to explanations of psychological concepts like attention made those explanations more satisfying to nonexpert audiences. Perhaps disturbingly, the effect was greatest when the explanations were actually wrong. © 2014 The New York Times Company

Keyword: Miscellaneous
Link ID: 19703 - Posted: 06.06.2014

Neil Levy Can human beings still be held responsible in the age of neuroscience? Some people say no: they say once we understand how the brain processes information and thereby causes behaviour, there’s nothing left over for the person to do. This argument has not impressed philosophers, who say there doesn’t need to be anything left for the person to do in order to be responsible. People are not anything over and above the causal systems involved in information processing, we are our brains (plus some other, equally physical stuff). We are responsible if our information processing systems are suitably attuned to reasons, most philosophers think. There are big philosophical debates concerning what it takes to be suitably attuned to reasons, and whether this is really enough for responsibility. But I want to set those debates aside here. It’s more interesting to ask what we can learn from neuroscience about the nature of responsibility and about when we’re responsible. Even if neuroscience doesn’t tell us that no one is ever responsible, it might be able to tell us if particular people are responsible for particular actions. A worthy case study Consider a case like this: early one morning in 1987, a Canadian man named Ken Parks got up from the sofa where he had fallen asleep and drove to his parents’-in-law house. There he stabbed them both before driving to the police station, where he told police he thought he had killed someone. He had: his mother-in-law died from her injuries. © 2010–2014, The Conversation Trust (UK)

Keyword: Consciousness; Emotions
Link ID: 19702 - Posted: 06.06.2014

By Sadie Dingfelder Want to become famous in the field of neuroscience? You could go the usual route, spending decades collecting advanced degrees, slaving away in science labs and publishing your results. Or you could simply fall victim to a freak accident. The stars of local science writer Sam Kean’s new book, “The Tale of the Dueling Neurosurgeons,” (which he’ll discuss Saturday at Politics and Prose) took the latter route. Be it challenging the wrong guy to a joust, spinning out on a motorcycle, or suffering from a stroke, these folks sustained brain injuries with bizarre and fascinating results. One man, for instance, lost the ability to identify different kinds of animals but had no trouble naming plants and objects. Another man lost his short-term memory. The result? A diary filled with entries like: “I am awake for the very first time.” “Now, I’m really awake.” “Now, I’m really, completely awake.” Unfortunate mishaps like these have advanced our understanding of how the gelatinous gray mass that (usually) stays hidden inside our skulls gives rise to thoughts, feelings and ideas, Kean says. “Traditionally, every major discovery in the history of neuroscience came about this way,” he says. “We had no other way of looking at the brain for centuries and centuries, because we didn’t have things like MRI machines.” Rather than covering the case studies textbook-style, Kean provides all the gory details. Consider Phineas Gage. You may remember from Psych 101 that Gage, a railroad worker, survived having a metal rod launched through his skull. You might not know, however, that one doctor “shaved Gage’s scalp and peeled off the dried blood and gelatinous brains. He then extracted skull fragments from the wound by sticking his fingers in from both ends, Chinese-finger-trap-style,” as Kean writes in his new book. © 1996-2014 The Washington Post

Keyword: Learning & Memory; Attention
Link ID: 19701 - Posted: 06.06.2014

By Ian Randall The human tongue may have a sixth sense—and no, it doesn’t have anything to do with seeing ghosts. Researchers have found that in addition to recognizing sweet, sour, salty, savory, and bitter tastes, our tongues can also pick up on carbohydrates, the nutrients that break down into sugar and form our main source of energy. Past studies have shown that some rodents can distinguish between sugars of different energy densities, while others can still tell carbohydrate and protein solutions apart even when their ability to taste sweetness is lost. A similar ability has been proposed in humans, with research showing that merely having carbohydrates in your mouth can improve physical performance. How this works, however, has been unclear. In the new study, to be published in Appetite, the researchers asked participants to squeeze a sensor held between their right index finger and thumb when shown a visual cue. At the same time, the participants’ tongues were rinsed with one of three different fluids. The first two were artificially sweetened—to identical tastes—but with only one containing carbohydrate; the third, a control, was neither sweet nor carb-loaded. When the carbohydrate solution was used, the researchers observed a 30% increase in activity for the brain areas that control movement and vision. This reaction, they propose, is caused by our mouths reporting that additional energy in the form of carbs is coming. The finding may explain both why diet products are often viewed as not being as satisfying as their real counterparts and why carbohydrate-loaded drinks seem to immediately perk up athletes—even before their bodies can convert the carbs to energy. Learning more about how this “carbohydrate sense” works could lead to the development of artificially sweetened foods, the researchers propose, “as hedonistically rewarding as the real thing.” © 2014 American Association for the Advancement of Science

Keyword: Chemical Senses (Smell & Taste)
Link ID: 19700 - Posted: 06.06.2014

By Jenny Graves The claim that homosexual men share a “gay gene” created a furor in the 1990s. But new research two decades on supports this claim – and adds another candidate gene. To an evolutionary geneticist, the idea that a person’s genetic makeup affects their mating preference is unsurprising. We see it in the animal world all the time. There are probably many genes that affect human sexual orientation. But rather than thinking of them as “gay genes,” perhaps we should consider them “male-loving genes.” They may be common because these variant genes, in a female, predispose her to mate earlier and more often and to have more children. Likewise, it would be surprising if there were not “female-loving genes” in lesbian women that, in a male, predispose him to mate earlier and have more children. We can detect genetic variants that produce differences between people by tracking traits in families that display differences. Patterns of inheritance reveal variants of genes (called “alleles”) that affect normal differences, such as hair color, or disease states, such as sickle cell anemia. Quantitative traits, such as height, are affected by many different genes, as well as environmental factors. It’s hard to use these techniques to detect genetic variants associated with male homosexuality partly because many gay men prefer not to be open about their sexuality. It is even harder because, as twin studies have shown, shared genes are only part of the story. Hormones, birth order and environment play roles, too.

Keyword: Sexual Behavior; Genes & Behavior
Link ID: 19699 - Posted: 06.06.2014

By Kelly Servick During the World Cup next week, there may be 1 minute during the opening ceremony when the boisterous stadium crowd in São Paulo falls silent: when a paraplegic young person wearing a brain-controlled, robotic exoskeleton attempts to rise from a wheelchair, walk several steps, and kick a soccer ball. The neuroscientist behind the planned event, Miguel Nicolelis, is familiar with the spotlight. His lab at Duke University in Durham, North Carolina, pioneered brain-computer interfaces, using surgically implanted electrodes to read neural signals that can control robotic arms. Symbolically, the project is a homecoming for Nicolelis. He has portrayed it as a testament to the scientific progress and potential of his native Brazil, where he founded and directs the International Institute of Neuroscience of Natal. The press has showered him with attention, and the Brazilian government chipped in nearly $15 million in support. But scientifically, the project is a departure. Nicolelis first intended the exoskeleton to read signals from implanted electrodes, but decided instead to use a noninvasive, EEG sensor cap. That drew skepticism from Nicolelis’s critics—and he has a few—that the system wouldn’t really be a scientific advance. Others have developed crude EEG-based exoskeletons, they note, and it will be impossible to tell from the demo how this system compares. A bigger concern is that the event could generate false hope for paralyzed patients and give the public a skewed impression of the field’s progress. © 2014 American Association for the Advancement of Science

Keyword: Robotics
Link ID: 19698 - Posted: 06.06.2014

By JAMES GORMAN The National Institutes of Health set an ambitious $4.5 billion price tag on its part of President Obama’s Brain Initiative on Thursday, stamping it as an effort on the scale of the Human Genome Project. The goals of the Brain Initiative were clearly grand when Mr. Obama announced it a year ago — nothing less than developing and applying new technology to crack the toughest unsolved puzzles of how the brains of humans and animals function. The hope is to lay a foundation for future advances in the medical treatment of brain disorders. But the initiative began with $110 million budgeted for 2014, shared by three major entities: the National Science Foundation; the Defense Advanced Research Projects Agency; and the N.I.H., which has a $40 million share. By calling for such a major commitment, to be spread over 12 years, the institutes answered concerns among neuroscientists about the initial level of funding. “This is a realistic amount of money,” said Dr. Eric R. Kandel, director of the Kavli Institute for Brain Science at Columbia University, who, like some other neuroscientists, had been skeptical of what could be accomplished with the funding committed when the initiative was announced about a year ago. Gerald Rubin, the executive director of the Janelia Farm Research Campus in Virginia, also found that this budget request allayed some of his concerns, but not all. “I am much more concerned about convincing Congress to fund the Brain Initiative at this level,” he said. © 2014 The New York Times Company

Keyword: Brain imaging
Link ID: 19697 - Posted: 06.06.2014

Laura Spinney One day in 1991, neurologist Warren Strittmatter asked his boss to look at some bewildering data. Strittmatter was studying amyloid-β, the main component of the molecular clumps found in the brains of people with Alzheimer's disease. He was hunting for amyloid-binding proteins in the fluid that buffers the brain and spinal cord, and had fished out one called apolipoprotein E (ApoE), which had no obvious connection with the disease. Strittmatter's boss, geneticist Allen Roses of Duke University in Durham, North Carolina, immediately realized that his colleague had stumbled across something exciting. Two years earlier, the group had identified a genetic association between Alzheimer's and a region of chromosome 19. Roses knew that the gene encoding ApoE was also on chromosome 19. “It was like a lightning bolt,” he says. “It changed my life.” In humans, there are three common variants, or alleles, of the APOE gene, numbered 2, 3 and 4. The obvious step, Roses realized, was to find out whether individual APOE alleles influence the risk of developing Alzheimer's disease. The variants can be distinguished from one another using a technique called the polymerase chain reaction (PCR). But Roses had little experience with PCR, so he asked the postdocs in his team to test samples from people with the disease and healthy controls. The postdocs refused: they were busy hunting for genes underlying Alzheimer's, and APOE seemed an unlikely candidate. The feeling in the lab, recalls Roses, was that “the chief was off on one of his crazy ideas”. Roses then talked to his wife, Ann Saunders, a mouse geneticist who was skilled at PCR. She had just given birth to their daughter and was on maternity leave, so they struck a deal. “She did the experiments while I held the baby,” he says. Within three weeks, they had collected the data that would fuel a series of landmark papers showing that the APOE4 allele is associated with a greatly increased risk of Alzheimer's disease1. © 2014 Nature Publishing Group,

Keyword: Alzheimers; Genes & Behavior
Link ID: 19696 - Posted: 06.05.2014

A moderate dose of MDMA. commonly known as Ecstasy or Molly, that is typically nonfatal in cool, quiet environments can be lethal in rats exposed to conditions that mimic the hot, crowded, social settings where the drug is often used by people, a study finds. Scientists have identified the therapeutically-relevant cooling mechanism to enable effective interventions when faced with MDMA-induced hyperthermia. The study, publishing tomorrow in the Journal of Neuroscience, was conducted by researchers at the National Institute on Drug Abuse’s Intramural Research Program (NIDA IRP). NIDA is a part of the National Institutes of Health. While MDMA can have a range of adverse health effects, previous studies have shown that high doses of MDMA increase body temperature, while results with moderate doses were inconsistent. This has led some people to assume that the drug is harmless if taken in moderation. However, this study shows that in rats even moderate doses of MDMA in certain environments can be dangerous because it interferes with the body’s ability to regulate temperature. “We know that high doses of MDMA can sharply increase body temperature to potentially lead to organ failure or even death,” said NIDA Director Dr. Nora D. Volkow. “However, this current study opens the possibility that even moderate doses could be deadly in certain conditions.” It is impossible to predict who will have an adverse reaction even to a low dose of MDMA. However, in this study scientists gave the rats low to moderate doses that have been shown in past studies to not be fatal. They monitored the rats to determine drug-induced changes in brain and body temperature and in the body’s ability to cool itself through blood vessel dilation. When rats were alone and in a room-temperature environment, a moderate dose of MDMA modestly increased brain and body temperature and moderately diminished the rats’ ability to eliminate excessive heat. However, when researchers injected the same dose in rats that were either in a warmer environment or in the presence of another rat in the cage, brain temperature increased, causing death in some rats. These fatal temperature increases were because the drug interfered with the body’s ability to eliminate heat.

Keyword: Drug Abuse
Link ID: 19695 - Posted: 06.05.2014

By Charles Q. Choi Scientists have found a kind of brain cell in mice that can instruct stem cells to start making more neurons, according to a new study. In addition, they found that electrical signals could trigger this growth in rodents, raising the intriguing possibility that devices could one day help the human brain repair itself. The study appears in the journal Nature Neuroscience. We knew the brain can generate new neurons, a process known as neurogenesis, via neural stem cells. And neuroscientists knew these stem cells got their instructions from a variety of sources from chemicals in the bloodstream, for instance, and from cells in the structures that hold the cerebrospinal fluid that cushion the brain. Earlier research had suggested brain cells might also be able to command these stem cells to create neurons. Neuroscientist Chay Kuo at the Duke University School of Medicine in Durham, N.C., and his colleagues have now discovered such cells in mice. "It's really cool that the brain can tell stem cells to make more neurons," Kuo says. To begin their experiments, the researchers tested how well a variety of neurotransmitters performed at spurring mouse neural stem cells to produce new neurons; they found that a compound known as acetylcholine performed best. The team then discovered a previously unknown type of neuron that produces an enzyme needed to make acetylcholine. These neurons are found in a part of the adult mouse brain known as the subventricular zone, where neurogenesis occurs. ©2014 Hearst Communication, Inc

Keyword: Neurogenesis
Link ID: 19694 - Posted: 06.05.2014

By Denali Tietjen Meditation has long been known for its mental health benefits, but new research shows that just a few minutes of mindfulness can improve physical health and personal life as well. A recent study conducted by researchers at INSEAD and The Wharton School found that 15 minutes of mindful meditation can help you make better decisions. The research, published in the Association for Psychological Science’s journal Psychological Science, comes from four studies (varying in sample size from 69 to 178 adults) in which participants responded to sunk-cost scenarios at different degrees of mindful awareness. The results consistently showed that increased mindfulness decreases the sunk-cost bias. WOAH, hold the phone. What’s a sunk cost and what’s a sunk-cost bias?? Sunk cost is an economics term that psychologists have adopted. In economics, sunk costs are defined as non-recoverable investment costs like the cost of employee training or a lease on office space. In psychology, sunk costs are basically the same thing: The time and energy we put into our personal lives. Though we might not sit down with a calculator at the kitchen table when deciding who to take as our plus one to our second cousin’s wedding next weekend, we do a cost-benefit analysis every time we make a decision. And we take these sunk costs into account. The sunk-cost bias, then, is the tendency to allow sunk costs to overly influence current decisions. Mindfulness meditation can provide improved clarity, which helps you stay present and make better decisions, the study says. This protects you from that manipulative sunk-cost bias.

Keyword: Stress
Link ID: 19693 - Posted: 06.05.2014