Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Bob Roehr Retired American soccer star Brandi Chastain recently agreed to donate her brain to concussion research after her death. Females are often an unseen part of the concussion story even though they suffer more concussions than males, have more severe symptoms and are slower to recover. Just why is not completely clear, but the deficit in knowledge is slowly beginning to change thanks to women’s advocates behind Pink Concussions. The group gathered last weekend at Georgetown University to review the science behind concussions, and also to develop recommendations on gender-specific prevention protocols and clinical practices on how best to treat females with concussions. In comparable sports “female rates of concussions are much higher than those of their male counterparts,” says Zachary Kerr, director of the National Collegiate Athletic Association (NCAA) Injury Surveillance Program. Over a five-year period the rates per 1000 athlete-exposures were 6.3 in females versus 3.4 in males in soccer, 6.0 in females versus 3.9 in males in basketball and 3.3 in females versus 0.9 in males in baseball and softball. Only in swimming and diving did male rates (0.3) exceed those of females (0.5). Headache, dizziness and difficulty concentrating were roughly similar among both sexes, Kerr says. But among injured high school athletes, “larger portions of females are reporting sensitivity to light, sensitivity to noise, nausea and drowsiness,” he says. They were also slower to return to normal activity. The difference between the incidence and severity of concussions between the sexes does not start at birth, because infants and young children of both sexes have similar rates and symptoms with concussions. Puberty, however, which marks a significant developmental fork in the road for males and females, also marks a divergence for concussions. © 2016 Scientific American
Keyword: Brain Injury/Concussion; Sexual Behavior
Link ID: 21973 - Posted: 03.10.2016
Susan Gaidos Most people would be happy to get rid of excess body fat. Even better: Trade the spare tire for something useful — say, better-functioning knees or hips, or a fix for an ailing heart or a broken bone. The idea is not far-fetched, some scientists say. Researchers worldwide are repurposing discarded fat to repair body parts damaged by injury, disease or age. Recent studies in lab animals and humans show that the much-maligned material can be a source of cells useful for treating a wide range of ills. At the University of Pittsburgh, bioengineer Rocky Tuan and colleagues extract buckets full of yellow fat from volunteers’ bellies and thighs and turn the liposuctioned material into tissue that resembles shock-absorbing cartilage. If the cartilage works as well in people as it has in animals, Tuan’s approach might someday offer a kind of self-repair for osteoarthritis, the painful degeneration of cartilage in the joints. He’s also using fat cells to grow replacement parts for the tendons and ligaments that support the joints. Foremost among fat’s virtues is its richness of stem cells, which have the ability to divide and grow into a wide variety of tissue types. Fat stem cells — also known as adipose-derived stem cells — can be coerced to grow into bone, cartilage, muscle tissue or, of course, more fat. Cells from fat are being tested to mend tissues found in damaged joints, hearts and muscle, and to regrow bone and heal wounds. © Society for Science & the Public 2000 - 2016
Keyword: Obesity; Stem Cells
Link ID: 21972 - Posted: 03.10.2016
Monya Baker A surgical technique to treat cataracts in children spurs stem cells to generate a new, clear lens. Discs made of multiple types of eye tissue have been grown from human stem cells — and that tissue has been used to restore sight in rabbits. The work, reported today in Nature1, suggests that induced pluripotent stem (iPS) cells — stem cells generated from adult cells — could one day be harnessed to provide replacement corneal or lens tissue for human eyes. The discs also could be used to study how eye tissue and congenital eye diseases develop. “The potential of this technique is mind-boggling,” says Mark Daniell, head of corneal research at the Centre for Eye Research Australia in Melbourne, who was not involved in the research. “It’s almost like an eye in a dish.” A second, unrelated paper in Nature2 describes a surgical procedure that activates the body’s own stem cells to regenerate a clear, functioning lens in the eyes of babies born with cataracts. The two studies are “amazing, almost like science fiction”, Daniell says. In the first study, a team led by Kohji Nishida, an ophthalmologist at Osaka University Graduate School of Medicine in Japan, cultivated human iPS cells to produce discs that contained several types of eye tissue. © 2016 Nature Publishing Group
Keyword: Vision
Link ID: 21971 - Posted: 03.10.2016
Rich Stanton In 1976, the driving simulation Death Race was removed from an Illinois amusement park. There had, according to a news story at the time, been complaints that it encouraged players to run over pedestrians to score points. Through a series of subsequent newspaper reports, the US National Safety Council labelled the game “gross” and motoring groups demanded its removal from distribution. The first moral panic over video game violence had begun. This January, a group of four scholars published a paper analysing the links between playing violent video games at a young age and aggressive behaviour in later life. The titles mentioned in the report are around 15-years-old – one of several troubling ambiguities to be found in the research. Nevertheless, the quality and quantity of the data make this an uncommonly valuable study. Given that game violence remains a favoured bogeyman for politicians, press and pressure groups, it should be shocking that such a robust study of the phenomenon is rare. But it is, and it’s important to ask why. A history of violence With the arrival of Pong in 1973, video games became a commercial reality, but now, in 2016, they are still on the rocky path to mass acceptance that all new media must traverse. The truth is that the big targets of moral concern – Doom, Grand Theft Auto, Call of Duty – are undeniably about killing and they are undeniably popular among male teenagers. An industry report estimates that 80% of the audience for the Call of Duty series is male, and 21% is aged 10-14. Going by the 18 rating on the last three entries, that means at least a fifth of the game’s vast audience shouldn’t be playing. © 2016 Guardian News and Media Limited
Keyword: Aggression; Development of the Brain
Link ID: 21970 - Posted: 03.09.2016
By Anahad O'Connor Mark Mattson, a neuroscientist at the National Institute on Aging in Maryland, has not had breakfast in 35 years. Most days he practices a form of fasting — skipping lunch, taking a midafternoon run, and then eating all of his daily calories (about 2,000) in a six-hour window starting in the afternoon. “Once you get used to it, it’s not a big deal,” said Dr. Mattson, chief of the institute’s laboratory of neurosciences. “I’m not hungry at all in the morning, and this is other people’s experience as well. It’s just a matter of getting adapted to it.” In a culture in which it’s customary to eat three large meals a day while snacking from morning to midnight, the idea of regularly skipping meals may sound extreme. But in recent years intermittent fasting has been gaining popular attention and scientific endorsement. It has been promoted in best-selling books and endorsed by celebrities like the actors Hugh Jackman and Benedict Cumberbatch. The late-night talk show host Jimmy Kimmel claims that for the past two years he has followed an intermittent fasting program known as the 5:2 diet, which entails normal eating for five days and fasting for two — a practice Mr. Kimmel credits for his significant weight loss. Fasting to improve health dates back thousands of years, with Hippocrates and Plato among its earliest proponents. Dr. Mattson argues that humans are well suited for it: For much of human history, sporadic access to food was likely the norm, especially for hunter-gatherers. As a result, we’ve evolved with livers and muscles that store quickly accessible carbohydrates in the form of glycogen, and our fat tissue holds long-lasting energy reserves that can sustain the body for weeks when food is not available. “From an evolutionary perspective, it’s pretty clear that our ancestors did not eat three meals a day plus snacks,” Dr. Mattson said. © 2016 The New York Times Company
Keyword: Obesity
Link ID: 21969 - Posted: 03.09.2016
By Virginia Morell Butterflies may not have a human’s sharp vision, but their eyes beat us in other ways. Their visual fields are larger, they’re better at perceiving fast-moving objects, and they can distinguish ultraviolet and polarized light. Now, it turns out that one species of swallowtail butterfly from Australasia, the common bluebottle (Graphium sarpedon, pictured), known for its conspicuous blue-green markings, is even better equipped for such visual tasks. Each of their eyes, scientists report in Frontiers in Ecology and Evolution, contains at least 15 different types of photoreceptors, the light-detecting cells required for color vision. These are comparable to the rods and cones found in our eyes. To understand how the spectrally complex retinas of butterflies evolved, the researchers used physiological, anatomical, and molecular experiments to examine the eyes of 200 male bluebottles collected in Japan. (Only males were used because the scientists failed to catch a sufficient number of females.) They found that different colors stimulate each class of receptor. For instance, UV light stimulates one, while slightly different blue lights set off three others; and green lights trigger four more. Most insect species have only three classes of photoreceptors. Even humans have only three cones, yet we still see millions of colors. Butterflies need only four receptor classes for color vision, including spectra in the UV region. So why did this species evolve 11 more? The scientists suspect that some of the receptors must be tuned to perceive specific things of great ecological importance to these iridescent butterflies—such as sex. For instance, with eyes alert to the slightest variation in the blue-green spectrum, male bluebottles can spot and chase their rivals, even when they’re flying against a blue sky. © 2016 American Association for the Advancement of Science
Keyword: Vision
Link ID: 21968 - Posted: 03.09.2016
By GINA KOLATA Marty and Matt Reiswig, two brothers in Denver, knew that Alzheimer’s disease ran in their family, but neither of them understood why. Then a cousin, Gary Reiswig, whom they barely knew, wrote a book about their family, “The Thousand Mile Stare.” When the brothers read it, they realized what they were facing. In the extended Reiswig family, Alzheimer’s disease is not just a random occurrence. It results from a mutated gene that is passed down from parent to child. If you inherit the mutated gene, Alzheimer’s will emerge at around age 50 — with absolute certainty. Your child has a 50-50 chance of suffering the same fate. The revelation came as a shock. And so did the next one: The brothers learned that there is a blood test that can reveal whether one carries the mutated gene. They could decide to know if they had it. Or not. It’s a dilemma more people are facing as scientists discover more genetic mutations linked to diseases. Often the newly discovered gene increases risk, but does not guarantee it. Sometimes knowing can be useful: If you have a gene mutation that makes colon cancer much more likely , for example, then frequent colonoscopies may help doctors stave off trouble. But then there are genes that make a dreaded disease a certainty: There is no way to prevent it, and no way to treat it. Marty Reiswig, 37, saw his father, now in the final stages of Alzheimer’s, slowly lose his ability to think, to remember, to care for himself, or even to recognize his wife and sons. Mr. Reiswig knows that if he has the gene, he has perhaps a bit more than a decade before the first symptoms appear. If he has it, his two young children may have it, too. He wavers about getting tested. © 2016 The New York Times Company
Keyword: Alzheimers; Genes & Behavior
Link ID: 21967 - Posted: 03.08.2016
Our eyes constantly send bits of information about the world around us to our brains where the information is assembled into objects we recognize. Along the way, a series of neurons in the eye uses electrical and chemical signals to relay the information. In a study of mice, National Institutes of Health scientists showed how one type of neuron may do this to distinguish moving objects. The study suggests that the NMDA receptor, a protein normally associated with learning and memory, may help neurons in the eye and the brain relay that information. “The eye is a window onto the outside world and the inner workings of the brain,” said Jeffrey S. Diamond, Ph.D., senior scientist at the NIH’s National Institute of Neurological Disorders and Stroke (NINDS), and the senior author of the study published in Neuron. “Our results show how neurons in the eye and the brain may use NMDA receptors to help them detect motion in a complex visual world.” Vision begins when light enters the eye and hits the retina, which lines the back of the eyeball. Neurons in the retina convert light into nerve signals which are then sent to the brain. Using retinas isolated from mice, Dr. Alon Poleg-Polsky, Ph.D. a postdoctoral fellow in Dr. Diamond’s lab, studied neurons called directionally selective retinal ganglion cells (DSGCs), which are known to fire and send signals to the brain in response to objects moving in specific directions across the eye. Electrical recordings showed that some of these cells fired when a bar of light passed across the retina from left to right, whereas others responded to light crossing in the opposite direction. Previous studies suggested these unique responses are controlled by incoming signals sent from neighboring cells at chemical communication points called synapses. In this study, Dr. Poleg-Polsky discovered that the activity of NMDA receptors at one set of synapses may regulate whether DSGCs sent direction-sensitive information to the brain.
Keyword: Vision
Link ID: 21966 - Posted: 03.08.2016
By Daniel Engber Nearly 20 years ago, psychologists Roy Baumeister and Dianne Tice, a married couple at Case Western Reserve University, devised a foundational experiment on self-control. “Chocolate chip cookies were baked in the room in a small oven,” they wrote in a paper that has been cited more than 3,000 times. “As a result, the laboratory was filled with the delicious aroma of fresh chocolate and baking.” Here’s how that experiment worked. Baumeister and Tice stacked their fresh-baked cookies on a plate, beside a bowl of red and white radishes, and brought in a parade of student volunteers. They told some of the students to hang out for a while unattended, eating only from the bowl of radishes, while another group ate only cookies. Afterward, each volunteer tried to solve a puzzle, one that was designed to be impossible to complete. Baumeister and Tice timed the students in the puzzle task, to see how long it took them to give up. They found that the ones who’d eaten chocolate chip cookies kept working on the puzzle for 19 minutes, on average—about as long as people in a control condition who hadn’t snacked at all. The group of kids who noshed on radishes flubbed the puzzle test. They lasted just eight minutes before they quit in frustration. The authors called this effect “ego depletion” and said it revealed a fundamental fact about the human mind: We all have a limited supply of willpower, and it decreases with overuse. © 2016 The Slate Group LLC.
Keyword: Attention
Link ID: 21965 - Posted: 03.08.2016
Shefali Luthra Depression prompts people to make about 8 million doctors' appointments a year, and more than half are with primary care physicians. A study suggests those doctors often fall short in treating depression because of insurance issues, time constraints and other factors. More often than not, primary care doctors fail to teach patients how to manage their care and don't follow up to see how they're doing, according to the study, which was published Monday in Health Affairs. Those are considered effective tactics for treating chronic illnesses. "The approach to depression should be like that of other chronic diseases," said Dr. Harold Pincus, vice chair of psychiatry at Columbia University's College of Physicians and Surgeons and one of the study's co-authors. But "by and large, primary care practices don't have the infrastructure or haven't chosen to implement those practices for depression." Most people with depression seek help from their primary care doctors, the study notes. That can be because patients often face shortages and limitations of access to specialty mental health care, including lack of insurance coverage, the authors write. Plus there's stigma: Patients sometimes feel nervous or ashamed to see a mental health specialist, according to the authors. Meanwhile, physicians and researchers have increasingly been calling for mental health conditions such as depression and anxiety to be treated like physical illnesses. Historically, those have been handled separately and not always with the same attention and care as things like high blood pressure and heart disease. © 2016 npr
Keyword: Depression
Link ID: 21964 - Posted: 03.08.2016
By Diana Kwon All of us have snapped at some point. A stranger cuts in line or a distracted driver nearly hits us and we lose our cool in a sudden fit of rage. As mass shootings continue to make headlines, it is becoming increasingly important to understand the brain circuits that underlie these flashes of emotion. R. Douglas Fields, a neuroscientist at the University of Maryland, College Park, and the National Institutes of Health, explores this very issue in his new book Why We Snap: Understanding the Rage Circuit in Your Brain (Dutton, 2016; 408 pages). After his own experience of sudden rage Fields began studying the topic and uncovered nine specific triggers, which he summarizes using the mnemonic “LIFEMORTS”: Life or limb (defending yourself against attackers); Insult; Family (protecting loved ones); Environment (protecting your territory); Mate; Order in society (responding to social injustice); Resources (gaining and safeguarding possessions); Tribe (defending your group); Stopped (escaping restraint or imprisonment). You mention in the introduction that you were compelled to write Why We Snap after a personal “snapping” incident in Barcelona: When a pickpocket snatched your wallet, you pinned him to the ground and grabbed it back. What was it like to try to understand that moment while writing and researching this book? Being pickpocketed was the inspiration for the book, but I also had the realization that this is an enormous problem that seems to be overlooked. © 2016 Scientific American
Keyword: Aggression; Emotions
Link ID: 21963 - Posted: 03.07.2016
By Roberto A. Ferdman In the mid 1970s, psychologist Merrill Elias began tracking the cognitive abilities of more than a thousand people in the state of New York. The goal was fairly specific: to observe the relationship between people's blood pressure and brain performance. And for decades he did just that, eventually expanding the Maine-Syracuse Longitudinal Study (MSLS) to observe other cardiovascular risk factors, including diabetes, obesity, and smoking. There was never an inkling that his research would lead to any sort of discovery about chocolate. And yet, 40 years later, it seems to have done just that. Late in the study, Elias and his team had an idea. Why not ask the participants what they were eating too? It wasn't unreasonable to wonder if what someone ate might add to the discussion. Diets, after all, had been shown to affect the risk factors Elias was already monitoring. Plus, they had this large pool of participants at their disposal, a perfect chance to learn a bit more about the decisions people were making about food. The researchers incorporated a new questionnaire into the sixth wave of their data collection, which spanned the five years between 2001 and 2006 (there have been seven waves in all, each conducted in five year intervals). The questionnaire gathered all sorts of information about the dietary habits of the participants. And the dietary habits of the participants revealed an interesting pattern. "We found that people who eat chocolate at least once a week tend to perform better cognitively," said Elias. "It's significant—it touches a number of cognitive domains." © 1996-2016 The Washington Post
Keyword: Obesity; Learning & Memory
Link ID: 21962 - Posted: 03.07.2016
By KATHARINE Q. SEELYE CAMBRIDGE, Mass. — In Philadelphia last spring, a man riding a city bus at rush hour injected heroin into his hand, in full view of other passengers, including one who captured the scene on video. In Cincinnati, a woman died in January after she and her husband overdosed in their baby’s room at Cincinnati Children’s Hospital Medical Center. The husband was found unconscious with a gun in his pocket, a syringe in his arm and needles strewn around the sink. Here in Cambridge a few years ago, after several people overdosed in the bathrooms of a historic church, church officials reluctantly closed the bathrooms to the public. “We weren’t medically equipped or educated to handle overdoses, and we were desperately afraid we were going to have something happen that was way out of our reach,” said the Rev. Joseph O. Robinson, rector of the church, Christ Church Cambridge. With heroin cheap and widely available on city streets throughout the country, users are making their buys and shooting up as soon as they can, often in public places. Police officers are routinely finding drug users — unconscious or dead — in cars, in the bathrooms of fast-food restaurants, on mass transit and in parks, hospitals and libraries. The visibility of drug users may be partly attributed to the nature of the epidemic, which has grown largely out of dependence on legal opioid painkillers and has spread to white, urban, suburban and rural areas. © 2016 The New York Times Company
Keyword: Drug Abuse
Link ID: 21961 - Posted: 03.07.2016
By ANNA FELS THERE was something odd about my new patient. She was elegantly dressed and self-possessed, and yet she was slowly, rhythmically chewing gum, something I rarely see in my psychiatry sessions. Was she trying to cover up anxiety about this first encounter, I wondered, or was she perhaps hoping to project a kind of cool, laid-back style? We talked for a long time about why she had come to see me. Then, as is my practice with a new patient, I asked what, if any, psychiatric medications and nonprescription, psychoactive substances — legal or illegal — she had used. Her answer was a new one for me. She stated that she chewed approximately 40 pieces of nicotine gum per day and had done so for well over a decade. Responses to this question are often illuminating and can be rather humbling. Although doctors are trained to focus on prescription medications, there are and have always been nonprescription “remedies” for psychiatric conditions. And people’s preferences for one type of substance over another can give a glimpse into their symptoms and even their brain chemistry. If a patient tells me he falls asleep on cocaine, I wonder if he might have attention deficit disorder. A patient who smokes marijuana to calm down before important business meetings leads me in the direction of social phobia or other anxiety disorders. I often wonder if people who take ketamine recreationally might be depressed, since this anesthetic has been shown to have antidepressant effects and is, in fact, being investigated for potential therapeutic use. Sorting through patients’ uses of psychoactive substances, from cocaine to alcohol to coffee, leaves me with an appreciation of the wildly different neurochemistry of people’s brains. One person will drink alcohol and feel euphoric, witty and extroverted, and the next will be logy and nauseated. © 2016 The New York Times Company
Keyword: Drug Abuse
Link ID: 21960 - Posted: 03.07.2016
By Kerry Grens On a closed-circuit television I watch Marie settle into her room, unpacking her toiletries in the bathroom and arranging her clothes for the next day. Her digs at the University of Chicago sleep lab look like an ordinary hotel room, with a bed, TV, desk, nightstand. Ordinary, except for the camera keeping watch from across the bed and the small metal door in the wall next to the headboard. The door, about one foot square, is used when researchers want to sample the study participants’ blood during the night without disturbing them; an IV line passes from the person’s arm through the door and into the master control room where I’m watching Marie on the screen. She’s come to the lab on a weekday evening to be screened for possible inclusion in a study on insomnia. Marie says her sleep problems started almost 20 years ago, on the first day of her job as a flight attendant. “The phone rang in the middle of the night,” she recalls. It was work, scheduling her for a flight. “Something was triggered in my mind. It was the first time in my life I experienced a night with no sleep. Something clicked. Then the second night I couldn’t sleep. It just went on. I lost my ability to sleep.” After a few years, Marie (not her real name—she asked to remain anonymous for privacy) stopped working. Most nights she’ll sleep for a short stretch—maybe a few hours—then wake up and lie awake for hours as pain in her neck consumes her and makes her uneasy and restless. “I’ve seen psychologists, physical therapists, doctors. I’ve been prescribed medications for depression. But it didn’t work,” she says. “Every single day it’s a struggle . . . I feel like when Job was attacked by the devil. Someone is trying to take my vitality away.” © 1986-2016 The Scientist
Keyword: Sleep
Link ID: 21959 - Posted: 03.07.2016
By C. CLAIBORNE RAY Q. What’s the No. 1 cause of blindness in seniors in the United States? A. “It sounds like a simple question, but there’s no perfect answer,” said Dr. Susan Vitale, a research epidemiologist at the National Eye Institute of the National Institutes of Health. “It depends on age, how blindness is measured and how statistics are collected.” For example, some studies have relied on the self-reported answer to the vague question: “Do you have vision problems?” The best available estimates, she said, come from a 2004 paper aggregating many other studies, some in the United States and some in other countries, updated by applying later census data. This paper and others have found striking differences by age and by racial and socioeconomic groups, Dr. Vitale said. In white people, she said, the major cause of blindness at older ages is usually age-related macular degeneration, progressive damage to the central portion of the retina. In older black people, the major causes are likely to be glaucoma or cataracts. In older people of working age, from their 40s to their 60s, the major cause, regardless of race, is diabetic retinopathy, damage to the retina as a result of diabetes. Many studies have shown that white people are more likely to have age-related macular degeneration, Dr. Vitale said, but as for cataracts, for which blindness is preventable by surgery, there are questions about access to health care and whether those affected can get the needed surgery. It is not known why black people are at higher risk of glaucoma. There are also some gender differences, she said, with white women more likely than white men to become blind. Studies have not found the same difference by gender in black and Hispanic people. Because many of the causes of blindness at all ages are preventable, Dr. Vitale said, it is essential to have regular eye checkups, even if there are no obvious symptoms. © 2016 The New York Times Company
Keyword: Vision
Link ID: 21958 - Posted: 03.07.2016
By DONALD G. McNEIL Jr. and CATHERINE SAINT LOUIS The Zika virus damages many fetuses carried by infected and symptomatic mothers, regardless of when in pregnancy the infection occurs, according to a small but frightening study released on Friday by Brazilian and American researchers. In a separate report published on Friday, other scientists suggested a mechanism for the damage, showing in laboratory experiments that the virus targets and destroys fetal cells that eventually form the brain’s cortex. The reports are far from conclusive, but the studies help shed light on a mysterious epidemic that has swept across more than two dozen countries in the Western Hemisphere, alarming citizens and unnerving public health officials. In the first study, published in The New England Journal of Medicine, researchers found that 29 percent of women who had ultrasound examinations after testing positive for infection with the Zika virus had fetuses that suffered “grave outcomes.” They included fetal death, tiny heads, shrunken placentas and nerve damage that suggested blindness. “This is going to have a chilling effect,” said Dr. Anthony S. Fauci, the director of the National Institute of Allergy and Infectious Diseases. “Now there’s almost no doubt that Zika is the cause.” The small size of the study, which looked at 88 women at one clinic in Rio de Janeiro, was a limitation, Dr. Fauci added. From such a small sample, it is impossible to be certain how often fetal damage may occur in a much larger population. © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 21957 - Posted: 03.05.2016
Anxious people perceive the world differently. An anxious brain appears to process sounds in an altered way, ramping up the expectation that something bad – or good – might happen. There’s no doubt that some degree of anxiety is vital for survival. When we learn that something is dangerous, we generalise that memory to apply the same warning signal to other, similar situations to avoid getting into trouble. If you’re bitten by a large, aggressive dog, for instance, it makes sense to feel slightly anxious around similar dogs. “It’s better to be safe than sorry,” says Rony Paz at the Weizmann Institute of Science in Rehovot, Israel. The trouble begins when this process becomes exaggerated. In the dog bite example, a person who went on to become anxious around all dogs, even small ones, would be described as overgeneralising. Overgeneralisation is thought to play a role in post-traumatic stress disorder and general anxiety disorder, a condition characterised by anxiety about many situations, leaving people in a state of near-constant restlessness. A study carried out by Paz suggests that overgeneralisation is not limited to anxious thoughts and memories – for such people the same process seems to affect their perception of the world. © Copyright Reed Business Information Ltd.
By Amy Ellis Nutt Surgeons snaked the electrodes under the 65-year-old woman’s scalp. Thirty years of Parkinson’s disease had almost frozen her limbs. The wires, connected to a kind of pacemaker under the skin, were aimed at decreasing the woman’s rigidity and allowing for more fluid movement. But five seconds after the first electrical pulse was fired into her brain, something else happened. Although awake and fully alert, she seemed to plunge into sadness, bowing her head and sobbing. One of the doctors asked what was wrong. “I no longer wish to live, to see anything, to hear anything, feel anything,” she said. Was she in some kind of pain? “No, I’m fed up with life. I’ve had enough,” she replied. “Everything is useless.” The operating team turned off the current. Less than 90 seconds later, the woman was smiling and joking, even acting slightly manic. Another five minutes more, and her normal mood returned. The patient had no history of depression. Yet in those few minutes after the electrical pulse was fired, the despair she expressed met nine of the 11 criteria for severe major depressive disorder in the Diagnostic and Statistical Manual of Mental Disorders. Fascinated by the anomaly, the French physicians wrote up the episode for the New England Journal of Medicine. The year was 1999, and hers was one of the first documented cases of an electrically induced, instantaneous, yet reversible depression. © 1996-2016 The Washington Post
Keyword: Parkinsons; Emotions
Link ID: 21955 - Posted: 03.05.2016
Angus Chen We know we should put the cigarettes away or make use of that gym membership, but in the moment, we just don't do it. There is a cluster of neurons in our brain critical for motivation, though. What if you could hack them to motivate yourself? These neurons are located in the middle of the brain, in a region called the ventral tegmental area. A paper published Thursday in the journal Neuron suggests that we can activate the region with a little bit of training. The researchers stuck 73 people into an fMRI, a scanner that can detect what part of the brain is most active, and focused on that area associated with motivation. When the researchers said "motivate yourself and make this part of your brain light up," people couldn't really do it. "They weren't that reliable when we said, 'Go! Get psyched. Turn on your VTA,' " says Dr. Alison Adcock, a psychiatrist at Duke and senior author on the paper. That changed when the participants were allowed to watch a neurofeedback meter that displayed activity in their ventral tegmental area. When activity ramps up, the participants see the meter heat up while they're in the fMRI tube. "Your whole mind is allowed to speak to a specific part of your brain in a way you never imagined before. Then you get feedback that helps you discover how to turn that part of the brain up or down," says John Gabrieli, a neuroscientist at the Massachusetts Institute of Technology who was not involved with the work. © 2016 npr
Keyword: Attention
Link ID: 21954 - Posted: 03.05.2016