Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Rachel Ehrenberg A computer can decode the stuff of dreams. By comparing brain activity during sleep with activity patterns collected while study participants looked at certain objects, a computer learned to identify some contents of people’s unconscious reveries. “It’s striking work,” says cognitive psychologist Frank Tong of Vanderbilt University in Nashville, who was not involved in the research. “It’s a demonstration that brain activity during dreaming is very similar to activity during wakefulness.” The work, reported April 4 in Science by Japanese researchers led by Yukiyasu Kamitani of Advanced Telecommunications Research Institute International, adds to somewhat scant knowledge of how the brain constructs dreams, says Tong. The research could lead to a better understanding of what the brain does during different states of consciousness, such as those experienced by some coma patients. Dreams are a bit of a black box and difficult to study. Experiments with mice have revealed aspects of sleep and dreaming, such as how the experiences contribute to forming memories. But a mouse can’t tell you what it dreamed about. And the sleep stage that’s richest in dreams — REM sleep — typically kicks in about 90 minutes after a person conks out, making it time consuming to gather data on dreams. The noisy fMRI brain scanning machine doesn’t help. To skirt these experimental issues, the researchers recorded brain activity in three adult male volunteers during the early stages of sleep. After the subjects had dozed off, they were repeatedly awakened and asked for detailed reports on what they had seen while sleeping. In an example, one participant stated: “Well, there were persons, about three persons, inside some sort of hall. There was a male, a female and maybe like a child. Ah, it was like a boy, a girl and a mother. I don't think that there was any color.” © Society for Science & the Public 2000 - 2013
Keyword: Sleep; Brain imaging
Link ID: 17995 - Posted: 04.05.2013
by Gisela Telis Insomniacs desperate for some zzzs may one day have a safer way to get them. Scientists have developed a new sleep medication that has induced sleep in rodents and monkeys without apparently impairing cognition, a potentially dangerous side effect of common sleep aids. The discovery, which originated in work explaining narcolepsy, could lead to a new class of drugs that help people who don't respond to other treatments. Between 10% and 15% of Americans chronically struggle with getting to or staying asleep. Many of them turn to sleeping pills for relief, and most are prescribed drugs, such as zolpidem (Ambien) and eszopiclone (Lunesta), that slow down the brain by binding to receptors for GABA, a neurotransmitter that's involved in mood, cognition, and muscle tone. But because the drugs target GABA indiscriminately, they can also impair cognition, causing amnesia, confusion, and other problems with learning and memory, along with a number of strange sleepwalking behaviors, including wandering, eating, and driving while asleep. This has led many researchers to seek out alternative mechanisms for inducing sleep. Neuroscientist Jason Uslaner of Merck Research Laboratories in West Point, Pennsylvania, and colleagues decided to tap into the brain's orexin system. Orexin (also known as hypocretin) is a protein that controls wakefulness and is missing in people with narcolepsy. Past studies successfully induced sleep by inhibiting orexin, but had not looked into its effects on cognition. The researchers developed a new orexin-inhibiting compound called DORA-22 and confirmed that it could induce sleep in rats and rhesus monkeys as effectively as the GABA-modulating drugs. © 2010 American Association for the Advancement of Science.
Keyword: Sleep
Link ID: 17994 - Posted: 04.05.2013
By Nathan Seppa Tiny components of amyloid plaques, the notorious protein clumps found littering the brains of people with Alzheimer’s disease, might fight inflammation. Researchers report that several of these sticky protein fragments, or peptides, glom onto inflammatory compounds and reverse paralysis in mice that have a condition similar to multiple sclerosis. A fragment of tau protein, which shows up in other brain deposits in Alzheimer’s patients, has a similar effect. When tested on blood taken from three MS patients, the tau peptide weeded out some inflammatory culprits there, too, researchers report in the April 3 Science Translational Medicine. “This is a seriously good study. It opens up more questions than it answers,” says Jian-Guo Geng, a cell biologist at the University of Michigan in Ann Arbor who wasn’t part of the research team. “But I don’t think we’re anywhere close to using these peptides for treatments.” Amyloid is a broad term for clusters of protein in the brain, including those arising with the aid of misfolded versions of tau or another protein implicated in brain disease called a prion. Viewing amyloid-forming peptides as good guys runs against the scientific thinking, since amyloid plaques are a hallmark of Alzheimer’s disease. But study coauthor Lawrence Steinman, a neurologist at Stanford University, points out that the actual role of amyloid plaques in the disease is unclear. He suggests the tiny peptides holding the plaques together might have an alternative, useful role in the body. © Society for Science & the Public 2000 - 2013
Keyword: Alzheimers
Link ID: 17993 - Posted: 04.05.2013
Genetic markers that could help highlight who is at risk of developing Alzheimer's disease have been identified by US scientists. The research in Neuron identifies mutations that affect the build-up of certain proteins in the brain. High levels of these tau proteins increase the chance of having the disease. UK experts said the study could help understand the changes that occur in the brains of Alzheimer's patients. Tangles of a kind of tau called phosphorylated tau (ptau) are a hallmark of the disease. One of the new gene variants identified by the Washington University School of Medicine team was also shown to be linked to a small increased risk of developing Alzheimer's and a greater risk of cognitive decline. The team used genetic information from more than 1,200 people, significantly larger than previous studies in this area. Dr Alison Goate, who led the study, said: "We anticipate that knowledge about the role of these genes in Alzheimer's disease may lead to the identification of new targets from therapies or new animal or cellular models of the disease. Lifestyle 'plays a role' UK experts said the study adds to the number of genetic markers that have been linked to the development of Alzheimer's disease. BBC © 2013
Keyword: Alzheimers; Genes & Behavior
Link ID: 17992 - Posted: 04.05.2013
By Puneet Kollipara Rats that will go to great lengths to get a cocaine fix might blame a group of sluggish neurons. Controlling the problem may come down to a flick of a light switch: Stimulating those brain cells with lasers reduces the addicted rats’ cocaine use, researchers report in the April 4 Nature. “It's an outstanding piece of work,” says neuroscientist A.J. Robison of Michigan State University, who wasn’t involved in the study. The findings could help researchers better understand the role of neural circuitry in drug addiction in humans, he says. Scientists know that when certain neurons fire less frequently in the prelimbic cortex, a brain region that handles impulse control and reward-driven behavior, a person’s self-control can decrease. But researchers didn’t know whether using cocaine chronically could make the neurons drowsy to begin with, and whether that sluggishness could also promote drug use in spite of ill consequences. Billy Chen, then of the National Institutes of Health, and colleagues trained rats to take cocaine. The rats learned to press levers to receive a dose of drug through an IV. After about two months, researchers started giving the rats shocks roughly one-third of the time when the animals pressed the levers. Most of the rats stopped taking cocaine, but about 30 percent continued. These were compulsive cocaine users, says coauthor Antonello Bonci, a neuroscientist at the NIH’s National Institute on Drug Abuse. © Society for Science & the Public 2000 - 2013
Keyword: Drug Abuse
Link ID: 17991 - Posted: 04.05.2013
by Emily Underwood For neuroscientist Rafael Yuste, sitting in an ornate White House chamber yesterday listening to President Barack Obama heap praise—and some $100 million—on a brain-mapping initiative that he helped hatch was a "luminous" experience. "It felt like history," says the researcher, who works at Columbia University. "There is this enormous mystery waiting to be unlocked," Obama told the East Room crowd packed with leaders of American neuroscience during a 12-minute paean to brain research (likely the most expansive yet delivered by an American president). By "giving scientists the tools they need to get a dynamic picture of the brain in action," he said, the new initiative will help scientists find a cure for complex brain processes such as traumatic brain injury and Parkinson's, and create jobs that "we haven't even dreamt up yet." For all the lofty rhetoric, however, the White House didn't provide many details about how the BRAIN (Brain Research through Advancing Innovative Neurotechnologies) Initiative will accomplish its mission. And the lack of detail is worrying not only BRAIN skeptics—who argue that it targets the wrong goal and could detract from other research efforts—but also even some staunch advocates such as Yuste. The way that the White House has packaged and plans to fund and coordinate the initiative, they say, is creating some unease. "As the proposal stands, it's still awfully vague, so it's hard not to have some reservations," says biophysicist Jeremy Berg of the University of Pittsburgh in Pennsylvania, who is a former director of the National Institute of General Medical Sciences at the National Institutes of Health (NIH). © 2010 American Association for the Advancement of Science
Keyword: Brain imaging
Link ID: 17990 - Posted: 04.05.2013
Dana Smith In January, the European Commission pledged 500 million euros to work towards creating a functional model of the human brain. Then, yesterday, Barack Obama officially announced an initiative to advance neuroscience, funding a large-scale research project aimed at unlocking the secrets of the brain that involves over $100 million in federal spending in the first year alone, as well as investments from private organizations. Both projects are geared towards creating a working model of the brain, mapping its 100 billion neurons. The first, the Human Brain Project, is being spearheaded by Professor Henry Markram of École Polytechnique Fédérale de Lausanne. Together with collaborators from 86 other European institutions, they aim to simulate the workings of the human brain using a giant super computer. This would mean compiling information about the activity of individual neurons and neuronal circuits throughout the brain in a massive database. They then hope to integrate the biological actions of these neurons to create theoretical maps of different subsystems, and eventually, through the magic of computer simulation, a working model of the entire brain. Neurologic and psychiatric disorders collectively "affect 100 million Americans and cost us $500 billion each year in terms of health-care costs." Similarly, the United States' recently renamed Brain Research Through Advancing Innovative Neurotechnologies, or BRAIN (previously the Brain Activity Map Project, or BAM), is an initiative that will be organized through the National Institutes of Health, National Science Foundation, and Defense Advanced Research Projects Agency, and carried out in a number of universities and research institutes throughout the U.S. © 2013 by The Atlantic Monthly Group.
Keyword: Brain imaging
Link ID: 17989 - Posted: 04.05.2013
Kerri Smith The experiment helped to change John-Dylan Haynes's outlook on life. In 2007, Haynes, a neuroscientist at the Bernstein Center for Computational Neuroscience in Berlin, put people into a brain scanner in which a display screen flashed a succession of random letters1. He told them to press a button with either their right or left index fingers whenever they felt the urge, and to remember the letter that was showing on the screen when they made the decision. The experiment used functional magnetic resonance imaging (fMRI) to reveal brain activity in real time as the volunteers chose to use their right or left hands. The results were quite a surprise. "The first thought we had was 'we have to check if this is real'," says Haynes. "We came up with more sanity checks than I've ever seen in any other study before." The conscious decision to push the button was made about a second before the actual act, but the team discovered that a pattern of brain activity seemed to predict that decision by as many as seven seconds. Long before the subjects were even aware of making a choice, it seems, their brains had already decided. As humans, we like to think that our decisions are under our conscious control — that we have free will. Philosophers have debated that concept for centuries, and now Haynes and other experimental neuroscientists are raising a new challenge. They argue that consciousness of a decision may be a mere biochemical afterthought, with no influence whatsoever on a person's actions. According to this logic, they say, free will is an illusion. "We feel we choose, but we don't," says Patrick Haggard, a neuroscientist at University College London. © 2013 Nature Publishing Group
Keyword: Consciousness
Link ID: 17988 - Posted: 04.05.2013
by Dennis Normile Puberty has always been a time of stress and emotional turmoil for adolescents and for their parents. And scientists have long recognized that kids who start puberty ahead of their peers are particularly likely to have trouble getting along with other children and with adults. New research suggests that those difficulties can be traced back to even earlier ages, indicating that early puberty may not be the root cause. Australian researchers drew on data for 3491 children, roughly half boys and half girls, who were recruited at ages 4 or 5 and then followed until they reached ages 10 or 11. Every 2 years, a researcher visited each subject's home, evaluated the child, and interviewed the primary caregiver, which in most cases was a parent, who later completed and returned a questionnaire about their child's behavior. The primary caregiver was also asked to judge the child's pubertal status, based on indicators for an early phase of puberty such as breast growth in girls, adult-type body odor, and body hair; and growth spurts, deepening voices in boys, and menstruation in girls for a later stage. Girls typically enter puberty at age 10 or 11 and boys at 11 or 12. The researchers found that 16% of the girls and 6% of the boys in the study had entered puberty early, at age 8 or 9. Previously, researchers thought that any negative effects of early puberty showed up only after puberty's onset. But by tracking a cohort of children from age 4 to 5 to age 10 to 11, they found that problems thought restricted to postpuberty children actually appeared well before puberty. Retrospectively, they were able to show that children who later had early onset puberty had difficulty playing with other children and participating in normal school activities, even when they were 4 or 5 years old. Boys, though not girls, in this group had also showed behavior problems, such as being overactive, losing their tempers, and preferring to play alone from a young age. © 2010 American Association for the Advancement of Science.
Keyword: Development of the Brain; Hormones & Behavior
Link ID: 17987 - Posted: 04.03.2013
By Meghan Rosen Save the clunky tricorders for Star Trek. One day, tiny biological computers with DNA-based circuitry could diagnose diseases. Using snippets of DNA and DNA-clipping chemicals, researchers have created one key component of a computer’s brain: the transistor, a switch that helps electronics perform logic. The biological switch, dubbed a transcriptor, could be plugged together with other biological devices to boost the power of DNA-based computers, researchers report March 28 in Science. With these switches, researchers might be able to program probiotic bacteria — the kind found in yogurt — to detect signs of colon cancer and then spit out warning signals, says study coauthor Jerome Bonnet of Stanford University. “The bacteria could actually travel through your gut and make a color in your poop,” he says. Inside every smartphone, television and iPod, a computer chip holds circuits loaded with millions of transistors. By flipping on or off, the tiny switches direct electrical current to different parts of the chip. But inside cells, even just a few linked-up switches could be powerful, says synthetic biologist Timothy Lu of MIT. The simple circuits “probably wouldn’t be able to compute square roots,” he says, “but you don’t need to put a MacBook chip inside a cell to get some really interesting functions.” And genetic computers can go places conventional electronics can’t. Instead of controlling the flow of electrons across metal circuit wires, the biological switches control the flow of a protein along a “wire” of DNA in living bacteria. As the protein chugs along the wire, it sends out messages telling the cell to make specific molecules — molecules that color a person’s poop green, for example. © Society for Science & the Public 2000 - 2013
Keyword: Robotics
Link ID: 17986 - Posted: 04.03.2013
By James Gallagher Health and science reporter, BBC News Eye drops designed to lower cholesterol may be able to prevent one of the most common forms of blindness, according to US researchers. They showed how high cholesterol levels could affect the immune system and lead to macular degeneration. Tests on mice and humans, published in the journal Cell Metabolism, showed that immune cells became destructive when they were clogged with fats. Others cautioned that the research was still at an early stage. The macula is the sweet spot in the eye which is responsible for fine detail. It is essential for reading, driving and recognising people's faces. Macular degeneration is more common in old age. It starts in a "dry" form in which the light-sensing cells in the eye become damaged, but can progress into the far more threatening "wet" version, when newly formed blood vessels can rapidly cause blindness. Doctors at the Washington University School of Medicine investigated the role of macrophages, a part of the immune system, in the transition from the dry to the wet form of the disease. One of the researchers, Dr Rajendra Apte, said the role of macrophages changed and they triggered the production of new blood vessels. "Instead of being protective, they accelerate the disease, but we didn't understand why they switched to become the bad cells," he told the BBC. Normally the cells can "eat" fatty deposits and send them back into the blood. However, their research showed that older macrophages struggle. They could still eat the fats, but they could not expel them. So they became "bloated", causing inflammation which in turn led to the creation of new blood vessels. BBC © 2013
Keyword: Vision
Link ID: 17985 - Posted: 04.03.2013
By Puneet Kollipara President Barack Obama has unveiled a long-term neuroscience research initiative that will develop new tools and technologies to study human and animal brains on larger scales than currently possible. Announced April 2, the BRAIN Initiative could ultimately help researchers better understand human behavior and thought and develop new ways to diagnose, treat and cure neurological and psychiatric diseases. The initiative is slated to begin in October, with $100 million budgeted for the project in fiscal year 2014. The National Institutes of Health, the Defense Advanced Research Projects Agency and the National Science Foundation will lead the effort, which Obama likened to the Human Genome Project in terms of its ambitious aims and the scientific and health benefits the initiative could yield. The human brain remains one of the greatest scientific mysteries. Researchers can now probe only a small number of neurons simultaneously or get relatively crude looks at specific regions or the entirety of the brain. But scientists believe that understanding the action of circuits containing thousands or millions of coordinated neurons could lead to a better understanding of how the brain works — as well as what goes wrong when it doesn’t. Short for Brain Research through Advancing Innovative Neurotechnologies, the BRAIN Initiative would seek to develop tools and technologies to measure and manipulate the firing patterns of all neurons in a circuit. Other new tools — hardware, software and databases — would store the data, make it public and analyze it. The initiative takes its inspiration from a research vision known as the Brain Activity Map, which originated from a group of neuroscientists, nanoscientists and research groups. © Society for Science & the Public 2000 - 2013
Keyword: Brain imaging
Link ID: 17984 - Posted: 04.03.2013
By DOUGLAS QUENQUA A new study suggests that primates’ ability to see in three colors may not have evolved as a result of daytime living, as has long been thought. The findings, published in the journal Proceedings of the Royal Society B, are based on a genetic examination of tarsiers, the nocturnal, saucer-eyed primates that long ago branched off from monkeys, apes and humans. By analyzing the genes that encode photopigments in the eyes of modern tarsiers, the researchers concluded that the last ancestor that all tarsiers had in common had highly acute three-color vision, much like that of modern-day primates. Such vision would normally indicate a daytime lifestyle. But fossils show that the tarsier ancestor was also nocturnal, strongly suggesting that the ability to see in three colors somehow predated the shift to daytime living. The coexistence of the two normally incompatible traits suggests that primates were able to function during twilight or bright moonlight for a time before making the transition to a fully diurnal existence. “Today there is no mammal we know of that has trichromatic vision that lives during night,” said an author of the study, Nathaniel J. Dominy, associate professor of anthropology at Dartmouth. “And if there’s a pattern that exists today, the safest thing to do is assume the same pattern existed in the past. “We think that tarsiers may have been active under relatively bright light conditions at dark times of the day,” he added. “Very bright moonlight is bright enough for your cones to operate.” © 2013 The New York Times Company
Keyword: Vision; Evolution
Link ID: 17983 - Posted: 04.02.2013
By ANAHAD O'CONNOR Doctors have plenty of good reasons to persuade people with sleep apnea to get it treated. The widespread disorder causes disruptions in breathing at night, which can ruin sleep and raise the likelihood of problems like obesity and fatigue. The standard treatment for the condition, a mask worn at night that delivers continuous positive airway pressure, or CPAP, significantly improves apnea, even though many people don’t like to wear it. But the mask may do more than restore normal breathing at night. Some research suggests it reduces inflammation, benefiting overall health. Many studies have looked at the link between sleep apnea and high levels of inflammatory markers. To get a clearer picture of the connection, a team of researchers recently carried out a meta-analysis that pooled data from two dozen trials involving over 1,000 patients. It was published last month. The data suggested that treating apnea with CPAP significantly reduces levels of two proteins associated with inflammation: tumor necrosis factor and C-reactive protein, or CRP. Sleep apnea is a risk factor for several severe chronic conditions like Type 2 diabetes and heart disease. It’s not clear whether apnea helps drive the development of these disorders or vice versa. But reducing inflammation may be one way in which treatment with CPAP reverses some of the long-term consequences of the sleep disorder. THE BOTTOM LINE: Treating sleep apnea with positive airway pressure helps to lower systemic inflammation, which might prevent some of the other problems associated with the disorder. Copyright 2013 The New York Times Company
Keyword: Sleep; Neuroimmunology
Link ID: 17982 - Posted: 04.02.2013
By Kathryn Doyle Despite concerns that antidepressant use during pregnancy might affect infants’ growth and development, a small new study finds no size differences in the first year of life between babies exposed and not exposed to the drugs. The medications — known as selective serotonin reuptake inhibitors, or SSRIs, which include fluoxetine (marketed as Prozac) and citalopram (Celexa) — have been tied to premature births and lower birth weight. But their effect on growth during infancy had not been studied. The agency will lift a requirement that the products carry a strict limit on how long they can be used. “It’s a reassuring finding in that when you have an illness during pregnancy, you want to know what is the impact of the illness and what is the impact of the medication,” Katherine Wisner, the study’s lead author, said. Untreated depression also didn’t seem to influence infant growth, according to Wisner, the director of Northwestern University’s Asher Center for the Study and Treatment of Depressive Disorders. That’s important because a baby’s most rapid growth happens in the first year, which sets the stage for growth patterns for the whole life span, she added. Wisner and her colleagues tracked 97 pregnant women without depression, 46 on antidepressants and 31 with depression that was not treated with medication. Their babies were measured and weighed four times over the first year of life. © 1996-2013 The Washington Post
Keyword: Depression; Development of the Brain
Link ID: 17981 - Posted: 04.02.2013
A new manual for mental disorders is slated to be released in May and video-game addiction experts are hoping for a new addition. The Diagnostic and Statistical Manual of Mental Disorders doesn’t currently list video-game addiction in its list of disorders, but tech-addiction expert Hilarie Cash said that needs to change. Cash runs Restart in Seattle, Wash., one of the few known internet and video game addiction rehabilitation centres in North and South America. “People’s lives completely fall apart, and there are people who die from it,” said Cash. “It’s rewarding, right? You get this pump and then it fades and you miss it and you want it back,” said Valesquez. “It’s like a cycle.” He began gaming when he got his first Nintendo. When he shifted to gaming online, he saw major success. He was even sponsored by a gaming company at age 12. “If I couldn’t play, it was like profound boredom,” he said. That’s when it turned from a hobby into a habit for Valesquez. “I was generally playing, at the very peak, six to ten hours a day,” he said. As a result, his grades slid and he began to replace his real life friends with ones that were online. “Even if you don’t want to play, you feel a responsibility to go online. It’s like a community,” said Valesquez. Cash said those are classic signs of video-game addiction. “Most people understand that gambling can become, can develop into a serious addiction, so it’s like that,” said Cash. She said the most addictive games have a social component and are competitive. © CBC 2013
Keyword: Drug Abuse
Link ID: 17980 - Posted: 04.02.2013
By C. CLAIBORNE RAY Q. Can cataracts grow back after they have been removed? A. “Once a cataract is removed, it cannot grow back,” said Dr. Jessica B. Ciralsky, an ophthalmologist at NewYork-Presbyterian Hospital/Weill Cornell Medical Center. Blurred vision may develop after cataract surgery, mimicking the symptoms of the original cataract. This is not a recurrence of the cataract and is from a condition that is easily treated, said Dr. Ciralsky, who is a cornea and cataract specialist. Cataracts, which affect about 22 million Americans over 40, are a clouding of the eye’s naturally clear crystalline lens. Besides blurred vision, the symptoms include glare and difficulty driving at night. In cataract surgery, the entire cataract is removed and an artificial lens is implanted in its place; the capsule that held the cataract is left intact to provide support for the new lens. After surgery, patients may develop a condition called posterior capsular opacification, which is often referred to as a secondary cataract. “This is a misnomer,” Dr. Ciralsky said. “The cataract has not actually grown back.” Instead, she explained, in about 20 percent of patients, the capsule that once supported the cataract has become cloudy, or opacified. A simple laser procedure done in the office can treat the problem effectively. © 2013 The New York Times Company
Keyword: Vision
Link ID: 17979 - Posted: 04.02.2013
By John McCarthy Maybe this discovery is interesting because it sheds therapeutic light on the dreaded neurodegenerative diseases that killed Woody Guthrie and Lou Gehrig. Or maybe it’s fascination with healthy cells, and yet another unsuspected complexity in how they work. What’s discovered: a previously unknown energy source in nerve cells. It propels the molecular “motors” that drag neurotransmitters from the nucleus where they’re made. The “motors” are assemblies of molecules. They walk like clumsy robots, with a staggering gait, dragging a capsule of neurotransmitter “bullets” along microtubule “highways” between nucleus and synapses. They move by flinging their boot-like feet (lavender blobs, in the image) forward, a billionth of a meter at each step. (A superb animation of “motors” in action is XVIVO’s “Life of a Cell” (at ~1:15 of playing time)). When the cargo finally arrives at the synapses, neurotransmitters are loaded into compartments at the synapse’s interior face, like bullets into a magazine. They are ready to be “fired” across a synapse to signal an adjoining neuron. It’s this transport of neurotransmitter “bullets” that failed in Guthrie’s and Gehrig’s nerve cells. Their synapses had nothing to fire. What powers the flinging that moves those boots? Previously, the answer has been specialized molecules (acronym: ATP) spewed into the cell’s fluid interior by mitochondria. The boots, it was thought, powered each step by grabbing a floating ATP and blowing it up like a firecracker. © 2013 Scientific American
Keyword: Huntingtons
Link ID: 17978 - Posted: 04.02.2013
By DENISE GRADY A treatment that many people with multiple sclerosis had hoped would prove effective has failed its first rigorous test, according to a new study. The treatment uses balloons — the type commonly employed to open blocked arteries in people with heart disease — to widen veins in the head and neck. The technique is based on the unproven theory that narrowed veins cause multiple sclerosis by stopping blood from draining out of the brain properly, which is thought to damage nerves and the fatty sheath, myelin, that insulates them. A vascular surgeon from Italy, Dr. Paolo Zamboni, is the leading proponent of the idea. In recent years, 30,000 people around the world have flocked to clinics offering the balloon treatment, despite the lack of solid evidence for it. Many patients want it because the standard drug treatments have not helped. Multiple sclerosis is incurable and causes progressive disability that eventually forces many patients to use wheelchairs. Some people think the balloon treatment has helped them, and testimonials on the Internet have helped create a powerful demand for the procedure. Researchers at the University at Buffalo recruited 20 patients with the disease to test the theory. Half were picked at random to receive the treatment, and the other half underwent a “sham” procedure in which doctors did not actually use balloons. The patients did not know whether their veins had been expanded, and neither did the people who assessed them later. The patients were monitored for six months. There were no significant differences between the two groups in symptoms or tests used to measure the quality of life, the researchers reported last month at a meeting of the American Academy of Neurology in San Diego. In a few cases, brain lesions associated with the disease actually seemed to worsen after the treatment. © 2013 The New York Times Company
Keyword: Multiple Sclerosis
Link ID: 17977 - Posted: 04.02.2013
By Scicurious Much as we all like to think we’re modest, most of us really aren’t. We might try to be humble and say “we’re just some guy, you know?“, but most often, we actually think we’re better than average. Maybe we think we’re smarter, or better looking, or nicer, or maybe even all of the above. And it turns out that thinking we’re above average (even though, statistically, only half of us CAN be above average) is actually good for us. People who suffer from depression usually show a symptom called “depressive realism”. They actually see themselves MORE REALISTICALLY than other people do. And seeing yourself in the harsh light of reality…well it’s pretty depressing (you don’t really want to know how average you are in a sea of over 6 billion people. You don’t). Thinking that you are better than you actually are is sometimes called the Dunning-Kruger effect (though that usually refers specifically to how competent you think you are…when really you’re not), but in psychology it’s called the Superiority Illusion: the belief that you are better than average in any particular metric. But where does the superiority illusion come from? How do our brains give us this optimism bias? The authors of this study wanted to look at how our brain might give us the idea that we are better than the other guy. They were particularly interested in the connection between two areas of the brain, the frontal cortex, and the striatum. The frontal cortex does a lot of higher processing (things like sense of self), while the striatum is involved in things like feelings of reward. The connection between these two areas is called the fronto-striatal circuit. And the strength of that connection may mean something for how you think of yourself. While people who think well of themselves have relatively low connectivity in this circuit, people with depression have higher levels of connectivity. The two areas are MORE connected. © 2013 Scientific American
Keyword: Emotions
Link ID: 17976 - Posted: 04.02.2013