Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Jesse Bering Disgust, in its most familiar form, is our response to something vile in the world—spoiled food, a dirty floor or rats cavorting in the subway. It is a contamination-avoidance mechanism that evolved to help us make biologically adaptive decisions in the heat of the moment. Yet disgust has also come to have powerful symbolic elements. When left unchecked, these symbolic qualities can have devastating impacts on our mental states. Consider, for example, the often dramatized, heartbreaking image of a woman crouched in the corner of a shower and frantically trying to scrub her body clean after being raped. Empirical evidence supports the characterization. Seventy percent of female victims of sexual assault report a strong impulse to wash afterward, and a quarter of these continue to wash excessively up to three months later. For women, simply imagining an unwanted advance can turn on this moral-cleansing effect. Psychiatrist Nichole Fairbrother of the University of British Columbia Hospital and her colleagues looked more closely at the phenomenon of mental pollution in a study published in 2005. Two groups of female participants were told to close their eyes and picture being kissed. The members of one group were instructed to imagine being aggressively cornered and kissed against their will. The members of the other group were asked to envision themselves in a consensual embrace. Only those women in the coercive condition chose to wash up after the study. In many cases, it seems as though a person's sense of self has become contaminated. © 2013 Scientific American
Keyword: Emotions
Link ID: 18811 - Posted: 10.19.2013
by Bruce Bower Thomas Jefferson defended the right to pursue happiness in the Declaration of Independence. But that’s so 237 years ago. Many modern societies champion everyone’s right to be happy pretty much all the time. Good luck with that, says psychologist Joseph Forgas of the University of New South Wales in Sydney. A lack of close friends, unfulfilled financial dreams and other harsh realities leave many people feeling lonely and forlorn a lot of the time. But there’s a mental and social upside to occasional downers that often goes unappreciated. “Bad moods are seen in our happiness-focused culture as representing a problem, but we need to be aware that temporary, mild negative feelings have important benefits,” Forgas says. Growing evidence suggests that gloomy moods improve key types of thinking and behavior, Forgas asserts in a new review paper aptly titled “Don’t worry, be sad!” For good evolutionary reasons, positive and negative moods subtly recruit thinking styles suited to either benign or troubling situations, he says. Each way of dealing with current circumstances generally works well, if imperfectly. New and recent studies described by Forgas in the June Current Directions in Psychological Science illustrate some of the ways in which periods of sadness spontaneously recruit a detail-oriented, analytical thinking style. Morose moods have evolved as early-warning signs of problematic or dangerous situations that demand close attention, these reports suggest. © Society for Science & the Public 2000 - 2013.
Keyword: Emotions; Depression
Link ID: 18810 - Posted: 10.19.2013
By Brian Palmer Myopia isn’t an infectious disease, but it has reached nearly epidemic proportions in parts of Asia. In Taiwan, for example, the percentage of 7-year-old children suffering from nearsightedness increased from 5.8 percent in 1983 to 21 percent in 2000. An incredible 81 percent of Taiwanese 15-year-olds are myopic. If you think that the consequences of myopia are limited to a lifetime of wearing spectacles—and, let’s be honest, small children look adorable in eyeglasses—you are mistaken. The prevalence of high myopia, an extreme form of the disorder, in Asia has more than doubled since the 1980s, and children who suffer myopia early in life are more likely to progress to high myopia. High myopia is a risk factor for such serious problems as retinal detachment, glaucoma, early-onset cataracts, and blindness. The explosion of myopia is a serious public health concern, and doctors have struggled to identify the source of the problem. Nearsightedness has a strong element of heritability, but the surge in cases shows that a child’s environment plays a significant role. A variety of risk factors has been linked to the disorder: frequent reading, participation in sports, television watching, protein intake, and depression. When each risk factor was isolated, however, its overall effect on myopia rates seemed to be fairly minimal. Researchers believe they are now closing in on a primary culprit: too much time indoors. In 2008 orthoptics professor Kathryn Rose found that only 3.3 percent of 6- and 7-year-olds of Chinese descent living in Sydney, Australia, suffered myopia, compared with 29.1 percent of those living in Singapore. The usual suspects, reading and time in front of an electronic screen, couldn’t account for the discrepancy. The Australian cohort read a few more books and spent slightly more time in front of the computer, but the Singaporean children watched a little more television. On the whole, the differences were small and probably canceled each other out. The most glaring difference between the groups was that the Australian kids spent 13.75 hours per week outdoors compared with a rather sad 3.05 hours for the children in Singapore. © 2013 The Slate Group, LLC.
Keyword: Vision; Development of the Brain
Link ID: 18809 - Posted: 10.19.2013
Daniel Cossins It may not always seem like it, but humans usually take turns speaking. Research published today in Current Biology1 shows that marmosets, too, wait for each other to stop calling before they respond during extended vocal exchanges. The discovery could help to explain how humans came to be such polite conversationalists. Taking turns is a cornerstone of human verbal communication, and is common across all languages. But with no evidence that non-human primates 'converse' similarly, it was not clear how such behaviour evolved. The widely accepted explanation, known as the gestural hypothesis, suggests that humans might somehow have taken the neural machinery underlying cooperative manual gestures such as pointing to something to attract another person's attention to it, and applied that to vocalization. Not convinced, a team led by Daniel Takahashi, a neurobiologist at Princeton University in New Jersey, wanted to see whether another primate species is capable of cooperative calling. The researchers turned to common marmosets (Callithrix jacchus) because, like humans, they are prosocial — that is, generally friendly towards each other — and they communicate using vocalizations. After you The team recorded exchanges between pairs of marmosets that could hear but not see each other, and found that the monkeys never called at the same time. Instead, they always waited for roughly 5 seconds after a caller had finished before responding. © 2013 Nature Publishing Group
Keyword: Language; Animal Communication
Link ID: 18808 - Posted: 10.19.2013
Sending up the alarm when a predator approaches seems like a good idea on the surface. But it isn’t always, because such warnings might help the predator pinpoint the location of its next meal. So animals often take their audience into account when deciding whether or not to warn it of impending danger. And a new study in Biology Letters finds that the vulnerability of that audience matters, at least when we’re talking about baby birds and their parents. Tonya Haff and Robert Magrath of Australian National University in Canberra studied a local species, the white-browed scrubwren, by setting up an experiment to see if parents' reactions to predators changed when the babies were more vulnerable. Baby birds are vulnerable pretty much all the time but more so when they’re begging for food. That whining noise can lead a predator right to them. But a parent’s alarm call can shut them right up. Haff and Magrath began by determining that parent scrubwrens would respond normally when they heard recordings of baby birds. (They used recordings because those are more reliable than getting little chicks to act on cue.) Then they played those recordings or one of background noise near scrubwren nests. The role of the predator was played by a taxidermied pied currawong, with a harmless fake crimson rosella (a kind of parrot) used as a control. The mama and papa birds called out their “buzz” alarm more often when the pied currawong was present and the baby bird recording was being played. They barely buzzed when the parrot was present or only background noise was played. The parents weren’t alarm calling more just to be heard over the noise, the researchers say. If that were the case, then a second type of call — a contact “chirp” that mamas and papas give when approaching a nest — should also have become more common, which it didn’t. © Society for Science & the Public 2000 - 2013.
Keyword: Animal Communication; Language
Link ID: 18807 - Posted: 10.19.2013
Sid Perkins One of the most complete early human skulls yet found suggests that what scientists thought were three hominin species may in fact be one. This controversial claim comes from a comparison between the anatomical features of a 1.8-million-year-old fossil skull with those of four other skulls from the same excavation site at Dmanisi, Georgia. The wide variability in their features suggests that Homo habilis, Homo rudolfensis and Homo erectus, the species so far identified as existing worldwide in that era, might represent a single species. The research is published in Science today1. The newly described skull — informally known as 'skull 5' — was unearthed in 2005. When combined with a jawbone found five years before and less than 2 metres away, it “is the most complete skull of an adult from this date”, says Marcia Ponce de León, a palaeoanthropologist at the Anthropological Institute and Museum in Zurich, Switzerland, and one of the authors of the study. The volume of skull 5’s braincase is only 546 cubic centimetres, about one-third that of modern humans, she notes. Despite that low volume, the hominin’s face was relatively large and protruded more than the faces of the other four skulls found at the site, which have been attributed to H. erectus. Having five skulls from one site provides an unprecedented opportunity to study variation in what presumably was a single population, says co-author Christoph Zollikofer, a neurobiologist at the same institute as Ponce de León. All of the skulls excavated so far were probably deposited within a 20,000-year time period, he notes. © 2013 Nature Publishing Group
Keyword: Evolution
Link ID: 18806 - Posted: 10.19.2013
by Helen Thomson ONE moment you are alive. The next you are dead. A few hours later and you are alive again. Pharmacologists have discovered a mechanism that triggers Cotard's syndrome – the mysterious condition that leaves people feeling like they, or parts of their body, no longer exist. With the ability to switch the so-called walking corpse syndrome on and off comes the prospect of new insights into how conscious experiences are constructed. Acyclovir – also known by the brand name Zovirax – is a common drug used to treat cold sores and other herpes infections. It usually has no harmful side effects. However, about 1 per cent of people who take the drug orally or intravenously experience some psychiatric side effects, including Cotard's. These occur mainly in people who have renal failure. To investigate the potential link between acyclovir and Cotard's, Anders Helldén at Karolinska University Hospital in Stockholm and Thomas Lindén at the Sahlgrenska Academy in Gothenburg pooled data from Swedish drug databases along with hospital admissions. They identified eight people with acyclovir-induced Cotard's. One woman with renal failure began using acyclovir to treat shingles. She ran into a hospital screaming, says Helldén. After an hour of dialysis, she started to talk: she said the reason she was so anxious was that she had a strong feeling she was dead. After a few more hours of dialysis she said, "I'm not quite sure whether I'm dead any more but I'm still feeling very strange." Four hours later: "I'm pretty sure I'm not dead any more but my left arm is definitely not mine." Within 24 hours, the symptoms had disappeared. © Copyright Reed Business Information Ltd.
Keyword: Attention
Link ID: 18805 - Posted: 10.17.2013
by Denise Chow, LiveScience The discovery of a fossilized brain in the preserved remains of an extinct "mega-clawed" creature has revealed an ancient nervous system that is remarkably similar to that of modern-day spiders and scorpions, according to a new study. The fossilized Alalcomenaeus is a type of arthropod known as a megacheiran (Greek for "large claws") that lived approximately 520 million years ago, during a period known as the Lower Cambrian. The creature was unearthed in the fossil-rich Chengjiang formation in southwest China. VIDEO: Bugs, Arthropods, and Insects! Oh My! Researchers studied the fossilized brain, the earliest known complete nervous system, and found similarities between the extinct creature's nervous system and the nervous systems of several modern arthropods, which suggest they may be ancestrally related. [Photos of Clawed Arthropod & Other Strange Cambrian Creatures] Living arthropods are commonly separated into two major groups: chelicerates, which include spiders, horseshoe crabs and scorpions, and a group that includes insects, crustaceans and millipedes. The new findings shed light on the evolutionary processes that may have given rise to modern arthropods, and also provide clues about where these extinct mega-clawed creatures fit in the tree of life. "We now know that the megacheirans had central nervous systems very similar to today's horseshoe crabs and scorpions," senior author Nicholas Strausfeld, a professor in the department of neuroscience at the University of Arizona in Tucson, said in a statement. "This means the ancestors of spiders and their kin lived side by side with the ancestors of crustaceans in the Lower Cambrian." © 2013 Discovery Communications, LLC.
Keyword: Evolution
Link ID: 18804 - Posted: 10.17.2013
By Christopher Wanjek and LiveScience Your liver could be "eating" your brain, new research suggests. People with extra abdominal fat are three times more likely than lean individuals to develop memory loss and dementia later in life, and now scientists say they may know why. It seems that the liver and the hippocampus (the memory center in the brain), share a craving for a certain protein called PPARalpha. The liver uses PPARalpha to burn belly fat; the hippocampus uses PPARalpha to process memory. In people with a large amount of belly fat, the liver needs to work overtime to metabolize the fat, and uses up all the PPARalpha — first depleting local stores and then raiding the rest of the body, including the brain, according to the new study. The process essentially starves the hippocampus of PPARalpha, thus hindering memory and learning, researchers at Rush University Medical Center in Chicago wrote in the study, published in the current issue of Cell Reports. Other news reports were incorrect in stating that the researchers established that obese individuals were 3.6 times more likely than lean individuals to develop dementia. That finding dates back to a 2008 study by researchers at the Kaiser Permanente Division of Research in Oakland, Calif. In another study, described in a 2010 article in the Annals of Neurology, researchers at Boston University School of Medicine found that the greater the amount of belly fat, the greater the brain shrinkage in old age. © 2013 Scientific American
Keyword: Learning & Memory; Development of the Brain
Link ID: 18803 - Posted: 10.17.2013
by Bob Holmes The great flowering of human evolution over the past 2 million years may have been driven not by the African savannahs, but by the lakes of that continent's Great Rift Valley. This novel idea, published this week, may explain why every major advance in the evolution of early humans, from speciation to the vast increase in brain size, appears to have taken place in eastern Africa. Anthropologists have surmised for several years that early humans, or hominins, might have evolved their unusually large, powerful brains to cope with an increasingly variable climate over the past few million years. However, studies testing this hypothesis have been equivocal, perhaps because most use global or continental-scale measures of climate, such as studying trends in the amount of airborne dust from dry earth that is blown into the ocean and incorporated into deep-sea sediments. Mark Maslin, a palaeoclimatologist at University College London, and his colleague Susanne Shultz at the University of Manchester, UK, have taken a local approach instead, by studying whether the presence or absence of lakes in the Rift Valley affected the hominins living there. Maslin's hunch is that relatively short periods of extreme variability 2.6, 1.8, and 1 million years ago – which are important periods for human evolution – corresponded to times of rapid change in the large lakes of the Great Rift Valley. Because the valley concentrates rainfall from a wide area into relatively small basins, these lakes are unusually sensitive to rainfall and swell or disappear depending on climate. © Copyright Reed Business Information Ltd.
Keyword: Evolution
Link ID: 18802 - Posted: 10.17.2013
By Jason G. Goldman Scientists love yawning. No, that’s not quite right. Scientists love doing research on yawning. It seems to be of interest to folks in fields ranging from primatology to developmental psychology to psychopathology to animal behavior. If the notion of scientifically investigation the purpose of yawning makes you, well, yawn, then you’re missing one of the more interesting debates in the social cognition literature. To understand why yawning is about more than feeling tired or bored, we have to go back a few years. Once upon a time, scientists thought that yawning might be process through which the brain keeps itself cool (PDF). Yawning is associated with increases in blood pressure, and the consequential increase in blood flow might mean that the vascular system acts as a radiator, replacing the warm blood in the brain with cooler blood. It could also be that the deep inhalation of cold air during a yawn can, through convection, alter blood temperature which in turn could cool the brain. Even if it turns out that some yawns can be explained through purely physiological means, yawning is also contagious for humans and other species. If someone watches someone else yawning, they’ll be likely to yawn as well. That means that there is social component to yawning, and it might be related to empathy. It turns out that there’s a correlation between a person’s self-reported empathy and their susceptibility to reacting to a yawn contagion, and those who are more skilled at theory of mind tasks are also more likely (PDF) to yawn contagiously. © 2013 Scientific American
Keyword: Emotions; Evolution
Link ID: 18801 - Posted: 10.17.2013
Henry Astley In the Mark Twain story The Celebrated Jumping Frog of Calaveras County, a frog named Daniel Webster "could get over more ground at one straddle than any animal of his breed you ever see." Now, scientists have visited the real Calaveras County in hopes of learning more about these hopping amphibians. They’ve found that what they see in the lab doesn’t always match the goings-on in the real world. If you wanted to know how far the bullfrog Rana catesbeiana could jump, the scientific literature would give you one answer: 1.295 meters, published in Smithsonian Contributions to Zoology in 1978. If you looked at the Guinness Book of World Records, though, you'd find a different answer. In 1986, a bullfrog called Rosie the Ribeter covered 6.55 meters in three hops. If you divide by three, at least one of those hops had to be no shorter than 2.18 meters—about four bullfrog body lengths more than the number in the scientific paper. The disparity matters. If bullfrogs can hop only 1.3 meters, they have enough power in their muscles to pull off the jump without any other anatomical help. But if they can jump farther, they must also be using a stretchy tendon to power their hops—an ability that other frogs have but that researchers thought bullfrogs had lost. These particular amphibians, scientists speculated, might have made some kind of evolutionary tradeoff that shortened their jumps but enabled them to swim better in the water, where they spend much of their lives. © 2013 American Association for the Advancement of Science
Keyword: Miscellaneous
Link ID: 18800 - Posted: 10.17.2013
By Lary C. Walker Clumps of proteins twisted into aberrant shapes cause the prion diseases that have perplexed biologists for decades. The surprises just keep coming with a new report that the simple clusters of proteins responsible for Mad Cow and other prions diseases may, without help from DNA or RNA, be capable of changing form to escape the predations of drugs that target their eradication. Prion drug resistance could be eerily similar to that found in cancer and HIV—and may have implications for drug development for Alzheimer’s and Parkinson’s, neurodegenerative diseases also characterized by misfolded proteins. Prion diseases include scrapie, chronic wasting disease and bovine spongiform encephalopathy (mad cow disease) in nonhuman species, and Creutzfeldt-Jakob disease and fatal insomnia in humans. They are unusual in that they can arise spontaneously, as a result of genetic mutations, or, in some instances, through infection. Remarkably, the infectious agent is not a microbe or virus, but rather the prion itself, a clump of proteins without genetic material. The noxious agents originate when a normally generated protein – called the prion protein – mistakenly folds into a stable, sticky, and potentially toxic shape. When the misfolded protein contacts other prion protein molecules, they too are corrupted and begin to bind to one another. In the ensuing chain reaction, the prions grow, break apart, and spread; within the nervous system, they relentlessly destroy neurons, ultimately, and invariably, leading to death. © 2013 Scientific American
Keyword: Prions; Alzheimers
Link ID: 18799 - Posted: 10.17.2013
Anne Trafton, MIT News Office Schizophrenia patients usually suffer from a breakdown of organized thought, often accompanied by delusions or hallucinations. For the first time, MIT neuroscientists have observed the neural activity that appears to produce this disordered thinking. The researchers found that mice lacking the brain protein calcineurin have hyperactive brain-wave oscillations in the hippocampus while resting, and are unable to mentally replay a route they have just run, as normal mice do. Mutations in the gene for calcineurin have previously been found in some schizophrenia patients. Ten years ago, MIT researchers led by Susumu Tonegawa, the Picower Professor of Biology and Neuroscience, created mice lacking the gene for calcineurin in the forebrain; these mice displayed several behavioral symptoms of schizophrenia, including impaired short-term memory, attention deficits, and abnormal social behavior. In the new study, which appears in the Oct. 16 issue of the journal Neuron, Tonegawa and colleagues at the RIKEN-MIT Center for Neural Circuit Genetics at MIT’s Picower Institute for Learning and Memory recorded the electrical activity of individual neurons in the hippocampus of these knockout mice as they ran along a track. Previous studies have shown that in normal mice, “place cells” in the hippocampus, which are linked to specific locations along the track, fire in sequence when the mice take breaks from running the course. This mental replay also occurs when the mice are sleeping. These replays occur in association with very high frequency brain-wave oscillations known as ripple events.
Keyword: Schizophrenia
Link ID: 18798 - Posted: 10.17.2013
By MICHAEL WINES NEW HOLSTEIN, Wis. — Next to their white clapboard house on a rural road here, in long rows of cages set beneath the roofs of seven open-air sheds, Virginia and Gary Bonlander are raising 5,000 minks. Or were, anyway, until two Saturdays ago, when the police roused them from bed at 5 a.m. with a rap on their door. The Bonlanders woke one recent morning to find thousands of the creatures zipping across their lawn. Outside, 2,000 minks were scampering away — up to 50 top-quality, full-length and, suddenly, free-range mink coats. “The backyard was full of mink. The driveway was full of mink,” Mrs. Bonlander recalled a few days ago. “Then, pshew” — she made a whooshing sound — “they were gone.” And not only in Wisconsin, the mink-raising capital of the United States. After something of a hiatus, the animal rights movement has resumed a decades-old guerrilla war against the fur industry with a vengeance — and hints of more to come. In New Holstein; in Grand Meadow, Minn.; in Coalville, Utah; in Keota, Iowa; and four other states, activists say, eight dark-of-night raids on mink farms have liberated at least 7,700 of the critters — more than $770,000 worth of pelts — just since late July. That is more such raids than in the preceding three years combined. Two more raids in Ontario and British Columbia freed 1,300 other minks and foxes during the same period, according to the North American Animal Liberation Press Office, which bills itself as a conduit for messages from anonymous animal rights activists. “What we’re seeing now is unprecedented,” Peter Young, a Santa Cruz, Calif., activist who was imprisoned in 2005 for his role in raids on six mink ranches, said in a telephone interview. Though still an outspoken defender of the animal rights movement and mink-ranch raids, Mr. Young says he has no contact with those who raid fur farms or commit other illegal acts and, in fact, does not know who they are. © 2013 The New York Times Company
Keyword: Animal Rights
Link ID: 18797 - Posted: 10.17.2013
by Colin Barras A part of all of us loves sums. Eavesdropping on the brain while people go about their daily activity has revealed the first brain cells specialised for numbers. Josef Parvizi and his colleagues at Stanford University in California enlisted the help of three people with epilepsy whose therapy involved placing a grid of electrodes on the surface of their brain that record activity. Neurons fired in a region called the intraparietal sulcus when the three volunteers performed arithmetic tests, suggesting they dealt with numbers. The team continued to monitor brain activity while the volunteers went about their normal activity in hospital. Comparing video footage of their stay with their brain activity (see video, above) revealed that the neurons remained virtually silent for most of the time, bursting into life only when the volunteers talked about numbers or numerical concepts such as "more than" or "less than". There is debate over whether some neural populations perform many functions or are involved in very precise tasks. "We show here that there is specialisation for numeracy," says Parvizi. Journal reference: Nature Communications, DOI: 10.1038/ncomms3528 © Copyright Reed Business Information Ltd.
Keyword: Attention
Link ID: 18796 - Posted: 10.16.2013
Ewen Callaway As a new study in the British Medical Journal reveals that 1 in 2000 people in the UK may harbour the infectious prion protein which causes variant Creutzfeldt–Jakob disease (vCJD), Nature explains what this means. The usually fatal condition is the human form of bovine spongiform encepalpoathy — dubbed 'mad cow disease' in the UK after an outbreak of the disease in the 1980s. Both diseases are caused by misfolded proteins called prions, which induce other proteins in the brain to clump, eventually destoying neurons. Humans are thought to contract the disease by consuming beef containing infected bovine brain or other central nervous system tissue. But it also spreads through blood transfusions, and some worry that the prion disease is transmitted via contaminated surgical instruments . The BSE outbreak in the 1980s and 1990s led to a surge in British vCJD cases, and a total of 177 have been detected in the UK to date, with just one in the last two years. Cases of vCJD peaked in 2000, leading some scientists to speculate that the disease takes about a decade to develop. Yet other studies of different forms of CJD suggest its incubation time could be much longer — indicating that many Britons may be carrying the infection without symtoms. Studies have come to varying conclusions as to just how many people harbour the abnormal prion protein (PrP) that causes vCJD. Surveys of tens of thousands of appendices and tonsil, discarded after surgery, have come up with prevalence rates ranging from 1 in 40001 to 1 in 10,0002 to 03. © 2013 Nature Publishing Group
Keyword: Prions
Link ID: 18795 - Posted: 10.16.2013
Doug Greene, WVIT and NBC News staff NBC News Oreos are as addictive as cocaine, at least for lab rats, and just like us, they like the creamy center best. Eating the sugary treats activates more neurons in the brain’s “pleasure center” than drugs such as cocaine, the team at Connecticut College found. “Our research supports the theory that high-fat/ high-sugar foods stimulate the brain in the same way that drugs do,” neuroscience assistant professor Joseph Schroeder says. “That may be one reason people have trouble staying away from them and it may be contributing to the obesity epidemic.” Schroeder’s neuroscience students put hungry rats into a maze. On one side went rice cakes. “Just like humans, rats don’t seem to get much pleasure out of eating them,” Schroeder said. On the other side went Oreos. Then the rats got the option of hanging out where they liked. They compared the results to a different test. In that on, rats on one side if the maze got an injection of saline while those on the other side got injections of cocaine or morphine. Rats seems to like the cookies about as much as they liked the addictive drugs. When allowed to wander freely, they’d congregate on the Oreo side for about as much time as they would on the drug side. Oh, and just like most people - the rats eat the creamy center first.
Keyword: Drug Abuse; Obesity
Link ID: 18794 - Posted: 10.16.2013
by Simon Makin A drug similar to ketamine has been shown to work as an antidepressant, without the psychosis-like side effects associated with the party drug. In 2000, ketamine was seen to alleviate depression almost immediately in people for whom other treatments had failed. Larger clinical trials have since corroborated the findings. The drawback is that ketamine can cause hallucinations and other psychotic symptoms, making it unsuitable for use as a treatment. These effects also make it difficult to conduct randomised, placebo-controlled trials – the gold standard in clinical medicine – as it is obvious which participants have been given the drug. This meant that there was a possibility that the beneficial effects seen in previous trials were inflated. So a team led by Gerard Sanacora of Yale University and Mike Quirk of pharmaceutical firm AstraZeneca looked for an alternative compound. They decided to test lanicemine, a drug originally developed to treat epilepsy that targets the same brain receptors as ketamine. The team gave 152 people with moderate-to-severe depression and a history of poor response to antidepressants either lanicemine or a placebo three times a week, for three weeks. They were allowed to continue taking any medications they were already on. Before and after the trial the participants' level of depression was rated on a 60-point scale. After three weeks, those taking lanicemine were less depressed by an average of 13.5 points – 5.5 points better than those who took the placebo. The improvement was still statistically significant up to two weeks after the treatment ended. Dizziness was the only common side effect. © Copyright Reed Business Information Ltd.
Keyword: Depression; Drug Abuse
Link ID: 18793 - Posted: 10.16.2013
By Emilie Reas Think back to your first childhood beach vacation. Can you recall the color of your bathing suit, the softness of the sand, or the excitement of your first swim in the ocean? Early memories such as this often arise as faded snapshots, remarkably distinct from newer memories that can feel as real as the present moment. With time, memories not only lose their rich vividness, but they can also become distorted, as our true experiences tango with a fictional past. The brain’s ability to preserve or alter memories lies at the heart of our basic human experience. The you of today is molded not only by your personal history, but also by your mental visits to that past, prompting you to laugh over a joke heard yesterday, reminisce about an old friend or cringe at the thought of your awkward adolescence. When we lose those pieces of the past we lose pieces of our identity. But just where in the brain do those old memories go? Despite decades studying how the brain transforms memories over time, neuroscientists remain surprisingly divided over the answer. Some of the best clues as to how the brain processes memories have come from patients who can’t remember. If damage to a particular brain area results in memory loss, researchers can be confident that the region is important for making or recalling memories. Such studies have reliably shown that damage to the hippocampus, a region nestled deep inside the brain, prevents people from creating new memories. But a key question, still open to debate, is what happens to a memory after it’s made. Does it stay in the hippocampus or move out to other areas of the brain? To answer this, scientists have studied old memories formed before brain damage, only to discover a mix of inconsistent findings that have given rise to competing theories. © 2013 Scientific American
Keyword: Learning & Memory
Link ID: 18792 - Posted: 10.16.2013