Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
|By Lindsey Konkel For 28 years, Bill Gilmore lived in a New Hampshire beach town, where he surfed and kayaked. “I’ve been in water my whole life,” he said. “Before the ocean, it was lakes. I’ve been a water rat since I was four.” Now Gilmore can no longer swim, fish or surf, let alone button a shirt or lift a fork to his mouth. Earlier this year, he was diagnosed with amyotrophic lateral sclerosis (ALS), or Lou Gehrig’s disease. In New England, medical researchers are now uncovering clues that appear to link some cases of the lethal neurological disease to people’s proximity to lakes and coastal waters. About five years ago, doctors at a New Hampshire hospital noticed a pattern in their ALS patients—many of them, like Gilmore, lived near water. Since then, researchers at Dartmouth-Hitchcock Medical Center have identified several ALS hot spots in lake and coastal communities in New England, and they suspect that toxic blooms of blue-green algae—which are becoming more common worldwide—may play a role. Now scientists are investigating whether breathing a neurotoxin produced by the algae may raise the risk of the disease. They have a long way to go, however: While the toxin does seem to kill nerve cells, no research, even in animals, has confirmed the link to ALS. As with all ALS patients, no one knows what caused Bill Gilmore’s disease. He was a big, strong guy—a carpenter by profession. One morning in 2011, his arms felt weak. “I couldn’t pick up my tools. I thought I had injured myself,” said Gilmore, 59, who lived half his life in Hampton and now lives in Rochester, N.H. © 2014 Scientific American
by Colin Barras It's not just great minds that think alike. Dozens of the genes involved in the vocal learning that underpins human speech are also active in some songbirds. And knowing this suggests that birds could become a standard model for investigating the genetics of speech production – and speech disorders. Complex language is a uniquely human trait, but vocal learning – the ability to pick up new sounds by imitating others – is not. Some mammals, including whales, dolphins and elephants, share our ability to learn new vocalisations. So do three groups of birds: the songbirds, parrots and hummingbirds. The similarities between vocal learning in humans and birds are not just superficial. We know, for instance, that songbirds have specialised vocal learning brain circuits that are similar to those that mediate human speech. What's more, a decade ago we learned that FOXP2, a gene known to be involved in human language, is also active in "area X" of the songbird brain – one of the brain regions involved in those specialised vocal learning circuits. Andreas Pfenning at the Massachusetts Institute of Technology and his colleagues have now built on these discoveries. They compared maps of genetic activity – transcriptomes – in brain tissue taken from the zebra finch, budgerigar and Anna's hummingbird, representing the three groups of vocal-learning birds. © Copyright Reed Business Information Ltd.
By Gail Sullivan Chemicals found in food and common household products have been linked to lower IQ in kids exposed to high levels during pregnancy. Previous research linked higher exposure to chemicals called "phthalates" to poor mental and motor development in preschoolers. This study was said to be the first to report a link between prenatal exposure to the chemicals and childhood development. Researchers from Columbia University’s Mailman School of Public Health studied exposure to five types of phthalates, which are sometimes referred to as “hormone disruptors” or “endocrine disruptors.” Among these, di-n-butyl phthalate (DnBP) is used in shower curtains, raincoats, hairspray, food wraps, vinyl and pill coating, among other things — but according to the EPA, the largest source of exposure may be seafood. Di-isobutyl phthalate (DiBP) and Butylbenzyl phthalate (BBzP) are added to plastics to make them flexible. These chemicals may also used in makeup, nail polish, lacquer and explosives. The researchers linked prenatal exposure to phthalates to a more than six-point drop in IQ score compared with kids with less exposure. The study, “Persistent Associations between Maternal Prenatal Exposure to Phthalates on Child IQ at Age 7 Years," was published Wednesday in the journal PLOS One. "The magnitude of these IQ differences is troubling," one of the study’s authors, Robin Whyatt, said in a press release. "A six- or seven-point decline in IQ may have substantial consequences for academic achievement and occupational potential."
by Andy Coghlan To catch agile prey on the wing, dragonflies rely on the same predictive powers we use to catch a ball: that is, anticipating by sight where the ball will go and readying body and hand to snatch it from mid-air. Until now, dragonflies were thought to catch their prey without this predictive skill, instead blindly copying every steering movement made by their prey, which can include flies and bees. Now, sophisticated laboratory experiments have tracked the independent body and eye movements of dragonflies as they pursue prey, showing for the first time that dragonflies second guess where their prey will fly to next and then steer their flight accordingly. Throughout the pursuit, they lock on to their target visually while they orient their bodies and flight path for ultimate interception, rather than copying each little deviation in their prey's flight path in the hope of ultimately catching up with it. "The dragonfly lines up its body axis in the flight direction of the prey, but keeps the eyes in its head firmly fixed on the prey," says Anthony Leonardo of the Howard Hughes Medical Institute in Ashburn, Virginia. "It enables the dragonfly to catch the prey from beneath and behind, the prey's blind spot," he says. © Copyright Reed Business Information Ltd.
Link ID: 20412 - Posted: 12.13.2014
|By Claudia Wallis Touch a hot frying pan and the searing message of pain sprints up to your brain and back down to your hand so fast that the impulse to withdraw your fingers seems instantaneous. That rapid-fire signal begins in a heat-sensing molecule called a TRPV1 channel. This specialized protein is abundant on the surface of sensory nerve cells in our fingers and elsewhere and is a shape-shifter that can take an open or closed configuration. Heat opens a central pore in the molecule, so do certain spider toxins and capsaicin—the substance that gives chili peppers their burn. Once the pore is open, charged ions of sodium and calcium flow into the nerve cell, triggering the pain signal. Ouch! As neuroscientist-journalist Stephani Sutherland explains in “Pain that Won’t Quit,” in the December Scientific American, researchers have long been interested in finding ways to moderate the action of this channel—and other ion channels—in patients who suffer from chronic pain. Shutting down the TRPV1 channel completely, however, is not an option because it plays a vital role in regulating body temperature. In two papers published in Nature in December 2013 investigators at the University of California, San Francisco, gave pain researchers a big leg up in understanding TRPV1. They revealed, in exquisite atomic detail, the structure of the channel molecule (from a rat) using an electron cryomicroscope, an instrument designed to explore the 3-D structure of molecules at very low temperatures. One of those investigators, Yifan Cheng, also created this colorful animation, showing how the molecule looks when the channel is open. © 2014 Scientific American
Keyword: Pain & Touch
Link ID: 20411 - Posted: 12.13.2014
By Dr. Mitesh Popat It’s common knowledge that eating better, exercising more, limiting alcohol intake and not smoking can lead to a healthier, longer life. For many, sustaining healthy behaviors is not easy. For diabetics, maintaining healthy behaviors is even more challenging, although it is critical. If well managed, the disease can be held in check; if not, it can be devastating, leading to kidney failure, blindness, stroke and even death. It may be a surprise that there is strong association between depression, anxiety and diabetes. Not only can depression and anxiety seriously affect the ability to manage the disease, but there also is evidence that, for some, depression plays a role in actually causing diabetes. Research indicates that depression is unrecognized and untreated in approximately two-thirds of patients with diabetes. Whether cause or effect, the medical profession needs to do more to address the psychological issues associated with the disease. As a family medicine physician, I see the association on daily basis. Some patients are so overwhelmed by the necessary daily self-care that comes with diabetes that they become highly anxious and depressed. Others who are suffering from complications or are having trouble managing their blood sugar levels may feel a loss of control and get anxious or depressed. These symptoms are often compounded in people who live in poverty, including the low-income Latinos, African Americans and seniors whom we care for at Marin Community Clinics. Diabetes has become an epidemic in these groups. Working three jobs and constantly worrying about making ends meet can trigger depression and anxiety in anyone. Add to that the need to adopt a disciplined healthy lifestyle, and it can be a real struggle.
Link ID: 20410 - Posted: 12.13.2014
By Gary Stix Our site recently ran a great story about how brain training really doesn’t endow you instantly with genius IQ. The games you play just make you better at playing those same games. They aren’t a direct route to a Mensa membership. Just a few days before that story came out—Proceedings of the National Academy of Sciences—published a report that suggested that playing action video games, Call of Duty: Black Ops II and the like—actually lets gamers learn the essentials of a particular visual task (the orientation of a Gabor signal—don’t ask) more rapidly than non-gamers, a skill that has real-world relevance beyond the confines of the artificial reality of the game itself. As psychologists say, it has “transfer effects.” Gamers appear to have learned how to do stuff like home in quickly on a target or multitask better than those who inhabit the non-gaming world. Their skills might, in theory, make them great pilots or laparoscopic surgeons, not just high scorers among their peers. Action video games are not billed as brain training, but both Call of Duty and nominally accredited training programs like Lumosity are both structured as computer games. So that leads to the question of what’s going on here? Every new finding about brain training as B.S. appears to be contradicted by another that points to the promise of cognitive exercise, if that’s what you call a session with Call of Duty. It may boil down to a realization that the whole story about exercising your neurons to keep the brain supple may be a lot less simple than proponents make it out to be. © 2014 Scientific American
Keyword: Learning & Memory
Link ID: 20409 - Posted: 12.13.2014
By Nsikan Akpan Gut surgery is often the only option for life-threatening obesity and diabetes, but what if doctors could cut the pounds without using a knife? Scientists have engineered an antiobesity drug that rivals the dramatic benefits seen with surgery, dropping excess body weight by a third. Though the work was done only in rodents, the drug is the first to influence three obesity-related hormones in the gut at once. Bariatric surgery, including gastric bypass, typically involves limiting food intake by removing part of the stomach or intestines. Yet it does more than shrink the size of patient’s stomach or intestines. It also changes the release of multiple gut-related hormones, explains clinical endocrinologist Stephen O'Rahilly of the University of Cambridge in the United Kingdom, who wasn’t involved with the study. That’s important, because years of eating a diet high in fat and sugar can throw a person’s metabolism into disarray. Cells undergo genetic reprogramming that negatively impacts how they process sugar and store fat, locking in obesity. This pattern makes it harder and harder to lose weight, even if a person changes their diet and begins exercising. Bariatric surgery interrupts that cycle by stimulating the production of several hormones that reduce blood sugar, burn fat, and curb appetite. (It may also change the composition of the gut’s microbes.) Three of these hormones are called glucagon-like peptide-1 (GLP-1), gastric inhibitory peptide (GIP), and glucagon. Cells in your gut release GLP-1 and GIP after a meal to keep your body’s blood sugar levels in a normal range. GLP-1 also curbs appetite, signaling to your brain that you are full. In type 2 diabetes, the body stops responding to GLP-1 and GIP, which contributes to hyperglycemia, or too much blood sugar. Hyperglycemia causes the devastating hallmarks of diabetes, such as kidney injury, cardiovascular disease, and nerve damage. © 2014 American Association for the Advancement of Science.
Link ID: 20408 - Posted: 12.10.2014
By ANDREW POLLACK It is either the most exciting new treatment for depression in years or it is a hallucinogenic club drug that is wrongly being dispensed to desperate patients in a growing number of clinics around the country. It is called ketamine — or Special K, in street parlance. While it has been used as an anesthetic for decades, small studies at prestigious medical centers like Yale, Mount Sinai and the National Institute of Mental Health suggest it can relieve depression in many people who are not helped by widely used conventional antidepressants like Prozac or Lexapro. And the depression seems to melt away within hours, rather than the weeks typically required for a conventional antidepressant. But some psychiatrists say the drug has not been studied enough to be ready for use outside of clinical trials, and they are alarmed that clinics are springing up to offer ketamine treatments, charging hundreds of dollars for sessions that must be repeated many times. “We don’t know what the long-term side effects of this are,” said Dr. Anthony J. Rothschild, a professor of psychiatry at the University of Massachusetts Medical School. Some psychiatrists say the drug has not been studied enough to be ready for use outside of clinical trials. Credit Sandy Huffaker for The New York Times Pharmaceutical companies hope to solve the problem by developing drugs that work like ketamine but without the side effects, which are often described as out-of-body experiences. © 2014 The New York Times Company
by Helen Thomson Zapping your brain might make you better at maths tests – or worse. It depends how anxious you are about taking the test in the first place. A recent surge of studies has shown that brain stimulation can make people more creative and better at maths, and can even improve memory, but these studies tend to neglect individual differences. Now, Roi Cohen Kadosh at the University of Oxford and his colleagues have shown that brain stimulation can have completely opposite effects depending on your personality. Previous research has shown that a type of non-invasive brain stimulation called transcranial direct current stimulation (tDCS) – which enhances brain activity using an electric current – can improve mathematical ability when applied to the dorsolateral prefrontal cortex, an area involved in regulating emotion. To test whether personality traits might affect this result, Kadosh's team tried the technique on 25 people who find mental arithmetic highly stressful, and 20 people who do not. They found that participants with high maths anxiety made correct responses more quickly and, after the test, showed lower levels of cortisol, an indicator of stress. On the other hand, individuals with low maths anxiety performed worse after tDCS. "It is hard to believe that all people would benefit similarly [from] brain stimulation," says Cohen Kadosh. He says that further research could shed light on how to optimise the technology and help to discover who is most likely to benefit from stimulation. © Copyright Reed Business Information Ltd.
Ian Sample, science editor Electrical brain stimulation equipment – which can boost cognitive performance and is easy to buy online – can have bad effects, impairing brain functioning, research from scientists at Oxford University has shown. A steady stream of reports of stimulators being able to boost brain performance, coupled with the simplicity of the devices, has led to a rise in DIY enthusiasts who cobble the equipment together themselves, or buy it assembled on the web, then zap themselves at home. In science laboratories brain stimulators have long been used to explore cognition. The equipment uses electrodes to pass gentle electric pulses through the brain, to stimulate activity in specific regions of the organ. Roi Cohen Kadosh, who led the study, published in the Journal of Neuroscience, said: “It’s not something people should be doing at home at this stage. I do not recommend people buy this equipment. At the moment it’s not therapy, it’s an experimental tool.” The Oxford scientists used a technique called transcranial direct current stimulation (tDCS) to stimulate the dorsolateral prefrontal cortex in students as they did simple sums. The results of the test were surprising. Students who became anxious when confronted with sums became calmer and solved the problems faster than when they had sham stimulation (the stimulation itself lasted only 30 seconds of the half hour study). The shock was that the students who did not fear maths performed worse with the same stimulation.
By Tina Rosenberg When Ebola ends, the people who have suffered, who have lost loved ones, will need many things. They will need ways to rebuild their livelihoods. They will need a functioning health system, which can ensure that future outbreaks do not become catastrophes. And they will need mental health care. Depression is the most important thief of productive life for women around the world, and the second-most important for men. We sometimes imagine it is a first-world problem, but depression is just as widespread, if not more so, in poor countries, where there is a good deal more to be depressed about. And it is more debilitating, as a vast majority of sufferers have no safety net. Health care for all must include mental health care. It’s hard to believe but both Liberia and Sierra Leone have only a single psychiatrist. The Ebola crisis has exposed these countries’ malignant neglect of their health systems. People can’t get care for diarrhea and malaria. How will these countries take care of an epidemic of depression? This isn’t really a medical question. We know how to treat depression. What we don’t know yet is how to make effective treatment cheap, culturally appropriate, convenient and non-stigmatizing — all needed to get treatment out to millions and millions of people. But some researchers are finding out. They are doing so despite the fact that growing attention to this issue hasn’t been accompanied by money. The U.S. National Institute of Mental Health last year provided just $24.5 million for global mental health efforts, and the Canadian government’s Grand Challenges Canada, which is said to have the largest portfolio of mental health innovation in developing countries, has spent only $28 million on them since it began in 2010. © 2014 The New York Times Company
Link ID: 20404 - Posted: 12.08.2014
By Lenny Bernstein There are 60 million epileptics on the planet, and while advances in medication and implantable devices have helped them, the ability to better detect and even predict when they will have debilitating seizures would be a significant improvement in their everyday lives. Imagine, for example, if an epileptic knew with reasonable certainty that his next seizure would not occur for an hour or a day or a week. That might allow him to run to the market or go out for the evening or plan a short vacation with less concern. Computers and even dogs have been tested in the effort to do this, but now a group of organizations battling epilepsy is employing "big data" to help. They sponsored an online competition that drew 504 entrants who tried to develop algorithms that would detect and predict epileptic seizures. Instead of the traditional approach of asking researchers in a handful of labs to tackle the problem, the groups put huge amounts of data online that was recorded from the brains of dogs and people as they had seizures over a number of months. They then challenged anyone interested to use the information to develop detection and prediction models. "Seizure detection and seizure prediction," said Walter J. Koroshetz, deputy director of the National Institute of Neurological Disorders and Stroke (NINDS), are "two fundamental problems in the field that are poised to take significant advantage of large data computation algorithms and benefit from the concept of sharing data and generating reproducible results."
Link ID: 20403 - Posted: 12.08.2014
By Quassim Cassam Most people wonder at some point in their lives how well they know themselves. Self-knowledge seems a good thing to have, but hard to attain. To know yourself would be to know such things as your deepest thoughts, desires and emotions, your character traits, your values, what makes you happy and why you think and do the things you think and do. These are all examples of what might be called “substantial” self-knowledge, and there was a time when it would have been safe to assume that philosophy had plenty to say about the sources, extent and importance of self-knowledge in this sense. Not any more. With few exceptions, philosophers of self-knowledge nowadays have other concerns. Here’s an example of the sort of thing philosophers worry about: suppose you are wearing socks and believe you are wearing socks. How do you know that that’s what you believe? Notice that the question isn’t: “How do you know you are wearing socks?” but rather “How do you know you believe you are wearing socks?” Knowledge of such beliefs is seen as a form of self-knowledge. Other popular examples of self-knowledge in the philosophical literature include knowing that you are in pain and knowing that you are thinking that water is wet. For many philosophers the challenge is explain how these types of self-knowledge are possible. This is usually news to non-philosophers. Most certainly imagine that philosophy tries to answer the Big Questions, and “How do you know you believe you are wearing socks?” doesn’t sound much like one of them. If knowing that you believe you are wearing socks qualifies as self-knowledge at all — and even that isn’t obvious — it is self-knowledge of the most trivial kind. Non-philosophers find it hard to figure out why philosophers would be more interested in trivial than in substantial self-knowledge. © 2014 The New York Times Company
Link ID: 20402 - Posted: 12.08.2014
By JOHN McWHORTER “TELL me, why should we care?” he asks. It’s a question I can expect whenever I do a lecture about the looming extinction of most of the world’s 6,000 languages, a great many of which are spoken by small groups of indigenous people. For some reason the question is almost always posed by a man seated in a row somewhere near the back. Asked to elaborate, he says that if indigenous people want to give up their ancestral language to join the modern world, why should we consider it a tragedy? Languages have always died as time has passed. What’s so special about a language? The answer I’m supposed to give is that each language, in the way it applies words to things and in the way its grammar works, is a unique window on the world. In Russian there’s no word just for blue; you have to specify whether you mean dark or light blue. In Chinese, you don’t say next week and last week but the week below and the week above. If a language dies, a fascinating way of thinking dies along with it. I used to say something like that, but lately I have changed my answer. Certainly, experiments do show that a language can have a fascinating effect on how its speakers think. Russian speakers are on average 124 milliseconds faster than English speakers at identifying when dark blue shades into light blue. A French person is a tad more likely than an Anglophone to imagine a table as having a high voice if it were a cartoon character, because the word is marked as feminine in his language. This is cool stuff. But the question is whether such infinitesimal differences, perceptible only in a laboratory, qualify as worldviews — cultural standpoints or ways of thinking that we consider important. I think the answer is no. Furthermore, extrapolating cognitive implications from language differences is a delicate business. In Mandarin Chinese, for example, you can express If you had seen my sister, you’d have known she was pregnant with the same sentence you would use to express the more basic If you see my sister, you know she’s pregnant. One psychologist argued some decades ago that this meant that Chinese makes a person less sensitive to such distinctions, which, let’s face it, is discomfitingly close to saying Chinese people aren’t as quick on the uptake as the rest of us. The truth is more mundane: Hypotheticality and counterfactuality are established more by context in Chinese than in English. © 2014 The New York Times Company
Link ID: 20401 - Posted: 12.08.2014
Carl Zimmer For thousands of years, fishermen knew that certain fish could deliver a painful shock, even though they had no idea how it happened. Only in the late 1700s did naturalists contemplate a bizarre possibility: These fish might release jolts of electricity — the same mysterious substance as in lightning. That possibility led an Italian physicist named Alessandro Volta in 1800 to build an artificial electric fish. He observed that electric stingrays had dense stacks of muscles, and he wondered if they allowed the animals to store electric charges. To mimic the muscles, he built a stack of metal disks, alternating between copper and zinc. Volta found that his model could store a huge amount of electricity, which he could unleash as shocks and sparks. Today, much of society runs on updated versions of Volta’s artificial electric fish. We call them batteries. Now a new study suggests that electric fish have anticipated other kinds of technology. The research, by Kenneth C. Catania, a biologist at Vanderbilt University, reveals a remarkable sophistication in the way electric eels deploy their shocks. Dr. Catania, who published the study on Thursday in the journal Science, found that the eels use short shocks like a remote control on their victims, flushing their prey out of hiding. And then they can deliver longer shocks that paralyze their prey at a distance, in precisely the same way that a Taser stops a person cold. “It shows how finely adapted eels are to attack prey,” said Harold H. Zakon, a biologist at the University of Texas at Austin, who was not involved in the study. He considered Dr. Catania’s findings especially impressive since scientists have studied electric eels for more than 200 years. © 2014 The New York Times Company
Link ID: 20400 - Posted: 12.06.2014
Kelly Servick* Anesthesiologists and surgeons who operate on children have been dogged by a growing fear—that being under anesthesia can permanently damage the developing brain. Although the few studies of children knocked out for surgeries have been inconclusive, evidence of impaired development in nematodes, zebrafish, rats, guinea pigs, pigs, and monkeys given common anesthetics has piled up in recent years. Now, the alarm is reaching a tipping point. “Anything that goes from [the roundworm] C. elegans to nonhuman primates, I've got to worry about,” Maria Freire, co-chair of the U.S. Food and Drug Administration (FDA) science advisory board, told attendees at a meeting the agency convened here last month to discuss the issue. The gathering came as anesthesia researchers and regulators consider several moves to address the concerns: a clinical trial of anesthetics in children, a consensus statement about their possible risks, and an FDA warning label on certain drugs. But each step stirs debate. Many involved in the issue are reluctant to make recommendations to parents and physicians based on animal data alone. At the same time, more direct studies of anesthesia's risks in children are plagued by confounding factors, lack of funding, and ethical issues. “We have to generate—very quickly—an action item, because I don't think the status quo is acceptable,” Freire said at the 19 November meeting. “Generating an action item without having the data is where things become very, very tricky.” © 2014 American Association for the Advancement of Science
by Michael Slezak The elusive link between obesity and high blood pressure has been pinned down to the action of leptin in the brain, and we might be able to block it with drugs. We've known for more than 30 years that fat and high blood pressure are linked, but finding what ties them together has been difficult. One of the favourite candidates has been leptin – a hormone produced by fat cells. Under normal circumstances, when fat cells produce leptin, the hormone sends the message that you've had enough food. But in people with obesity, the body stops responding to this message, and large levels of leptin build up. Leptin is known to activate the regulatory network called the sympathetic nervous system, and it's the activation of sympathetic nerves on the kidneys that seem to be responsible for raising blood pressure. Leptin has thus been linked to blood pressure. However, conclusive evidence has been hard to come by. Michael Cowley of Monash University in Melbourne, Australia, and his colleagues have now conducted a string of experiments that provide some evidence. Through genetic and drug experiments in mice, they have pinpointed an area in the mouse brain that increases blood pressure when it is exposed to high leptin levels. This region is called the dorsomedial hypothalamus, and is thought to be involved in controlling energy consumption. Their findings show that high levels in leptin do indeed boost blood pressure, via this brain region. © Copyright Reed Business Information Ltd.
Link ID: 20398 - Posted: 12.06.2014
|By Bret Stetka When University of Bonn psychologist Monika Eckstein designed her latest published study, the goal was simple: administer a hormone into the noses of 62 men in hopes that their fear would go away. And for the most part, it did. The hormone was oxytocin, often called our “love hormone” due to its crucial role in mother-child relationships, social bonding, and intimacy (levels soar during sex). But it also seems to have a significant antianxiety effect. Give oxytocin to people with certain anxiety disorders, and activity in the amygdala—the primary fear center in human and other mammalian brains, two almond-shaped bits of brain tissue sitting deep beneath our temples—falls. The amygdala normally buzzes with activity in response to potentially threatening stimuli. When an organism repeatedly encounters a stimulus that at first seemed frightening but turns out to be benign—like, say, a balloon popping—a brain region called the prefrontal cortex inhibits amygdala activity. But in cases of repeated presentations of an actual threat, or in people with anxiety who continually perceive a stimulus as threatening, amygdala activity doesn’t subside and fear memories are more easily formed. To study the effects of oxytocin on the development of these fear memories, Eckstein and her colleagues first subjected study participants to Pavlovian fear conditioning, in which neutral stimuli (photographs of faces and houses) were sometimes paired with electric shocks. Subjects were then randomly assigned to receive either a single intranasal dose of oxytocin or a placebo. Thirty minutes later they received functional MRI scans while undergoing simultaneous fear extinction therapy, a standard approach to anxiety disorders in which patients are continually exposed to an anxiety-producing stimulus until they no longer find it stressful. In this case they were again exposed to images of faces and houses, but this time minus the electric shocks. © 2014 Scientific American
By Neuroskeptic | An important new study could undermine the concept of ‘endophenotypes’ – and thus derail one of the most promising lines of research in neuroscience and psychiatry. The findings are out now in Psychophysiology. Unusually, an entire special issue of the journal is devoted to presenting the various results of the study, along with commentary, but here’s the summary paper: Knowns and unknowns for psychophysiological endophenotypes by Minnesota researchers William Iacono, Uma Vaidyanathan, Scott Vrieze and Stephen Malone. In a nutshell, the researchers ran seven different genetic studies to try to find the genetic basis of a total of seventeen neurobehavioural traits, also known as ‘endophenotypes’. Endophenotypes are a hot topic in psychiatric neuroscience, although the concept is somewhat vague. The motivation behind interest in endophenotypes comes mainly from the failure of recent studies to pin down the genetic cause of most psychiatric syndromes: endophenotypes_A Essentially an endophenotype is some trait, which could be almost anything, which is supposed to be related to (or part of) a psychiatric disorder or symptom, but which is “closer to genetics” or “more biological” than the disorder itself. Rather than thousands of genes all mixed together to determine the risk of a psychiatric disorder, each endophenotype might be controlled by only a handful of genes – which would thus be easier to find.