Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 11181 - 11200 of 28882

by Jennifer Couzin-Frankel Publish your data, or else we will—that's the stark warning to drug companies in a new proposal released today. Peter Doshi (shown right), a postdoctoral fellow at Johns Hopkins University in Baltimore, Maryland, and his colleagues are fed up that only about half of all clinical trials are published. They want to change that, by convincing researchers and journals to print data that have been publicly released through other means, such as litigation and Freedom of Information Act requests, but, practically speaking, are sitting dormant in the filing cabinets or computers of individual scientists. The unusual proposal is called RIAT, for Restoring Invisible and Abandoned Trials. It was published today in BMJ and also endorsed by PLOS Medicine. Doshi, who studies comparative effectiveness research, came up with the idea when his colleague, Swaroop Vedula, was analyzing reporting biases involving the drug gabapentin. Gabapentin's maker Pfizer had been sued for the way in which they marketed the drug for unapproved indications. During litigation, Pfizer had released thousands of pages involving gabapentin trials, and Vedula was poring through them. (One of the authors of the RIAT paper, Kay Dickersin, served as an expert witness against Pfizer in gabapentin litigation.) Pfizer had published only 12 of its 20 trials in gabapentin. But Doshi's center at Hopkins had the clinical study reports detailing the results of the other eight. At the time, "it just hits me," Doshi says. "Why are we still referring to these as unpublished trials? Why aren't we publishing them ourselves?" © 2010 American Association for the Advancement of Science

Keyword: Depression; Schizophrenia
Link ID: 18275 - Posted: 06.15.2013

Published by scicurious What do the overconsumption of food and Obsessive-Compulsive Disorder (OCD) have in common? At first, this sounds like a trick question. But deep in the brain, the molecules underlying our behavior may come together for these two conditions. The first is MC4R, a receptor for melanocortin. It binds hormones and affects feeding behavior, mutations in MC4R are associated with severe overcomsumption of high fat, high calorie foods and with obesity. A mouse without an MC4R gene will become severely obese compared to its wildtype counterparts. SAPAP3 is a protein that is associated with synapses, the spaces between neurons. It can regulate things like receptor levels that determine how well a neuron responds to excitatory input. But a knockout of SAPAP3 in mice produces something very different: severe overgrooming, a model of OCD. All rodents groom themselves, it's necessary to keep clean. But SAPAP3 knockouts groom themselves far, far too much, to the point of creating terrible lesions on their skin. This has been proposed as a model of OCD, as many people with OCD become obsessed with cleanliness, and will do things like, say, washing their hands, to the point of severely damaging their skin. So a knockout of MC4R creates obese mice. A knockout of SAPAP3 creates overgrooming mice. You might think that if you combined the two knockouts, you would get severely obese mice that also overgroomed. But you don't. Instead, you get mice that, to all appearances, seem completely normal. No obesity. No overgrooming. Neurotic Physiology Copyright © 2013

Keyword: Obesity; OCD - Obsessive Compulsive Disorder
Link ID: 18274 - Posted: 06.15.2013

By E. Paul Zehr As an infant, the Man Of Steel escaped Krypton’s red sun in a rocket lovingly prepared for him by his parents. Kal-L (but more commonly known as Kal-El) arrived under our yellow sun in Smallville to eventually become Clark Kent. Since his debut in Action Comics #1 in June of 1938, Superman has accumulated a pretty long list of “super abilities”. For me, though, I really like the list of his abilities that come from the 1940s radio serials. This was back when Superman was described as “faster than a speeding bullet, more powerful than a locomotive, and able to leap tall buildings in a single bound”. These descriptions all have to do with super-strength when you get right down to it. And with this summer’s “Man of Steel” Superman re-boot, super-strength is the focus of this post. I have to admit I’ve always found the explanation for Superman’s powers to be, well, a bit dubious. He has his powers because of our yellow sun. That is, because he was from a red sun planet (Krypton) somehow the yellow sun of Earth unleashes some inner super power mechanism that gives Superman all his…super-ness. Of course it’s a bit pure escapist fun. But what if there actually was something to that, though? I don’t mean something to the “yellow sun / red sun” stuff. You can just check in with our “friendly neighborhood physics” professor Jim Kakalios and his bok “Physics of Superheroes” for the real deal on that one. I mean rather the unleashing of some inner mechanism bit. What if something inside the human body could be unleashed—like removing the shackles from Hercules—and allow for dramatically increased strength? © 2013 Scientific American

Keyword: Muscles; Genes & Behavior
Link ID: 18273 - Posted: 06.15.2013

by Trisha Gura A rare genetic disease may be going to the dogs. About six in 100,000 babies are born with centronuclear myopathy, which weakens skeletal muscles so severely that children have trouble eating and breathing and often die before age 18. Now, by discovering a very similar condition in canines, researchers have a means to diagnose the disease, unravel its molecular intricacies, and target new therapies. The story began when Jocelyn Laporte, a geneticist at the Institute of Genetics and Molecular and Cellular Biology in Strasbourg, France, uncovered the genetic roots of an odd form of centronuclear myopathy that showed up in a Turkish family. Three children, two of them fraternal twins, were born normal. Then, at the age of 3-and-a-half, they grew progressively and rapidly ill. (Most forms of the illness do not come on so suddenly.) The twins died by the age of 9. Their younger brother recently reached the same age but is very ill. Investigators traced the problem to a mutation in a gene called BIN1, which makes a protein that helps shape the muscle so that it can respond to nerve signals that initiate muscle contraction. To find out how mutations in this gene could lead to such dire consequences, other researchers tried to genetically engineer mice models. But deleting the BIN1 gene failed to recreate the disease in mice, so the researchers had to look elsewhere. Laporte's team joined with geneticist and veterinarian Laurent Tiret, at the Alfort School of Veterinary Medicine in Paris, to tap a network of vets in the United States, United Kingdom, Canada, Australia, and France. The idea was to track down and analyze dogs that had spontaneously acquired a similar condition. Because of their longer lifespans and larger size, the canines could model how the disease progresses and might respond to new therapies. © 2010 American Association for the Advancement of Science

Keyword: Muscles; Movement Disorders
Link ID: 18272 - Posted: 06.15.2013

By Elizabeth Landau, CNN Philadelphia (CNN) -- Martha Farah is leaning forward, furiously typing on her thin laptop in her spacious office at the University of Pennsylvania. Awards, paintings and posters lean against the walls on the floor as she puts the final touches on a grant proposal. "I hate it, but I love it!" she exclaims, in a voice that often rises melodically to stress words with enthusiasm. "The adrenaline!" Farah, 57, built a career that has taken many exciting turns. The scope of her work in the field is impressive: She has studied vision, brain-enhancing drugs and socioeconomic influences on the brain, among other topics. Currently, she is the founding director for Penn's Center for Neuroscience and Society. "One of the things that really drew me to her was her interest in applying the tools and insights of cognitive neuroscience to socially relevant questions," said Andrea Heberlein, a former postdoctoral fellow in Farah's lab and current lecturer at Boston College. "How can we make the world better, using these tools?" After completing her undergraduate education at Massachusetts Institute of Technology, Farah studied experimental psychology in the 1970s and '80s at Harvard University, where she earned her Ph.D. The prevailing idea among scientists at the time was that the mind is like computer software and the brain is like the hardware; software would explain "cognitive" phenomena such as memory, problem-solving and information processing. CNN© 2013 Cable News Network

Keyword: Development of the Brain; Learning & Memory
Link ID: 18271 - Posted: 06.15.2013

By LIZ ALDERMAN PARIS — On a recent day in the shadow of the Arc de Triomphe, a line of 20 people spilled onto the sidewalk of a trendy new boutique, eager to get a taste of its latest gourmet offerings. A sign in the window promoted piña colada as the store’s flavor of the month. A woman wearing a Chanel jacket said she wanted to try peach. But this was no temple of gastronomy. It was one of scores of electronic cigarette shops that have been springing up by the week in Paris as well as in numerous cities across Europe and the United States. Inside the ClopiNette boutique, shoppers can choose from among more than 60 flavors of nicotine liquid — including Marlboro and Lucky Strike flavors — all in varying strengths and arranged in color-coded rows. (ClopiNette is a play on “clope,” French slang for a cigarette.) “It’s like visiting a Nespresso store,” said Anne Stephan, a lawyer specializing in health issues at a nearby law firm. What’s driving her into the store is a desire shared by many: they want to give up smoking tobacco but don’t want to kick the smoking habit. After smoking 20 cigarettes daily for 25 years and failing to quit, Ms. Stephan said she had cut down to one a day in the three months since she began puffing on a so-called e-cig. Using technology that turns nicotine-infused propylene glycol into an inhalable vapor, e-cigarettes smoke almost like the real thing, without the ashtray odor. © 2013 The New York Times Company

Keyword: Drug Abuse
Link ID: 18270 - Posted: 06.13.2013

Alison Abbott A simple brain scan may offer a way to predict which people being treated for depression will respond to drugs, and which will respond to cognitive behavioural therapy. Neurologist Helen Mayberg from Emory University in Atlanta, Georgia, and her colleagues have run the first systematic, well-controlled study to identify the first potential biomarker that distinguishes between treatment responses. The work is published in JAMA Psychiatry1. Psychiatrists are desperate for such biomarkers, because fewer than 40% of people with depression go into remission after initial treatment. “It could be fabulous,” says Steven Zalcman, chief of clinical neuroscience research at the US National Institute of Mental Health (NIMH) in Bethesda, Maryland. But he cautions that the brain-scan biomarker still has to be validated in further trials — a process that could take a couple of years. Mayberg and her colleagues selected 82 people with untreated depression, and measured glucose metabolism in their brains using positron emission tomography (PET) scans. They then randomly assigned the subjects to treatment groups. One group received the common antidepressant drug escitalopram oxalate (a selective serotonin reuptake inhibitor, or SSRI) for 12 weeks. The other group received 16 sessions of cognitive behavioural therapy over the same period. © 2013 Nature Publishing Group

Keyword: Depression; Brain imaging
Link ID: 18269 - Posted: 06.13.2013

Comedian and writer Ruby Wax, a regular on British television, has clinical depression. In her book published last week, Sane New World (Hodder & Stoughton, 2013), she describes her struggles with different therapies and her fear of being ‘found out’. She is not alone. A 2010 survey in Europe revealed that 38% of people had a diagnosed mental disorder — including 7% with major depression. The proportion is likely to be similar in all populations, even in Africa, where psychiatric disease barely features on the health agenda. The stigma attached to such disorders means that many people do not admit to their illness. The same stigma discourages investment, so that research funding is not proportional to the distress these disorders cause. Why lobby for better treatments for depression or schizophrenia when there are ‘real’ diseases out there, such as cancer? Wax has been through the catalogue of available therapies and says that she has settled on an approach known as ‘mindfulness’, which helps to keep her depression under control. It may seem that the various therapies are inadequate, given that initial treatment of depression fails in 60% or more of cases. It is true that more treatment options are badly needed. Yet evidence-based cognitive behavioural therapies and drugs already developed by the pharmaceutical industry can work splendidly for long periods — if they are given to the right patients. How do you recognize the right patients? Treatment decisions tend to be based on the preferences of physicians or their patients, often with a missionary zeal that gives no credence to the idea that a personalized approach would be more appropriate. © 2013 Nature Publishing Group

Keyword: Depression
Link ID: 18268 - Posted: 06.13.2013

By NICHOLAS BAKALAR Hearing loss in older adults increases the risk for hospitalization and poor health, a new study has found, even taking into account other risk factors. Researchers analyzed data on 529 men and women over 70 with normal hearing, comparing them with 1,140 whose hearing was impaired, most with mild or moderate hearing loss. The data were gathered in a large national health survey in 2005-6 and again in 2009-10. The results appeared in The Journal of the American Medical Association. After adjusting for race, sex, education, hypertension, diabetes, stroke, cardiovascular disease and other risks, the researchers found that people with poor hearing were 32 percent more likely to be hospitalized, 36 percent more likely to report poor physical health and 57 percent more likely to report poor emotional or mental health. The authors acknowledge that this is an association only, and that there may be unknown factors that could have affected the result. “There has been a belief that hearing loss is an inconsequential part of aging,” said the lead author, Dr. Frank R. Lin, an associate professor of otolaryngology at Johns Hopkins. “But it’s probably not. Everyone knows someone with hearing loss, and as we think about health costs, we have to take its effects into account.” Copyright 2013 The New York Times Company

Keyword: Hearing
Link ID: 18267 - Posted: 06.13.2013

By Nathan Seppa Soccer players who hit the ball with their head a lot don’t score as well on a memory test as players who head the ball less often, a new study finds. Frequent headers are also associated with abnormalities in the white matter of the brain, researchers report June 11 in Radiology. “These changes are subtle,” says Inga Koerte, a radiologist at Harvard Medical School and Brigham and Women’s Hospital in Boston. “But you don’t need a concussive trauma to get changes in the microstructure of your brain.” While soccer players can get concussions from colliding with goal posts, the ground or each other, concussions are uncommon from heading the ball, even though it can move at 80 kilometers per hour, says coauthor Michael Lipton, a neuroradiologist at the Albert Einstein College of Medicine in New York City. He and his colleagues took magnetic resonance imaging scans of 28 men and nine women who played amateur soccer. The players, with an average age of 31, tallied up their games and practice sessions in the previous year and estimated how many headers they had done in each. Most players headed the ball hundreds of times; some hit thousands of headers. The MRIs revealed brain abnormalities in some players, mainly in the white matter of three regions of the brain. White matter coats nerve fibers, and bundles of fibers cross and converge in the three regions. But the areas aren’t associated with a single function, Lipton says. Attention, memory, sensory inputs and visual and spatial functions could all be processed there. © Society for Science & the Public 2000 - 2013

Keyword: Brain Injury/Concussion; Learning & Memory
Link ID: 18266 - Posted: 06.13.2013

A team of NIH-supported researchers is the first to show, in mice, an unexpected two-step process that happens during the growth and regeneration of inner ear tip links. Tip links are extracellular tethers that link stereocilia, the tiny sensory projections on inner ear hair cells that convert sound into electrical signals, and play a key role in hearing. The discovery offers a possible mechanism for potential interventions that could preserve hearing in people whose hearing loss is caused by genetic disorders related to tip link dysfunction. The work was supported by the National Institute on Deafness and Other Communication Disorders (NIDCD), a component of the National Institutes of Health. Stereocilia are bundles of bristly projections that extend from the tops of sensory cells, called hair cells, in the inner ear. Each stereocilia bundle is arranged in three neat rows that rise from lowest to highest like stair steps. Tip links are tiny thread-like strands that link the tip of a shorter stereocilium to the side of the taller one behind it. When sound vibrations enter the inner ear, the stereocilia, connected by the tip links, all lean to the same side and open special channels, called mechanotransduction channels. These pore-like openings allow potassium and calcium ions to enter the hair cell and kick off an electrical signal that eventually travels to the brain where it is interpreted as sound. The findings build on a number of recent discoveries in laboratories at NIDCD and elsewhere that have carefully plotted the structure and function of tip links and the proteins that comprise them. Earlier studies had shown that tip links are made up of two proteins — cadherin-23 (CDH23) and protocadherin-15 (PCDH15) — that join to make the link, with PCDH15 at the bottom of the tip link at the site of the mechanotransduction channel, and CDH23 on the upper end. Scientists assumed that the assembly was static and stable once the two proteins bonded.

Keyword: Hearing
Link ID: 18265 - Posted: 06.13.2013

by Alyssa Danigelis Next time you happen across an enormous cockroach, check to see whether it’s got a backpack on. Then look for the person controlling its movements with a phone. The RoboRoach has arrived. The RoboRoach is a system created by University of Michigan grads who have backgrounds in neuroscience, Greg Gage and Tim Marzullo. They came up with the cyborg roach idea as part of an effort to show students what real brain spiking activity looks like using off-the-shelf electronics. Essentially the RoboRoach involves taking a real live cockroach, putting it under anesthesia and placing wires in its antenna. Then the cockroach is outfitted with a special lightweight little backpack Gage and Marzullo developed that sends pulses to the antenna, causing the neurons to fire and the roach to think there’s a wall on one side. So it turns. The backpack connects to a phone via Bluetooth, enabling a human user to steer the cockroach through an app. Why? Why would anyone do this? ”We want to create neural interfaces that the general public can use,” the scientists say in a video. “Typically, to understand how these hardware devices and biological interfaces work, you’d have to go to graduate school in a neuro-engineering lab.” They added that the product is a learning tool, not a toy, and through it they hope to start a neuro-revolution. Currently the duo’s Backyard Brains startup is raising money through a Kickstarter campaign to develop more fine-tuned prototypes, make them more affordable, and extend battery life. The startup says it will make the RoboRoach hardware by hand in an Ann Arbor hacker space. © 2013 Discovery Communications, LLC

Keyword: Robotics
Link ID: 18264 - Posted: 06.12.2013

By Keith Payne It was a summer evening when Tony Cornell tried to make the residents of Cambridge, England see a ghost. He got dressed up in a sheet and walked through a public park waving his arms about. Meanwhile his assistants observed the bystanders for any hint that they noticed something strange. No, this wasn’t Candid Camera. Cornell was a researcher interested in the paranormal. The idea was first to get people to notice the spectacle, and then see how they understood what their eyes were telling them. Would they see the apparition as a genuine ghost or as something more mundane, like a bloke in a bed sheet? The plan was foiled when not a single bystander so much as raised an eye brow. Several cows did notice, however, and they followed Cornell on his ghostly rambles. Was it just a fluke, or did people “not want to see” the besheeted man, as Cornell concluded in his 1959 report? Okay, that stunt was not a very good experiment, but twenty years later the eminent psychologist Ulric Neisser did a better job. He filmed a video of two teams of students passing a basketball back and forth, and superimposed another video of a girl with an umbrella walking right through the center of the screen. When he asked subjects in his study to count the number of times the ball was passed, an astonishing 79 percent failed to notice the girl with the umbrella. In the years since, hundreds of studies have backed up the idea that when attention is occupied with one thing, people often fail to notice other things right before their eyes. When you first learn about these studies they seem deeply strange. Is it really possible that we are constantly failing to notice things right in front of us? Is there some mysterious force screening what we see and what remains hidden? © 2013 Scientific American

Keyword: Attention
Link ID: 18263 - Posted: 06.12.2013

by Douglas Heaven Here's a game of join the dots. Hundreds of genes have been linked to autism, but to understand the roles they play – and the ones most likely to eventually lead to treatment – we need to work out how they are connected to one another. Now Caleb Webber at the University of Oxford and his colleagues have done that, creating the largest interacting network yet of genes linked to autism. The team looked at the DNA of 181 individuals with autism and found they often had either more or fewer copies of certain genes known to be important for the transmission of brain signals, a system thought to go awry with autism. By feeding these genes into a computer model and tracing their interactions, they were able to build a web-like network, with genes in the centre having many connections and those at the edge just a few. They found that many of the genes identified in the 181 people were connected to others that had previously been linked to autism. By turning certain genes on or off in the model, the team found that extra or missing copies of genes in the centre of the network were much more likely to disrupt brain signalling – and therefore be influential in autism – than those on the edge. © Copyright Reed Business Information Ltd.

Keyword: Autism; Genes & Behavior
Link ID: 18262 - Posted: 06.12.2013

By Felicity Muth I recently came across an article entitled ‘Advantages in exploring a new environment with the left eye in lizards’ and I couldn’t help but read more. In this study, conducted in Italy, scientists caught 44 wall lizards and glued eye patches on to them (using a paper glue that is harmless to the lizards as they can shed and renew their skin). Half the lizards had their left eye covered, and half had their right eye covered. The lizards were then let into a maze for 20 minutes to see how they fared with turning left and right. The ones that were allowed to use just their left eye were much faster than those that could just use their right eye at turning both left and right. In addition to this, they made fewer stops, seeming to be less hesitant and indecisive than the right-eyed individuals. However, this was only the case when the lizard had to make a choice between turning left or right, not when they only had the choice to turn one way. Why might this be the case? Well, like a lot of vertebrates, lizards have lateralized brains. This means that the brain is divided in two halves, and some functions are specialized to one half. The classic example of this in humans is Broca’s area (associated with speech), which is found in the left hemisphere of the brain in 95% of us. Similar to how humans on the whole prefer to use their right hand, it seems that lizards generally prefer to use their left eye. As with humans, lizard optic nerve fibres are crossed over, meaning that control of the left eye comes from the right hemisphere of the brain and vice versa. As these lizards predominantly use their left eye, this indicates that in this species, something in the right side of their brain is specialised in attending to spatial cues. © 2013 Scientific American

Keyword: Laterality; Vision
Link ID: 18261 - Posted: 06.12.2013

// by Jennifer Viegas It goes a little something like this: A young male zebra finch, whose father taught him a song, shared that song with a brother, with the two youngsters then creating new tunes based on dad’s signature sound. The musical bird family, described in the latest Biology Letters, strengthens evidence that imitation between siblings and similar-aged youngsters facilitates vocal learning. The theory could help to explain why families with multiple same sex siblings, such as the Bee Gees and the Jackson 5, often form such successful musical groups. Co-author Sébastien Derégnaucourt told Discovery News that, among humans, “infants have a visual preference for peers of the same age, which may facilitate imitation.” He added that it’s also “known that children can have an impact on each other’s language acquisition, such as in the case of the emergence of creole languages, whether spoken or signed, among children exposed to pidgin (a grammatically simplified form of a language).” Pidgin in this case is more like pigeon, since the study focused on birds. Derégnaucourt, an associate professor at University Paris West, collaborated with Manfred Gahr of the Max Planck Institute for Ornithology. The two researchers studied how the young male zebra finch from a bird colony in Germany learned from his avian dad. © 2013 Discovery Communications, LLC.

Keyword: Language; Sexual Behavior
Link ID: 18260 - Posted: 06.12.2013

By Felicity Muth Pigs are one of the top animals consumed across the world. According to the US Census Bureau, in 2010, around one hundred million metric tons of pork were consumed that year, with 10% of this being in the US (although it does seem that overall meat consumption is declining). With so many of us eating pork, you might think we’d know a bit more about these animals. A lot of people are surprised to hear about some of the cognitive abilities of the average pig. While it’s problematic to call an animal ‘intelligent’ or not, as this is a term is ill-defined and too often related to human cognition, pigs have shown us that they have a number of cognitive abilities tested across many different types of test. They have good learning and memory in many contexts (both short- and long-term), including episodic memory (memory for past events in their life), the ability to differentiate between familiar and unfamiliar pigs, and an inclination to explore novel objects. In addition to these behavioural feats, the pig brain is well-developed. For example, the volume of the prefrontal cortex is around 24% of the total neocortex and 10% of the total brain volume, comparable to primates including humans. I’m not sure why, despite this research, pigs have a reputation for being ‘stupid’. Similar to the ‘three-second memory’ myth with fish, I wonder if it’s perpetuated to make people not feel bad about eating these animals, or the conditions under which they are often reared. © 2013 Scientific American

Keyword: Intelligence; Evolution
Link ID: 18259 - Posted: 06.12.2013

by Satoshi Kanazawa in The Scientific Fundamentalist Drinking alcohol is evolutionarily novel, so the Hypothesis would predict that more intelligent people drink more alcohol than less intelligent people. The human consumption of alcohol probably originates from frugivory (consumption of fruits). Fermentation of sugars by yeast naturally present in overripe and decaying fruits produces ethanol, known to intoxicate birds and mammals. However, the amount of ethanol alcohol in such fruits ranges from trace to 5%, roughly comparable to light beer. (And you can't really get drunk on light beer.) It is nothing compared to the amount of alcohol present in regular beer (4-6%), wine (12-15%), and distilled spirits (20-95%). Human consumption of alcohol, however, was unintentional, accidental, and haphazard until about 10,000 years ago. The intentional fermentation of fruits and grain to yield ethanol arose only recently in human history. The production of beer, which relies on a large amount of grain, and that of wine, which similarly requires a large amount of grapes, could not have taken place before the advent of agriculture around 8,000 BC and the consequent agricultural surplus. Archeological evidence dates the production of beer and wine to Mesopotamia at about 6,000 BC. The origin of distilled spirits is far more recent, and is traced to Middle East or China at about 700 AD. The word alcohol - al kohl - is Arabic in origin, like many other words that begin with "al," like algebra, algorithm, alchemy, and Al Gore. Human experience with concentrations of ethanol higher than 5% that is attained by decaying fruits is therefore very recent. © Copyright 2002-2013 Sussex Directories, Inc.

Keyword: Drug Abuse; Intelligence
Link ID: 18258 - Posted: 06.12.2013

By Scicurious When I am stressed (and I’m stressed a lot of the time, as I bet a lot of you are as well), I turn to coffee. Not just to keep me going through the time when I need to get things done, but also for relaxation. For me, the smell and taste of coffee brings me thoughts of relaxing conversations with friends and other fun times. But what if the memories weren’t all the relaxing the caffeine was doing for me? What if the chronic caffeine consumption was keeping my stressful life at bay? It’s time to look at adenosine 2A receptors in the hippocampus. Don’t worry, the coffee will be back. First let’s talk about stress. Specifically, childhood stress. In small doses, stress exposure can actually be good for you, but in large, or prolonged, doses, it’s definitely not. There are effects immediately after stress, as well as long term ones. when you suffer strong stressors in development, you can end up with changes all the way into adulthood, from cognitive deficits to predisposition to psychiatric disorders. Why is stress in development so important? During development, our brains are developing too, particularly our hippocampus. While the hippocampus is best known for its role in memory and spatial navigation, it’s also extremely important in emotional responses. Neuronal growth in the hippocampus can come from enriched environments or chronic antidepressants, and death of those neurons can come from chronic stress. Chronic stress also disrupts the hypothalamic-pituitary-adrenal axis (the HPA axis) And that’s just in adults! During development, animals are very susceptible to stress, and the hippocampus is still developing its connections. And we’re still figuring out what changes occur during early life stress and how they relate to behaviors in adulthood. © 2013 Scientific American

Keyword: Drug Abuse; Stress
Link ID: 18257 - Posted: 06.11.2013

By Sandra G. Boodman, Through repeated painful experience, Shannon Bream had learned to keep her eyedrops close at hand wherever she went — even in the shower. Although they did little to quell the near-constant thrum of pain, the lubricating drops were better than nothing. She clutched the bottle while working out at the gym and kept extras in her purse, car and desk. At night, she set her alarm clock to ring every few hours so she could use them; failing to do so, she had discovered, meant waking up in pain that felt “like someone was stabbing me in the eye,” she said. “Daytime was okay, I could function, but nights had become an absolute nightmare,” said Bream, who covers the Supreme Court for Fox News. But a doctor’s suggestion that she was exaggerating her worsening misery, coupled with the bleak future presented on the Internet message boards she trolled night after night searching for help, plunged her into despair. “I didn’t think I could live like this for another 40 years,” she recalled thinking during her 18-month ordeal. Ironically, it was those same message boards that helped steer Bream to the doctor who provided a correct diagnosis and a satisfactory resolution. In the middle of one night in February 2010, Bream, then 39, awoke suddenly with pain in her left eye “so searing it sat me straight up in bed.” She stumbled to the bathroom, where she frantically rummaged through the medicine cabinet and grabbed various eyedrops, hoping to dull the pain. Her eye was tearing profusely; after about three hours, both the pain and tearing subsided. © 1996-2013 The Washington Post

Keyword: Pain & Touch; Vision
Link ID: 18256 - Posted: 06.11.2013