Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 10221 - 10240 of 29407

Sarah C. P. Williams There’s a reason people say “Calm down or you’re going to have a heart attack.” Chronic stress—such as that brought on by job, money, or relationship troubles—is suspected to increase the risk of a heart attack. Now, researchers studying harried medical residents and harassed rodents have offered an explanation for how, at a physiological level, long-term stress can endanger the cardiovascular system. It revolves around immune cells that circulate in the blood, they propose. The new finding is “surprising,” says physician and atherosclerosis researcher Alan Tall of Columbia University, who was not involved in the new study. “The idea has been out there that chronic psychosocial stress is associated with increased cardiovascular disease in humans, but what’s been lacking is a mechanism,” he notes. Epidemiological studies have shown that people who face many stressors—from those who survive natural disasters to those who work long hours—are more likely to develop atherosclerosis, the accumulation of fatty plaques inside blood vessels. In addition to fats and cholesterols, the plaques contain monocytes and neutrophils, immune cells that cause inflammation in the walls of blood vessels. And when the plaques break loose from the walls where they’re lodged, they can cause more extreme blockages elsewhere—leading to a stroke or heart attack. Studying the effect of stressful intensive care unit (ICU) shifts on medical residents, biologist Matthias Nahrendorf of Harvard Medical School in Boston recently found that blood samples taken when the doctors were most stressed out had the highest levels of neutrophils and monocytes. To probe whether these white blood cells, or leukocytes, are the missing link between stress and atherosclerosis, he and his colleagues turned to experiments on mice. © 2014 American Association for the Advancement of Science

Keyword: Stress
Link ID: 19761 - Posted: 06.23.2014

By Adam Carter, CBC News Women who take antidepressants when they’re pregnant could unknowingly predispose their kids to type 2 diabetes and obesity later on in life, new research out of McMaster University suggests. The study, conducted by associate professor of obstetrics and gynecology Alison Holloway and PhD student Nicole De Long, found a link between the antidepressant fluoxetine and increased risk of obesity and diabetes in children. Holloway cautions that this is not a warning for all pregnant women to stop taking antidepressants, but rather to start a conversation about prenatal care and what works best on an individual basis. “There are a lot of women who really need antidepressants to treat depression. This is what they need,” Holloway told CBC. “We’re not saying you should necessarily take patients off antidepressants because of this — but women should have this discussion with their caregiver.” “Obesity and Type 2 diabetes in children is on the rise and there is the argument that it is related to lifestyle and availability of high calorie foods and reduced physical activity, but our study has found that maternal antidepressant use may also be a contributing factor to the obesity and diabetes epidemic.” According to a study out of Memorial University in St. John's, obesity rates in Canada have tripled between 1985 and 2011. Canada also ranks poorly when it comes to its overall number of cases of diabetes, according to international report from the Organization for Economic Co-operation and Development, released last year. © CBC 2014

Keyword: Depression; Obesity
Link ID: 19760 - Posted: 06.23.2014

Nicola Davis The old adage that we eat with our eyes appears to be correct, according to research that suggests diners rate an artistically arranged meal as more tasty – and are prepared to pay more for it. The team at Oxford University tested the idea by gauging the reactions of diners to food presented in different ways. Inspired by Wassily Kandinsky's "Painting Number 201" Franco-Columbian chef and one of the authors of the study, Charles Michel, designed a salad resembling the abstract artwork to explore how the presentation of food affects the dining experience. "A number of chefs now are realising that they are being judged by how their foods photograph – be it in the fancy cookbooks [or], more often than not, when diners instagram their friends," explains Professor Charles Spence, experimental psychologist at the University of Oxford and a co-author of the study. Thirty men and 30 women were each presented with one of three salads containing identical ingredients, arranged either to resemble the Kandinsky painting, a regular tossed salad, or a "neat" formation where each component was spaced away from the others. Seated alone at a table mimicking a restaurant setting, and unaware that other versions of the salad were on offer, each participant was given two questionnaires asking them to rate various aspects of the dish on a 10-point scale, before and after tucking into the salad. Before participants sampled their plateful, the Kandinsky-inspired dish was rated higher for complexity, artistic presentation and general liking. Participants were prepared to pay twice as much for the meal as for either the regular or "neat arrangements". © 2014 Guardian News and Media Limited

Keyword: Chemical Senses (Smell & Taste); Attention
Link ID: 19759 - Posted: 06.23.2014

By ANDREW POLLACK It is a tantalizingly simple idea for losing weight: Before meals, swallow a capsule that temporarily swells up in the stomach, making you feel full. Now, some early results for such a pill are in. And they are only partly fulfilling. People who took the capsule lost 6.1 percent of their weight after 12 weeks, compared with 4.1 percent for those taking a placebo, according to results presented Sunday at an endocrinology meeting in Chicago. Gelesis, the company developing the capsule, declared the results a triumph and said it would start a larger study next year aimed at winning approval for the product, called Gelesis100. “I’m definitely impressed, absolutely,” Dr. Arne V. Astrup, head of the department of nutrition, exercise and sports at the University of Copenhagen in Denmark and the lead investigator in the study, said in an interview. He said the physical mode of action could make the product safer than many existing diet drugs, which act chemically on the brain to influence appetite. But Dr. Daniel H. Bessesen, an endocrinologist at the University of Colorado who was not involved in the study, said weight loss of 2 percent beyond that provided by a placebo was “very modest.” “It doesn’t look like a game changer,” he said. Gelesis, a privately held company based in Boston, is one of many trying to come up with a product that can provide significant weight loss without bariatric surgery. Two new drugs — Qsymia from Vivus, and Belviq from Arena Pharmaceuticals and Eisai — have had disappointing sales since their approvals in 2012. Reasons include modest effectiveness, safety concerns, lack of insurance reimbursement and a belief among some doctors and overweight people that obesity is not a disease. © 2014 The New York Times Company

Keyword: Obesity
Link ID: 19758 - Posted: 06.23.2014

by Frank Swain WHEN it comes to personal electronics, it's difficult to imagine iPhones and hearing aids in the same sentence. I use both and know that hearing aids have a well-deserved reputation as deeply uncool lumps of beige plastic worn mainly by the elderly. Apple, on the other hand, is the epitome of cool consumer electronics. But the two are getting a lot closer. The first "Made for iPhone" hearing aids have arrived, allowing users to stream audio and data between smartphones and the device. It means hearing aids might soon be desirable, even to those who don't need them. A Bluetooth wireless protocol developed by Apple last year lets the prostheses connect directly to Apple devices, streaming audio and data while using a fraction of the power consumption of conventional Bluetooth. LiNX, made by ReSound (pictured), and Halo hearing aids made by Starkey – both international firms – use the iPhone as a platform to offer users new features and added control over their hearing aids. "The main advantage of Bluetooth is that the devices are talking to each other, it's not just one way," says David Nygren, UK general manager of ReSound. This is useful as hearing aids have long suffered from a restricted user interface – there's not much room for buttons on a device the size of a kidney bean. This is a major challenge for hearing-aid users, because different environments require different audio settings. Some devices come with preset programmes, while others adjust automatically to what their programming suggests is the best configuration. This is difficult to get right, and often devices calibrated in the audiologist's clinic fall short in the real world. © Copyright Reed Business Information Ltd.

Keyword: Hearing
Link ID: 19757 - Posted: 06.23.2014

Carl Zimmer A novelist scrawling away in a notebook in seclusion may not seem to have much in common with an NBA player doing a reverse layup on a basketball court before a screaming crowd. But if you could peer inside their heads, you might see some striking similarities in how their brains were churning. That’s one of the implications of new research on the neuroscience of creative writing. For the first time, neuroscientists have used fMRI scanners to track the brain activity of both experienced and novice writers as they sat down — or, in this case, lay down — to turn out a piece of fiction. The researchers, led by Martin Lotze of the University of Greifswald in Germany, observed a broad network of regions in the brain working together as people produced their stories. But there were notable differences between the two groups of subjects. The inner workings of the professionally trained writers in the bunch, the scientists argue, showed some similarities to people who are skilled at other complex actions, like music or sports. The research is drawing strong reactions. Some experts praise it as an important advance in understanding writing and creativity, while others criticize the research as too crude to reveal anything meaningful about the mysteries of literature or inspiration. Dr. Lotze has long been intrigued by artistic expression. In previous studies, he has observed the brains of piano players and opera singers, using fMRI scanners to pinpoint regions that become unusually active in the brain. Needless to say, that can be challenging when a subject is singing an aria. Scanners are a lot like 19th-century cameras: They can take very sharp pictures, if their subject remains still. To get accurate data, Dr. Lotze has developed software that can take into account fluctuations caused by breathing or head movements. © 2014 The New York Times Company

Keyword: Language; Brain imaging
Link ID: 19756 - Posted: 06.21.2014

Karen Ravn To the west, the skies belong to the carrion crow. To the east, the hooded crow rules the roost. In between, in a narrow strip running roughly north to south through central Europe, the twain have met, and mated, for perhaps as long as 10,000 years. But although the crows still look very different — carrion crows are solid black, whereas hooded crows are grey — researchers have found that they are almost identical genetically. The taxonomic status of carrion crows (Corvus corone) and hooded crows (Corvus cornix) has been debated ever since Carl Linnaeus, the founding father of taxonomy, declared them to be separate species in 1758. A century later, Darwin called any such classification impossible until the term 'species' had been defined in a generally accepted way. But the definition is still contentious, and many believe it always will be. The crows are known to cross-breed and produce viable offspring, so lack the reproductive barriers that some biologists consider essential to the distinction of a species, leading to proposals that they are two subspecies of carrion crow. In fact, evolutionary biologist Jochen Wolf from Uppsala University in Sweden and his collaborators have now found that the populations living in the cross-breeding zone are so similar genetically that the carrion crows there are more closely related to hooded crows than to the carrion crows farther west1. Only a small part of the genome — less than 0.28% — differs between the populations, the team reports in this week's Science1. This section is located on chromosome 18, in an area associated with pigmentation, visual perception and hormonal regulation. It is no coincidence, the researchers suggest, that the main differences between carrion and hooded crows are in colouring, mating preferences (both choose mates whose colouring matches theirs), and hormone-influenced social behaviours (carrion crows lord it over hooded ones). © 2014 Nature Publishing Group,

Keyword: Sexual Behavior; Evolution
Link ID: 19755 - Posted: 06.21.2014

By Gary Stix James DiCarlo: We all have this intuitive feel for what object recognition is. It’s the ability to discriminate your face from other faces, a car from other cars, a dog from a camel, that ability we all intuitively feel. But making progress in understanding how our brains are able to accomplish that is a very challenging problem and part of the reason is that it’s challenging to define what it isn’t and is. We take this problem for granted because it seems effortless to us. However, a computer vision person would tell you is that this is an extremely challenging problem because each object presents an essentially infinite number of images to your retina so you essentially never see the same image of each object twice. SA: It seems like object recognition is actually one of the big problems both in neuroscience and in the computational science of machine learning? DiCarlo: That’s right., not only machine learning but also in psychology or cognitive science because the objects that we see are the sources in the world of what we use to build higher cognition, things like memory and decision-making. Should I reach for this, should I avoid it? Our brains can’t do what you would call higher cognition without these foundational elements that we often take for granted. SA: Maybe you can talk about what’s actually happening in the brain during this process. DiCarlo: It’s been known for several decades that there’s a portion of the brain, the temporal lobe down the sides of our head, that, when lost or damaged in humans and non-human primates, leads to deficits of recognition. So we had clues that that’s where these algorithms for object recognition are living. But just saying that part of your brain solves the problem is not really specific. It’s still a very large piece of tissue. Anatomy tells us that there’s a whole network of areas that exist there, and now the tools of neurophysiology and still more advanced tools allow us to go in and look more closely at the neural activity, especially in non-human primates. We can then begin to decipher the actual computations to the level that an engineer might, for instance, in order to emulate what’s going on in our heads. © 2014 Scientific American

Keyword: Vision; Attention
Link ID: 19754 - Posted: 06.21.2014

—By Indre Viskontas He might be fictional. But the gigantic Hodor, a character in the blockbuster Game of Thrones series, nonetheless sheds light on something very much in the realm of fact: how our ability to speak emerges from a complex ball of neurons, and how certain brain-damaged patients can lose very specific aspects of that ability. According to George R.R. Martin, who wrote the epic books that inspired the HBO show, the 7-foot-tall Hodor could only say one word—"Hodor"—and everyone therefore tended to assume that was his name. Here's one passage about Hodor from the first novel in Martin's series: Theon Greyjoy had once commented that Hodor did not know much, but no one could doubt that he knew his name. Old Nan had cackled like a hen when Bran told her that, and confessed that Hodor's real name was Walder. No one knew where "Hodor" had come from, she said, but when he started saying it, they started calling him by it. It was the only word he had. Yet it's clear that Hodor can understand much more than he can say; he's able to follow instructions, anticipate who needed help, and behave in socially appropriate ways (mostly). Moreover, he says this one word in many different ways, implying very different meanings: So what might be going on in Hodor's brain? Hodor's combination of impoverished speech production with relatively normal comprehension is a classic, albeit particularly severe, presentation of expressive aphasia, a neurological condition usually caused by a localized stroke in the front of the brain, on the left side. Some patients, however, have damage to that part of the brain from other causes, such as a tumor, or a blow to the head. ©2014 Mother Jones

Keyword: Language
Link ID: 19753 - Posted: 06.21.2014

Heidi Ledford If shown to be possible in humans, addiction to the Sun could help explain why some tanners continue to seek out sunlight despite being well aware of the risks. The lure of a sunny day at the beach may be more than merely the promise of fun and relaxation. A study published today reports that mice exposed to ultraviolet (UV) rays exhibit behaviours akin to addiction. The researchers found that mice exposed repeatedly to UV light produced an opioid called β-endorphin, which numbs pain and is associated with addiction to drugs. When they were given a drug that blocks the effect of opioids, the mice also showed signs of withdrawal — including shaky paws and chattering teeth. If the results hold true in humans, they would suggest an explanation for why many tanners continue to seek out sunlight, despite the risks — and, in some cases, even after being diagnosed with skin cancer. “This offers a clear potential mechanism for how UV radiation can be rewarding and, in turn, potentially addictive,” says Bryon Adinoff, an addiction psychiatrist at the University of Texas Southwestern Medical Center in Dallas, who was not involved with the study. “That’s a big deal.” Oncologist David Fisher of the Massachusetts General Hospital in Boston and his colleagues became interested in sunlight addiction after studying the molecular mechanisms of pigment production in the skin after UV light exposure. In the new study published today in Cell1, they show that in mice, some skin cells also synthesize β-endorphin in response to chronic, low doses of UV light. © 2014 Nature Publishing Group

Keyword: Drug Abuse
Link ID: 19752 - Posted: 06.21.2014

By Robert Dudley When we think about the origins of agriculture and crop domestication, alcohol isn’t necessarily the first thing that comes to mind. But our forebears may well have been intentionally fermenting fruits and grains in parallel with the first Neolithic experiments in plant cultivation. Ethyl alcohol, the product of fermentation, is an attractive and psychoactively powerful inebriant, but fermentation is also a useful means of preserving food and of enhancing its digestibility. The presence of alcohol prolongs the edibility window of fruits and gruels, and can thus serve as a means of short-term storage for various starchy products. And if the right kinds of bacteria are also present, fermentation will stabilize certain foodstuffs (think cheese, yogurt, sauerkraut, and kimchi, for example). Whoever first came up with the idea of controlling the natural yeast-based process of fermentation was clearly on to a good thing. Using spectroscopic analysis of chemical residues found in ceramic vessels unearthed by archaeologists, scientists know that the earliest evidence for intentional fermentation dates to about 7000 BCE. But if we look deeper into our evolutionary past, alcohol was a component of our ancestral primate diet for millions of years. In my new book, The Drunken Monkey, I suggest that alcohol vapors and the flavors produced by fermentation stimulate modern humans because of our ancient tendencies to seek out and consume ripe, sugar-rich, and alcohol-containing fruits. Alcohol is present because of particular strains of yeasts that ferment sugars, and this process is most common in the tropics where fruit-eating primates originated and today remain most diverse. © 1986-2014 The Scientist

Keyword: Drug Abuse; Evolution
Link ID: 19751 - Posted: 06.21.2014

by Colin Barras The Neanderthals knew how to make an entrance: teeth first. Our sister species' distinctive teeth were among the first unique aspects of their anatomy to evolve, according to a study of their ancestors. These early Neanderthals may have used their teeth as a third hand, gripping objects that they then cut with tools. The claim comes from a study of fossils from Sima de los Huesos in northern Spain. This "pit of bones" may be an early burial site, and 28 near-complete skeletons have been pulled from it, along with a large hand-axe that might be a funeral gift. The hominins in the pit look like Neanderthals, but are far too old. That suggests they are forerunners of the Neanderthals, and if that is the case they can tell us how the species evolved. To find out, Juan Luis Arsuaga Ferreras at the UCM-ISCIII Joint Centre for Research into Human Evolution and Behaviour in Madrid, Spain, and colleagues studied 17 of the skulls. They found that the brain case was still the same shape as in older species. But the skulls' protruding faces and small molar teeth were much more Neanderthal-like. This suggests the earliest Neanderthals used their jaws in a specialised way. It's not clear how, but it probably wasn't about food, says Ferreras. "There are no indications of any dietary specialisation in the Neanderthals and their ancestors. They were basically carnivores." © Copyright Reed Business Information Ltd.

Keyword: Evolution
Link ID: 19750 - Posted: 06.21.2014

By Elizabeth Norton A single dose of a century-old drug has eliminated autism symptoms in adult mice with an experimental form of the disorder. Originally developed to treat African sleeping sickness, the compound, called suramin, quells a heightened stress response in neurons that researchers believe may underlie some traits of autism. The finding raises the hope that some hallmarks of the disorder may not be permanent, but could be correctable even in adulthood. That hope is bolstered by reports from parents who describe their autistic children as being caught behind a veil. "Sometimes the veil parts, and the children are able to speak and play more normally and use words that didn't seem to be there before, if only for a short time during a fever or other stress" says Robert Naviaux, a geneticist at the University of California, San Diego, who specializes in metabolic disorders. Research also shows that the veil can be parted. In 2007, scientists found that 83% of children with autism disorders showed temporary improvement during a high fever. The timing of a fever is crucial, however: A fever in the mother can confer a higher risk for the disorder in the unborn child. As a specialist in the cell's life-sustaining metabolic processes, Naviaux was intrigued. Autism is generally thought to result from scrambled signals at synapses, the points of contact between nerve cells. But given the specific effects of something as general as a fever, Naviaux wondered if the problem lay "higher up" in the cell's metabolism. © 2014 American Association for the Advancement of Science.

Keyword: Autism
Link ID: 19749 - Posted: 06.19.2014

by Helen Thomson KULLERVO HYNYNEN is preparing to cross neuroscience's final frontier. In July he will work with a team of doctors in the first attempt to open the blood-brain barrier in humans – the protective layer around blood vessels that shields our most precious organ against threats from the outside world. If successful, the method would be a huge step in the treatment of pernicious brain diseases such as cancer, Parkinson's and Alzheimer's, by allowing drugs to pass into the brain. The blood-brain barrier (BBB) keeps toxins in the bloodstream away from the brain. It consists of a tightly packed layer of endothelial cells that wrap around every blood vessel throughout the brain. It prevents viruses, bacteria and any other toxins passing into the brain, while simultaneously ushering in vital molecules such as glucose via specialised transport mechanisms. The downside of this is that the BBB also completely blocks the vast majority of drugs. Exceptions include some classes of fat and lipid-soluble chemicals, but these aren't much help as such drugs penetrate every cell in the body – resulting in major side effects. "Opening the barrier is really of huge importance. It is probably the major limitation for innovative drug development for neurosciences," says Bart De Strooper, co-director of the Leuven Institute for Neuroscience and Disease in Belgium. © Copyright Reed Business Information Ltd.

Keyword: Glia
Link ID: 19748 - Posted: 06.19.2014

By Brady Dennis Government warnings a decade ago about the risks associated with children and adolescents taking antidepressants appear to have backfired, causing an increase in suicide attempts and discouraging many depressed young people from seeking treatment, according to a study published Wednesday in the academic journal BMJ. Researchers said their findings underscore how even well-intentioned public health warnings can produce unintended conseque­n­c­­es, particularly when they involve widespread media attention and sensitive topics such as depression and suicide. In 2003 and 2004, the Food and Drug Administration issued a series of warnings based on data that pointed to an increase in suicidal thinking among some children and adolescents prescribed a class of antidepressants known as selective serotonin reuptake inhibitors, or SSRIs. They included such drugs as Paxil and Zoloft. In late 2004, the agency directed manufacturers to include a “black box” warning on their labels notifying consumers and doctors about the increased risk of suicidal thoughts and behaviors in youths being treated with these medications. The FDA warnings received a flood of media coverage that researchers said focused more on the tiny percentage of patients who had experienced suicidal thinking due to the drugs than on the far greater number who benefited from them. “There was a huge amount of publicity,” said Stephen Soumerai, professor of population medicine at Harvard Medical School and a co-author of Wednesday’s study. “The media concentrated more on the relatively small risk than on the significant upside.”

Keyword: Depression
Link ID: 19747 - Posted: 06.19.2014

by Lauren Hitchings Our brain's ability to rapidly interpret and analyse new information may lie in the musical hum of our brainwaves. We continuously take in information about the world but establishing new neural connections and pathways – the process thought to underlie memory formation – is too slow to account for our ability to learn rapidly. Evan Antzoulatos and Earl Miller at the Massachusetts Institute of Technology decided to see if brainwaves – the surges of electricity produced by individual neurons firing en masse – play a role. They used EEG to observe patterns of electrical activity in the brains of monkeys as they taught the animals to categorise patterns of dots into two distinct groups. At first, they memorised which dots went where, but as the task became harder, they shifted to learning the rules that defined the categories. Humming brainwaves The researchers found that, initially, brainwaves of different frequencies were being produced independently by the prefrontal cortex and the striatum – two brain regions involved in learning. But as the monkeys made sense of the game, the waves began to synchronise and "hum" at the same frequency – with each category of dots having its own frequency. Miller says the synchronised brainwaves indicate the formation of a communication circuit between the two brain regions. He believes this happens before anatomical changes in brain connections take place, giving our minds time to think through various options when presented with new information before the right one gets laid down as a memory. Otherwise, the process is too time-consuming to account for the flexibility and speed of the human mind, says Miller. © Copyright Reed Business Information Ltd.

Keyword: Learning & Memory
Link ID: 19746 - Posted: 06.19.2014

Migraines have been diagnosed in about eight per cent of Canadians, a quarter or more of whom say the severe headaches impact day-to-day life such as getting a good night’s sleep or driving, Statistics Canada says. The federal agency on Wednesday released its first report on the prevalence of migraine, saying an estimated 2.7 million Canadians, or 8.3 per cent, reported they had been diagnosed with the severe headaches in 2010-2011. Chronic migraines are frequent, severe, pulsating headaches accompanied by nausea, vomiting, and sensitivity to light and sound. "I think the key finding that was quite interesting was the impact of migraine," said report author Pamela Ramage-Morin, a senior analyst in Ottawa. "For three-quarters to say that it had an impact on their getting a good night sleep, over half said it prevented them from driving on some occasions, even people feeling left out of things because of their condition. There's some social isolation that could be occurring. It may be limiting on people's education and employment opportunities. That can have a long-term effect." The sleep findings are important given lack of sleep can impact other aspects of life, Ramage-Morin said, noting how the effects can extend beyond the individual to the larger community. For both men and women surveyed, migraines were most common at ages 30 to 49, a group represents 12 per cent of the population and the prime working years. © CBC 2014

Keyword: Pain & Touch
Link ID: 19745 - Posted: 06.19.2014

by Laura Sanders Some brain cells need a jolt of stress to snap to attention. Cells called astroglia help regulate blood flow, provide energy to nearby cells and even influence messages’ movement between nerve cells. Now, scientists report June 18 in Neuron that astroglia can be roused by the stress molecule norepinephrine, an awakening that may help the entire brain jump into action. As mice were forced to walk on a treadmill, an activity that makes them alert, astroglia in several parts of their brains underwent changes in calcium levels, a sign of activity, neuroscientist Dwight Bergles of Johns Hopkins University School of Medicine and colleagues found. Norepinephrine, which acts as a fight-or-flight hormone in the body and a neural messenger in the brain, seemed to cause the cell activity boost. When researchers depleted norepinephrine, treadmill walking no longer activated astroglia. It’s not clear whether astroglia in all parts of the brain heed this wake-up call, nor is it clear whether this activation influences behavior. Norepinephrine might help shift brain cells, both neurons and astroglia, into a state of heightened vigilance, the authors write. © Society for Science & the Public 2000 - 2013.

Keyword: Stress; Glia
Link ID: 19744 - Posted: 06.19.2014

By PAM BELLUCK Cindy Wachenheim was someone people didn’t think they had to worry about. She was a levelheaded lawyer working for the State Supreme Court, a favorite aunt who got down on the floor to play with her nieces and nephews, and, finally, in her 40s, the mother she had long dreamed of becoming. But when her baby was a few months old, she became obsessed with the idea that she had caused him irrevocable brain damage. Nothing could shake her from that certainty, not even repeated assurances from doctors that he was normal. “I love him so much, but it’s obviously a terrible kind of love,” she agonized in a 13-page handwritten note. “It’s a love where I can’t bear knowing he is going to suffer physically and mentally/emotionally for much of his life.” Ms. Wachenheim’s story provides a wrenching case study of one woman’s experience with maternal mental illness in its most extreme and rare form. It also illuminates some of the surprising research findings that are redefining the scientific understanding of such disorders: that they often develop later than expected and include symptoms not just of depression, but of psychiatric illnesses. Now these mood disorders, long hidden in shame and fear, are coming out of the shadows. Many women have been afraid to admit to terrifying visions or deadened emotions, believing they should be flush with maternal joy or fearing their babies would be taken from them. But now, advocacy groups on maternal mental illness are springing up, and some mothers are blogging about their experiences with remarkable candor. A dozen states have passed laws encouraging screening, education and treatment. And celebrities, including Brooke Shields, Gwyneth Paltrow and Courteney Cox, have disclosed their postpartum depression. © 2014 The New York Times Company

Keyword: Depression; Hormones & Behavior
Link ID: 19743 - Posted: 06.17.2014

by Bethany Brookshire When a cartoon character gets an idea, you know it. A lightbulb goes on over Wile E. Coyote’s head, or a ding sounds as Goofy puts two and two together. While the lightbulb and sound effects are the stuff of cartoons, scientists can, in a way, watch learning in action. In a new study, a learning task in rats was linked to increases in activity patterns in groups of brain cells. The results might help scientists pin down what learning looks like at the nerve cell level, and give us a clue about how memories are made. Different areas of the brain communicate with each other, transferring information from one area to another for processing and interpretation. Brain cell meets brain cell at connections called synapses. But to transfer information between areas often takes more than one neuron firing a lonely signal. It takes cortical oscillations — networks of brain cells sending electrical signals in concert — over and over again for a message to transmit from one brain area to another. Changes in electrical fields increase the probability that neurons in a population will fire. These cortical oscillations are like a large crowd chanting. Not all voices may be yelling at once, some people may be ahead or behind, some may even be whispering, but you still hear an overwhelming “USA! USA!” Cortical oscillations can occur within a single brain area, or they can extend from one area to another. “The oscillation tells you what the other brain area is likely to ‘see’ when it gets that input,” explains Leslie Kay, a neuroscientist at the University of Chicago. Once the receiving area ‘sees’ the incoming oscillation, it may synchronize its own population firing, joining in the chant. “A synchronized pattern of oscillations in two separate brain regions serves to communicate between the two regions,” says Kei Igarashi, a neuroscientist at the Norwegian University of Science and Technology in Trondheim. © Society for Science & the Public 2000 - 2013

Keyword: Learning & Memory
Link ID: 19742 - Posted: 06.17.2014