Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
Pagan Kennedy In 2011, Ben Trumble emerged from the Bolivian jungle with a backpack containing hundreds of vials of saliva. He had spent six weeks following indigenous men as they tramped through the wilderness, shooting arrows at wild pigs. The men belonged to the Tsimane people, who live as our ancestors did thousands of years ago — hunting, foraging and farming small plots of land. Dr. Trumble had asked the men to spit into vials a few times a day so that he could map their testosterone levels. In return, he carried their kills and helped them field-dress their meat — a sort of roadie to the hunters. Dr. Trumble wanted to find out whether the hunters who successfully shot an animal would be rewarded with a spike in testosterone. (They were.) As a researcher with the Tsimane Health and Life History Project, he had joined a long-running investigation into human well-being and aging in the absence of industrialization. That day when he left the jungle, he stumbled across a new and more urgent question about human health. He dropped his backpack, called his mom and heard some terrible news: His 64-year-old uncle had learned he had dementia, probably Alzheimer’s. In just a few short years, his uncle, a vibrant former lawyer, would stop speaking, stop eating and die. “I couldn’t help my uncle,” Dr. Trumble said, but he was driven to understand the disease that killed him. He wondered: Do the Tsimane suffer from Alzheimer’s disease like we do? And if not, what can we learn from them about treating or preventing dementia? “There is really no cure yet for Alzheimer’s,” Dr. Trumble told me. “We have nothing that can undo the damage already done.” Why, he wondered, had billions of dollars and decades of research yielded so little? Perhaps major clues were being missed. Dr. Trumble was trained as an anthropologist, and his field — evolutionary medicine — taught him to see our surroundings as a blip in the timeline of human history. He thinks it’s a problem that medical research focuses almost exclusively on “people who live in cities like New York or L.A.” Scientists often refer to these places as “Weird” — Western, educated, industrialized, rich and democratic — and point out that our bodies are still designed for the not-Weird environment in which our species evolved. Yet we know almost nothing about how dementia affected humans during the 50,000 years before developments like antibiotics and mechanized farming. Studying the Tsimane, Dr. Trumble believes, could shed light on this modern plague. © 2017 The New York Times Company
Keyword: Alzheimers
Link ID: 23839 - Posted: 07.15.2017
Brandie Jefferson It wasn't long ago that there were no treatments for multiple sclerosis. In the 1970s, some doctors used chemotherapy to treat the degenerative neurological disease. Since then, more than a dozen drugs have been developed or approved, including infusions, oral medications and self-administered shots. None of these are a magic bullet for a disease that can be disabling and deadly. But now there is a new drug, Ocrevus, that looks like a game-changer. It uses a novel approach to blocking the inflammation that drives the disease and looks as if it's spectacularly effective. It also costs $65,000 a year. I have MS. Should I take Ocrevus? That, I discovered, is not a simple question to answer. But because I'm an MS patient and a science journalist, I was determined to try to figure it out. In March, the FDA approved Ocrevus (ocrelizumab) for the treatment of relapsing-remitting multiple sclerosis, the most common form of the disease. People with RRMS tend to have flare-ups when their symptoms worsen, followed by periods of remission and, in some cases, a full or partial recovery. In two clinical trials sponsored by the drug's eventual manufacturer, F. Hoffmann-La Roche, RRMS patients who were given ocrelizumab had about 50 percent fewer relapses and up to 95 percent fewer new lesions on the brain and spinal cord than those who were given Rebif, a common therapy. MS is an autoimmune disease, meaning the body attacks itself. The body's nerve endings and the fatty tissue that coats them, called myelin, bear the brunt of the immune system's attacks. As a result, the central nervous system has difficulty communicating with the nerves, leading to a disease that manifests itself in different ways, such as pain, fatigue, disability and slurred speech. © 2017 npr
Keyword: Multiple Sclerosis
Link ID: 23838 - Posted: 07.15.2017
By Aaron Reuben, Jonathan Schaefer Most of us know at least one person who has struggled with a bout of debilitating mental illness. Despite their familiarity, however, these kinds of episodes are typically considered unusual, and even shameful. New research, from our lab and from others around the world, however, suggests mental illnesses are so common that almost everyone will develop at least one diagnosable mental disorder at some point in their lives. Most of these people will never receive treatment, and their relationships, job performance and life satisfaction will likely suffer. Meanwhile the few individuals who never seem to develop a disorder may offer psychology a new avenue of study, allowing researchers to ask what it takes to be abnormally, enduringly, mentally well. Epidemiologists have long known that, at any given point in time, roughly 20 to 25 percent of the population suffers from a mental illness, which means they experience psychological distress severe enough to impair functioning at work, school or in their relationships. Extensive national surveys, conducted from the mid-1990s through the early 2000s, suggested that a much higher percentage, close to half the population, would experience a mental illness at some point in their lives. These surveys were large, involving thousands of participants representative of the U.S. in age, sex, social class and ethnicity. They were also, however, retrospective, which means they relied on survey respondents’ accurate recollection of feelings and behaviors months, years and even decades in the past. Human memory is fallible, and modern science has demonstrated that people are notoriously inconsistent reporters about their own mental health history, leaving the final accuracy of these studies up for debate. Of further concern, up to a third of the people contacted by the national surveys failed to enroll in the studies. Follow-up tests suggested that these “nonresponders” tended to have worse mental health. © 2017 Scientific American
Keyword: Schizophrenia; Depression
Link ID: 23837 - Posted: 07.15.2017
Hannah Devlin Science correspondent Brash, brawny and keen to impose their will on anyone who enters their sphere of existence: the alpha male in action is unmistakable. Now scientists claim to have pinpointed the biological root of domineering behaviour. New research has located a brain circuit that, when activated in mice, transformed timid individuals into bold alpha mice that almost always prevailed in aggressive social encounters. In some cases, the social ranking of the subordinate mice soared after the scientists’ intervention, hinting that it might be possible to acquire “alphaness” simply by adopting the appropriate mental attitude. Or as Donald Trump might put it: “My whole life is about winning. I almost never lose.” Prof Hailan Hu, a neuroscientist at Zhejiang University in Hangzhou, China, who led the work said: “We stimulate this brain region and we can make lower ranked mice move up the social ladder.” The brain region, called the dorsal medial prefrontal cortex (dmPFC), was already known to light up during social interactions involving decisions about whether to be assertive or submissive with others. But brain imaging alone could not determine whether the circuit was ultimately controlling how people behave. The latest findings answer the question, showing that when the circuit was artificially switched on, low-ranking mice were immediately emboldened. “It’s not aggressiveness per se,” Hu said. “It increases their perseverance, motivational drive, grit.” © 2017 Guardian News and Media Limited
Keyword: Aggression; Sexual Behavior
Link ID: 23836 - Posted: 07.14.2017
Susan Milius Ravens have passed what may be their toughest tests yet of powers that, at least on a good day, let people and other apes plan ahead. Lab-dwelling common ravens (Corvus corax) in Sweden at least matched the performance of nonhuman apes and young children in peculiar tests of advanced planning ability. The birds faced such challenges as selecting a rock useless at the moment but likely to be useful for working a puzzle box and getting food later. Ravens also reached apelike levels of self-control, picking a tool instead of a ho-hum treat when the tool would eventually allow them to get a fabulous bit of kibble 17 hours later, Mathias Osvath and Can Kabadayi of Lund University in Sweden report in the July 14 Science. “The insight we get from the experiment is that [ravens] can plan for the future outside behaviors observed in the wild,” Markus Böckle, of the University of Cambridge, said in an e-mail. Böckle, who has studied ravens, coauthored a commentary in the same issue of Science. In the wild, ravens cache some of their food, but that apparent foresight could be more of a specific adaptation that evolved with diet instead of as some broader power of planning. The Lund tests, based on experiments with apes, tried to challenge ravens in less natural ways. The researchers say the birds aren’t considered much of a tool-using species in nature, nor do they trade for food. “The study for the first time in any animal shows that future planning can be used in behaviors it was not originally selected for” in evolution, Böckle says. © Society for Science & the Public 2000 - 2017.
Keyword: Intelligence; Evolution
Link ID: 23835 - Posted: 07.14.2017
By Ryan Cross Can you imagine watching 20,000 videos, 16 minutes apiece, of fruit flies walking, grooming, and chasing mates? Fortunately, you don’t have to, because scientists have designed a computer program that can do it faster. Aided by artificial intelligence, researchers have made 100 billion annotations of behavior from 400,000 flies to create a collection of maps linking fly mannerisms to their corresponding brain regions. Experts say the work is a significant step toward understanding how both simple and complex behaviors can be tied to specific circuits in the brain. “The scale of the study is unprecedented,” says Thomas Serre, a computer vision expert and computational neuroscientist at Brown University. “This is going to be a huge and valuable tool for the community,” adds Bing Zhang, a fly neurobiologist at the University of Missouri in Columbia. “I am sure that follow-up studies will show this is a gold mine.” At a mere 100,000 neurons—compared with our 86 billion—the small size of the fly brain makes it a good place to pick apart the inner workings of neurobiology. Yet scientists are still far from being able to understand a fly’s every move. To conduct the new research, computer scientist Kristin Branson of the Howard Hughes Medical Institute in Ashburn, Virginia, and colleagues acquired 2204 different genetically modified fruit fly strains (Drosophila melanogaster). Each enables the researchers to control different, but sometimes overlapping, subsets of the brain by simply raising the temperature to activate the neurons. © 2017 American Association for the Advancement of Science.
Keyword: Brain imaging
Link ID: 23834 - Posted: 07.14.2017
By BENEDICT CAREY Keith Conners, whose work with hyperactive children established the first standards for diagnosing and treating what is now known as attention deficit hyperactivity disorder, or A.D.H.D. — and who late in life expressed misgivings about how loosely applied that label had become — died on July 5 in Durham, N.C. He was 84. His wife, Carolyn, said the cause was heart failure. The field of child psychiatry was itself still young when Dr. Conners joined the faculty of the Johns Hopkins University School of Medicine in the early 1960s as a clinical psychologist. Children with emotional and behavioral problems often got a variety of diagnoses, depending on the clinic, and often ended up being given strong tranquilizers as treatment. Working with Dr. Leon Eisenberg, a prominent child psychiatrist, Dr. Conners focused on a group of youngsters who were chronically restless, hyperactive and sometimes aggressive. Doctors had recognized this type — “hyperkinesis,” it was called, or “minimal brain dysfunction” — but Dr. Conners combined existing descriptions and, using statistical analysis, focused on the core symptoms. The 39-item questionnaire he devised, called the Conners Rating Scale, quickly became the worldwide standard for assessing the severity of such problems and measuring improvement. It was later abbreviated to 10 items, giving child psychiatry a scientific foothold and anticipating by more than a decade the kind of checklists that would come to define all psychiatric diagnosis. He used his scale to study the effects of stimulant drugs on hyperactive children. Doctors had known since the 1930s that amphetamines could, paradoxically, calm such youngsters; a Rhode Island doctor, Charles Bradley, had published a well-known report detailing striking improvements in attention and academic performance among many children at a children’s inpatient home he ran near Providence. But it was a series of rigorous studies by Dr. Conners, in the 1960s and ’70s, that established stimulants — namely Dexedrine and Ritalin — as the standard treatments. © 2017 The New York Times Company
Keyword: ADHD
Link ID: 23833 - Posted: 07.14.2017
By PAM BELLUCK How we look at other people’s faces is strongly influenced by our genes, scientists have found in new research that may be especially important for understanding autism because it suggests that people are born with neurological differences that affect how they develop socially. The study, published on Wednesday in the journal Nature, adds new pieces to the nature-versus-nurture puzzle, suggesting that genetics underlie how children seek out formative social experiences like making eye contact or observing facial expressions. Experts said the study may also provide a road map for scientists searching for genes linked to autism. “These are very convincing findings, novel findings,” said Charles A. Nelson III, a professor of pediatrics and neuroscience at Harvard Medical School and Boston Children’s Hospital, who was not involved in the research. “They seem to suggest that there’s a genetic underpinning that leads to different patterns of brain development, that leads some kids to develop autism.” Dr. Nelson, an expert in child development and autism who was an independent reviewer of the study for Nature, said that while autism is known to have a genetic basis, how specific genes influence autism’s development remains undetermined. The study provides detailed data on how children look at faces, including which features they focus on and when they move their eyes from one place to another. The information, Dr. Nelson said, could help scientists “work out the circuitry that controls these eye movements, and then we ought to be able to work out which genes are being expressed in that circuit.” “That would be a big advance in autism,” he said. In the study, scientists tracked the eye movements of 338 toddlers while they watched videos of motherly women as well as of children playing in a day care center. The toddlers, 18 months to 24 months old, included 250 children who were developing normally (41 pairs of identical twins, 42 pairs of nonidentical twins and 84 children unrelated to each other). There were also 88 children with autism. © 2017 The New York Times Company
Carina Storrs In the late 1960s, a team of researchers began doling out a nutritional supplement to families with young children in rural Guatemala. They were testing the assumption that providing enough protein in the first few years of life would reduce the incidence of stunted growth. It did. Children who got supplements grew 1 to 2 centimetres taller than those in a control group. But the benefits didn't stop there. The children who received added nutrition went on to score higher on reading and knowledge tests as adolescents, and when researchers returned in the early 2000s, women who had received the supplements in the first three years of life completed more years of schooling and men had higher incomes1. “Had there not been these follow-ups, this study probably would have been largely forgotten,” says Reynaldo Martorell, a specialist in maternal and child nutrition at Emory University in Atlanta, Georgia, who led the follow-up studies. Instead, he says, the findings made financial institutions such as the World Bank think of early nutritional interventions as long-term investments in human health. Since the Guatemalan research, studies around the world — in Brazil, Peru, Jamaica, the Philippines, Kenya and Zimbabwe — have all associated poor or stunted growth in young children with lower cognitive test scores and worse school achievement2. A picture slowly emerged that being too short early in life is a sign of adverse conditions — such as poor diet and regular bouts of diarrhoeal disease — and a predictor for intellectual deficits and mortality. But not all stunted growth, which affects an estimated 160 million children worldwide, is connected with these bad outcomes. Now, researchers are trying to untangle the links between growth and neurological development. Is bad nutrition alone the culprit? What about emotional neglect, infectious disease or other challenges? © 2017 Macmillan Publishers Limited
Keyword: Development of the Brain
Link ID: 23831 - Posted: 07.13.2017
By Jane C. Hu In English the sky is blue, and the grass is green. But in Vietnamese there is just one color category for both sky and grass: xanh. For decades cognitive scientists have pointed to such examples as evidence that language largely determines how we see color. But new research with four- to six-month-old infants indicates that long before we learn language, we see up to five basic categories of hue—a finding that suggests a stronger biological component to color perception than previously thought. The study, published recently in the Proceedings of the National Academy of Sciences USA, tested the color-discrimination abilities of more than 170 British infants. Researchers at the University of Sussex in England measured how long babies spent gazing at color swatches, a metric known as looking time. First the team showed infants one swatch repeatedly until their looking time decreased—a sign they had grown bored with it. Then the researchers showed them a different swatch and noted their reaction. Longer looking times were interpreted to mean the babies considered the second swatch to be a new hue. Their cumulative responses showed that they distinguished among five colors: red, green, blue, purple and yellow. The finding “suggests we’re all working from the same template,” explains lead author Alice Skelton, a doctoral student at Sussex. “You come prepackaged to make [color] distinctions, but given your culture and language, certain distinctions may or may not be used.” For instance, infants learning Vietnamese most likely see green and blue, even if their native language does not use distinct words for the two colors. © 2017 Scientific American
Keyword: Vision; Development of the Brain
Link ID: 23830 - Posted: 07.13.2017
By Alice Klein FOR decades, new parents have been warned against sharing a bed with their babies. While snuggling up with your newborn may seem like the most natural thing in the world, prevailing medical advice says this increases the risk of sudden infant death syndrome (SIDS), sometimes called cot death. Instead, doctors say your little ones should sleep in a separate crib in your bedroom. On the other side of the argument are anthropologists and proponents of “attachment parenting”, who believe that infant-parent separation is unnatural and at odds with our evolutionary history. They favour not just room-sharing but bed-sharing – putting them in direct conflict with paediatric advice. This debate was recently reignited by a study suggesting that room-sharing for up to nine months reduces a baby’s sleep, which in theory could have future health consequences. So what’s a sleep-deprived parent to do? Our ancestors slept in direct contact with their young in order to protect them, just as other primates do today, says Helen Ball at Durham University, UK. “Babies respond to close contact – their breathing, blood oxygen and heart rate are on a more even keel.” In Asia and Africa, most babies still share their parents’ beds (see map). But in the West, bed-sharing fell during the industrial revolution as increased wealth let people afford separate rooms and value was placed on teaching early independence. © Copyright New Scientist Ltd.
Keyword: Sleep
Link ID: 23829 - Posted: 07.13.2017
By Giorgia Guglielmi Semen has something in common with the brains of Alzheimer’s sufferers: Both contain bundles of protein filaments called amyloid fibrils. But although amyloid accumulation appears to damage brain cells, these fibrils may be critical for reproduction. A new study suggests that semen fibrils immobilize subpar sperm, ensuring that only the fittest ones make it to the egg. “I’m sure that from the very first time scientists described semen fibrils, they must have been speculating what their natural function was,” says Daniel Otzen, an expert in protein aggregates at Aarhus University in Denmark, who did not participate in the research. “This seems to be the smoking gun.” Researchers discovered semen fibrils in 2007. At first, they seemed like mostly bad news. Scientists showed that the fibrils, found in the seminal fluid together with sperm cells and other components, can bind to HIV, helping it get inside cells. But the fibrils are found in most primates, notes Nadia Roan, a mucosal biologist at the University of California, San Francisco. “If fibrils didn’t serve some beneficial purpose, they would have been eliminated over evolutionary time.” Because the way HIV fuses to cells is reminiscent of the way a sperm fuses to the egg, she wondered whether the fibrils facilitated fertilization. © 2017 American Association for the Advancement of Science.
Keyword: Alzheimers; Sexual Behavior
Link ID: 23828 - Posted: 07.12.2017
By Linda Geddes Many dangers stalk the bushlands of Tanzania while members of the Hadza people sleep, yet no one keeps watch. There is no need because it seems that natural variation in sleep means there’s rarely a moment when someone isn’t alert enough to raise the alarm. That’s the conclusion of a study that sheds new light on why teenagers sleep late while grandparents are often up at the crack of dawn. Fifty years ago, psychologist Frederick Snyder proposed that animals who live in groups stay vigilant during sleep, by having some stay awake while others rest. However, no one had tested this sentinel hypothesis in humans until now. One way of maintaining this constant vigilance might be by the evolution of different chronotypes – individual differences in when we tend to sleep. This changes as we age, with teenagers shifting towards later bedtimes, and older people towards earlier bedtimes. Would such variability be enough to keep a community safe at night? To investigate, David Samson, then at the University of Toronto in Canada, and his colleagues turned to the Hadza, a group of hunter-gatherers in northern Tanzania. The Hadza sleep in grass huts, each containing one or two adults and often several children. They live in camps of around 30 adults, although several other camps may be close by. Samson recruited 33 adults from two nearby groups of 22 huts and asked them to wear motion-sensors on their wrists to monitor sleep, for 20 days. “It turned out that it was extremely rare for there to be synchronous sleep,” says Samson, now at Duke University in Durham, North Carolina. © Copyright New Scientist Ltd.
Keyword: Sleep; Development of the Brain
Link ID: 23827 - Posted: 07.12.2017
By Jessica Wright, Spectrum on July 11, 2017 Treatment with the hormone oxytocin improves social skills in some children with autism, suggest results from a small clinical trial. The results appeared today in the Proceedings of the National Academy of Sciences1. Oxytocin, dubbed the ‘love hormone,’ enhances social behavior in animals. This effect makes it attractive as a potential autism treatment. But studies in people have been inconsistent: Some small trials have shown that the hormone improves social skills in people with autism, and others have shown no benefit. This may be because only a subset of people with autism respond to the treatment. In the new study, researchers tried to identify this subset. The same team showed in 2014 that children with relatively high blood levels of oxytocin have better social skills than do those with low levels2. In their new work, the researchers examined whether oxytocin levels in children with autism alter the children’s response to treatment with the hormone. They found that low levels of the hormone prior to treatment are associated with the most improvement in social skills. “We need to be thinking about a precision-medicine approach for autism,” says Karen Parker, associate professor of psychiatry at Stanford University in California, who co-led the study. “There’s been a reasonable number of failed [oxytocin] trials, and the question is: Could they have failed because all of the kids, by blind, dumb luck, had really high baseline oxytocin levels?” The study marks the first successful attempt to find a biological marker that predicts response to the therapy. © 2017 Scientific American,
Keyword: Autism; Hormones & Behavior
Link ID: 23826 - Posted: 07.12.2017
By Ryan Cross Whether caused by a car accident that slams your head into the dashboard or repeated blows to your cranium from high-contact sports, traumatic brain injury can be permanent. There are no drugs to reverse the cognitive decline and memory loss, and any surgical interventions must be carried out within hours to be effective, according to the current medical wisdom. But a compound previously used to enhance memory in mice may offer hope: Rodents who took it up to a month after a concussion had memory capabilities similar to those that had never been injured. The study “offers a glimmer of hope for our traumatic brain injury patients,” says Cesario Borlongan, a neuroscientist who studies brain aging and repair at the University of South Florida in Tampa. Borlongan, who reviewed the new paper, notes that its findings are especially important in the clinic, where most rehabilitation focuses on improving motor—not cognitive—function. Traumatic brain injuries, which cause cell death and inflammation in the brain, affect 2 million Americans each year. But the condition is difficult to study, in part because every fall, concussion, or blow to the head is different. Some result in bleeding and swelling, which must be treated immediately by drilling into the skull to relieve pressure. But under the microscope, even less severe cases appear to trigger an “integrated stress response,” which throws protein synthesis in neurons out of whack and may make long-term memory formation difficult. © 2017 American Association for the Advancement of Science.
Keyword: Learning & Memory; Brain Injury/Concussion
Link ID: 23825 - Posted: 07.11.2017
Tina Hesman Saey How well, not how much, people sleep may affect Alzheimer’s disease risk. Healthy adults built up Alzheimer’s-associated proteins in their cerebral spinal fluid when prevented from getting slow-wave sleep, the deepest stage of sleep, researchers report July 10 in Brain. Just one night of deep-sleep disruption was enough to increase the amount of amyloid-beta, a protein that clumps into brain cell‒killing plaques in people with Alzheimer’s. People in the study who slept poorly for a week also had more of a protein called tau in their spinal fluid than they did when well rested. Tau snarls itself into tangles inside brain cells of people with the disease. These findings support a growing body of evidence that lack of Zs is linked to Alzheimer’s and other neurodegenerative diseases. Specifically, “this suggests that there’s something special about deep, slow-wave sleep,” says Kristine Yaffe, a neurologist and psychiatrist at the University of California, San Francisco who was not involved in the study. People with Alzheimer’s are notoriously poor sleepers, but scientists aren’t sure if that is a cause or a consequence of the disease. Evidence from recent animal and human studies suggests the problem goes both ways, Yaffe says. Lack of sleep may make people more prone to brain disorders. And once a person has the disease, disruptions in the brain may make it hard to sleep. Still, it wasn’t clear why not getting enough shut-eye promotes Alzheimer’s disease.
Keyword: Sleep; Alzheimers
Link ID: 23824 - Posted: 07.11.2017
Nicola Davis People who drink coffee have a lower risk of dying from a host of causes, including heart disease, stroke and liver disease, research suggests – but experts say it’s unclear whether the health boost is down to the brew itself. The connection, revealed in two large studies, was found to hold regardless of whether the coffee was caffeinated or not, with the effect higher among those who drank more cups of coffee a day. But scientists say that the link might just be down to coffee-drinkers having healthier behaviours. “It is plausible that there is something else behind this that is causing this relationship,” said Marc Gunter, a co-author of one of the studies, from the International Agency for Research on Cancer. But, he added, based on the consistency of the results he would be surprised if coffee itself didn’t play a role in reducing the risk of death. About 2.25bn cups of coffee are consumed worldwide every day. While previous studies have suggested coffee might have health benefits, the latest research involves large and diverse cohorts of participants. The first study looked at coffee consumption among more than 185,000 white and non-white participants, recruited in the early 1990s and followed up for an average of over 16 years. The results revealed that drinking one cup of coffee a day was linked to a 12% lower risk of death at any age, from any cause while those drinking two or three cups a day had an 18% lower risk, with the association not linked to ethnicity. © 2017 Guardian News and Media Limited
Keyword: Drug Abuse; Stroke
Link ID: 23823 - Posted: 07.11.2017
Dean Burnett Antidepressants; the go-to treatment for depression, or generalised anxiety. It’s incredible when you think about it, the fact that you can have a debilitating mood disorder, take a few pills, and feel better. It’s unbelievable that medical science has progressed so far that we now fully understand how the human brain produces moods and other emotions, so can manipulate them with designer drugs. That’s right, it is unbelievable. Because it isn’t the case. The fact that antidepressants are now so common is something of a mixed blessing. On one hand, anything that helps reduce stigma and lets those afflicted know they aren’t alone can only be helpful. Depression is incredibly common, so this awareness can literally save many lives. On the other hand, familiarity does not automatically mean understanding. Nearly everyone has a smartphone these days, but how many people, if pushed, could construct a touchscreen? Not many, I’d wager. And so it is with depression and antidepressants. For all the coverage and opinion pieces produced about them, the details around how they work remain somewhat murky and elusive. Actually, in the case of antidepressants, it’s more a question of why they work, rather than how. Most antidepressants, from the earliest Trycyclics and Monamine Oxidase inhibitors, to the ubiquitous modern day selective serotonin reuptake inhibitors (SSRIs), work by increasing the levels of specific neurotransmitters in the brain, usually by preventing them from being broken down and reabsorbed into the neurons, meaning they linger in the synapses longer, causing more activity, so “compensating” for the reduced overall levels. Antidepressants make the remaining neurotransmitters work twice as hard, so overall activity is more “normal”, so to speak. © 2017 Guardian News and Media Limited
Keyword: Depression
Link ID: 23822 - Posted: 07.11.2017
By Jennifer Oullette Are brain-training games any better at improving your ability to think, remember and focus than regular computer games? Possibly not, if the latest study is anything to go by. Joseph Kable at the University of Pennsylvania and his colleagues have tested the popular Luminosity brain-training program from Lumos Labs in San Francisco, California, against other computer games and found no evidence that it is any better at improving your thinking skills. Brain-training is a booming market. It’s based on the premise that our brains change in response to learning challenges. Unlike computer games designed purely for entertainment, brain-training games are meant to be adaptive, adjusting challenge levels in response to a player’s changing performance. The thinking is that this should improve a player’s memory, attention, focus and multitasking skills. But there are questions over whether brain-training platforms can enhance cognitive function in a way that is meaningful for wider life. Last year, Lumos Labs paid $2 million to settle a charge from the US Federal Trade Commission for false advertising. Advertising campaigns had claimed that the company’s memory and attention games could reduce the effects of age-related dementia, and stave off Alzheimer’s disease. Most studies on the effects of brain-training games have been small and had mixed results. For this study, Kable and his colleagues recruited 128 young healthy adults for a randomised controlled trial. © Copyright New Scientist Ltd.
Keyword: Learning & Memory; Alzheimers
Link ID: 23821 - Posted: 07.11.2017
Aimee Cunningham An expectant mom might want to think twice about quenching her thirst with soda. The more sugary beverages a mom drank during mid-pregnancy, the heavier her kids were in elementary school compared with kids whose mothers consumed less of the drinks, a new study finds. At age 8, boys and girls weighed approximately 0.25 kilograms more — about half a pound — with each serving mom added per day while pregnant, researchers report online July 10 in Pediatrics. “What happens in early development really has a long-term impact,” says Meghan Azad, an epidemiologist at the University of Manitoba in Canada, who was not involved in the study. A fetus’s metabolism develops in response to the surrounding environment, including the maternal diet, she says. The new findings come out of a larger project that studies the impact of pregnant moms’ diets on their kids’ health. “We know that what mothers eat during pregnancy may affect their children’s health and later obesity,” says biostatistician Sheryl Rifas-Shiman of Harvard Medical School and Harvard Pilgrim Health Care Institute in Boston. “We decided to look at sugar-sweetened beverages as one of these factors.” Sugary drinks are associated with excessive weight gain and obesity in studies of adults and children. Rifas-Shiman and colleagues included 1,078 mother-child pairs in the study. Moms filled out a questionnaire in the first and second trimesters of their pregnancy about what they were drinking — soda, fruit drinks, 100 percent fruit juice, diet soda or water — and how often. Soda and fruit drinks were considered sugar-sweetened beverages. A serving was defined as a can, glass or bottle of a beverage. |© Society for Science & the Public 2000 - 2017
Keyword: Obesity; Development of the Brain
Link ID: 23820 - Posted: 07.11.2017


.gif)

