Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Lydia Denworth The demands of parenthood are so considerable that it’s fair to wonder why any adult takes on the challenge. Mammalian babies are especially helpless—and among mammals, only humans can see beyond individual sacrifice to understand a species’s survival depends on caring for its young. Yet there is remarkable consistency in the way all mammals change their behavior upon becoming parents. Suddenly they are motivated to care for their young, and know how to feed and shelter, nurture and protect new babies. Parents also give up a lot of adult social interaction, whether it is mating with other mice or going barhopping with friends. “What this means is that there is this instinctive or genetically programmed aspect to the drive to take care of offspring,” says neuroscientist Catherine Dulac of Harvard University. But if a complicated and variable behavior like parenting is hardwired, how would that work? Reporting in Nature this week, Dulac, also a Howard Hughes Medical Institute investigator, and her colleagues have provided a wiring diagram of the brain-wide circuit that coordinates parenting behavior in mice. The study marks the first deconstruction of the architecture of a brain circuit underlying a complex social behavior. The circuit they describe resembles the hub-and-spoke flight-routing system used by airlines and relies on a type of neuron that expresses the signaling molecule galanin. A relatively small number of these galanin neurons form a parenting command center—the medial preoptic area (MPOA)—in the hypothalamus, a brain structure responsible for controlling everything from appetite to sex drive. Responding to sensory input received from all over the brain, the neurons at the hub send distinct messages to at least 20 downstream subsets of galanin neurons. Like an airport terminal serving passengers according to their destinations, these subsets of cells, which the researchers dub “pools,” handle different facets of parenting behavior such as motor control of grooming or the motivation to parent at all. © 2018 Scientific American
Keyword: Sexual Behavior
Link ID: 24859 - Posted: 04.12.2018
By Anna Azvolinsky In recent years, scientists have accomplished what previously was saved for miracle workers: they have given blind patients the ability to see better. In 2017, the vision field saw an enormous advance with the approval Luxturna, the first gene therapy to correct vision loss in certain patients with childhood onset blindness. And just last week, researchers reported that a retinal implant allowed a 69-year-old woman with macular degeneration to more than double the number of letters she could identify on a vision chart. “It’s early data but very promising, including one patient with impressive vision gains, for a disease where we don’t have any treatment options,” says Thomas Albini of the University of Miami’s Bascom Palmer Eye Institute who was not involved in the study. The implant, given to five patients with dry age-related macular degeneration (AMD), is a single sheet of retinal pigment epithelial (RPE) cells derived from human embryonic stem cells. Other teams across the globe are inventing their own form of RPE implants, and this type of approach is just one of a plethora of modalities being tested to either slow down or reverse various forms of blindness. © 1986-2018 The Scientist
Keyword: Vision
Link ID: 24858 - Posted: 04.12.2018
Phil Plait I've said this before and I will no doubt say it many times hence: I love optical illusions. For one thing, they're just fun. They warp our sense of reality, and it's really pretty cool to see how easy it is to fool our senses. Some illusions do this in complicated ways that make it hard to understand where our perception goes wrong (and yes, there are entire fields of psychology devoted to figuring out how our brain and eyes physically react to illusions). But others are surprisingly simple, yet are deeply difficult to not see. One of the best is the Müller-Lyer illusion, and the easiest way to describe it is to just show it to you: The two horizontal lines are the same length, but the lower one looks longer thanks to the Müller-Lyer illusion. The two horizontal lines are the same length, but the lower one looks longer thanks to the Müller-Lyer illusion. Despite what your eyes are telling you, those two horizontal lines are the same size! Measure them if you don't believe me. It's an utterly convincing illusion, but an illusion all the same. But still, it's simple, right? Well, visual artist Gianni Sarcone decided to play with this illusion, and created an animated version of it that is simply stunning.
Keyword: Vision
Link ID: 24857 - Posted: 04.12.2018
by Robby Berman She was wide awake and it was nearly two in the morning. When asked if everything was alright, she said, “Yes.” Asked why she couldn’t get to sleep she said, “I don’t know.” Neuroscientist Russell Foster of Oxford might suggest she was exhibiting “a throwback to the bi-modal sleep pattern." Research suggests we used to sleep in two segments with a period of wakefulness in-between. A. Roger Ekirch, historian at Virginia Tech, uncovered our segmented sleep history in his 2005 book At Day’s Close: A Night in Time’s Past. There’s very little direct scientific research on sleep done before the 20th century, so Ekirch spent years going through early literature, court records, diaries, and medical records to find out how we slumbered. He found over 500 references to first and second sleep going all the way back to Homer’s Odyssey. “It’s not just the number of references—it is the way they refer to it as if it was common knowledge,” Ekirch tells BBC. "He knew this, even in the horror with which he started from his first sleep, and threw up the window to dispel it by the presence of some object, beyond the room, which had not been, as it were, the witness of his dream." — Charles Dickens, Barnaby Rudge (1840) Here’s a suggestion for dealing with depression from English ballad 'Old Robin of Portingale': "And at the wakening of your first sleepe/You shall have a hott drinke made/And at the wakening of your next sleepe/Your sorrowes will have a slake." Two-part sleep was practiced into the 20th century by people in Central America and Brazil and is still practiced in areas of Nigeria. © Copyright 2007-2018 & BIG THINK
Keyword: Sleep
Link ID: 24856 - Posted: 04.12.2018
By Kerry Grens A variant in the gene for a certain hormone is tied to people eating more carbs. Yet a new study of 451,000 people finds that the allele doesn’t universally mean poorer health. Researchers reported yesterday (April 10) in Cell Reports that those with the sweet-tooth variant actually have lower body fat than others, and no higher risk for type 2 diabetes. They did, however, find a link between the allele and high blood pressure and a thicker waistline. “This goes against the current perception that eating sugar is bad for health. It may reduce body fat because the same allele also results in a lower consumption of protein and fat in the diet,” study coauthor Timothy Frayling, a molecular geneticist at the University of Exeter Medical School in the U.K., says in a press release. “But whilst this version of the gene lowers body fat, it also redistributes fat to the upper body, where it’s more likely to cause harm, including higher blood pressure.” The gene of interest here is FGF21, which encodes fibroblast growth factor 21, a hormone involved in alcohol and sugar consumption and insulin sensitization. The authors note that it’s a target of weight loss interventions. People with a particular allele of FGF21—20 percent of Europeans are homozygous for the variant—tend to consume relatively more sugar and alcohol than those without the allele. To see what consequences this might have on people’s health, Frayling and his colleagues collected data on 451,000 people whose genetic and health information is part of the UK Biobank. © 1986-2018 The Scientist
Keyword: Obesity; Genes & Behavior
Link ID: 24855 - Posted: 04.12.2018
By NICHOLAS BAKALAR Morning people may live longer than night owls, a new study suggests. Researchers studied 433,268 people, aged 38 to 73, who defined themselves as either “definite morning” types, “moderate morning” types, “moderate evening” types or “definite evening” types. They followed their health for an average of six-and-a-half years, tracking cause of death with death certificates. The study is in Chronobiology International. After controlling for age and sex, smoking, body mass index, sleep duration and other variables, they found that compared with “definite morning” types, “definite evening” types had a 10 percent increased risk of dying from any cause. Each increase from “morningness” to “eveningness” was associated with an increased risk for disease. Night owls were nearly twice as likely as early risers to have a psychological disorder and 30 percent more likely to have diabetes. Their risk for respiratory disease was 23 percent higher and for gastrointestinal disease 22 percent higher. The lead author, Kristen L. Knutson, an associate professor of neurology at Northwestern University, said that while being a night owl is partly genetic, people can make adjustments — gradually making bedtime earlier, avoiding using smartphones before bed, and eventually moving themselves out of the “night owl zone.” Although the reasons for their increased mortality remain unclear, she said, “Night owls should know that there may be some health consequences.” © 2018 The New York Times Company
Keyword: Biological Rhythms
Link ID: 24854 - Posted: 04.12.2018
By HEATHER MURPHY A battle over pleasure has broken out. On Twitter and in the pages of scientific journals, psychologists, neurologists and neuroscientists are forging alliances over the question of whether pleasure we get from art is somehow different from the pleasure we get from candy, sex or drugs. The debate was ignited by an opinion piece titled “Pleasure Junkies All Around!” published last year in the journal Proceedings of the Royal Society B. In it, Julia F. Christensen, a neuroscientist at the The Warburg Institute at the University of London who studies people’s responses to dance choreography, argued that many of us have been turned into “mindless pleasure junkies, handing over our free will for the next dopamine shoot” provided by social media, pornography and sugar. She offered up an unconventional solution: art, which she says engages us in ways these other pleasures do not and can “help overwrite the detrimental effects of dysfunctional urges and craving.” The paper struck a nerve with some of her fellow art and pleasure researchers, who published a rebuttal last month in the same journal. The idea that the way that art engages the brain is somehow special has been around for far too long and it is time to kill it off once and for all, they insist. “Christensen has recently argued that the pleasure induced by art is different to the pleasure induced by food, sex, sports, or drugs. Her argument, however, is contradicted by plenty of evidence showing that the pleasure from art is no different in genesis and function to the pleasure induced by food, drugs, and sex,” wrote Marcos Nadal, a psychologist at the University of the Balearic Islands who studies people’s responses to curvilinear lines in architecture and art, and Martin Skov, a neuroscientist at the Danish Research Centre for Magnetic Resonance, who studies decision-making. Their comment spurred others to rally to Dr. Christensen’s defense. The arguments over Dr. Christensen’s paper pointed to disputes within the emerging field of neuroaesthetics, or the study of the neural processes underlying our appreciation and the production of beautiful objects and artworks: © 2018 The New York Times Company
Keyword: Drug Abuse; Attention
Link ID: 24853 - Posted: 04.11.2018
Jason Murugesu We all daydream, whether about marrying Rihanna, discovering a sudden ability to sing opera or never having to answer another email again. Yet it is only in the last few decades that the science behind daydreaming, or mind-wandering as it is termed in most academic literature, has transitioned from the realms of pseudoscience to the cutting edge of cognitive neuroscience. At its most basic, daydreaming is your mind wandering from the here and now. Traditionally, daydreaming was considered to be a single psychological state of mind. This, however, caused conflict in academic literature, and the resulting confusion is the reason why you might read that daydreaming is linked to happiness in one paper, but to depression in the next. Different types of mind-wandering have been conflated. Using neuroimaging techniques, a study conducted last year by the University of York found that different types of daydreams – for example, those which are fantastical, autobiographical, future orientated or past oriented – were built up of different neuronal activation patterns, and by virtue could not be considered a single psychological construct. Nevertheless, if we consider all these types of mind-wandering together, you would be surprised about how much of our waking time we spend daydreaming. In 2008, Professor Matthew Killingsworth, then at Harvard University, used an app that contacted a large group of people at random points of the day to find out how often they were daydreaming. The app would ask its users what they were doing, and whether they were thinking about something else entirely. They found that 46.9 per cent of the time, the user was mind-wandering.
Keyword: Attention
Link ID: 24852 - Posted: 04.11.2018
By Gary Stix Self-help books often extoll the value of resilience. Last year one such primer—Bounce: Overcoming Adversity, Building Resilience and Finding Joy—proclaimed: “By strengthening your inner power, your ability to handle stressful situations and your skill in persevering after setbacks threaten to fell you, you’ll develop real resilience—you’ll develop grit.” This implies weathering adverse life events is a character trait to be cultivated. Exercising, eating right and giving yourself mental pep talks certainly may help. But neuroscientists are learning the story is not quite so simple, and that some people are likely better equipped from birth to deal with adversity. During the last 15 years discoveries about why some brains excel at resisting stress have initiated a search for new drugs to treat depression and post-traumatic stress disorder by enhancing psychological resilience. One of these compounds has now entered early-stage clinical trials. If the drug is safe and works, it will undoubtedly encounter strong demand; depression—the world’s leading cause of mental disability—never enters full remission in more than half the patients treated with psychotherapies and existing antidepressants. But depression does not affect everyone, and the molecular biology of resilience for psychiatric disorders can be clearly seen by inspecting the brains of lab animals. About a third of mice exposed to severe stress (in the form of aggressive attacks by other rodents) seem to breeze through these assaults without developing the social withdrawal, listlessness or other depression and traumalike symptoms displayed by most of their rodent lab-mates. © 2018 Scientific America
Keyword: Depression
Link ID: 24851 - Posted: 04.11.2018
By Ashley Yeager Mothers who take selective serotonin reuptake inhibitors (SSRIs), a class of commonly used antidepressants, while pregnant have babies with distinct structural changes to their brains, researchers report today (April 9) in JAMA Pediatrics. MRI scans of the babies’ brains revealed exposure to the drugs in the womb increased the volumes of the babies’ amygdalae and insular cortices—regions that play a role in processing emotions. “Hopefully these results highlight the fact that something could be going on here,” study coauthor Claudia Lugo-Candelas, a postdoctoral research fellow at Columbia University tells Time. “They point to the fact that there is a signal—we don’t know what it means, or don’t know how long it might last. But we know it’s worth studying.” The number of women using SSRIs while pregnant is increasing, but not much is known about how the medication might affect the brains of developing babies. Studies in animals suggest exposure to SSRIs can change the offspring’s brain circuitry and lead to depressive-like behaviors and anxiety later in life. In the new study, Lugo-Candelas and her colleagues studied the brains of 98 human infants, 16 babies whose mothers were treated for depression with SSRIs during pregnancy, 21 mothers with depression but not treated, and 61 mothers with no history of depression. © 1986-2018 The Scientist
Keyword: Depression; Development of the Brain
Link ID: 24850 - Posted: 04.11.2018
Nicola Davis A man who took part in a chilli pepper eating contest ended up with more than he bargained for when he took on the hottest pepper in the world. After eating a Carolina Reaper pepper, the 34-year-old started dry heaving before developing a pain in his neck that turned into a series of thunderclap headaches: sudden and severe episodes of excruciating pain that peak within a minute. Scoville scale: The hottest chillies in the world– in pictures The Carolina Reaper, which can top 2.2m on the Scoville heat scale, was the world’s hottest pepper at the time of the incident in 2016 – although new breeds called Pepper X and Dragon’s Breath have since reportedly surpassed it. The details, published in the journal BMJ Case Reports, reveal the pain was so terrible the man went to the emergency room at Bassett Medical Center in Cooperstown, a village in New York State. “[A thunderclap headache] lasts for a few minutes and it might be associated with dry-heaving, nausea, vomiting – and then it gets better on its own. But it keeps coming back,” said Dr Kulothungan Gunasekaran of the Henry Ford Health System in Detroit, a co-author of the report, adding that thunderclap headaches can be caused by a number of problems including bleeding inside the brain or blood clots. CT and MRI scans of the man’s brain were taken but showed nothing out of the ordinary. What’s more, the man did not report having any speech or vision problems. © 2018 Guardian News and Media Limited
Keyword: Pain & Touch; Stroke
Link ID: 24849 - Posted: 04.11.2018
By GRETCHEN REYNOLDS If you give a mouse a running wheel, it will run. But it may not burn many additional calories, because it will also start to move differently when it is not on the wheel, according to an interesting new study of the behaviors and metabolisms of exercising mice. The study, published in Diabetes, involved animals, but it could have cautionary implications for people who start exercising in the hopes of losing weight. In recent years, study after study examining exercise and weight loss among people and animals has concluded that, by itself, exercise is not an effective way to drop pounds. In most of these experiments, the participants lost far less weight than would have been expected, mathematically, given how many additional calories they were burning with their workouts. Scientists involved in this research have suspected and sometimes shown that exercisers, whatever their species, tend to become hungrier and consume more calories after physical activity. They also may grow more sedentary outside of exercise sessions. Together or separately, these changes could compensate for the extra energy used during exercise, meaning that, over all, energy expenditure doesn’t change and a person’s or rodent’s weight remains stubbornly the same. Proving that possibility has been daunting, though, in part because it is difficult to quantify every physical movement someone or something makes, and how their movements do or do not change after exercise. Mice, for instance, skitter, dart, freeze, groom, eat, roam, defecate and otherwise flit about in frequent fits and starts. But recently, animal researchers hit upon the idea of using infrared light beams to track how animals move at any given moment in their cages. Sophisticated software then can use that information to map daily patterns of physical activity, showing, second-by-second, when, where and for how long an animal roams, sits, runs or otherwise spends its time. © 2018 The New York Times Company
Keyword: Obesity
Link ID: 24848 - Posted: 04.11.2018
By Alex Therrien Health reporter, BBC News People who suffer brain injuries are at increased risk of dementia later in life, a large study suggests. An analysis of 2.8 million people found those who had one or more traumatic brain injuries were 24% more likely to get dementia than those who had not. The risk was greatest in people who had the injuries in their 20s, who were 63% more likely to get the condition at some point in their life. But independent experts said other lifestyle factors were more important. Dementia, a category of brain diseases that includes Alzheimer's, affects some 47 million people worldwide - a number expected to double in the next 20 years. Previous research has suggested a link between brain injuries - leading causes of which include falls, motor vehicle accidents, and assaults - and subsequent dementia, but evidence has been mixed. This new study, which followed people in Denmark over a 36-year period, found those who had experienced even one mild TBI (concussion) were 17% more likely to get dementia, with the risk increasing with the number of TBIs and the severity of injury. Sustaining the injury at a younger age appeared to further increase the risk of getting the condition, the research found. Those who suffered a TBI in their 30s were 37% more likely to develop dementia later in life, while those who had the injury in their 50s were only 2% more likely to get the condition. © 2018 BBC
Keyword: Brain Injury/Concussion; Alzheimers
Link ID: 24847 - Posted: 04.11.2018
Jon Hamilton An international coalition of brain researchers is suggesting a new way of looking at Alzheimer's. Instead of defining the disease through symptoms like memory problems or fuzzy thinking, the scientists want to focus on biological changes in the brain associated with Alzheimer's. These include the plaques and tangles that build up in the brains of people with the disease. But they say the new approach is intended only for research studies, and isn't yet ready for use by most doctors who treat Alzheimer's patients. If the new approach is widely adopted, it would help researchers study patients whose brain function is still normal, but are likely to develop dementia caused by Alzheimer's. "There is a stage of the disease where there are no symptoms and we need to have some sort of a marker," says Eliezer Masliah, who directs the Division of Neuroscience at the National Institute on Aging. The new approach would be a dramatic departure from the traditional way of looking at Alzheimer's, says Clifford Jack, an Alzheimer's researcher at Mayo Clinic Rochester. In the past, "a person displayed a certain set of signs and symptoms and it was expected that they had Alzheimer's pathology," says Jack, who is the first author of the central paper describing the proposed new "research framework." © 2018 npr
Keyword: Alzheimers
Link ID: 24846 - Posted: 04.10.2018
Ian Sample Science editor Modern humans might never have raised a quizzical eyebrow had Homo sapiens not lost the thick, bony brows of its ancient ancestors in favour of smoother facial features, a new study suggests. Researchers at the University of York believe early humans bore prominent brow ridges as a mark of physical dominance, and as the human face evolved to become smaller and flatter, it became a canvas on which the eyebrows could portray a much richer range of emotions. “We traded dominance or aggression for a wider palette of expression,” said Paul O’Higgins, a professor of anatomy and lead author on the study. “As the face became smaller and the forehead flattened, the muscles in the face could move the eyebrows up and down and we could express all these subtler feelings.” The York team stress their conclusions are speculative, but if they are right, the evolution of smaller, flatter faces may have unleashed the social power of the eyebrow, allowing humans to communicate at a distance in more complex and nuanced ways. “We moved from a position where we wanted to compete, where looking more intimidating was an advantage, to one where it was better to get on with people, to recognise each other from afar with an eyebrow flash, and to sympathise and so on,” said Penny Spikins, a palaeolithic archaeologist at York and co-author on the study, published in Nature Ecology & Evolution. © 2018 Guardian News and Media Limited
Keyword: Emotions; Evolution
Link ID: 24845 - Posted: 04.10.2018
Marisa Taylor, Melissa Bailey By the time Ann Marie Owen, 61, turned to marijuana to treat her pain, she was struggling to walk and talk. She was also hallucinating. For four years, her doctor prescribed a wide range of opioids for transverse myelitis, a debilitating disease that caused pain, muscle weakness and paralysis. The drugs not only failed to ease her symptoms, they hooked her. When her home state of New York legalized marijuana for the treatment of select medical ailments, Owens decided it was time to swap pills for pot. But her doctors refused to help. "Even though medical marijuana is legal, none of my doctors were willing to talk to me about it," she says. "They just kept telling me to take opioids." Cancer Patients Get Little Guidance From Doctors On Using Medical Marijuana Although 29 states have legalized marijuana to treat pain and other ailments, the growing number of Americans like Owen who use marijuana and the doctors who treat them are caught in the middle of a conflict in federal and state laws — a predicament that is only worsened by thin scientific data. Because the federal government considers marijuana a Schedule 1 drug, research on marijuana or its active ingredients is highly restricted and even discouraged in some cases. Underscoring the federal government's position, Health and Human Services Secretary Alex Azar recently pronounced that there was "no such thing as medical marijuana." © 2018 npr
Keyword: Pain & Touch; Drug Abuse
Link ID: 24844 - Posted: 04.10.2018
By Matthew Hutson As artificial intelligence (AI) allows machines to become more like humans, will they experience similar psychological quirks such as hallucinations or depression? And might this be a good thing? Last month, New York University in New York City hosted a symposium called Canonical Computations in Brains and Machines, where neuroscientists and AI experts discussed overlaps in the way humans and machines think. Zachary Mainen, a neuroscientist at the Champalimaud Centre for the Unknown, a neuroscience and cancer research institute in Lisbon, speculated that we might expect an intelligent machine to suffer some of the same mental problems people do. Q: Why do you think AIs might get depressed and hallucinate? A: I’m drawing on the field of computational psychiatry, which assumes we can learn about a patient who’s depressed or hallucinating from studying AI algorithms like reinforcement learning. If you reverse the arrow, why wouldn’t an AI be subject to the sort of things that go wrong with patients? Q: Might the mechanism be the same as it is in humans? A: Depression and hallucinations appear to depend on a chemical in the brain called serotonin. It may be that serotonin is just a biological quirk. But if serotonin is helping solve a more general problem for intelligent systems, then machines might implement a similar function, and if serotonin goes wrong in humans, the equivalent in a machine could also go wrong. © 2018 American Association for the Advancement of Science
Keyword: Robotics; Intelligence
Link ID: 24843 - Posted: 04.10.2018
By BENEDICT CAREY and ROBERT GEBELOFF Victoria Toline would hunch over the kitchen table, steady her hands and draw a bead of liquid from a vial with a small dropper. It was a delicate operation that had become a daily routine — extracting ever tinier doses of the antidepressant she had taken for three years, on and off, and was desperately trying to quit. “Basically that’s all I have been doing — dealing with the dizziness, the confusion, the fatigue, all the symptoms of withdrawal,” said Ms. Toline, 27, of Tacoma, Wash. It took nine months to wean herself from the drug, Zoloft, by taking increasingly smaller doses. “I couldn’t finish my college degree,” she said. “Only now am I feeling well enough to try to re-enter society and go back to work.” Long-term use of antidepressants is surging in the United States, according to a new analysis of federal data by The New York Times. Some 15.5 million Americans have been taking the medications for at least five years. The rate has almost doubled since 2010, and more than tripled since 2000. Nearly 25 million adults, like Ms. Toline, have been on antidepressants for at least two years, a 60 percent increase since 2010. The drugs have helped millions of people ease depression and anxiety, and are widely regarded as milestones in psychiatric treatment. Many, perhaps most, people stop the medications without significant trouble. But the rise in longtime use is also the result of an unanticipated and growing problem: Many who try to quit say they cannot because of withdrawal symptoms they were never warned about. Some scientists long ago anticipated that a few patients might experience withdrawal symptoms if they tried to stop — they called it “discontinuation syndrome.” Yet withdrawal has never been a focus of drug makers or government regulators, who felt antidepressants could not be addictive and did far more good than harm. © 2018 The New York Times Company
Keyword: Depression
Link ID: 24842 - Posted: 04.09.2018
by Kevin Sheth Recently, I cared for an 82-year-old grandfather who was having some trouble opening a jar of jelly. Twenty minutes later, the fork he was using fell out of his hand. Feeling tired, he laid down, and on waking four hours later, he and his wife discovered that his arm was flaccid. That’s when they called 911 and he was taken to a local hospital. The hospital wasn’t a specialized stroke center and transferred him to Yale New Haven Hospital, where I work and where he arrived two hours after his original emergency response call — and almost seven hours from when his symptoms first started. That was too late to prevent permanent disability. As a neurologist, every single day I am left unable to help victims of stroke, despite an effective treatment in hand, simply because they arrived too late. The blood clots in the brain that cause strokes irreversibly change who we are and burden our families. Strokes strike nearly 800,000 Americans each year, killing 140,000 and at a cost to society of $34 billion annually, according to the Centers for Disease Control and Prevention. For over two decades, neurologists and emergency providers have had a drug available that can restore blood flow to the brain, limiting damage, but only 4 percent of stroke patients receive the medication. The drug, known as tissue plasminogen activator (tPA), is a potent blood thinner and was approved as an effective clot-busting treatment by the Food and Drug Administration in 1996. The rub is that patients must receive the medication in the first few hours after experiencing a stroke for it to work. © 1996-2018 The Washington Post
Keyword: Stroke
Link ID: 24841 - Posted: 04.09.2018
By Edith Sheffer PALO ALTO, Calif. — My son’s school, David Starr Jordan Middle School, is being renamed. A seventh grader exposed the honoree, Stanford University’s first president, as a prominent eugenicist of the early 20th century who championed sterilization of the “unfit.” This sort of debate is happening all over the country, as communities fight over whether to tear down Confederate monuments and whether Andrew Jackson deserves to remain on the $20 bill. How do we decide whom to honor and whom to disavow? There are some straightforward cases: Hitler Squares were renamed after World War II; Lenin statues were hauled away after the collapse of the Soviet Union. But other, less famous monsters of the past continue to define our landscape and language. I have spent the past seven years researching the Nazi past of Dr. Hans Asperger. Asperger is credited with shaping our ideas of autism and Asperger syndrome, diagnoses given to people believed to have limited social skills and narrow interests. The official diagnosis of Asperger disorder has recently been dropped from the American Psychiatric Association’s Diagnostic and Statistical Manual of Mental Disorders because clinicians largely agreed it wasn’t a separate condition from autism. But Asperger syndrome is still included in the World Health Organization’s International Classification of Diseases, which is used around the globe. Moreover, the name remains in common usage. It is an archetype in popular culture, a term we apply to loved ones and an identity many people with autism adopt for themselves. Most of us never think about the man behind the name. But we should. © 2018 The New York Times Company
Keyword: Autism
Link ID: 24840 - Posted: 04.09.2018


.gif)

