Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 5421 - 5440 of 29412

By David Kohn Each year, thousands of Americans suffer a traumatic brain injury. In 2013, about 2.8 million TBI-related emergency department visits, hospitalizations and deaths occurred in the United States, according to the Centers for Disease Control and Prevention. Most of these are what are called mild traumatic brain injuries, or mTBIs — head injuries that don’t cause a coma. People with an mTBI typically get better within a few weeks, but for as many as 20 percent, problems can linger for months or years. Many of these patients find themselves stuck with depression, cognitive problems, headaches, fatigue and other symptoms. Known as post-concussion syndrome, this phenomenon is often difficult to treat. Antidepressants can lift moods, painkillers can ease headaches and physical therapy may ease dizziness, but most researchers agree that these remedies don’t heal the injury within the brain. Could oxygen do the trick? A growing group of scientists and physicians say that hyperbaric treatment, which exposes patients to pure oxygen at higher-than-normal air pressure, may work. “These patients don’t have enough oxygen to heal the injured parts of their brains,” said Shai Efrati, a researcher and physician at Tel Aviv University in Israel and a leading hyperbaric scientist. “Hyperbaric treatment massively increases the amount of oxygen available to the brain.” But other researchers believe that the treatment has no merit and should not be recommended. © 1996-2018 The Washington Post

Keyword: Brain Injury/Concussion; Stroke
Link ID: 24583 - Posted: 01.29.2018

Sarah Varney Two-year old Maverick Hawkins sits on a red plastic car in his grandmother's living room in the picturesque town of Nevada City, Calif., in the foothills of the Sierra Nevada mountains. His playpal Delilah Smith, a fellow 2-year old, snacks on hummus and cashews and delights over the sounds of her Princess Peppa stuffie. It's playtime for the kids of the provocatively named FaceBook group "Pot Smoking Moms Who Cuss Sometimes." Maverick's mother, Jenna Sauter, started the group after he was born. "I was a new mom, a young mom — I was 22 — and I was just feeling really lonely in the house, taking care of him," she says. She wanted to reach out to other mothers but didn't want to hide her marijuana use. "I wanted friends who I could be open with," Sauter says — "like I enjoy going to the river and I like to maybe smoke a joint at the river." There are nearly 2,600 members now in the FaceBook group. Marijuana, which became legal for recreational use in California earlier this month, is seen by many group members as an all-natural and seemingly harmless remedy for everything from morning sickness to post-partum depression. Delilah Smith's mom Andria is 21 and a week away from her due date with her second child. She took umbrage when an emergency room physician recently suggested she take "half a Norco"— a pill akin to Vicodin, an opioid-based painkiller — for her excruciating back pain. © 2018 npr

Keyword: Development of the Brain; Drug Abuse
Link ID: 24582 - Posted: 01.29.2018

Emily Willingham How many times did I say it – to myself, out loud alone or out loud to others, throughout my childhood? ‘I wish I were a boy.’ The words were mine, a fervent and frequent wish. They were not born of a feeling of mismatch between external expectations and internal signals. Except for a lifelong tension with society’s mixed messages about what it means to be a woman, I’m comfortable identifying as the gender assigned to me. But I wished for boyness because the boys did so many things I wanted to do and was excluded from doing because I was a girl. My body and my brain mapped to each other just fine, but my body didn’t map at all to what society told these boys – and me – I was allowed to do. As many a woman can attest, this feeling of belonging in male spaces that lock you out doesn’t end with teenhood, adulthood, careerhood or parenthood. An aficionado of adventure stories, I couldn’t – still can’t – help but notice that the places men can go are often No Women’s Lands for someone like me. Not because I lack the physicality, strength or stamina to traverse them but because the mere presentation of being female is itself dangerous. Realistically, it invites violence, exclusion and violation in too many ways to be considered anything but a liability. And then there are the less wild places, just boys’ clubs and men’s clubs, de facto or tacit, where being a girl or woman means being viewed as an intruder or, as women have always known, being subject to harassment or worse. Every day, I see men circle their masculinity like musk oxen, protective and exclusionary, in my professions of academia and journalism. Even in the virtual world of social media, they reflexively exclude women who are their peers in expertise and competence while readily engaging men who are neither. I am wryly amused when people committed to the idea that men and women are cognitively different throw women the double-edged bone of being ‘better at verbal expression’. (Look, we’re good at a thing! That you’ll also use to make fun of us chatty, chatty Cathies!) I read that and think of who receives most of the major book awards and other writing accolades. Hint: it’s men. I’ll wager that the social factors involved in the latter contribute to the assumptions underlying the former. © Aeon Media Group Ltd. 2012-2018.

Keyword: Sexual Behavior
Link ID: 24581 - Posted: 01.29.2018

By Jennifer Hassan The first time sleep paralysis struck me was in the winter of 2012. My grandfather had recently died, and I was spending time at my grandmother’s house. After 60 years of marriage, she wasn’t used to being alone or to the sadness an empty home can bring. Determined to help her in any way I could, I moved into her spare bedroom. As night came, I tucked her into bed and turned out the light — a task she had done for me on countless occasions growing up. The role reversal saddened me but also gave me an overwhelming urge to protect one of the most important women in my life. I lay down in the next bedroom and listened to her muffled sobs. I woke up a few hours later, feeling cold. As I went to pull the blankets up around me, I realized I couldn’t move. I began to panic. What was happening to me? Why was my body paralyzed? I tried to lift my arms: Nothing. My head was cemented to the pillow, my body embedded, frozen. Then the pressure came, pushing against my chest. The more I panicked, the harder it became to breathe. Like something out of a bad horror movie, I tried to scream, but no words came out. Unable to move my eyes, I had no option but to stare upward into the darkness. I couldn’t see anyone else, but for some reason it felt as if I had company. There was a hidden presence and it was tormenting me, refusing to let me go. After what felt like hours but was probably just a few minutes, I was able to move again. Shaking, I switched the bedroom light on and sat upright in bed until morning came. © 1996-2018 The Washington Post

Keyword: Sleep
Link ID: 24580 - Posted: 01.29.2018

Sara Reardon Superconducting computing chips modelled after neurons can process information faster and more efficiently than the human brain. That achievement, described in Science Advances on 26 January1, is a key benchmark in the development of advanced computing devices designed to mimic biological systems. And it could open the door to more natural machine-learning software, although many hurdles remain before it could be used commercially. Artificial intelligence software has increasingly begun to imitate the brain. Algorithms such as Google’s automatic image-classification and language-learning programs use networks of artificial neurons to perform complex tasks. But because conventional computer hardware was not designed to run brain-like algorithms, these machine-learning tasks require orders of magnitude more computing power than the human brain does. “There must be a better way to do this, because nature has figured out a better way to do this,” says Michael Schneider, a physicist at the US National Institute of Standards and Technology (NIST) in Boulder, Colorado, and a co-author of the study. NIST is one of a handful of groups trying to develop ‘neuromorphic’ hardware that mimics the human brain in the hope that it will run brain-like software more efficiently. In conventional electronic systems, transistors process information at regular intervals and in precise amounts — either 1 or 0 bits. But neuromorphic devices can accumulate small amounts of information from multiple sources, alter it to produce a different type of signal and fire a burst of electricity only when needed — just as biological neurons do. As a result, neuromorphic devices require less energy to run. © 2018 Macmillan Publishers Limited

Keyword: Robotics; Learning & Memory
Link ID: 24579 - Posted: 01.27.2018

Carl Zimmer For centuries, people have drawn the line between nature and nurture. In the nineteenth century, the English polymath Francis Galton cast nature-versus-nurture in scientific terms. He envisioned a battle between heredity and experience that shapes each of us. “When nature and nurture compete for supremacy…the former proves the stronger,” Galton wrote in 1874. Today, scientists can do something Galton couldn’t imagine: they can track the genes we inherit from our parents. They are gaining clues to how that genetic legacy influences many aspects of our experience, from our risk of developing cancer to our tendency to take up smoking. But determining exactly how any particular variation in DNA shapes the course of our life is proving far trickier than Galton would have guessed. There is no clean line between nature and nurture: How a particular variant acts, if at all, may depend on your environment. A study published on Thursday offers a striking new demonstration of this complexity. Genes may help determine how long children stay in school, the researchers found, but some of those genes operate at a distance — by influencing parents. The study was published in Science. The authors go on coin a new phrase for this effect: “genetic nurture.” To scientists accustomed to tracing the links between the genes you carry and the traits they govern, it’s a headspinning idea. A genetic variant may shape you not because it directly influences you, but because it changes those around you, noted Paige Harden, a psychologist at the University of Texas who co-authored a commentary on the new study: “Something is happening outside your own skin.” © 2018 The New York Times Company

Keyword: Development of the Brain; Genes & Behavior
Link ID: 24578 - Posted: 01.27.2018

By Shawna Williams When the late organic chemist John Daly was on the hunt for poisonous frogs, he employed an unadvisable method: “It involved touching the frog, then sampling it on the tongue. If you got a burning sensation, then you knew this was a frog you ought to collect,” he once told a National Institutes of Health (NIH) newsletter writer. Daly survived to gather frogs from South America, Madagascar, Australia, and Thailand, and he extracted more than 500 compounds from their skin (many of which the frogs in turn had harvested from their insect diets). One of these compounds, the toxin epibatidine, turned out to have an analgesic effect 200 times more potent than morphine in rodents, Daly and his colleagues reported in 1992 (J Am Chem Soc, 114:3475-78, 1992); and rather than working through opioid receptors, epibatidine bound to nicotinic receptors. “To have a drug that works as well [as opioids] but is actually targeting a completely independent receptor system is really one of those holy grails of the drug industry,” says Daniel McGehee, who studies nicotinic receptors at the University of Chicago. But an epibatidine-related compound tested by Abbott Labs as an analgesic in the late 2000s caused uncontrollable vomiting, McGehee says. Although research on nicotinic receptors continues, he’s not aware of any epibatidine analogs currently in the drug development pipeline. But frogs may yet hold clues to killing pain. At least one frog does deploy an opioid: the waxy monkey tree frog (Phyllomedusa sauvagii), whose skin is laced with the peptide dermorphin. Although the compound does not appear to be a toxin that wards off predators, dermorphin has about 40 times the potency of morphine in a guinea-pig ileum assay, but it doesn’t effectively cross the blood-brain barrier, says pharmacologist Tony Yaksh of the University of California, San Diego. Dermorphin also boasts an unusual chemical property: the inclusion of a D-amino acid in its sequence. Almost all amino acids found in natural compounds are L-isomers, and dermorphin’s stereochemistry makes it resistant to metabolism and “certainly renders it more potent,” Yaksh writes in an email to The Scientist. © 1986-2018 The Scientist

Keyword: Pain & Touch; Drug Abuse
Link ID: 24577 - Posted: 01.27.2018

By Eli Meixler Friday’s Google Doodle celebrates the birthday of Wilder Penfield, a scientist and physician whose groundbreaking contributions to neuroscience earned him the designation “the greatest living Canadian.” Penfield would have turned 127 today. Later celebrated as a pioneering researcher and a humane clinical practitioner, Penfield pursued medicine at Princeton University, believing it to be “the best way to make the world a better place in which to live.” He was drawn to the field of brain surgery, studying neuropathy as a Rhodes scholar at Oxford University. In 1928, Penfield was recruited by McGill University in Montreal, where he also practiced at Royal Victoria Hospital as the city’s first neurosurgeon. Penfield founded the Montreal Neurological Institute with support from the Rockefeller Foundation in 1934, the same year he became a Canadian citizen. Penfield pioneered a treatment for epilepsy that allowed patients to remain fully conscious while a surgeon used electric probes to pinpoint areas of the brain responsible for setting off seizures. The experimental method became known as the Montreal Procedure, and was widely adopted. But Wilder Penfield’s research led him to another discovery: that physical areas of the brain were associated with different duties, such as speech or movement, and stimulating them could generate specific reactions — including, famously, conjuring a memory of the smell of burnt toast. Friday’s animated Google Doodle features an illustrated brain and burning toast. © 2017 Time Inc.

Keyword: Miscellaneous
Link ID: 24576 - Posted: 01.27.2018

By Carly Ledbetter Two years after “the dress” divided people over its color, the internet is back with another puzzling wardrobe question. What color are these shoes? Some people think these Vans sneakers look gray and mint (or teal), while others see pink and white. For some, the color changes the more they stare at the shoes: While others are dead-set on the color they see: Twitter user @dolansmalik explained one theory about why the shoes look like different colors to some people: “THE REAL SHOE IS PINK & WHITE OKAY?!” she wrote on Twitter. “The second pic was with flash & darkened, so it looks teal & gray. (depends on what lighting ur in).” Bevil Conway is an investigator with the National Eye Institute who helped contribute to a study on the differences in color perception for the famous “dress” controversy two years ago. He told HuffPost how and why our eyes play tricks on us, in situations like “the dress” and the shoes above. “This is related to the famous dress insofar as both are related to issues of color constancy,” he explained. “Basically your visual system is constantly trying to color correct the images projected on the retina, to remove the color contamination introduced by the spectral bias in the light source.” Conway explained just how and why some people see turquoise in the shoes, while others see pink. “In that manipulated photograph there is a lot of the turquoise cast over the whole image. When you first look at it, after having looked at the pink version, your visual system is still adapted to the lighting conditions of the pink version and so you see the turquoise in the other version, and you attribute this to the shoe itself,” he said. “But after a while, your visual system adapts to the turquoise across the whole of that image and interprets it as part of the light source, eventually discounting it and restoring the shoe to the original pink version (or at least pinker).” ©2018 Oath Inc.

Keyword: Vision
Link ID: 24575 - Posted: 01.26.2018

By Katarina Zimmer | Cellular senescence, the process by which cells cease to divide in response to stress, may be a double-edged sword. In addition to being an important anti-cancer mechanism, recent studies show it may also contribute to age-related tissue damage and inflammation. A study published in Cell Reports yesterday (January 23) suggests that cellular senescence could be a factor underlying neurodegeneration in sporadic forms of Parkinson’s disease. “I think the proposition that cellular senescence drives neurodegeneration in Parkinson’s disease and other ageing-related neurodegenerative diseases . . . has a great deal of merit,” writes D James Surmeier, a physiologist at Northwestern University, to The Scientist in an email. “To my knowledge, [this study] is the first strong piece of evidence for this model.” Cellular senescence may be the basis by which the herbicide and neurotoxin paraquat, which has been previously linked to Parkinson’s disease, can contribute to the disease, the researchers propose. The vast majority of Parkinson’s disease cases are sporadic, rather than inherited, and caused by a combination of environmental and genetic factors. Julie Andersen, a neuroscientist at the Buck Institute for Research on Aging, says her laboratory decided to focus on paraquat based on epidemiological evidence linking it to the condition in humans and on lab work showing that mice treated with the chemical suffer a loss of dopaminergic neurons in the same region that is affected in humans. It is an acutely toxic chemical—capable of causing death—and was banned in the E.U. in 2007 over safety concerns, but is still used extensively by American farmworkers. © 1986-2018 The Scientist

Keyword: Parkinsons; Glia
Link ID: 24574 - Posted: 01.26.2018

by Ariana Eunjung Cha A new class of epilepsy medications based on an ingredient derived from marijuana could be available as soon as the second half of 2018 in the United States, pending Food and Drug Administration approval. Officials from GW Pharmaceuticals, the company that developed the drug, on Wednesday announced promising results from a study on 171 patients randomized into treatment and placebo groups. Members of the group, ages 2 to 55, have a condition called Lennox-Gastaut syndrome and were suffering from seizures that were not being controlled by existing drugs. On average they had tried and discontinued six anti-seizure treatments and were experiencing 74 “drop” seizures per month. Drop seizures involve the entire body, trunk or head and often result in a fall or other type of injury. The results, published in the Lancet, show that over a 14-week treatment period, 44 percent of patients taking the drug, called Epidiolex, saw a significant reduction in seizures, compared with 22 percent of the placebo group. Moreover, more of the patients who got the drug experienced a 50 percent or greater reduction in drop seizures. Elizabeth Thiele, director of pediatric epilepsy at Massachusetts General Hospital and lead author of the study, said the results varied depending on the patient. “For some, it does not do a whole lot. But for the people it does work in, it is priceless,” she said. “One child who comes to mind had multiple seizures a day. She had been on every medication possible,” said Thiele, a professor of neurology at Harvard Medical School. Then the patient tried the cannabis-based treatment and has been seizure-free for almost four years. “She is now talking about college options. She would have never had that conversation before. It has been life-changing.” © 1996-2018 The Washington Post

Keyword: Epilepsy; Drug Abuse
Link ID: 24573 - Posted: 01.26.2018

Emmarie Huetteman Dr. Andrey Ostrovsky's family did not discuss what killed his uncle in 2015. The man was young, not quite two weeks past his 45th birthday, when he died, and had lost touch with loved ones in his final months. At the time, Ostrovsky wondered if his uncle had perhaps killed himself. Almost two years later, Ostrovsky was Medicaid's chief medical officer, grappling professionally with an opioid crisis that kills about 115 Americans each day, when he learned the truth: His uncle had died of a drug overdose. Family members knew the uncle's life had been turbulent for a while before his death; they'd watched as he divorced his wife and became estranged from his 4-year-old daughter and eventually lost his job as a furniture store manager. But Ostrovsky wanted to better understand what had happened to the man — his stepfather's younger brother. So last fall, when he found himself in southeastern Florida, where his uncle had died, Ostrovsky contacted one of the uncle's friends for what he expected would be a quick cup of coffee. Instead the friend "let loose," revealing that he and Ostrovsky's uncle had been experimenting with a variety of drugs the night of the death. It was the tragic culmination of more than a decade of substance abuse — a pattern of behavior much of the family knew nothing about. An autopsy showed there were opiates and cocaine in his uncle's system, Ostrovsky later learned. © 2018 npr

Keyword: Drug Abuse
Link ID: 24572 - Posted: 01.26.2018

By Shawna Williams When the Voyager I spacecraft left Earth in 1977, it carried with it a “Golden Record” containing audio recordings of messages meant for any intelligent life that might cross its path. It bore sounds from around the world, including greetings in 55 languages, Chuck Berry’s “Johnny B. Goode,” and a fussy baby being soothed by its mother. According to Marc Bornstein, a developmental psychologist at Eunice Kennedy Shriver National Institute of Child Health and Human Development, Carl Sagan and other members of the committee who decided what to include on the record were spot on in picking the latter track. “Infant cry is . . . the very first communication between an infant and a caregiver,” Bornstein says. Crying is infants’ best tool for ensuring they get the care they need, but Bornstein and his research collaborators wondered about the caregivers’ responses: to what extent were those innate versus learned? To investigate, they enrolled 684 new mothers and their babies from 11 countries around the world and put cameras in their homes. Each time a baby began crying, the researchers recorded what the mother did in the next five seconds. Did she pick the baby up? Kiss or stroke it? Talk to it? Try to distract it with a toy? “Within five seconds, the predominant kinds of responses are picking up and holding and talking to the baby,” says Bornstein (PNAS, 114:E9465-73, 2017). The degree of uniformity surprised him. “People in Kenya and Cameroon . . . the mothers are growing up and have been reared in wildly different circumstances than mothers in Brazil and Argentina or the United States, or certainly than Japan or South Korea.” © 1986-2018 The Scientist

Keyword: Sexual Behavior; Language
Link ID: 24571 - Posted: 01.26.2018

Ewen Callaway The oldest human fossils ever found outside Africa suggest that Homo sapiens might have spread to the Arabian Peninsula around 180,000 years ago — much earlier than previously thought. The upper jaw and teeth, found in an Israeli cave and reported in Science on 25 January1, pre-date other human fossils from the same region by at least 50,000 years. But scientists say that it is unclear whether the fossils represent a brief incursion or a more-lasting expansion of the species. Researchers originally thought that H. sapiens emerged in East Africa 200,000 years ago then moved out to populate the rest of the world. Until discoveries in the past decade countered that story, scientists thought that a small group left Africa some 60,000 years ago and that signs of earlier travels, including 80,000–120,000 year-old skulls and other remains from Israel discovered in the 1920s and 1930s, were from failed migrations. However, recent discoveries have muddied that simple narrative. Some H. sapiens-like fossils from Morocco that are older than 300,000 years, reported last year2, have raised the possibility that humans evolved earlier and perhaps elsewhere in Africa. Teeth from southern China, described in 20153, hint at long-distance migrations some 120,000 years ago. And genome studies have sown more confusion, with some comparisons of global populations pointing to just one human migration from Africa4,5, and others suggesting multiple waves6. © 2018 Macmillan Publishers Limited,

Keyword: Evolution
Link ID: 24570 - Posted: 01.26.2018

Laura Sanders People tend to think of memories as deeply personal, ephemeral possessions — snippets of emotions, words, colors and smells stitched into our unique neural tapestries as life goes on. But a strange series of experiments conducted decades ago offered a different, more tangible perspective. The mind-bending results have gained unexpected support from recent studies. In 1959, James Vernon McConnell, a psychologist at the University of Michigan in Ann Arbor, painstakingly trained small flatworms called planarians to associate a shock with a light. The worms remembered this lesson, later contracting their bodies in response to the light. One weird and wonderful thing about planarians is that they can regenerate their bodies — including their brains. When the trained flatworms were cut in half, they regrew either a head or a tail, depending on which piece had been lost. Not surprisingly, worms that kept their heads and regrew tails retained the memory of the shock, McConnell found. Astonishingly, so did the worms that grew replacement heads and brains. Somehow, these fully operational, complex arrangements of brand-spanking-new nerve cells had acquired the memory of the painful shock, McConnell reported. In subsequent experiments, McConnell went even further, attempting to transfer memory from one worm to another. He tried grafting the head of a trained worm onto the tail of an untrained worm, but he couldn’t get the head to stick. He injected trained planarian slurry into untrained worms, but the recipients often exploded. Finally, he ground up bits of the trained planarians and fed them to untrained worms. Sure enough, after their meal, the untrained worms seemed to have traces of the memory — the cannibals recoiled at the light. The implications were bizarre, and potentially profound: Lurking in that pungent planarian puree must be a substance that allowed animals to literally eat one another’s memories. |© Society for Science & the Public 2000 - 2017.

Keyword: Learning & Memory
Link ID: 24569 - Posted: 01.25.2018

By Lenny Bernstein Advanced brain imaging technology may give doctors an additional 10 hours or more to respond to some strokes, researchers said Wednesday, a development that may soon bring major changes to the way hospitals treat one of the leading causes of disability and death. The research is upending doctors’ long-held belief that they have just six hours to save threatened brain tissue from lack of blood flow when a major vessel to the brain is blocked. The new findings suggest they may have as long as 16 hours in many cases; a study published three weeks ago with a different group of stroke victims put the outer limit at 24 hours for some. Both studies showed such dramatic results that they were cut short to speed up reporting of the information to physicians. In response to the studies, new stroke treatment guidelines were released Wednesday. “The big news is that we were all wrong in how we were thinking about how strokes evolve,” said Gregory W. Albers, a professor of neurology at Stanford University Medical Center and lead author of the new paper. While some brain tissue dies quickly after a stroke begins, in most patients, collateral blood vessels usually take over feeding a larger area of the brain that is also starved for blood and oxygen, giving doctors many more hours to save that tissue than they previously believed, Albers said. So the age-old medical belief that “time is brain” — that millions of neurons die each minute after a stroke — must be reconsidered, he said. “We are quadrupling the stroke treatment window today,” Albers said. “It’s going to have a massive impact on how stroke is triaged and assessed.” © 1996-2018 The Washington Post

Keyword: Stroke
Link ID: 24568 - Posted: 01.25.2018

By Bret Stetka Fossil records can tell us a lot about our evolutionary past: what our ancestors looked like, how they walked, what they ate. But what bits of bone don’t typically reveal is why humans evolved the way we did—why, compared with all other known species, we wound up capable of such complex thought, emotion and behavior. A team of researchers has now used a novel technique to form a hypothesis on the origins of our rich cognitive abilities. They did so by profiling the chemicals buzzing around our brains. These compounds, known as neurotransmitters, are the signaling molecules responsible for key brain functions. Their research reveals that in comparison with other higher primates, our brains have unique neurotransmitter profiles that probably resulted in our enhanced cognition. The authors of the new study—a multicenter effort led by Kent State University anthropologists C. Owen Lovejoy and Mary Ann Raghanti and published January 22 in PNAS—began by measuring neurotransmitter levels in brain samples from humans, chimpanzees, gorillas, baboons and monkeys, all of whom had died of natural causes. Specifically, they tested levels in the striatum, a brain region involved in social behaviors and interactions. Compared with the other species tested, humans had markedly increased striatal dopamine activity. Among other functions, dopamine helps drive reward activity and social behaviors. In the striatum in particular it contributes to uniquely human abilities and behaviors like complicated social group formation and, in part, speech and language. © 2018 Scientific American,

Keyword: Evolution
Link ID: 24567 - Posted: 01.25.2018

Bruce Bower Big brains outpaced well-rounded brains in human evolution. Around the time of the origins of our species 300,000 years ago, the brains of Homo sapiens had about the same relatively large size as they do today, new research suggests. But rounder noggins rising well above the forehead — considered a hallmark of human anatomy — didn’t appear until between about 100,000 and 35,000 years ago, say physical anthropologist Simon Neubauer and his colleagues. Using CT scans of ancient and modern human skulls, the researchers created digital brain reconstructions, based on the shape of the inner surface of each skull’s braincase. Human brains gradually evolved from a relatively flatter and elongated shape — more like that of Neandertals’ — to a globe shape thanks to a series of genetic tweaks to brain development early in life, the researchers propose January 24 in Science Advances. A gradual transition to round brains may have stimulated considerable neural reorganization by around 50,000 years ago. That cognitive reworking could have enabled a blossoming of artwork and other forms of symbolic behavior among Stone Age humans, the team suspects. Other researchers have argued, however, that abstract and symbolic thinking flourished even before H. sapiens emerged (SN: 12/27/14, p. 6). Ancient DNA studies indicate that genes involved in brain development changed in H. sapiens following a split from Neandertals more than 600,000 years ago (SN Online: 3/14/16). “Those genetic changes might be responsible for differences in neural wiring and brain growth that led to brain [rounding] in modern humans, but not in Neandertals,” says Neubauer of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. |© Society for Science & the Public 2000 - 2017

Keyword: Evolution
Link ID: 24566 - Posted: 01.25.2018

By Diana Kwon Search for “pheromones products” on the internet, and dozens of sprays and perfume additives will appear—many claiming to be able to increase your attractiveness to the opposite sex. Some companies, such as the Athena Institute, which, according to its founder, Winnifred Cutler, published its 108th consecutive ad in The Atlantic this month, assert that scientific studies back up their claims. While there have been several experiments examining the effects of compounds extracted from people’s armpits, much of the data on sex-related behaviors, The Scientist has found, go back more than a decade and were met then—and still now—with skepticism from pheromone researchers. “I am not compelled by any studies that are out there that say there is an active steroid component from the underarm that causes [sexual attraction],” says George Preti, an organic chemist at the Monell Chemical Senses Center in Philadelphia who conducted some of the early human pheromone trials. Within the scientific community, pheromones are broadly defined as chemical signals released by an animal that induce specific effects on other members of the same species. Although these substances are typically associated with sexual attraction, researchers have found they can have a broader range of influence, such as prompting aggression or modifying parental behaviors. © 1986-2018 The Scientist

Keyword: Chemical Senses (Smell & Taste); Sexual Behavior
Link ID: 24565 - Posted: 01.25.2018

Dean Burnett Doctors Warn That Anti-Depressants Can Lead To SuicideMIAMI, FL - MARCH 23: A bottle of antidepressant pills named Effexor is shown March 23, 2004 photographed in Miami, Florida. The Food and Drug Administration asked makers of popular antidepressants to add or strengthen suicide-related warnings on their labels as well as the possibility of worsening depression especially at the beginning of treatment or when the doses are increased or decreased. (Photo Illustration by Joe Raedle/Getty Images) Please, do not just abandon your medication. If you’ve been prescribed drugs to treat an illness, suddenly dropping it altogether – for whatever reason – is invariably a very bad move. And this is as true for things like antidepressants as it is for insulin or antibiotics. Antidepressant withdrawal syndrome is a real problem. It doesn’t effect everyone equally, but that’s always been the case with antidepressants. The effects can be really profound and debilitating though, including a spike in anxiety, alarming “brain zaps”, and more. For better or worse, if you’ve been taking antidepressants for a number of weeks, your brain has slowly adapted to the new chemical levels and balance that they have brought about. A sudden cessation will rapidly alter this again, potentially causing all manner of problems, and possibly cause the underlying issue (ie depression) they’re supposed to be treating to return with a vengeance. This is why the specific dose and schedule of antidepressants is very carefully considered. If you do genuinely want to come off your antidepressants, please speak to your GP or medical expert, work out a system for gradual reduction and cessation and make plans and preparations for what could happen, as it could leave you unable to function normally. If you’re going to do it, please do it carefully, and thoroughly. © 2018 Guardian News and Media Limit

Keyword: Depression
Link ID: 24564 - Posted: 01.25.2018