Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Jane E. Brody I’ve long thought the human body was not meant to run on empty, that fasting was done primarily for religious reasons or political protest. Otherwise we needed a reliably renewed source of fuel to function optimally, mentally and emotionally as well as physically. Personal experience reinforced that concept; I’m not pleasant to be around when I’m hungry. There’s even an official name for that state of mind, confirmed by research: Hangry! But prompted by recent enthusiasm for fasting among people concerned about their health, weight or longevity, I looked into the evidence for possible benefits — and risks — of what researchers call intermittent fasting. Popular regimens range from ingesting few if any calories all day every other day or several times a week to fasting for 16 hours or more every day. A man I know in his early 50s said he had lost 12 pounds in about two months on what he calls the 7-11 diet: He eats nothing from 7 p.m. until 11 a.m. the next morning, every day. I was skeptical, but it turns out there is something to be said for practicing a rather prolonged diurnal fast, preferably one lasting at least 16 hours. Mark P. Mattson, neuroscientist at the National Institute on Aging and Johns Hopkins University School of Medicine, explained that the liver stores glucose, which the body uses preferentially for energy before it turns to burning body fat. “It takes 10 to 12 hours to use up the calories in the liver before a metabolic shift occurs to using stored fat,” Dr. Mattson told me. After meals, glucose is used for energy and fat is stored in fat tissue, but during fasts, once glucose is depleted, fat is broken down and used for energy. Most people trying to lose weight should strive for 16 calorie-free hours, he said, adding that “the easiest way to do this is to stop eating by 8 p.m., skip breakfast the next morning and then eat again at noon the next day.” (Caffeine-dependent people can have sugar- free black coffee or tea before lunch.) But don’t expect to see results immediately; it can take up to four weeks to notice an effect, he said. © 2020 The New York Times Company
Keyword: Obesity
Link ID: 27045 - Posted: 02.18.2020
By Tam Hunt Strangely, modern science was long dominated by the idea that to be scientific means to remove consciousness from our explanations, in order to be “objective.” This was the rationale behind behaviorism, a now-dead theory of psychology that took this trend to a perverse extreme. Behaviorists like John Watson and B.F. Skinner scrupulously avoided any discussion of what their human or animal subjects thought, intended or wanted, and focused instead entirely on behavior. They thought that because thoughts in other peoples’ heads, or in animals, are impossible to know with certainty, we should simply ignore them in our theories. We can only be truly scientific, they asserted, if we focus solely on what can be directly observed and measured: behavior. Erwin Schrödinger, one of the key architects of quantum mechanics in the early part of the 20th century, labeled this approach in his philosophical 1958 book Mind and Matter, the “principle of objectivation” and expressed it clearly: Advertisement “By [the principle of objectivation] I mean … a certain simplification which we adopt in order to master the infinitely intricate problem of nature. Without being aware of it and without being rigorously systematic about it, we exclude the Subject of Cognizance from the domain of nature that we endeavor to understand. We step with our own person back into the part of an onlooker who does not belong to the world, which by this very procedure becomes an objective world.” Schrödinger did, however, identify both the problem and the solution. He recognized that “objectivation” is just a simplification that is a temporary step in the progress of science in understanding nature. © 2020 Scientific American
Keyword: Consciousness
Link ID: 27044 - Posted: 02.18.2020
Shannon Bond If you're having a hard time falling asleep, that sleep tracker on your wrist might be to blame. And there's a name for this new kind of insomnia of the digital age: orthosomnia. It's "when you just really become fixated on having this perfect sleep via tracker," said Seema Khosla, medical director at the North Dakota Center for Sleep. "And then you start worrying about it, and you wind up giving yourself insomnia." Sleep trackers have become increasingly popular. They come in the form of watches, wristbands, rings and even mattresses. The gadgets measure how you breathe, how fast your heart is beating, how much you're tossing and turning. They crunch that data to produce a sleep score, usually through a smartphone app. But in an irony of our digital lifestyles, for some people, perfecting that sleep score becomes an end unto itself — so much so that they can lose sleep over it. Khosla sees this is her own practice as a sleep doctor. Stressed-out patients complain they are aiming for a sleep score of 100 but are getting only 80. It keeps them up at night. She has a simple solution. "I'll ask them just to put their tracker away for a couple of weeks. And honestly, sometimes you can just see the relief on their faces," she said. Kathrin Hamm experienced this problem firsthand. She was traveling around the world as an economist for the World Bank, and getting good sleep was a priority. © 2020 npr
Keyword: Sleep
Link ID: 27043 - Posted: 02.18.2020
Blake Richards Despite billions of dollars spent and decades of research, computation in the human brain remains largely a mystery. Meanwhile, we have made great strides in the development of artificial neural networks, which are designed to loosely mimic how brains compute. We have learned a lot about the nature of neural computation from these artificial brains and it’s time to take what we’ve learned and apply it back to the biological ones. Neurological diseases are on the rise worldwide, making a better understanding of computation in the brain a pressing problem. Given the ability of modern artificial neural networks to solve complex problems, a framework for neuroscience guided by machine learning insights may unlock valuable secrets about our own brains and how they can malfunction. Our thoughts and behaviours are generated by computations that take place in our brains. To effectively treat neurological disorders that alter our thoughts and behaviours, like schizophrenia or depression, we likely have to understand how the computations in the brain go wrong. However, understanding neural computation has proven to be an immensely difficult challenge. When neuroscientists record activity in the brain, it is often indecipherable. In a paper published in Nature Neuroscience, my co-authors and I argue that the lessons we have learned from artificial neural networks can guide us down the right path of understanding the brain as a computational system rather than as a collection of indecipherable cells. © 2010–2020, The Conversation US, Inc.
Keyword: Brain imaging; Robotics
Link ID: 27042 - Posted: 02.14.2020
By Jade Wu 3 Anxiety-Related Disorders You Might Not Know About Person suffering from trichotillomania, an obsessive compulsive condition where sufferers can't resist pulling their hair out. Credit: Ryan Jackson Getty Images Most people know what it’s like to feel anxious. That tension in your muscles, those butterflies in your stomach, and the drumming of your heart tells you that you’re not calm. And this is totally normal. Where would we be if genuinely dangerous situations like dark alleys at night didn’t give us the heebie-jeebies? And would we take important tasks very seriously if we didn’t get nervous in the spotlight, like when giving a wedding toast? Sometimes, anxiety goes too far and gets in the way of our everyday functioning. It can mess up our health, relationships, work, and fun. It’s not hard to imagine the pain of being plagued by non-stop worries or feeling so shy as to have trouble with dating. But sometimes, anxiety and anxiety-related processes can show up in more unusual ways, even ways that don’t seem at first to have anything to do with emotions. The Diagnostic and Statistics Manual - 5th Edition is the official American Psychiatric Association’s list of psychological disorders. It’s a huge bible detailing everything that’s considered a disorder and how it’s categorized. It takes experts years to update it in response to ongoing scientific findings. Advertisement The Anxiety Disorders section got a big makeover in the last update, which came out in 2013. It’s now split into a few different sections, including Trauma and Stress-Related Disorders and Obsessive-Compulsive Disorders. Some of the less common disorders got shuffled around, some got new names, but experts still agree that the line between categories is blurry at best. Overlapping and related to some of the most common anxiety disorders, such as generalized anxiety disorder and social anxiety disorder, are some that are less well-known.
Keyword: Stress; Emotions
Link ID: 27041 - Posted: 02.14.2020
By Pallab Ghosh Science correspondent, BBC News, Seattle US researchers are developing a better understanding of the human brain by studying tissue left over from surgery. They say that their research is more likely to lead to new treatments than studies based on mouse and rat models. Dr Ed Lein, who leads the initiative at the Allen Institute has set up a scheme with local doctors to study left over tissue just hours after surgery. He gave details at the American Association for the Advancement of Science meeting in Seattle. "It is a little bit crazy that we have such a huge field where we are trying to solve brain diseases and there is very little understanding of the human brain itself," said Dr Lein. "The field as a whole is largely assuming that the human brain is similar to those of animal models without ever testing that view. "But the mouse brain is a thousand times smaller, and any time people look, they find significant differences." Dr Lein and his colleagues at the Allen Institute in Seattle set up the scheme with local neurosurgeons to study brain tissue just hours after surgery - with the consent of the patient. It functions as if it is still inside the brain for up to 48 hours after it has been removed. So Dr Lein and his colleagues have to drop everything and often have to work through the night once they hear that brain tissue has become available. © 2020 BBC
Keyword: Brain imaging; Epilepsy
Link ID: 27040 - Posted: 02.14.2020
Alison Abbott The use of animals in scientific research seems to be declining in the European Union, according to statistics gathered by the European Commission. The figures come from the first report on the state of animal research in the bloc since the introduction of tougher regulations 7 years ago. The report — published on 6 February — reviews the impact of an animal-research directive, legislation that was designed to reduce the use of animals in research and minimize their suffering. The directive, which came into effect in 2013, is widely considered to be one of the world’s toughest on animal research. According to the report, 9.39 million animals were used for scientific purposes in 2017 — the most recent year for which data have been collated — compared with 9.59 million in 2015. From 2015 to 2016, however, there was a slight increase, to 9.82 million. The report acknowledges that this prevents the confirmation of a clear decrease. But it concludes that, when compared with figures from before the directive came into force, the numbers suggest “a clear positive development”. In 2017, more than two-thirds of animals were used in basic or applied research (45% and 23%, respectively), and around one-quarter (23%) were involved in the testing of drugs and other chemicals to meet regulatory requirements. Other uses included the routine production of biological agents such as vaccines; teaching; and forensic investigations (see ‘Animals in science’). More than 60% of the animals used in 2017 were mice, 12% were rats, 13% were fish and 6% were birds. Dogs, cats and non-human primates made up just 0.3% of the total. © 2020 Springer Nature Limited
Keyword: Animal Rights
Link ID: 27039 - Posted: 02.14.2020
By Gina Kolata The study aimed to show that Alzheimer’s disease could be stopped if treatment began before symptoms emerged. The participants were the best candidates that scientists could find: still healthy, but with a rare genetic mutation that guaranteed they would develop dementia. For five years, on average, the volunteers received monthly infusions or injections of one of two experimental drugs, along with annual blood tests, brain scans, spinal taps and cognitive tests. Now, the verdict is in: The drugs did nothing to slow or stop cognitive decline in these subjects, dashing the hopes of scientists. Dr. Randall Bateman, a neurologist at Washington University in St. Louis and principal investigator of the study, said he was “shocked” when he first saw the data: “It was really crushing.” The results are a deep disappointment, scientists said — but not a knockout punch. The drugs did not work, but the problems may be fixable: perhaps the doses were too low, or they should have been given to patients much younger. Few experts want to give up on the hypothesis that amyloid plaques in the brain are intimately involved in Alzheimer’s disease. The data from this international study, called DIAN-TU, are still being analyzed and are to be presented on April 2 at scientific conferences in Vienna in April and in Amsterdam in July. The trial was sponsored by Washington University in St. Louis, two drug companies that supplied the drugs — Eli Lilly and Roche, with a subsidiary, Genentech — the National Institutes of Health and philanthropies, including the Alzheimer’s Association. © 2020 The New York Times Company
Keyword: Alzheimers
Link ID: 27038 - Posted: 02.13.2020
By Elizabeth Pennisi Scientists seeking the origins of sleep may have uncovered important clues in the Australian bearded dragon. By tracing sleep-related neural signals to a specific region of the lizard’s brain—and linking that region to a mysterious part of the mammalian brain—a new study suggests complex sleep evolved even earlier in vertebrate evolution than researchers thought. The work could ultimately shed light on the mechanisms behind sleep—and pave the way for studies that may help humans get a better night’s rest. “Answers to the questions raised and reframed by this research seem extremely likely to be significant in many ways, including clinically,” says Stephen Smith, a neuroscientist at the Allen Institute who was not involved with the new study. Mammals and birds have two kinds of sleep. During rapid eye movement (REM) sleep, eyes flutter, electrical activity moves through the brain, and, in humans, dreaming occurs. In between REM episodes is “slow wave” sleep, when brain activity ebbs and electrical activity synchronizes. This less intense brain state may help form and store memories, a few studies have suggested. In 2016, Gilles Laurent, a neuroscientist at the Max Planck Institute for Brain Research, discovered that reptiles, too, have both kinds of sleep. Every 40 seconds, central bearded dragons (Pogona vitticeps) switch between the two sleep states, he and his colleagues reported. © 2019 American Association for the Advancement of Science
Keyword: Sleep; Evolution
Link ID: 27037 - Posted: 02.13.2020
By Brian Platzer Three years ago I wrote an essay for Well about the chronic dizziness that had devastated my life. In response, I received thousands of letters, calls, tweets, emails and messages from Times readers who were grateful to see a version of their own story made public. Their symptoms varied. While some experienced a constant disequilibrium and brain fog that were similar to mine, others had become accustomed to a pattern of short periods of relative health alternating with longer periods of vertigo. Most of them, like me, felt that family and friends often didn’t understand how dizziness could be so debilitating. They told me that the combination of the loneliness and feelings of uselessness that come from an inability to work or spend time with family led to despair and depression. And, most commonly, they felt that the medical system made them feel responsible for their own suffering. “Doctors began to suggest that anxiety or depression were the cause of my symptoms,” a young woman from Connecticut wrote. “I eventually gave up on the quest for answers, as their attitudes added stress to an already stressful reality.” “Have been to so many doctors that keep saying, ‘It’s all in your head. There’s nothing wrong with you,’” wrote an older woman from Ohio. “Mostly been told there is nothing they can find,” wrote a middle-aged woman from Illinois. Her doctor told her it was probably just depression and anxiety. Dizziness is among the most common reasons people visit their doctor in the United States. When patients first experience prolonged dizziness, they may go to an emergency room or to see their primary care physician. That’s what I did. And I heard what most patients hear: “People get dizzy for all sorts of reasons, and it should resolve itself soon.” It’s true that dizziness often is a temporary symptom. The most common causes of dizziness are benign paroxysmal positional vertigo (caused by displaced pieces of small bone-like calcium in the inner ear), and vestibular neuritis (dizziness attributed to a viral infection or tiny stroke of the vestibular nerve), both of which typically last only weeks or months. © 2020 The New York Times Company
Keyword: Miscellaneous
Link ID: 27036 - Posted: 02.13.2020
By Veronique Greenwood When you look at a reconstruction of the skull and brain of Neoepiblema acreensis, an extinct rodent, it’s hard to shake the feeling that something’s not quite right. Huddled at the back of the cavernous skull, the brain of the South American giant rodent looks really, really small. By some estimates, it was around three to five times smaller than scientists would expect from the animal’s estimated body weight of about 180 pounds, and from comparisons to modern rodents. In fact, 10 million years ago the animal may have been running around with a brain weighing half as much as a mandarin orange, according to a paper published Wednesday in Biology Letters. The glory days of rodents, in terms of the animals’ size, were quite a long time ago, said Leonardo Kerber, a paleontologist at Universidade Federal de Santa Maria in Brazil and an author of the new study. Today rodents are generally dainty, with the exception of larger creatures like the capybara that can weigh as much as 150 pounds. But when it comes to relative brain size, N. acreensis, represented in this study by a fossil skull unearthed in the 1990s in the Brazilian Amazon, seems to be an extreme. The researchers used an equation that relates the body and brain weight of modern South American rodents to get a ballpark estimate for N. acreensis, then compared that with the brain weight implied by the volume of the cavity in the skull. The first method predicted a brain weighing about 4 ounces, but the volume suggested a dinky 1.7 ounces. Other calculations, used to compare the expected ratio of the rodent’s brain and body size with the actual fossil, suggested that N. acreensis’ brain was three to five times smaller than one would expect. © 2020 The New York Times Company
Keyword: Evolution; Brain imaging
Link ID: 27035 - Posted: 02.13.2020
Jon Hamilton Scientists have taken a small step toward personalizing treatment for depression. A study of more than 300 people with major depression found that brain wave patterns predicted which ones were most likely to respond to the drug sertraline (Zoloft), a team reported Monday in the journal Nature Biotechnology. If the approach pans out, it could offer better care for the millions of people in the U.S. with major depression. "This is definitely a step forward," says Michele Ferrante, who directs the computational psychiatry and computational neuroscience programs at the National Institute of Mental Health. He was not a part of the study. Right now, "one of our great frustrations is that when a patient comes in with depression we have very little idea what the right treatment for them is," says Dr. Amit Etkin, an author of the study and a professor of psychiatry at Stanford University. "Essentially, the medications are chosen by trial and error." Etkin is also the CEO of Alto Neuroscience, a Stanford-backed start-up developing computer-based approaches to diagnosing mental illness and selecting treatments. In the study, researchers used artificial intelligence to analyze the brainwave patterns in more than 300 patients who'd been diagnosed with major depression. Then they looked to see what happened when these same patients started treatment with sertraline. And one pattern of electrical activity seemed to predict how well a patient would do. "If the person scores particularly high on that, the recommendation would be to get sertraline," Etkin says. © 2020 npr
Keyword: Depression
Link ID: 27034 - Posted: 02.11.2020
Rachel Patton McCord, Rebecca A. Prosser Have you ever slipped when trying to avoid sugar, quit smoking, or break another habit or addiction? Usually that one piece of cake or one cigarette won’t ruin your whole plan, but for people struggling with cocaine addiction, one slip can undo months of hard work. Cocaine consumption is increasing, with 2.2 million people in the U.S. admitting to recent cocaine use in 2017. In 2014, the National Survey on Drug Use and Health estimated that nearly 1 million Americans were addicted to cocaine. The effect of cocaine on the brain and body is so powerful that, even after state-of-the-art treatments, many people trying to quit cocaine relapse within a year. What if cocaine could be made less euphoric, so that a single use by a recovering addict doesn’t result in a full-blown relapse? Scientists at the Mayo Clinic recently published progress toward making this idea a reality – a gene therapy that would treat cocaine addiction by making cocaine less rewarding. We are a molecular biologist and a neurobiologist who are interested in understanding and treating human disease, including neurological disorders such as cocaine addiction. As University of Tennessee faculty members leading basic biomedical research, we have worked for years on how genes are turned on and off in people and the effects of cocaine on mice, respectively. So, we were excited to see a promising convergence of novel gene therapy and cocaine addiction therapy. Beginning more than 20 years ago, scientists have worked to engineer a new version of a human protein that could break down cocaine so quickly that it doesn’t produce an addictive high. We all have the normal human protein BChE that helps regulate neurotransmitters, and which can slowly break down cocaine. Targeted mutations in BChE can turn it into a super-CocH – a protein that can quickly break down cocaine. When this CocH is injected into the bloodstream, it breaks down cocaine very fast – before the user can experience the pleasurable effects – so a dose of cocaine is less rewarding. Being less rewarding means it is easier to stop using cocaine. © 2010–2020, The Conversation US, Inc.
Keyword: Drug Abuse; Neuroimmunology
Link ID: 27033 - Posted: 02.11.2020
Catherine Offord The first time Kees van Heeringen met Valerie, the 16-year-old girl had just jumped from a bridge. It was the 1980s and van Heeringen was working as a trainee psychiatrist at the physical rehabilitation unit at Ghent University Hospital in Belgium. As he got to know Valerie, who’d lost both legs in the jump and spent several months at the hospital, he pieced together the events leading up to the moment the teenager tried to end her life, including stressful interactions with people around her and a steady accumulation of depression symptoms. Van Heeringen, who would later describe the experience in his 2018 book The Neuroscience of Suicidal Behavior, says Valerie’s story left a permanent impression on him. “I found it very difficult to understand,” he tells The Scientist. He asked himself why anyone would do “such a horrible thing,” he recalls. “It was the first stimulus for me to start studying suicidal behavior.” In 1996, van Heeringen founded the Ghent University Unit for Suicide Research. He’s been its director ever since, helping to drive scientific research into the many questions he and others have about suicide. Many of the answers remain as elusive as they seemed that day in the rehabilitation unit. Suicide rates are currently climbing in the US and many other countries, and suicide is now the second leading cause of death among young people globally, after traffic accidents. The World Health Organization recently estimated that, worldwide, one person ends their own life every 40 seconds. © 1986–2020 The Scientist.
Keyword: Depression
Link ID: 27032 - Posted: 02.11.2020
By Randi Hutter Epstein It was a staple of medical thinking dating to the 1910s that stress was the body’s alarm system, switching on only when terrible things happened, often leaving a person with an either-or choice: fight or flight. The neuroscientist Bruce S. McEwen trailblazed a new way of thinking about stress. Beginning in the 1960s, he redefined it as the body’s way of constantly monitoring daily challenges and adapting to them. Dr. McEwen, who died on Jan. 2 at 81, described three forms of stress: good stress — a response to an immediate challenge with a burst of energy that focuses the mind; transient stress — a response to daily frustrations that resolve quickly; and chronic stress — a response to a toxic, unrelenting barrage of challenges that eventually breaks down the body. It was Dr. McEwen’s research into chronic stress that proved groundbreaking. He and his research team at Rockefeller University in Manhattan discovered in 1968 that stress hormones had a profound effect on the brain. In studies using animals (five rats in the initial one), Dr. McEwen and his colleagues demonstrated that toxic stress atrophied neurons near the hippocampus, the brain’s memory and learning center, while expanding neurons near the amygdala, an area known for vigilance toward threats. Describing the burden of continuing stress, he coined the term “allostatic load” (derived from allostasis, the process by which the body seeks to regain stability, or homeostasis, in response to stressors). Their discoveries, first published in the journal Nature in 1968, ignited a new field of research, one that would reveal how stress hormones and other mediators change the brain, alter behavior and impact health, in some cases accelerating disease. At the time, only a few scientists were asserting that the brain remains malleable throughout life, challenging the dogma that the brain stops changing after adolescence. Dr. McEwen’s studies documenting how hormones alter neurons lent credence to this emerging idea. © 2020 The New York Times Company
Keyword: Stress; Hormones & Behavior
Link ID: 27031 - Posted: 02.11.2020
By Perri Klass, M.D. Whenever I write about attention deficit hyperactivity disorder — whether I’m writing generally about the struggles facing these children and their families or dealing more specifically with medications — I know that some readers will write in to say that A.D.H.D. is not a real disorder. They say that the rising numbers of children taking stimulant medication to treat attentional problems are all victims, sometimes of modern society and its unfair expectations, sometimes of doctors, and most often of the rapacious pharmaceutical industry. I do believe that A.D.H.D. is a valid diagnosis, though a diagnosis that has to be made with care, and I believe that some children struggle with it mightily. Although medication should be neither the first nor the only treatment used, some children find that the stimulants significantly change their educational experiences, and their lives, for the better. Dr. Mark Bertin, a developmental pediatrician in Pleasantville, N.Y., who is the author of “Mindful Parenting for A.D.H.D.,” said, “On a practical level, we know that correctly diagnosed A.D.H.D. is real, and we know that when they’re used properly, medications can be both safe and effective.” The choice to use medications can be a difficult one for families, he said, and is made even more difficult by “the public perception that they’re not safe, or that they fundamentally change kids.” He worries, he says, that marketing is really effective, and wants to keep it “at arm’s length,” far away from his own clinical decisions, not allowing drug reps in the office, not accepting gifts — but acknowledging, all the same, that it’s probably not possible to avoid the effects of marketing entirely. Still, he said, when it comes to stimulants, “the idea that we’re only using them because of the pharmaceutical industry is totally off base,” and can make it much harder to talk with parents about the potential benefits — and the potential problems — of treating a particular child with a particular medication. “When it comes to A.D.H.D. in particular, it’s a hard enough thing for families to be dealing with without all the fear and judgment added on.” © 2020 The New York Times Company
Keyword: ADHD; Drug Abuse
Link ID: 27030 - Posted: 02.10.2020
By Everyday Einstein Sabrina Stierwalt People from all cultures laugh, although we may laugh at different things. (I once interviewed for a job in the Netherlands and none of my jokes landed. I didn’t get that job.) Apes also laugh. We know this because there are scientists whose job it is to tickle animals. I’m not even kidding. What a life! Advertisement Humans start laughing as early as 3 months into life, even before we can speak. This is true even for babies who are deaf or blind. Peekaboo, it turns out, is particularly a global crowd-pleaser. And we know this because studying baby laughter is an actual job, too. So, the ubiquitous nature of laughter suggests that it must serve a purpose, but what? Why do we laugh? Here are a few scientific reasons Laughter clearly serves a social function. It is a way for us to signal to another person that we wish to connect with them. In fact, in a study of thousands of examples of laughter, the speakers in a conversation were found to be 46 percent more likely to laugh than the listeners. We’re also 30 times more likely to laugh in a group. Young children between the ages of 2.5 and 4 were found to be eight times more likely to laugh at a cartoon when they watched it with another child even though they were just as likely to report that the cartoon was funny whether alone or not. Evolutionarily speaking, this signal of connection likely played an important role in survival. Upon meeting a stranger, we want to know: What are your intentions with me? And who else are you aligned with? © 2020 Scientific American
Keyword: Emotions
Link ID: 27029 - Posted: 02.10.2020
By Jane E. Brody Climate change is not the only source of dire projections for the coming decade. Perhaps just as terrifying from both a health and an economic perspective is a predicted continued rise in obesity, including severe obesity, among American adults. A prestigious team of medical scientists has projected that by 2030, nearly one in two adults will be obese, and nearly one in four will be severely obese. The estimates are thought to be particularly reliable, as the team corrected for current underestimates of weight given by individuals in national surveys. In as many as 29 states, the prevalence of obesity will exceed 50 percent, with no state having less than 35 percent of residents who are obese, they predicted. Likewise, the team projected, in 25 states the prevalence of severe obesity will be higher than one adult in four, and severe obesity will become the most common weight category among women, non-Hispanic black adults and low-income adults nationally. Given the role obesity plays in fostering many chronic, disabling and often fatal diseases, these are dire predictions indeed. Yet, as with climate change, the powers that be in this country are doing very little to head off the potentially disastrous results of expanding obesity, obesity specialists say. Well-intentioned efforts like limiting access to huge portions of sugar-sweetened soda, the scientists note, are effectively thwarted by well-heeled industries able to dwarf the impact of educational efforts by health departments that have minuscule budgets by comparison. With rare exceptions, the sugar and beverage industries have blocked nearly every attempt to add an excise tax to sugar-sweetened beverages. Claims that such a tax is regressive and unfairly targets low-income people is shortsighted, according to Zachary J. Ward, public health specialist at Harvard and the lead author of the new report, published in The New England Journal of Medicine in December. © 2020 The New York Times Company
Keyword: Obesity
Link ID: 27028 - Posted: 02.10.2020
By Chris Woolston Sometimes it takes multitudes to reveal scientific truth. Researchers followed more than 7,000 subjects to show that a Mediterranean diet can lower the risk of heart disease. And the Women’s Health Initiative enlisted more than 160,000 women to show, among other findings, that postmenopausal hormone therapy put women at risk of breast cancer and stroke. But meaningful, scientifically valid insights don’t always have to come from studies of large groups. A growing number of researchers around the world are taking a singular approach to pain, nutrition, psychology and other highly personal health issues. Instead of looking for trends in many people, they’re designing studies for one person at a time. A study of one person — also called an N of 1 trial — can uncover subtle, important results that would be lost in a large-scale study, says geneticist Nicholas Schork of the Translational Genomics Research Institute in Phoenix. The results, he says, can be combined to provide insights for the population at large. But with N of 1 studies, the individual matters above all. “People differ at fundamental levels,” says Schork, who discussed the potential of N of 1 studies in a 2017 issue of the Annual Review of Nutrition. And the only way to understand individuals is to study them. Case studies of individuals in odd circumstances have a long history in medical literature. But the concept of a clinical medicine N of 1 study gathering the same level of information as a large study goes back to an article published in the New England Journal of Medicine in 1986. Hundreds of N of 1 studies have been published since then, and the approach is gaining momentum, says Suzanne McDonald, N of 1 research coordinator at the University of Queensland in Brisbane, Australia.
Keyword: Genes & Behavior; Schizophrenia
Link ID: 27027 - Posted: 02.10.2020
By Laura Sanders Immune cells in the brain chew up memories, a new study in mice shows. The finding, published in the Feb. 7 Science, points to a completely new way that the brain forgets, says neuroscientist Paul Frankland of the Hospital for Sick Children Research Institute in Toronto, who wasn’t involved in the study. That may sound like a bad thing, but forgetting is just as important as remembering. “The world constantly changes,” Frankland says, and getting rid of unimportant memories — such as a breakfast menu from two months ago — allows the brain to collect newer, more useful information. Exactly how the brain stores memories is still debated, but many scientists suspect that connections between large groups of nerve cells are important (SN: 1/24/18). Forgetting likely involves destroying or changing these large webs of precise connections, called synapses, other lines of research have suggested. The new result shows that microglia, immune cells that can clear debris from the brain, “do exactly that,” Frankland says. Microglia are master brain gardeners that trim extra synapses away early in life, says Yan Gu, a neuroscientist at Zhejiang University School of Medicine in Hangzhou, China. Because synapses have a big role in memory storage, “we started to wonder whether microglia may induce forgetting by eliminating synapses,” Gu says. Gu’s team first gave mice an unpleasant memory: mild foot shocks, delivered in a particular cage. Five days after the shocks, the mice would still freeze in fear when they were placed in the cage. But 35 days later, they had begun to forget and froze less often in the room. © Society for Science & the Public 2000–2020
Keyword: Learning & Memory; Neuroimmunology
Link ID: 27026 - Posted: 02.07.2020


.gif)

