Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 5301 - 5320 of 29412

By Dina Fine Maron Millions of Americans who suffer from bipolar disorder depend on lithium. The medication has been prescribed for half a century to help stabilize patients’ moods and prevent manic or depressive episodes. Yet what it does in the brain—and why it does not work for some people—has remained largely mysterious. But last year San Diego–based researchers uncovered new details about how lithium may alter moods, thanks to an approach recently championed by a small number of scientists studying mental illness: The San Diego team used established lab techniques to reprogram patients’ skin cells into stem cells capable of becoming any other kind—and then chemically coaxed them into becoming brain cells. This process is now providing the first real stand-ins for brain cells from mentally ill humans, allowing for unprecedented direct experiments. Proponents hope studying these lab-grown neurons and related cells will eventually lead to more precise and effective treatment options for a variety of conditions. The San Diego team has already used this technique to show some bipolar cases may have more to do with protein regulation than genetic errors. And another lab discovered the activity of glial cells (a type of brain cell that supports neuron function) likely helps fuel schizophrenia—upending the theory that the disorder results mainly from faulty neurons. This new wave of research builds on Shinya Yamanaka’s Nobel-winning experiments on cellular reprogramming from a decade ago. His landmark findings about creating induced pluripotent stem cells (iPSCs) have only recently been applied to studying mental illness as the field has matured. “What’s really sparked that move now has been the ability to make patient-specific stem cells—and once you can do that, then all sorts of diseases become amenable to investigation,” says Steven Goldman, who specializes in cellular and gene therapy at the University of Rochester Medical Center. © 2018 Scientific American,

Keyword: Schizophrenia; Stem Cells
Link ID: 24703 - Posted: 02.27.2018

Lauren Smith As a shark biologist, I enjoy nothing more than going scuba diving with sharks in the wild. However, I realise it’s an immense privilege to do this as part of my work – and that for the vast majority of people experiencing the underwater world in such a way is simply not possible. Nevertheless, even without the aid of an air tank humans interact with fish on many levels and in greater numbers than they do with mammals and birds. A review published by the journal Animal Cognition in 2014 by Culum Brown, an associate professor at Macquarie University, Sydney, explains that fish are one of the vertebrate taxa most highly utilised by humans. But despite the fact that they are harvested from wild stocks as part of global fishing industries, grown under intensive aquaculture conditions, are the most common pet and are widely used for scientific research, fish are seldom afforded the same level of compassion or welfare as warm-blooded vertebrates. As Brown highlights in his review, part of the problem is the large gap between people’s perception of fish intelligence and the scientific reality. This is an important issue because public perception guides government policy. The perception of an animal’s intelligence often drives our decision on whether or not to include them in our moral circle. From a welfare perspective, most researchers would suggest that if an animal is sentient, then it can most likely suffer and should therefore be offered some form of formal protection.

Keyword: Consciousness; Evolution
Link ID: 24702 - Posted: 02.27.2018

By Aaron E. Carroll I remember the first time my daughter discovered her hand. The look of amazement on her face was priceless. It wasn’t long before she was putting that discovery to use, trying to put everything she could find into her mouth. Babies want to feed themselves. It sometimes feels as if parents spend more time trying to stop them than encouraging them. Over the last few years, however, some people have begun to ask if we are doing the right thing. Baby-led weaning is an approach to feeding that encourages infants to take control of their eating. It’s based on the premise that infants might be better self-regulators of their food consumption. It has even been thought that baby-led weaning might lead to reductions in obesity. While babies have been spoon-fed for a long time, the explosion of commercial foods for them might be making it too easy to overfeed them, an idea that the results from a cohort study in 2015 seemed to hint at. Those weaned in a baby-led approach seemed to be more responsive to being sated and were less likely to be overweight. A case-control study from 2012 also argued that baby-led weaning was associated with a lower body mass index (B.M.I). Such trials cannot establish causality, however, and may be confounded in unmeasured ways. A recent randomized controlled trial accomplished what previous work could not. Pregnant women in New Zealand were recruited before they gave birth and randomly assigned to one of two groups. Both got standard midwifery and child care. But one group received eight more contacts, from pregnancy to the newborn’s ninth month. Five of these were with a lactation consultant, who encouraged the mothers to prolong breast-feeding and delay the introduction of solid foods until 6 months of age. The three other contacts were with research staffers who encouraged parents to read hunger and fullness cues from their infants and provide their babies (starting at 6 months) with foods that were high in energy and iron — easy to grab but hard to choke on. © 2018 The New York Times Company

Keyword: Obesity; Development of the Brain
Link ID: 24701 - Posted: 02.27.2018

Mike Shooter Sian was just 14, brought by her misery to the edge of self-harm, when I met her in a cafe at the top end of one of the old mining valleys. Neutral ground. She told me about her rugby-playing older brother and her bright little sister who had lots of pets and wanted to be a vet. She felt that her parents doted on them and that there could be no room in anyone’s heart for her. She told me about her only friend, who had been killed in a road accident just as they went up to big school. About the recent death of her grandmother, who had been the only person she could confide in. And about the GP who had said she was depressed and given her a course of pills. I thought about Sian again this week. The newspaper headlines across the world were welcoming a major study that confirmed the value of antidepressant medication in the treatment of depression in adults. And so did I. Depression was validated at long last as an illness every bit as serious as physical conditions, that could cause untold human suffering and economic devastation, but could be helped with a course of antidepressant pills. First things first, I heartily agree with what that survey was saying about adult treatment. After all, I have a recurrent depression myself that has needed frequent treatment over the years. I talked about it openly when I was president of the Royal College of Psychiatrists and have continued to do so from the public platform, in the media, and to anyone who will listen. I do this in the hope that it will help to dispel the stigma that surrounds mental illness and prevents people from seeking therapy until it is too late. The diagnosis made sense of what I was going through. It wasn’t my fault. And I was grateful for the medication.

Keyword: Depression; Development of the Brain
Link ID: 24700 - Posted: 02.27.2018

By MAYA SALAM and LIAM STACK President Trump said Thursday that violent video games and movies may play a role in school shootings, a claim that has been made — and rejected — many times since the increase in such attacks in the past two decades. Movies are “so violent,” Mr. Trump said at a meeting on school safety one day after he gathered with survivors of school shootings, including some from last week’s massacre at Marjory Stoneman Douglas High School, where, the authorities say, a former student, Nikolas Cruz, killed 17 people with a semiautomatic rifle. “We have to look at the internet because a lot of bad things are happening to young kids and young minds and their minds are being formed,” Mr. Trump said, “and we have to do something about maybe what they’re seeing and how they’re seeing it. And also video games. I’m hearing more and more people say the level of violence on video games is really shaping young people’s thoughts.” “And then you go the further step and that’s the movies,” he added. “You see these movies, they’re so violent, and yet a kid is able to see the movie if sex isn’t involved, but killing is involved.” A neighbor of Mr. Cruz’s told The Miami Herald that he played video games, often violent ones, for up to 15 hours a day. Media scholars say the claim — a common one in the wake of mass shootings — does not hold up to scrutiny. Mr. Trump is far from the first leader to argue that violence in video games or movies can lead to violence in the real world. A similar claim was made in the 1940s, when Mayor Fiorello La Guardia of New York argued that pinball — which was illegal in the city for over 30 years — was “dominated by interests heavily tainted with criminality.” © 2018 The New York Times Company

Keyword: Aggression
Link ID: 24699 - Posted: 02.26.2018

By Alexandra Rosati The shift to a cooked-food diet was a decisive point in human history. The main topic of debate is when, exactly, this change occurred. All known human societies eat cooked foods, and biologists generally agree cooking could have had major effects on how the human body evolved. For example, cooked foods tend to be softer than raw ones, so humans can eat them with smaller teeth and weaker jaws. Cooking also increases the energy they can get from the food they eat. Starchy potatoes and other tubers, eaten by people across the world, are barely digestible when raw. Moreover, when humans try to eat more like chimpanzees and other primates, we cannot extract enough calories to live healthily. Up to 50 percent of women who exclusively eat raw foods develop amenorrhea, or lack of menstruation, a sign the body does not have enough energy to support a pregnancy—a big problem from an evolutionary perspective. Such evidence suggests modern humans are biologically dependent on cooking. But at what point in our evolutionary history was this strange new practice adopted? Some researchers think cooking is a relatively recent innovation—at most 500,000 years old. Cooking requires control of fire, and there is not much archaeological evidence for hearths and purposefully built fires before this time. The archaeological record becomes increasingly fragile farther back in time, however, so others think fire may have been controlled much earlier. Anthropologist Richard Wrangham has proposed cooking arose before 1.8 million years ago, an invention of our evolutionary ancestors. If the custom emerged this early, it could explain a defining feature of our species: the increase in brain size that occurred around this time. © 2018 Scientific American,

Keyword: Evolution
Link ID: 24698 - Posted: 02.26.2018

Amelia Hill For a serious examination of the devastating and incurable disability that is narcolepsy, Henry Nicholls’s book, Sleepy Head, is a surprisingly funny account. There is the obvious, if somewhat cruel, humour to be found in stories of people falling asleep in surprising places: in a small boat sailing around the Farne Islands, with the freezing North Sea cascading over the gunwale; while scuba diving; on a rollercoaster; at the dentist’s; on the back of a horse; on a surfboard. But there are other extremely funny insights that Nicholls gives into the crepuscular world that narcoleptics inhabit: his laconic fretting over the etiquette of attending a CBT group for insomniacs, which he discovers he also suffers from while researching the book. “A narcoleptic attending an insomnia clinic could be seen as the height of insensitivity,” he deadpans. Then there’s the attempt to solve sleep apnoea by learning the didgeridoo. (Didgetherapy, since you ask. It involves acrylic didgeridoos and is, apparently, quite effective.) Misjudging his tone entirely, I arrive at our interview expecting a garrulous chat. I’m particularly excited that I opened Nicholls’s book thinking I was pretty special to be able to share with him the fact that my father also had narcolepsy – and close his book having realised that five of my closest family members (including myself) have had diagnosable sleep disorders ranging from sleep apnoea to night terrors to – my own thrilling self-realisation – an episode of hypnagogic hallucination and sleep paralysis. © 2018 Guardian News and Media Limited

Keyword: Narcolepsy; Sleep
Link ID: 24697 - Posted: 02.26.2018

By Natalie Crockett BBC News Older people in Wales are being urged to think about donating their brains after they die to help scientists researching dementia. Researchers at Cardiff University are not actively recruiting at the moment but are still keen to hear from people aged over 85 without a diagnosis. While they also recruit donors with dementia as healthy brains are needed for comparisons. Donor Ken Baxter, 75, said: "When I'm finished, it isn't any use to me." Since 2009, 460 people in Wales have signed up, with 79 successful donations made to the Brains for Dementia Research project so far. They are recruited through its team at the university, which is working to identify which genes contribute to a person's susceptibility to developing Alzheimer's disease. It is hoped they will then be able to predict which people are more likely to get it. But to do this they need to study human brain tissue, as looking at the distribution of protein deposits on the brain is the only way to get a definitive diagnosis of the disease. While donors who have dementia often find out about brain donation from medical professionals, it can be harder to attract those with healthy brains. Mr Baxter is one such donor and decided to donate his brain after seeing how dementia affected a friend. He saw it as a way to help others but admitted he does not always get a positive reaction to his plans. He said: "'[People say] are you sure? It's not something I want to do'. And some people are horrified when you tell them - I can't see a reason why but a lot of people take it the wrong way. "They think 'I've never thought of that' - but you're helping someone. If we can overcome these diseases, so much the better." © 2018 BBC.

Keyword: Alzheimers
Link ID: 24696 - Posted: 02.26.2018

Rhik Samadder The results of a comprehensive, six-year study confirmed last week what I’ve known a long time: antidepressants work. I know this because half the people I know are on them – and that’s only the half I know about. Antidepressants saved my life, they tell me, and I believe them. I don’t say: “The only thing you’ve swallowed is propaganda, mate, straight from Big Pharma’s chalky teat.” I would have to be a maniac to do that. And I’m not a maniac. At least, not in that way. I’ve been on antidepressants at various points in my life. And I’ve always been one of the 80% who come off them within a month, looking for another way. I quickly tire of the tweaking of drugs and dosages required to find the appropriate prescription. I freak out at the initial side-effects – the flaccidness in my brain, the lack of ideas in my underpants. More than that, I’ve always had been uncomfortable accepting there is something medically wrong with me. To some extent, I stand by that. Our social structures perpetuate inequality, our media feeds feelings of inferiority, while our politics is an accelerated zoetrope of horror. I feel unnerved when I meet someone who isn’t depressed. What’s wrong with you, I want to ask. Still, while it’s not wrong to feel viscerally offended by many aspects of the modern world, when the strength of those feelings stops you living your life, it’s not a solution, either. What struck me from that study, below the headline, was another of its findings: that talking therapies are equally as effective at treating moderate to severe depression.

Keyword: Depression
Link ID: 24695 - Posted: 02.26.2018

By JAMES GORMAN Recently someone (my boss, actually) mentioned to me that I wrote more articles about dogs than I did about cats and asked why. My first thought, naturally, was that it had nothing to do with the fact that I have owned numerous dogs and no cats, but rather reflected the amount of research done by scientists on the animals. After all, I’ll write about any interesting findings, and I like cats just fine, even if I am a dog person. Two of my adult children have cats, and I would hate for them to think I was paying them insufficient attention. (Hello Bailey! Hello Tawny! — Those are the cats, not the children.) But I figured I should do some reporting, so I emailed Elinor Karlsson at the Broad Institute and the University of Massachusetts. She is a geneticist who owns three cats, but does much of her research on dogs — the perfect unbiased observer. Her research, by the way, is about dog genomes. She gets dog DNA from owners who send in their pets’ saliva samples. The research I have been interested in and writing about involves evolution, domestication, current genetics and behavior. And the questions are of the What-is-a-dog-really? variety. Dogs and cats have also been used as laboratory animals in invasive experiments, but I wasn’t asking about which animal is more popular for those. I had gotten to know Dr. Karlsson a bit while reporting on research she was doing on wolves. I asked her whether there was indeed more research on dogs than cats, and if so, why? “Ooo, that is an interesting question!” she wrote back. “Way more interesting than the various grant-related emails that are filling up my inbox. “The research has lagged behind in cats. I think they’re taken less seriously than dogs, probably to do with societal biases. I have a vet in my group who thinks that many of the cancers in cats may actually be better models for human cancer, but there has been almost no research into them.” Better models than cancers in dogs, that is. Dogs do get many of the same cancers as humans, but in dogs the risk for these cancers often varies by breed, which narrows the target down when looking for the cause of a disease. © 2018 The New York Times Company

Keyword: Animal Rights
Link ID: 24694 - Posted: 02.26.2018

Emma Marris Neanderthals painted caves in what is now Spain before their cousins, Homo sapiens, even arrived in Europe, according to research published today in Science1. The finding suggests that the extinct hominids, once assumed to be intellectually inferior to humans, may have been artists with complex beliefs. Ladder-like shapes, dots and handprints were painted and stenciled deep in caves at three sites in Spain. Their precise meaning may forever be unknowable, says Alistair Pike, an archaeologist at the University of Southampton, UK, who co-authored the study, but they were almost certainly meaningful to our lost kin. “It wasn’t simply decorating your living space,” Pike says. “People were making journeys into the darkness.” Humans are thought to have arrived in Europe from Africa around 40,000–45,000 years ago. The three caves in different parts of Spain yielded artworks that are at least 65,000 years old, according to uranium-thorium dating of calcium carbonate that had formed on top of the art. These mineral deposits develop slowly, as water containing calcium comes into contact with cave surfaces. The water also contains trace levels of uranium from the rock. After the calcium carbonate has precipitated out of the water, a clock of sorts begins to tick, as uranium decays into thorium at a steady, known rate. Uranium-thorium dating has been used in geology for decades, but has seldom been employed to estimate the age of cave art. Some archaeologists are sceptical of the approach. They suggest that the calcium carbonate could have dissolved and re-crystallized after it was first formed — a process that could have also washed away some uranium, making a sample of the mineral appear older than it is. 2018 Macmillan Publishers Limited,

Keyword: Evolution
Link ID: 24693 - Posted: 02.23.2018

Barbara J. King When humans talk to each other or walk alongside each other, we tend to match each other's subtle movements. Called interpersonal movement synchrony in the science literature and mirroring in the popular media, it's an often-unconscious process during which we match our gestures and pace to that of our social partner of the moment. Writing in the March issue of the journal Animal Cognition, Charlotte Duranton, Thierry Bedossa, and Florence Gaunet note that this process is "evolutionarily adaptive" for us: "It contributes to communication between individuals by signaling the convergence of their inner states and fostering social cohesion." Then, these three researchers present evidence to show that dogs synchronize their walking pace with their humans in a way that may also reflect an evolutionary adaptation. In an experiment, 36 pet dogs were brought to an open area in Maisons-Laffitte, France, with their owners. After a 15-minute free period, the owner-dog pairs experienced three testing conditions presented in random order. These were: stay-still (owner didn't move for 10 seconds), normal-walk (owners walked at normal speed for 10 seconds), and fast-walk (owner walked fast for 10 seconds). Importantly, the dogs were off-leash and, thus, not tethered in any way to the speed of the owners. The owners were told not to look at, or talk to, their dogs — or to show any evident emotion. The experimenters filmed the trials as they occurred. The dogs synchronized their pace closely with their owners, speeding up when the owners walked at an unnaturally fast pace. (The dogs in their regular routines were used to walking at a normal pace, with the owners often pausing to chat with other people). © 2018 npr

Keyword: Animal Communication; Emotions
Link ID: 24692 - Posted: 02.23.2018

By BENEDICT CAREY President Trump called again on Thursday for the opening of more mental hospitals to help prevent mass murders like the one at Marjory Stoneman Douglas High School in Parkland, Fla. Yet ramping up institutional care, experts say, likely would not have prevented most of the spree killings regularly making headlines in this country. “We’re going to be talking about mental institutions. And when you have some person like this, you can bring them into a mental institution, and they can see what they can do. But we’ve got to get them out of our communities,” the president said during a meeting at the White House with state and local officials. In the 1960s, states across the country began to close or shrink mental hospitals after a series of court decisions that limited the powers of state and local officials to commit people. The decline continued for decades, in part because of cuts in both state and federal budgets for mental health care. Those institutions housed people with severe mental disorders, like schizophrenia, who were deemed unable to care for themselves. And while spree killers may be angry and emotionally disordered, few have had the sorts of illnesses that would have landed them in hospital custody. The latest school shooter, Nikolas Cruz, 19, was clearly troubled and making threats, and he was stockpiling weapons. But he had no mental diagnosis. He has been described as angry, possibly depressed, perhaps isolated — not so different from millions of other teenagers. A full psychiatric evaluation, if he’d had one, might have resulted in a temporary commitment at best, but not full-time institutionalization, experts said. The idea that more such institutions would prevent this kind of violence “is ridiculous, because you can’t put half the people in the country with a mental disturbance in mental hospitals,” said Dr. Michael Stone, a forensic psychiatrist at Columbia University who has studied mass killers. © 2018 The New York Times Company

Keyword: Aggression; Schizophrenia
Link ID: 24691 - Posted: 02.23.2018

It was disappointing to read such an uncritical description of the latest analysis of antidepressant trials that does not address doubts about the widespread use of these drugs (The drugs do work, says study of antidepressants, 22 February). The analysis consists of comparing “response” rates between people on antidepressants and those on placebo. But “response” is an artificial category that has been arbitrarily constructed out of the data actually collected, which consists of scores on depression rating scales. Analysing categories inflates differences. When scores are compared, differences are trivial, and unlikely to be clinically relevant. Moreover, even these small differences are easily accounted for by the fact that antidepressants produce more or less subtle mental and physical alterations (eg nausea, dry mouth, drowsiness and emotional blunting) irrespective of whether or not they treat depression. These enable participants to guess whether they have been allocated to antidepressant or placebo, thus enhancing the placebo effect of the active drugs. This may explain why antidepressants that cause the most noticeable alterations, such as amitriptyline, appeared to be the most effective. “Real world” studies show that people treated with antidepressants have poor outcomes and fare worse than depressed people who do not receive antidepressants. Increased prescribing will do more harm than good. Adverse effects include sexual dysfunction, which may occasionally persist after the drugs are stopped, agitation, suicidal and aggressive behaviour among younger users, prolonged and severe withdrawal effects and foetal abnormalities. The costs of encouraging more people to consider themselves as flawed and diseased are hard to quantify. © 2018 Guardian News and Media Limited

Keyword: Depression
Link ID: 24690 - Posted: 02.23.2018

Kerri Smith Cole Skinner was hanging from a wall above an abandoned quarry when he heard a car pull up. He and his friends bolted, racing along a narrow path on the quarry’s edge and hopping over a barbed-wire fence to exit the grounds. The chase is part of the fun for Skinner and his friend Alex McCallum-Toppin, both 15 and pupils at a school in Faringdon, UK. The two say that they seek out places such as construction sites and disused buildings — not to get into trouble, but to explore. There are also bragging rights to be earned. “It’s just something you can say: ‘Yeah, I’ve been in an abandoned quarry’,” says McCallum-Toppin. “You can talk about it with your friends.” Science has often looked at risk-taking among adolescents as a monolithic problem for parents and the public to manage or endure. When Eva Telzer, a neuroscientist at the University of North Carolina in Chapel Hill, asks family, friends, undergraduates or researchers in related fields about their perception of teenagers, “there’s almost never anything positive”, she says. “It’s a pervasive stereotype.” But how Alex and Cole dabble with risk — considering its social value alongside other pros and cons — is in keeping with a more complex picture emerging from neuroscience. Adolescent behaviour goes beyond impetuous rebellion or uncontrollable hormones, says Adriana Galván, a neuroscientist at the University of California, Los Angeles. “How we define risk-taking is going through a shift.” Adolescents do take more risks than adults, and the consequences can include injury, death, run-ins with the law and even long-term health problems. But lab studies in the past decade have revealed layers of nuance in how young people assess risks. In some situations, teenagers can be more risk-averse than their older peers. And they navigate a broader range of risks than has typically been considered in the lab, including social risks and positive risks — such as trying out for a sports team. These types of behaviour seem to have different effects on the brain. © 2018 Macmillan Publishers Limited,

Keyword: Development of the Brain; Emotions
Link ID: 24689 - Posted: 02.22.2018

Jason Beaubien The blind have descended in droves on the Bisidimo Hospital in Eastern Ethiopia. The Himalayan Cataract Project is hosting a mass cataract surgery campaign at the medical compound that used to be a leper colony. For one week a team from the nonprofit has set up seven operating tables in four operating rooms and they're offering free cataract surgery to anyone who needs it. On the first day of the campaign it's clear that the need is great. "We have like 700 or 800 patients already in the compound and many more appointed for tomorrow and the day after and the day after that," says Teketel Mathiwos, the Ethiopian program coordinator for the Himalayan Cataract Project. People hoping to get their sight restored are jammed into the compound's main courtyard. Others spill out of an office where optometrists are prepping patients for surgery. The line to get into the actual operating theater extends all the way out of the building, up along a covered walkway and then loops around the corner of another medical building. More still are standing outside the hospital gates. Mathiwos says some patients may have to wait a day or two for the procedure. "They have tents here," Mathiwos says. "We give them the food to eat and we try to take care of them as best as we can." Some of the patients at the Bisidimo Hospital have only one milky eye. Others are blind in both eyes. These patients underwent surgery as part of a campaign run by Himalayan Cataract Project at the Bisidimo Hospital in Ethiopia. The bandages are removed the day after the procedure. Surgeons performed more than 1,600 cataract surgeries during a six-day event in December. © 2018 npr

Keyword: Vision
Link ID: 24688 - Posted: 02.22.2018

Sarah Boseley Health editor Antidepressants work – some more effectively than others – in treating depression, according to authors of a groundbreaking study which doctors hope will finally put to rest doubts about the controversial medicine. Millions more people around the world should be prescribed pills or offered talking therapies, which work equally well for moderate to severe depression, say the doctors, noting that just one in six people receive proper treatment in the rich world – and one in 27 in the developing world. If cancer or heart patients suffered this level of under-treatment, there would be a public outcry, they say. “Depression is the single largest contributor to global disability that we have – a massive challenge for humankind,” said John Geddes, professor of epidemiological psychiatry at Oxford University. It affects around 350 million people worldwide and instances rose almost 20% from 2005-2015. “Antidepressants are an effective tool for depression. Untreated depression is a huge problem because of the burden to society,” said Andrea Cipriani of the NIHR Oxford Health Biomedical Research Centre, who led the study. In the UK, Geddes said “it is likely that at least one million more people per year should have access to effective treatment for depression, either drugs or psychotherapy. The choice will need to be made by doctor and patient.” © 2018 Guardian News and Media Limited

Keyword: Depression
Link ID: 24687 - Posted: 02.22.2018

Ingfei Chen In a white lab coat and blue latex gloves, Neda Vishlaghi peers through a light microscope at six milky-white blobs. Each is about the size of a couscous grain, bathed in the pale orange broth of a petri dish. With tweezers in one hand and surgical scissors in the other, she deftly snips one tiny clump in half. When growing human brains, sometimes you need to do some pruning. The blobs are 8-week-old bits of brainlike tissue. While they wouldn’t be mistaken for Lilliputian-sized brains, some of their fine-grained features bear a remarkable resemblance to the human cerebral cortex, home to our memories, decision making and other high-level cognitive powers. Vishlaghi created these “minibrains” at the Eli and Edythe Broad Center of Regenerative Medicine and Stem Cell Research at UCLA, where she’s a research assistant. First she immersed batches of human pluripotent stem cells — which can morph into any cell type in the body — in a special mix of chemicals. The free-floating cells multiplied and coalesced into itty-bitty balls of neural tissue. Nurtured with meticulously timed doses of growth-supporting ingredients, the cell clumps were eventually transferred to petri dishes of broth laced with Matrigel, a gelatin-like matrix of proteins. On day 56, the blobs display shadowy clusters of neural “rosettes.” Under a laser scanning microscope, razor-thin slices of those rosettes reveal loose-knit layers of a variety of dividing neural stem cells and the nerve cells, or neurons, they give rise to. The layered structures look similar to the architecture of a human fetal brain at 14 weeks of gestation. |© Society for Science & the Public 2000 - 2018

Keyword: Development of the Brain
Link ID: 24686 - Posted: 02.21.2018

By ANAHAD O’CONNOR Anyone who has ever been on a diet knows that the standard prescription for weight loss is to reduce the amount of calories you consume. But a new study, published Tuesday in JAMA, may turn that advice on its head. It found that people who cut back on added sugar, refined grains and highly processed foods while concentrating on eating plenty of vegetables and whole foods — without worrying about counting calories or limiting portion sizes — lost significant amounts of weight over the course of a year. The strategy worked for people whether they followed diets that were mostly low in fat or mostly low in carbohydrates. And their success did not appear to be influenced by their genetics, a finding that casts doubt on the increasingly popular idea that different diets should be recommended to people based on their DNA makeup. The research lends strong support to the notion that diet quality, not quantity, is what helps people lose and manage their weight most easily in the long run. It also suggests that health authorities should shift away from telling the public to obsess over calories and instead encourage Americans to avoid processed foods that are made with refined starches and added sugar, like bagels, white bread, refined flour and sugary snacks and beverages, said Dr. Dariush Mozaffarian, a cardiologist and dean of the Friedman School of Nutrition Science and Policy at Tufts University. “This is the road map to reducing the obesity epidemic in the United States,” said Dr. Mozaffarian, who was not involved in the new study. “It’s time for U.S. and other national policies to stop focusing on calories and calorie counting.” The new research was published in JAMA and led by Christopher D. Gardner, the director of nutrition studies at the Stanford Prevention Research Center. It was a large and expensive trial, carried out on more than 600 people with $8 million in funding from the National Institutes of Health, the Nutrition Science Initiative and other groups. © 2018 The New York Times Company

Keyword: Obesity
Link ID: 24685 - Posted: 02.21.2018

Sarah Webb Four years ago, scientists from Google showed up on neuroscientist Steve Finkbeiner’s doorstep. The researchers were based at Google Accelerated Science, a research division in Mountain View, California, that aims to use Google technologies to speed scientific discovery. They were interested in applying ‘deep-learning’ approaches to the mountains of imaging data generated by Finkbeiner’s team at the Gladstone Institute of Neurological Disease in San Francisco, also in California. Deep-learning algorithms take raw features from an extremely large, annotated data set, such as a collection of images or genomes, and use them to create a predictive tool based on patterns buried inside. Once trained, the algorithms can apply that training to analyse other data, sometimes from wildly different sources. The technique can be used to “tackle really hard, tough, complicated problems, and be able to see structure in data — amounts of data that are just too big and too complex for the human brain to comprehend”, Finkbeiner says. He and his team produce reams of data using a high-throughput imaging strategy known as robotic microscopy, which they had developed for studying brain cells. But the team couldn’t analyse its data at the speed it acquired them, so Finkbeiner welcomed the opportunity to collaborate. “I can’t honestly say at the time that I had a clear grasp of what questions might be addressed with deep learning, but I knew that we were generating data at about twice to three times the rate we could analyse it,” he says. © 2018 Macmillan Publishers Limited,

Keyword: Learning & Memory; Robotics
Link ID: 24684 - Posted: 02.21.2018