Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
Serge Rivest The signals transmitted between neurons through synaptic connections are responsible for most, if not all, brain functions, from learning to decision-making. During brain development, synapses that are stimulated less often than others are eliminated through a process called pruning, whereas those that are highly stimulated are retained. This refines the brain’s ability to respond to stimuli and environmental cues. Microglia, the brain’s innate immune cells, have a key role in pruning — they engulf and digest synapses through a process called phagocytosis. But the mechanism that determines which synapses they avoid has been unclear. Writing in Neuron, Lehrman et al.1 describe a ‘don’t eat me’ signal, involving a protein called cluster of differentiation 47 (CD47), that prevents inappropriate synaptic pruning by microglia. About a decade ago, it was shown that synapses requiring elimination send an ‘eat me’ signal to microglia2 (Fig. 1a). This signal involves the proteins C1q and CR3, which are part of the complement cascade — a complex series of interactions that is best known for activating cells of the innate immune system to eliminate disease-causing organisms and damaged cells. ‘Don’t eat me’ signals act to limit the effects of ‘eat me’ signals in the immune system, but it was not known whether the same process occurs during synaptic pruning in the developing brain. CD47 is a cell-surface protein that has many immune functions, including acting as a ‘don’t eat me’ signal for macrophages3, microglia’s sister cells, which exist outside the brain. Lehrman et al. analysed whether CD47 is expressed in the dorsal lateral geniculate nucleus (dLGN), a region of the brain involved in vision. This region receives inputs from neurons called retinal ganglion cells (RGCs) that originate in the retina. The authors demonstrated in mice that, at five days after birth, synapses from RGCs to other neurons in the dLGN are being pruned at high levels. © 2018 Springer Nature Limited
Keyword: Development of the Brain; Neuroimmunology
Link ID: 25629 - Posted: 10.31.2018
Laura Sanders Taking a monthlong break from pot helps clear away young people’s memory fog, a small study suggests. The results show that not only does marijuana impair teenagers’ and young adults’ abilities to take in information, but that this memory muddling may be reversible. Scientists have struggled to find clear answers about how marijuana affects the developing brain, in part because it’s unethical to ask children to begin using a drug for a study. But “you can do the opposite,” says neuropsychologist Randi Schuster. “You can get kids who are currently using, and pay them to stop.” For a study published October 30 in the Journal of Clinical Psychiatry, Schuster and her colleagues did just that. The team recruited 88 Boston-area youngsters ages 16 to 25 years old who reported using marijuana at least once a week, and offered 62 of them money to quit for a month. Participants were paid more money as the experiment went along, with top earners banking $535 for their month without pot. The money “worked exceptionally well,” says Schuster, of Massachusetts General Hospital in Boston and Harvard Medical School. Urine tests showed that 55 of the 62 participants stopped using marijuana for the 30 days of the experiment. Along with regular drug tests, participants underwent attention and memory tests. Tricky tasks that required close monitoring of number sequences and the directions and locations of arrows revealed that, over the month, young people’s ability to pay attention didn’t seem to be affected by their newfound abstinence. |© Society for Science & the Public 2000 - 2018
Keyword: Drug Abuse; Learning & Memory
Link ID: 25628 - Posted: 10.31.2018
The transmission speed of neurons fluctuates in the brain to achieve an optimal flow of information required for day-to-day activities, according to a National Institutes of Health study. The results, appearing in PNAS, suggest that brain cells called astrocytes alter the transmission speed of neurons by changing the thickness of myelin, an insulation material, and the width of gaps in myelin called nodes of Ranvier, which amplify signals. “Scientists used to think that myelin could not be thinned except when destroyed in demyelinating diseases, such as multiple sclerosis,” said R. Douglas Fields, Ph.D., senior author and chief of the Section on Nervous System Development and Plasticity at NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD). “Our study suggests that under normal conditions, the myelin sheath and structure of the nodes of Ranvier are dynamic, even in adults.” The brain is composed of neurons, which have extensions called axons that can stretch for long distances. Axons are wrapped by layers of myelin, which serve as insulation to increase the speed of signals relayed by neurons. Gaps between segments of myelin are called nodes of Ranvier, and the number and width of these gaps can also regulate transmission speed. “Myelin can be located far from the neuron’s synapse, where signals originate,” said NICHD’s Dipankar Dutta, Ph.D., the lead author of the study. “We wanted to understand how myelin, and the cells that regulate it, help synchronize signals that come from different areas of the brain.”
Keyword: Learning & Memory; Glia
Link ID: 25627 - Posted: 10.31.2018
By Sam Rose One of neuroscience’s foundational experiments wasn’t performed in a Nobel laureate’s lab, but occurred in a railyard in 1848 when an accidental explosion sent a tamping iron through 25 year-old Phineas Gage’s forehead. Gage survived, but those studying his history detailed distinct personality changes resulting from the accident. He went from even-tempered to impulsive and profane. The case is likely the earliest—and most famous—of using a “lesion” to link a damaged brain region to its function. In the ensuing decades, to study the brain was to study lesions. Lesion cases fed most of the era’s knowledge of the brain. One might think that modern neuroscience, with its immense toolkit of experimental techniques, no longer needs lesions like Gage’s to parse the brain’s inner workings. Lesion studies, though, seem to be having a revival. A new method called lesion network mapping is clearing the cobwebs off the lesion study and uniting it with modern brain connectivity data. The results are revealing surprising associations between brain regions and disorders. Thankfully, most lesions aren’t a tamping iron through the forehead. Strokes, hemorrhages, or tumors make up most lesion cases. 19th century neurologists like Paul Broca made foundational discoveries by studying patients with peculiar symptoms resulting from these common neurological insults. Broca and his contemporaries synthesized a theory of the brain from lesions: that the brain is segmented. Different regions control different functions. Lesion studies lend a lawyerly logic to the brain: if region X is destroyed and function Y no longer occurs, then region X must control function Y. Advertisement © 2018 Scientific American,
Keyword: Stroke
Link ID: 25626 - Posted: 10.31.2018
Anna Nowogrodzki On a cold morning in Minneapolis last December, a man walked into a research centre to venture where only pigs had gone before: into the strongest magnetic resonance imaging (MRI) machine built to scan the human body. First, he changed into a hospital gown, and researchers made sure he had no metal on his body: no piercings, rings, metal implants or pacemakers. Any metal could be ripped out by the immensely powerful, 10.5-tesla magnet — weighing almost 3 times more than a Boeing 737 aeroplane and a full 50% more powerful than the strongest magnets approved for clinical use. Days earlier, he had passed a check-up that included a baseline test of his sense of balance to make sure that any dizziness from exposure to the magnets could be assessed properly. In the MRI room at the University of Minnesota’s Center for Magnetic Resonance Research, he lay down inside a 4-metre-long tube, surrounded by 110 tonnes of magnet and 600 tonnes of iron shielding, for an hour’s worth of imaging of his hips, whose thin cartilage would test the limits of the machine’s resolution. The centre’s director, Kamil Ugurbil, had been waiting for years for this day. The magnet faced long delays because the liquid helium needed to fill it was in short supply. After the machine was finally delivered, on a below-freezing day in 2013, it took four years of animal testing and ramping up the field strength before Ugurbil and his colleagues were comfortable sending in the first human. Even then, they didn’t quite know what they’d see. But it was worth the wait: when the scan materialized on screen, the fine resolution revealed intricate details of the wafer-thin cartilage that protects the hip socket. “It was extremely exciting and very rewarding,” Ugurbil says. © 2018 Springer Nature Limited
Keyword: Brain imaging
Link ID: 25625 - Posted: 10.31.2018
Jon Hamilton An ancient part of the brain long ignored by the scientific world appears to play a critical role in everything from language and emotions to daily planning. It's the cerebellum, which is found in fish and lizards as well as people. But in the human brain, this structure is wired to areas involved in higher-order thinking, a team led by researchers from Washington University in St. Louis reports Thursday in the journal Neuron. "We think that the cerebellum is acting as the brain's ultimate quality control unit," says Scott Marek, a postdoctoral research scholar and the study's first author. The finding adds to the growing evidence that the cerebellum "isn't only involved in sensory-motor function, it's involved in everything we do," says Dr. Jeremy Schmahmann, a neurology professor at Harvard and director of the ataxia unit at Massachusetts General Hospital. Schmahmann, who wasn't involved in the new study, has been arguing for decades that the cerebellum plays a key role in many aspects of human behavior, as well as mental disorders such as schizophrenia. But only a handful of scientists have explored functions of the cerebellum beyond motor control. "It's been woefully understudied," says Dr. Nico Dosenbach, a professor of neurology at Washington University whose lab conducted the study. Even now, many scientists think of the cerebellum as the part of the brain that lets you pass a roadside sobriety test. It helps you do things like walk in a straight line or stand on one leg or track a moving object — if you're not drunk. © 2018 npr
Keyword: Attention; Language
Link ID: 25624 - Posted: 10.27.2018
By Alex Williams It’s hard to say the precise moment when CBD, the voguish cannabis derivative, went from being a fidget spinner alternative for stoners to a mainstream panacea. Maybe it was in January, when Mandy Moore, hours before the Golden Globes, told Coveteur that she was experimenting with CBD oil to relieve the pain from wearing high heels. “It could be a really exciting evening,” she said. “I could be floating this year.” Maybe it was in July, when Willie Nelson introduced a line of CBD-infused coffee beans called Willie’s Remedy. “It’s two of my favorites, together in the perfect combination,” he said in a statement. Or maybe it was earlier this month, when Dr. Sanjay Gupta gave a qualified endorsement of CBD on “The Dr. Oz Show.” “I think there is a legitimate medicine here,” he said. “We’re talking about something that could really help people.” So the question now becomes: Is this the dawning of a new miracle elixir, or does all the hype mean we have already reached Peak CBD? Either way, it would be hard to script a more of-the-moment salve for a nation on edge. With its proponents claiming that CBD treats ailments as diverse as inflammation, pain, acne, anxiety, insomnia, depression, post-traumatic stress and even cancer, it’s easy to wonder if this all natural, non-psychotropic and widely available cousin of marijuana represents a cure for the 21st century itself. The ice caps are melting, the Dow teeters, and a divided country seems headed for divorce court. Is it any wonder, then, that everyone seems to be reaching for the tincture? “Right now, CBD is the chemical equivalent to Bitcoin in 2016,” said Jason DeLand, a New York advertising executive and a board member of Dosist, a cannabis company in Santa Monica, Calif., that makes disposable vape pens with CBD. “It’s hot, everywhere and yet almost nobody understands it.” With CBD popping up in nearly everything — bath bombs, ice cream, dog treats — it is hard to understate the speed at which CBD has moved from the Burning Man margins to the cultural center. © 2018 The New York Times Company
Keyword: Drug Abuse; Depression
Link ID: 25623 - Posted: 10.27.2018
Wenyao Xu Your brain is an inexhaustible source of secure passwords – but you might not have to remember anything. Passwords and PINs with letters and numbers are relatively easily hacked, hard to remember and generally insecure. Biometrics are starting to take their place, with fingerprints, facial recognition and retina scanning becoming common even in routine logins for computers, smartphones and other common devices. They’re more secure because they’re harder to fake, but biometrics have a crucial vulnerability: A person only has one face, two retinas and 10 fingerprints. They represent passwords that can’t be reset if they’re compromised. Like usernames and passwords, biometric credentials are vulnerable to data breaches. In 2015, for instance, the database containing the fingerprints of 5.6 million U.S. federal employees was breached. Those people shouldn’t use their fingerprints to secure any devices, whether for personal use or at work. The next breach might steal photographs or retina scan data, rendering those biometrics useless for security. Our team has been working with collaborators at other institutions for years, and has invented a new type of biometric that is both uniquely tied to a single human being and can be reset if needed. When a person looks at a photograph or hears a piece of music, her brain responds in ways that researchers or medical professionals can measure with electrical sensors placed on her scalp. We have discovered that every person’s brain responds differently to an external stimulus, so even if two people look at the same photograph, readings of their brain activity will be different. © 2010–2018, The Conversation US, Inc.
Keyword: Brain imaging; Robotics
Link ID: 25622 - Posted: 10.27.2018
By George Musser, The forest is still—until, out of the corner of my eye, I notice a butterfly flutter into view. At first it is barely perceptible, but as I watch the butterfly more intently, the trees around it darken and the insect grows brighter. The more I marvel at it, the more marvelous it becomes, making it impossible for me to look away. Before long the entire forest recedes, and the butterfly explodes into a red starburst, like a fireworks display. Everything goes dark. Then, dozens of white dots swarm around me. On my left, they are just dots. On my right, they leave long trails of spaghetti-like light. The contrast makes me acutely conscious that the present is never experienced as a mathematical instant; it has some duration, and the perception of that can vary with context. The sensation evaporates as soon as I take off my headset. This immersive virtual-reality (VR) experience was a preliminary look at Beholder, an art installation at the Victoria and Albert Museum in London in September that sought to recreate how autistic people perceive the world. It is now on display at the gallery that commissioned it, Birmingham Open Media. The project’s creator, Matt Clark, has a severely autistic 15-year-old son, Oliver. “He can’t talk; his behaviors are extremely challenging,” says Clark, creative director of United Visual Artists, an art and design group based in London. Clark built Beholder so he and others could see the world through his son’s eyes. He collaborated with artists who either are on the spectrum or have family members who are. © 2018 American Association for the Advancement of Science
By Karen Weintraub The stresses of everyday life may start taking a toll on the brain in relatively early middle age, new research shows. The study of more than 2,000 people, most of them in their 40s, found those with the highest levels of the stress-related hormone cortisol performed worse on tests of memory, organization, visual perception and attention. Higher cortisol levels, measured in subjects’ blood, were also found to be associated with physical changes in the brain that are often seen as precursors to Alzheimer’s disease and other forms of dementia, according to the study published Wednesday in Neurology. The link between high cortisol levels and low performance was particularly strong for women, the study found. But it remains unclear whether women in midlife are under more stress than men or simply more likely to have their stress manifested in higher cortisol levels, says lead researcher Sudha Seshadri. A professor of neurology, she splits her time between Boston University and The University of Texas Health Science Center at San Antonio, where she is the founding director of the Glenn Biggs Institute for Alzheimer's & Neurodegenerative Diseases. Working on the study “made me more stressed about not being less stressed,” Seshadri says, laughing. But, she adds, the bottom line is serious: “An important message to myself and others is that when challenges come our way, getting frustrated is very counterproductive—not just to achieving our aims but perhaps to our capacity to be productive.” © 2018 Scientific American
Keyword: Stress; Learning & Memory
Link ID: 25620 - Posted: 10.26.2018
By Daniel Engber Two decades ago, in late summer 1998, the journal Nature came out with an outrageous claim: Both women and men, a research paper argued, prefer faces with more “girlish” features. The authors of the study, based in Scotland and Japan, had expected the opposite result—that square-jawed, hunky faces, more Harrison Ford than Leonardo DiCaprio, would be deemed more attractive. “Our team has been working on this study for four years,” one of the scientists, Ian Penton-Voak, told the New York Times in advance of publication. “When it was found early on that there was a preference for feminized male faces, nobody believed it, so we did it again, and again. The preference for a feminized face keeps coming up.” Could it really be the case that everyone prefers a man with a gentle nose and a low-T brow? If so, then why are (or were) Harrison Ford and Leonardo DiCaprio both considered highly sexy? And what about the other sexy ’90s dyads of George Clooney and Jude Law, and Johnny Depp and Nick Nolte? The Nature data were no less perplexing for evolutionary psychologists like Penton-Voak. From that field’s perspective, manly features are indicative of a male’s reproductive fitness. Given this assumption, one might guess that women have evolved to find those traits the most appealing, since they help identify the sort of men with whom you could make the strongest, most immunocompetent children. What would women get from delicate men? A year later, in the summer of ’99, Penton-Voak and colleagues offered the beginnings of an explanation. For a second study, also out in Nature, and also drawn from research done in Scotland and Japan, they once again asked young women to evaluate male faces that had been digitally feminized to varying degrees—only now they had the women do so twice, at different points during their menstrual cycles. They found that a woman’s predilection for men with girlish features waxed and waned throughout the lunar month: When she looked at faces in the days leading up to ovulation, her tastes would tend a bit more masculine; later on she’d flip back the other way. © 2018 The Slate Group LLC
Keyword: Sexual Behavior; Hormones & Behavior
Link ID: 25619 - Posted: 10.26.2018
By Charles F. Zorumski One minute you’re enjoying a nice buzz, the next your brain stops recording events that are taking place. The result can mean having vague or no memory of a time period ranging anywhere from a few minutes up to several hours. Scary—isn’t it? Unfortunately, alcohol-induced blackouts aren’t a rarity, either. A 2015 survey of English teenagers who drank showed 30 percent of 15-year-olds and 75 percent of 19-year-olds suffered alcohol-induced blackouts. In medical terms this memory loss is a form of temporary anterograde amnesia, a condition where the ability to form new memories is, for a limited time, impaired. That means you can’t remember a stretch of time because your brain was unable to record and store memories in the first place. Advertisement Neuroscientists do not fully understand how blackouts occur. Researchers long assumed alcohol impairs memory because it kills brain cells. Indeed, long-standing alcohol abuse can damage nerve cells and permanently impact memory and learning. It is unlikely, however, that brain damage is behind acute blackouts. It is clear that processes in the hippocampus—the area of brain involved in the formation, storage and retrieval of new memories—are disturbed. Specifically, it appears alcohol impairs the so-called long-term potentiation of synapses at the pyramidal cells in the hippocampus. Alcohol alters the activity of certain glutamate receptors, thereby boosting the production of specific steroid hormones. This in turn slows the long-term potentiation of hippocampal synapses. Normally this mechanism, responsible for strengthening the synaptic transfer of information between neurons, is the basis of memory formation. © 2018 Scientific American
Keyword: Drug Abuse; Learning & Memory
Link ID: 25618 - Posted: 10.26.2018
By Anil Seth, Michael Schartner, Enzo Tagliazucchi, Suresh Muthukumaraswamy, Robin Carhart-Harris, Adam Barrett It’s not easy to strike the right balance when taking new scientific findings to a wider audience. In a recent opinion piece, Bernard Kastrup and Edward F. Kelly point out that media reporting can fuel misleading interpretations through oversimplification, sometimes abetted by the scientists themselves. Media misinterpretations can be particularly contagious for research areas likely to pique public interest—such as the exciting new investigations of the brain basis of altered conscious experience induced by psychedelic drugs. Unfortunately, Kastrup and Kelly fall foul of their own critique by misconstruing and oversimplifying the details of the studies they discuss. This leads them towards an anti-materialistic view of consciousness that has nothing to do with the details of the experimental studies—ours or others. Take, for example, their discussion of our recent study reporting increased neuronal “signal diversity” in the psychedelic state. In this study, we used “Lempel-Ziv” complexity—a standard algorithm used to compress data files—to measure the diversity of brain signals recorded using magnetoencephalography (MEG). Diversity in this sense is related to, though not entirely equivalent to, “randomness.” The data showed widespread increased neuronal signal diversity for three different psychedelics (LSD, psilocybin and ketamine), when compared to a placebo baseline. This was a striking result since previous studies using this measure had only reported reductions in signal diversity, in global states generally thought to mark “decreases” in consciousness, such as (non-REM) sleep and anesthesia. © 2018 Scientific American
Keyword: Consciousness; Drug Abuse
Link ID: 25617 - Posted: 10.26.2018
By Nicholas Bakalar People with high blood levels of cortisol, the “stress hormone,” may have poorer memory and thinking skills than those with lower levels. Cortisol is produced by the adrenal glands and is involved in regulating blood sugar levels, reducing inflammation, controlling salt and water balance and other body functions. Researchers gave tests for memory, abstract reasoning, visual perception and attention to 2,231 people, average age 49 and free of dementia. They recorded blood levels of cortisol and did M.R.I. examinations to assess brain volume. The study, in Neurology, controlled for age, sex, education, body mass index, blood pressure and many other variables, and found that compared with people with average levels of cortisol, those with the highest levels had lower scores on the cognitive tests. In women, but not in men, higher cortisol was also associated with reduced brain volume. There was no association of the lowest cortisol levels with either cognitive test scores or brain size. The lead author, Dr. Justin B. Echouffo-Tcheugui, an assistant professor of medicine at Johns Hopkins, said that the study suggests that even in people without symptoms, higher cortisol levels can be significant. Still, he said, “This is an initial study. The next step is a prospective study before we jump to the conclusion that this is really important. It’s premature now to consider intervention.” © 2018 The New York Times Company
Keyword: Stress; Learning & Memory
Link ID: 25616 - Posted: 10.26.2018
By Victoria Gill Science correspondent, BBC News Clever, tool-using crows have surprised scientists once again with remarkable problem-solving skills. In a task designed to test their tool-making prowess, New Caledonian crows spontaneously put together two short, combinable sticks to make a longer "fishing rod" to reach a piece of food. The findings are published in the journal Scientific Reports. Scientists say the demonstration is a "window into how another animals' minds work". How do you test a bird's tool-making skills? New Caledonian crows are known to spontaneously use tools in the wild. This task, designed by scientists at the Max Planck Institute for Ornithology in Seewiesen, Germany, and the University of Oxford, presented the birds with a novel problem that they needed to make a new tool in order to solve. It involved a "puzzle box" containing food behind a door that left a narrow gap along the bottom. With the food deep inside the box and only short sticks - too short to reach the food - the crows were left to work out what to do. The sticks were designed to be combinable - one was hollow to allow the other to slot inside. And with no demonstration or help, four out of the eight crows inserted one stick into another and used the resulting longer tool to fish for and extract the food from the box. "They have never seen this compound tool, but somehow they can predict its properties," explained one of the lead researchers, Prof Alex Kacelnik. "So they can predict what something that does not yet exist would do if they made it. Then they can make it and they can use it. © 2018 BBC
Keyword: Intelligence; Evolution
Link ID: 25615 - Posted: 10.25.2018
Weeks before they took their first breaths, two babies had their spinal cords delicately repaired by surgeons in the first operations of their kind in the UK. The spina bifida surgeries were successfully performed by a team at University College hospital in London this summer on two babies while they were still in the womb. Spina bifida is usually treated after birth, but research shows repairing the spine earlier can stop the loss of spinal fluid and lead to better long-term health and mobility outcomes. A 30-strong team carried out the two operations, coordinated by the UCL professor Anna David, who had worked for three years to bring the procedure to patients in the UK. She said mothers previously had to travel to the US, Belgium or Switzerland. “It’s fantastic. Women now don’t have to travel out of the UK,” David said. “They can have their family with them. There are less expenses. So all good things.” Advertisement The surgery team from University College London hospitals (UCLH) and Great Ormond Street hospital travelled to Belgium to train at a facility in Leuven, where more than 40 of the operations have been carried out. Spina bifida is a condition that develops during pregnancy when the bones of the spine do not form properly, creating a gap that leaves the spinal cord unprotected. It can cause a baby’s spinal fluid to leak and affect brain development, potentially leading to long-term health and mobility problems. © 2018 Guardian News and Media Limited
Keyword: Development of the Brain
Link ID: 25614 - Posted: 10.25.2018
Sukanya Charuchandra L. Wu et al., “Human ApoE isoforms differentially modulate brain glucose and ketone body metabolism: Implications for Alzheimer’s disease risk reduction and early intervention,” J Neurosci, 38:6665–81, 2018. Humans carry three different isoforms of the ApoE gene, which affects Alzheimer’s risk. Liqin Zhao of the University of Kansas and her colleagues previously found that the gene plays a role in brain metabolism when expressed in mice; in a new study, they looked for the pathways involved. LEAVING AN IMPRESSION Zhao’s team engineered female mice to express the human versions of either ApoE2, ApoE3, or ApoE4, and analyzed expression of 43 genes involved in energy metabolism in their cortical tissue. BLOCKADE Mice with ApoE2 showed higher levels of proteins needed for glucose uptake and metabolism in their brains relative to animals harboring the most common isoform in humans, ApoE3. Mice with ApoE4 had lower levels of such proteins. The brain tissue’s glucose transport efficiency also varied across the genotypes, and levels of a key glucose-metabolizing enzyme, hexokinase, were reduced in ApoE4 brains. However, ApoE2 and ApoE4 brains contained similar levels of proteins involved in using ketone bodies, a secondary source of energy, while ApoE3 brains had lower levels of those proteins. “Brain glycolytic function may serve as a significant mechanism underlying the differential impact of ApoE genotypes,” Zhao says. © 1986 - 2018 The Scientist.
Keyword: Obesity; Alzheimers
Link ID: 25613 - Posted: 10.25.2018
Laura Sanders Researchers have found a new link between gut and brain. By signaling to nerve cells in the brain, certain microbes in the gut slow a fruit fly’s walking pace, scientists report. Fruit flies missing those microbes — and that signal — turn into hyperactive speed walkers. With the normal suite of gut microbes, Drosophila melanogaster fruit flies on foot cover an average of about 2.4 millimeters a second. But fruit flies without any gut microbes zip along at about 3.5 millimeters a second, Catherine Schretter, a biologist at Caltech, and her colleagues report October 24 in Nature. These flies with missing microbes also take shorter breaks and are more active during the day. “Our work suggests that microbes assist in maintaining a certain level of locomotion,” Schretter says. An enzyme made by Lactobacillus brevis bacteria normally serves as the brakes, the researchers found. When researchers supplied the enzyme, called xylose isomerase, to flies lacking bacteria, the flies began walking at a slower, more normal pace. Xylose isomerase acts on a sugar that’s thought to influence nerve cells in fruit flies’ brains that control walking. For still mysterious reasons, the bacterial influence on walking speed occurred only in female fruit flies, not males. Studying that difference will be “a very interesting potential direction for this work,” Schretter says. |© Society for Science & the Public 2000 - 2018
Keyword: Obesity
Link ID: 25612 - Posted: 10.25.2018
By Kelly Servick WASHINGTON, D.C.—A hand-size monkey called Callithrix jacchus—the common marmoset—is in great demand in labs and yet almost nowhere to be found. Marmosets’ small size, fast growth, and sophisticated social life were already enough to catch the eye of neuroscientists. They’ve now been genetically engineered to make their brains easier to image and to serve as models for neurological disorders such as autism and Parkinson’s. The problem: “There are just no monkeys,” says Cory Miller, a neuroscientist at the University of California, San Diego. At a meeting here this week, convened by the National Academies of Sciences, Engineering, and Medicine’s (NASEM’s) Institute for Laboratory Animal Research, neuroscientist Jon Levine, who directs the Wisconsin National Primate Research Center at the University of Wisconsin in Madison, likened the surge in demand to “a 10-alarm fire that’s about to be set.” In response, the National Institutes of Health (NIH) plans to launch funding to expand marmoset research. And established marmoset researchers, including Miller, are working together to help new labs get animals. When Miller’s lab started to work with marmosets in 2009, many colleagues who studied macaques—the most popular genus of research monkey—didn’t even know that marmosets were monkeys, he remembers. “They were like, ‘Is it those chipmunks that were in the Rocky Mountains?’” (They were thinking of marmots.) © 2018 American Association for the Advancement of Science
Keyword: Animal Rights; Autism
Link ID: 25611 - Posted: 10.24.2018
By JoAnna Klein Lavender bath bombs; lavender candles; deodorizing lavender sachets for your shoes, car or underwear drawer; lavender diffusers; lavender essential oils; even lavender chill pills for humans and dogs. And from Pinterest: 370 recipes for lavender desserts. Take a deep breath. Release. People like lavender. We’ve been using this violet-capped herb since at least medieval times. It smells nice. But Google “lavender” and results hint at perhaps the real fuel for our obsession: “tranquillity,” “calm,” “relaxation,” “soothing,” and “serenity.” Lavender has purported healing powers for reducing stress and anxiety. But are these effects more than just folk medicine? Yes, said Hideki Kashiwadani, a physiologist and neuroscientist at Kagoshima University in Japan — at least in mice. “Many people take the effects of ‘odor’ with a grain of salt,” he said in an email. “But among the stories, some are true based on science.” In a study published Tuesday in the journal Frontiers in Behavioral Neuroscience, he and his colleagues found that sniffing linalool, an alcohol component of lavender odor, was kind of like popping a Valium. It worked on the same parts of a mouse’s brain, but without all the dizzying side effects. And it didn’t target parts of the brain directly from the bloodstream, as was thought. Relief from anxiety could be triggered just by inhaling through a healthy nose. Their findings add to a growing body of research demonstrating anxiety-reducing qualities of lavender odors and suggest a new mechanism for how they work in the body. Dr. Kashiwadani believes this new insight is a key step in developing lavender-derived compounds like linalool for clinical use in humans. Dr. Kashiwadani and his colleagues became interested in learning how linalool might work for anti-anxiety while testing its effects on pain relief in mice. In this earlier study, they noticed that the presence of linalool seemed to calm mice. © 2018 The New York Times Company
Keyword: Emotions; Chemical Senses (Smell & Taste)
Link ID: 25610 - Posted: 10.24.2018


.gif)

