Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By JANE E. BRODY I hope you’re not chomping on a bagel or, worse, a doughnut while you read about what is probably the most serious public health irony of the last half century in this country: As one major killer — smoking — declined, another rose precipitously to take its place: obesity. Many cancer deaths were averted after millions quit lighting up, but they are now rising because even greater numbers are unable to keep their waistlines in check. Today, obesity and smoking remain the two leading causes of preventable deaths in this country. Reviewing more than 1,000 studies, the International Agency for Research on Cancer and the Centers for Disease Control and Prevention linked the risk of developing 13 kinds of cancer to overweight and obesity, especially cancers that are now being diagnosed in increasing numbers among younger people. Included are cancers of the esophagus, liver, gallbladder, colon and rectum, upper stomach, pancreas, uterus, ovary, kidney and thyroid; breast cancer in postmenopausal women; meningioma and multiple myeloma. Only for colorectal cancers has the overall incidence declined, primarily the result of increased screening and removal of precancerous polyps. In most cases, the studies revealed, cancer risk rose in direct proportion to the degree of excess weight. In other words, the heavier you are, the more likely you will be to develop one of these often fatal cancers. From 2005 to 2014, the C.D.C. reported in October, there was a 1.4 percent annual increase in cancers related to overweight and obesity among people aged 20 to 49, and a 0.4 percent rise in these cancers among people 50 to 64. © 2017 The New York Times Company
Keyword: Obesity
Link ID: 24322 - Posted: 11.13.2017
Jon Hamilton The goal is simple: a drug that can relieve chronic pain without causing addiction. But achieving that goal has proved difficult, says Edward Bilsky, a pharmacologist who serves as the provost and chief academic officer at Pacific Northwest University of Health Sciences in Yakima, Wash. "We know a lot more about pain and addiction than we used to," says Bilsky, "But it's been hard to get a practical drug." Bilsky is moderating a panel on pain, addiction and opioid abuse at the Society for Neuroscience meeting in Washington, D.C., this week. Brain scientists have become increasingly interested in pain and addiction as opioid use has increased. About 2 million people in the U.S. now abuse opioids, according to the Centers for Disease Control and Prevention. But at least 25 million people suffer from chronic pain, according to an analysis by the National Institutes of Health. That means they have experienced daily pain for more than three months. The question is how to cut opioid abuse without hurting people who live with pain. And brain scientists think they are getting closer to an answer. One approach is to find drugs that decrease pain without engaging the brain's pleasure and reward circuits the way opioids do, Bilsky says. So far, these drugs have been hampered by dangerous side effects or proved less effective than opioids at reducing pain. But substances related to snail venom look promising, Bilsky says. © 2017 npr
Keyword: Pain & Touch
Link ID: 24321 - Posted: 11.13.2017
The ready availability of technology may make the children of today faster at configuring a new smartphone, but does all of that screen time affect the development of their eyes? While conventional wisdom dictates that children should do less up-close viewing, sit farther from the television and perhaps even wear their eyeglasses less, we have found in recent studies that another factor may be at play: Kids need to go outside and, if not play, at least get some general exposure to outdoor light. To our surprise, more time outdoors had a protective effect and reduced the chances that a child would go on to need myopic refractive correction. The size of the effect was impressive. What causes nearsightedness? Myopia, or nearsightedness, is a condition in which you can’t see far away but can see up close without glasses or contact lenses. It typically starts during the early elementary-school years. Because kids don’t know how other kids see, they often think their blurry vision is normal, so regular eye examinations are important. With myopia, the eye is growing, but growing too long for distant rays of light to focus accurately on the back of the eye. A blurry image results. For children, eyeglasses or contact lenses move the focus back to the retina, and a clear image is formed. The too-long eye cannot be shrunk, so refractive correction is then a lifelong necessity. In adulthood, surgery is an option. © 1996-2017 The Washington Post
Keyword: Vision; Development of the Brain
Link ID: 24320 - Posted: 11.13.2017
Summary A vast effort by a team of Janelia Research Campus scientists is rapidly increasing the number of fully-traced neurons in the mouse brain. Researchers everywhere can now browse and download the 3-D data. Inside the mouse brain, individual neurons zigzag across hemispheres, embroider branching patterns, and, researchers have now shown, often spool out spindly fibers nearly half a meter long. Scientists can see and explore these wandering neural traces in 3-D, in the most extensive map of mouse brain wiring yet attempted. The map – the result of an ongoing effort by an eclectic team of researchers at the Janelia Research Campus – reconstructs the entire shape and position of more than 300 of the roughly 70 million neurons in the mouse brain. Previous efforts to trace the path of individual neurons had topped out in the dozens. “Three hundred neurons is just the start,” says neuroscientist Jayaram Chandrashekar, who leads the Janelia project team, called MouseLight for its work illuminating the circuitry of the mouse brain. He and colleagues expect to trace hundreds more neurons in the coming months – and they’re sharing all the data with the neuroscience community. The team released their current dataset and an analysis tool, called the MouseLight NeuronBrowser, on October 27, 2017, and will report the work in November at the annual Society for Neuroscience meeting in Washington, D.C. They hope that the findings will help scientists ask, and begin to answer, questions about how neurons are organized, and how information flows through the brain. ©2017 Howard Hughes Medical Institute
Keyword: Brain imaging
Link ID: 24319 - Posted: 11.11.2017
By Jef Akst | After Nelson Dellis’s grandmother passed away from Alzheimer’s disease in the summer of 2009, he became obsessed with memory. “I had seen her whole decline, so brain health was on my mind,” he says. He found out about annual memory competitions that tested people’s ability to remember large volumes of data—for example, the exact order of 104 playing cards in two decks—and began to learn the strategies so-called “memory athletes” used to pull off these incredible feats. “I found the techniques worked, and with a bit of practice, you can do a lot more than you ever thought you could,” Dellis says. He entered the 2010 USA Memory Championship in New York City and came in third. The next two years in a row, he took first. A mistake in the finals cost him the championship in 2013, but he regained the crown in 2014 and won again in 2015, making him the first and only four-time USA Memory Champion. And all it took was “a bit of practice.” Dellis says there are several strategies memory athletes use, but they’re all based on the same principle: “You want to turn information you’re trying to memorize into something that your brain naturally prefers to absorb”—typically, an image. “Once you have that picture, the next step is to store it somewhere—somewhere in your mind you can safely store it and retrieve it later.” This place is known as a “memory palace,” and it can be any place that’s familiar to you, such as your house. You can then place the images you’ve chosen along a particular path through the memory palace, and “the path, which you know very well, preserves the order.”
Keyword: Learning & Memory
Link ID: 24318 - Posted: 11.11.2017
By: George Paxinos, Being an atlas maker, I have an image problem. I recently introduced myself to a lady at a Society for Neuroscience Meeting who had used the first edition of The Rat Brain in Stereotaxic Coordinates for her PhD thesis 35 years earlier. With surprise written on her face, she said, “George Paxinos, I thought you were dead.” On another occasion, I was giving a talk at Munich and one girl asked another, “Did you see Paxinos?” The other girl replied, “Yes, it is on my shelf.” The idea of constructing an atlas came to me while on a sabbatical at Cambridge. There, I used acetylcholinesterase (AChE) as a proxy (poor at that) for acetylcholine. Looking at the rat brain stained for AChE was like looking at a coloring book that was already colored. I was convinced immediately that I would be able to construct a better atlas of the rat brain than the then popular atlas of Konig and Klippel (1963). The Konig and Klippel atlas did not display the pons, medulla, cerebellum, olfactory bulbs, spinal cord, horizontal section or the point of bregma, the most frequently used reference point in stereotaxic surgery. Further it was based on 150g female rats, while most neuroscientists used 300g male rats. However, my greatest difficult with this atlas was that as an undergraduate in psychology at Berkeley, I was going to be instructed by my professor on stereotaxic surgery, but unfortunately the rat resisted going under the anesthetic. Trying to anesthetize the rat consumed the available time and my professor left, telling me to read the coordinates and implant the electrode in the hypothalamus. In my rush to implant the electrode without the rat getting out of the anesthetic, I failed to read the Introduction of the atlas, where it was stated clearly that the stereotaxic zero point of the atlas is not (repeat “not”) the stereotaxic zero point of the stereotaxic instrument, but 4.9mm above the true stereotaxic zero for convenience. So, in targeting the hypothalamus, I missed the brain by 4.9mm. I thought any psychologist would have been able to design a better atlas than that. The only problem I had in constructing the rat brain atlas was that I did not know anatomy. © 2017 Elsevier,
Keyword: Brain imaging
Link ID: 24317 - Posted: 11.11.2017
Paula Span Medical researchers and government health policymakers, a cautious lot, normally take pains to keep expectations modest when they’re discussing some new finding or treatment. They warn about studies’ limitations. They point out what isn’t known. They emphasize that correlation doesn’t mean causation. So it’s startling to hear prominent experts sound positively excited about a new shingles vaccine that an advisory committee to the Centers for Disease Control and Prevention approved last month. “This really is a sea change,” said Dr. Rafael Harpaz, a veteran shingles researcher at the C.D.C. Dr. William Schaffner, preventive disease specialist at the Vanderbilt University School of Medicine, said, “This vaccine has spectacular initial protection rates in every age group. The immune system of a 70- or 80-year-old responds as if the person were only 25 or 30.” “This really looks to be a breakthrough in vaccinating older adults,” agreed Dr. Jeffrey Cohen, a physician and researcher at the National Institutes of Health. What’s causing the enthusiasm: Shingrix, which the pharmaceutical firm GlaxoSmithKline intends to begin shipping this month. Large international trials have shown that the vaccine prevents more than 90 percent of shingles cases, even at older ages. The currently available shingles vaccine, called Zostavax, only prevents about half of shingles cases in those over age 60 and has demonstrated far less effectiveness among elderly patients. Yet those are the people most at risk for this blistering disease, with its often intense pain, its threat to vision and the associated nerve pain that sometimes last months, even years, after the initial rash fades. © 2017 The New York Times Company
Keyword: Pain & Touch
Link ID: 24316 - Posted: 11.11.2017
By Ann Gibbons Ever since Alex Pollen was a boy talking with his neuroscientist father, he wanted to know how evolution made the human brain so special. Our brains are bigger, relative to body size, than other animals', but it's not just size that matters. "Elephants and whales have bigger brains," notes Pollen, now a neuroscientist himself at the University of California, San Francisco. Comparing anatomy or even genomes of humans and other animals reveals little about the genetic and developmental changes that sent our brains down such a different path. Geneticists have identified a few key differences in the genes of humans and apes, such as a version of the gene FOXP2 that allows humans to form words. But specifically how human variants of such genes shape our brain in development—and how they drove its evolution—have remained largely mysterious. "We've been a bit frustrated working so many years with the traditional tools," says neurogeneticist Simon Fisher, director of the Max Planck Institute for Psycholinguistics in Nijmegen, the Netherlands, who studies FOXP2. Now, researchers are deploying new tools to understand the molecular mechanisms behind the unique features of our brain. At a symposium at The American Society of Human Genetics here last month, they reported zooming in on the genes expressed in a single brain cell, as well as panning out to understand how genes foster connections among far-flung brain regions. Pollen and others also are experimenting with brain "organoids," tiny structured blobs of lab-grown tissue, to detail the molecular mechanisms that govern the folding and growth of the embryonic human brain. "We used to be just limited to looking at sequence data and cataloging differences from other primates," says Fisher, who helped organize the session. "Now, we have these exciting new tools that are helping us to understand which genes are important." © 2017 American Association for the Advancement of Science.
Keyword: Development of the Brain
Link ID: 24315 - Posted: 11.10.2017
Laura Sanders The human brain is teeming with diversity. By plucking out delicate, live tissue during neurosurgery and then studying the resident cells, researchers have revealed a partial cast of neural characters that give rise to our thoughts, dreams and memories. So far, researchers with the Allen Institute for Brain Science in Seattle have described the intricate shapes and electrical properties of about 100 nerve cells, or neurons, taken from the brains of 36 patients as they underwent surgery for conditions such as brain tumors or epilepsy. To reach the right spot, surgeons had to remove a small hunk of brain tissue, which is usually discarded as medical waste. In this case, the brain tissue was promptly packed up and sent — alive — to the researchers. Once there, the human tissue was kept on life support for several days as researchers analyzed the cells’ shape and function. Some neurons underwent detailed microscopy, which revealed intricate branching structures and a wide array of shapes. The cells also underwent tiny zaps of electricity, which allowed researchers to see how the neurons might have communicated with other nerve cells in the brain. The Allen Institute released the first publicly available database of these neurons on October 25. A neuron called a pyramidal cell, for instance, has a bushy branch of dendrites (orange in 3-D computer reconstruction, above) reaching up from its cell body (white circle). Those dendrites collect signals from other neural neighbors. Other dendrites (red) branch out below. The cell’s axon (blue) sends signals to other cells that spur them to action. |© Society for Science & the Public 2000 - 2017.
Keyword: Brain imaging
Link ID: 24314 - Posted: 11.10.2017
Richard Gonzales The Boston researcher who examined the brain of former football star Aaron Hernandez says it showed the most damage her team had seen in an athlete so young. Hernandez, whose on-field performance for the New England Patriots earned him a $40 million contract in 2012, hanged himself in a prison cell earlier this year while serving a life sentence for murder. He was 27 years old. Dr. Ann McKee, a neuropathologist who directs research of chronic traumatic encephalopathy, or CTE, at Boston University, said her research team found Hernandez had Stage 3 CTE and that they had never seen such severe damage in a brain younger than 46 years old. McKee announced her findings at medical conference on Thursday in Boston where she spoke publicly for the first time. Dr. Ann McKee of Boston University talks about the severe degenerative brain disease suffered by former NFL star Aaron Hernandez. Her research team examined his brain after Hernandez died from suicide in prison. Among the lingering questions in the sports world and among brain researchers is, why did a young man with wealth, fame and a potentially bright athletic career ahead of him kill a friend and wind up in prison? © 2017 npr
Keyword: Brain Injury/Concussion
Link ID: 24313 - Posted: 11.10.2017
Laurel Hamers Light-sensitive cells in the eyes of some fish do double-duty. In pearlsides, cells that look like rods — the stars of low-light vision — actually act more like cones, which only respond to brighter light, researchers report November 8 in Science Advances. It’s probably an adaptation to give the deep-sea fish acute vision at dawn and dusk, when they come to the surface of the water to feed. Rods and cones studding the retina can work in tandem to give an animal good vision in a wide variety of light conditions. Some species that live in dark environments, like many deep-sea fish, have dropped cones entirely. But pearlside eyes have confused scientists: The shimmery fish snack at the water’s surface at dusk and dawn, catching more sun than fish that feed at night. Most animals active at these times of day use a mixture of rods and cones to see, but pearlside eyes appear to contain only rods. “That’s actually not the case when you look at it in more detail,” says study coauthor Fanny de Busserolles, a sensory biologist at the University of Queensland in Australia. She and her colleagues investigated which light-responsive genes those rod-shaped cells were turning on. The cells were making light-sensitive proteins usually found in cones, the researchers found, rather than the rod-specific versions of those proteins. |© Society for Science & the Public 2000 - 2017.
Keyword: Vision; Evolution
Link ID: 24312 - Posted: 11.10.2017
By VERONIQUE GREENWOOD When people tell you, “wake up and smell the roses,” they might be giving you bad advice. Your sense of smell may fluctuate in sensitivity over the course of 24 hours, in tune with our circadian clocks, with your nose best able to do its job during the hours before you go to sleep, according to a study published last month. The work, reported in the journal Chemical Senses, is part of a larger push to explore whether adolescents’ senses of taste and smell influence obesity. Rachel Herz, a sensory researcher at Brown University, and her colleagues designed this study to see if there might be times of day when the sense of smell was more powerful — perhaps making food smell particularly inviting. For the experiment, 37 adolescents ranging in age from 12 to 15 came into a lab for a very long sleepover party. For nine days, they followed a strict schedule to allow researchers to focus on the circadian clock, which helps control wake and sleep, but also influences other processes in the body, including metabolism. While more research is needed to test whether the results fully apply to adults, Dr. Herz says that as you grow up, the makeup of the smell receptors inside your nose doesn’t seem to change, although there is evidence your body clock may. The team kept track of where the teenagers were in their circadian cycle by measuring their saliva’s levels of melatonin, a hormone that rises and falls regularly over the course of the day. Every few hours, the children took a scent test, sniffing different concentrations of a chemical that smells like roses. The researchers recorded the lowest concentration they could detect at each time point. When the results were tallied up, the researchers saw a range of responses. “Nobody has the same nose,” Dr. Herz said. Some adolescents had only very mild changes in sensitivity, while sensitivity altered dramatically in others. Averaged together, however, the results showed that overall the circadian clock does affect smell, and that the times when the children’s noses were most sensitive tended to correspond to the evening, with an average peak of 9 p.m. © 2017 The New York Times Company
Keyword: Chemical Senses (Smell & Taste); Biological Rhythms
Link ID: 24311 - Posted: 11.09.2017
By Amanda B. Keener On a fall day in 2015 at Sunnybrook hospital in Toronto, a dozen people huddled in a small room peering at a computer screen. They were watching brain scans of a woman named Bonny Hall, who lay inside an MRI machine just a few feet away. Earlier that day, Hall, who had been battling a brain tumor for eight years, had received a dose of the chemotherapy drug doxorubicin. She was then fitted with an oversized, bowl-shape helmet housing more than 1,000 transducers that delivered ultrasound pulses focused on nine precise points inside her brain. Just before each pulse, her doctors injected microscopic air bubbles into a vein in her hand. Their hope was that the microbubbles would travel to the capillaries of the brain and, when struck by the sound waves, oscillate. This would cause the blood vessels near Hall’s tumor to expand and contract, creating gaps that would allow the chemotherapy drug to escape from the bloodstream and seep into the neural tissue. Finally, she received an injection of a contrast medium, a rare-earth metal called gadolinium that lights up on MRI scans. Now, doctors, technicians, and reporters crowded around to glimpse a series of bright spots where the gadolinium had leaked into the targeted areas, confirming the first noninvasive opening of a human’s blood-brain barrier (BBB). “It was very exciting,” says radiology researcher Nathan McDannold, who directs the Therapeutic Ultrasound Lab at Brigham and Women’s Hospital in Boston and helped develop the technique that uses microbubbles and ultrasound to gently disturb blood vessels. Doctors typically depend on the circulatory system to carry a drug from the gut or an injection site to diseased areas of the body, but when it comes to the brain and central nervous system (CNS), the vasculature switches from delivery route to security system. The blood vessels of the CNS are unlike those throughout the rest of the body. © 1986-2017 The Scientist
Keyword: Brain imaging
Link ID: 24310 - Posted: 11.09.2017
By Lena H. Sun Experts who work on the mosquito-borne West Nile virus have long known that it can cause serious neurological symptoms, such as memory problems and tremors, when it invades the brain and spinal cord. Now researchers have found physical evidence of brain damage in patients years after their original infection, the first such documentation using magnetic resonance imaging, or MRI. Brain scans revealed damage or shrinkage in different parts of the cerebral cortex, the outer part of the brain that handles higher-level abilities such as memory, attention and language. “Those areas correlated exactly with what we were seeing on the neurological exams,” said Kristy Murray, an associate professor of pediatric tropical medicine at Texas Children’s Hospital and Baylor College of Medicine and lead author of the study. “The thought is that the virus enters the brain and certain parts are more susceptible, and where those susceptibilities are is where we see the shrinkage occurring.” Results of the study, which has not yet been published, were presented Tuesday at the annual meeting of the American Society of Tropical Medicine and Hygiene. The 10-year study of 262 West Nile patients is one of the largest assessments studying the long-term health problems associated with West Nile infections. Most people who are infected do not develop symptoms. About 20 percent will develop fever, and less than 1 percent have the most severe type of infection that causes inflammation of the brain or surrounding tissues. © 1996-2017 The Washington Post
Keyword: Miscellaneous
Link ID: 24309 - Posted: 11.09.2017
By STEPHEN HEYMAN “For years, science has relegated our love to this basic instinct, almost like an addiction that has no redeeming value.” These are not the words of some New Age evangelist preaching from the mount at a couples retreat in Arizona but of Stephanie Cacioppo, a neuroscientist at the University of Chicago who has spent much of her career mapping the dynamics of love in the brain. Her research and some of the theories she has developed put her at odds with other scientists who have described romantic love as an emotion, a primitive drive, even a drug. Using neuroimaging, Dr. Cacioppo has collected data that could suggest that this kind of love activates not only the emotional brain, but also regions that are involved in higher-level intellectual activities and cognition. “This means that it’s possible that love has a real function — not only to connect with people emotionally but also to improve our behavior,” she said. Dr. Cacioppo attributes all kinds of mental and physical benefits to being in love. She says it can help you think faster, to better anticipate other people’s thoughts and behavior, or to bounce back more quickly from an illness. “The empirical tests I’ve done in my lab suggest that, in many ways, when you’re in love, you can be a better person,” she said. Talk to Dr. Cacioppo for long enough and you will be struck by how optimistic her views on traditional romance seem, especially in a world where divorce is commonplace, marriage rates are down, and polyamory and other forms of unconventional relationships are in the news. While she acknowledges that many types of relationships can be healthy, she believes that we are all searching for a “true love” to complete us, that humans are hard-wired for monogamy and that there is indirect biological evidence for fairy-tale tropes like love at first sight. © 2017 The New York Times Company
Keyword: Emotions; Sexual Behavior
Link ID: 24308 - Posted: 11.09.2017
By Elly Vintiadis The prevailing wisdom today is that addiction is a disease. This is the main line of the medical model of mental disorders with which the National Institute on Drug Abuse (NIDA) is aligned: addiction is a chronic and relapsing brain disease in which drug use becomes involuntary despite its negative consequences. The idea here is, roughly, that addiction is a disease because drug use changes the brain and, as a result of these changes, drug use becomes compulsive, beyond the voluntary control of the user. In other words, the addict has no choice and his behavior is resistant to long term change. This way of viewing addiction has its benefits: if addiction is a disease then addicts are not to blame for their plight, and this ought to help alleviate stigma and to open the way for better treatment and more funding for research on addiction. This is the main rationale of a recent piece in the New York Times, which describes addiction as a disease that is plaguing the U.S. and stresses the importance of talking openly about addiction in order to shift people’s understanding of it. And it seems like a welcome change from the blame attributed by the moral model of addiction, according to which addiction is a choice and, thus, a moral failing—addicts are nothing more than weak people who make bad choices and stick with them. Yet, though there are positive aspects to seeing addiction in this light, it seems unduly pessimistic and, though no one will deny that every behavior has neural correlates and that addiction changes the brain, this is not the same as saying that, therefore, addiction is pathological and irreversible. And there are reasons to question whether this is, in fact, the case. © 2017 Scientific American
Keyword: Drug Abuse
Link ID: 24307 - Posted: 11.09.2017
Mariah Quintanilla Emma Watson, Jake Gyllenhaal, journalist Fiona Bruce and Barack Obama all walk into a sheep pen. No, this isn’t the beginning of a baaa-d joke. By training sheep using pictures of these celebrities, researchers from the University of Cambridge discovered that the animals are able to recognize familiar faces from 2-D images. Given a choice, the sheep picked the familiar celebrity’s face over an unfamiliar face the majority of the time, the researchers report November 8 in Royal Society Open Science. Even when a celeb’s face was slightly tilted rather than face-on, the sheep still picked the image more often than not. That means the sheep were not just memorizing images, demonstrating for the first time that sheep have advanced face-recognition capabilities similar to those of humans and other primates, say neurobiologist Jennifer Morton and her colleagues. Sheep have been known to pick out pictures of individuals in their flock, and even familiar handlers (SN: 10/6/12, p. 20). But it’s been unclear whether the skill was real recognition or simple memorization. Sheep now join other animals, including horses, dogs, rhesus macaques and mockingbirds, that are able to distinguish between individuals of other species. Over a series of four training sessions, the sheep’s ability to choose a familiar face, represented by one of the four celebrities, over a completely unfamiliar face improved. |© Society for Science & the Public 2000 - 2017.
Keyword: Attention; Evolution
Link ID: 24306 - Posted: 11.08.2017
By Sara B. Linker, Tracy A. Bedrosian, and Fred H. Gage For years, neurons in the brain were assumed to all carry the same genome, with differences in cell type stemming from epigenetic, transcriptional, and posttranscriptional differences in how that genome was expressed. But in the past decade, researchers have recognized an incredible amount of genomic diversity, in addition to other types of cellular variation that can affect function. Indeed, the human brain contains approximately 100 billion neurons, and we now know that there may be almost as many unique cell types. Our interest in this incredible diversity emerged from experiments that we initially labeled as failures. In 1995, we (F.H.G. and colleagues) found that a protein called fibroblast growth factor 2 (FGF2) is important for maintaining adult neural progenitor cells (NPCs) in a proliferative state in vitro. We could only expand NPCs by culturing them at high density, however, so we could not generate homogeneous populations of cells.1 Five years later, we identified a glycosylated form of the protein cystatin C (CCg) that, combined with FGF2, allowed us to isolate and propagate a very homogeneous population of NPCs—cells that would uniformly and exclusively differentiate into neurons.2 We compared gene expression of this homogeneous population of cells to that of rat stem cells and the oligodendrocytes, astroglia, and neurons derived from the NPCs. To our surprise and disappointment, the top nine transcripts that were unique to the NPC-derived population were all expressed components of long interspersed nuclear element-1, also known as LINE-1 or L1— an abundant retrotransposon that makes up about 20 percent of mammalian genomes. © 1986-2017 The Scientist
Keyword: Development of the Brain; Epigenetics
Link ID: 24305 - Posted: 11.08.2017
James Gorman Dogs have evolved to be friendly and tolerant of humans and one another, which might suggest they would be good at cooperative tasks. Wolves are known to cooperate in hunting and even in raising one another’s pups, but they can seem pretty intolerant of one another when they are snapping and growling around a kill. So researchers at the Wolf Science Center at the University of Vienna decided to compare the performance of wolves and dogs on a classic behavioral test. To get a food treat, two animals have to pull ropes attached to different ends of a tray. The trick is that they have to pull both ropes at the same time. Chimps, parrots, rooks and elephants have all succeeded at the task. When Sarah Marshall-Pescini, Friederike Range and colleagues put wolves and dogs to the test, wolves did very well and dogs very poorly. In recordings of the experiments, the pairs of wolves look like experts, while the dogs seem, well, adorable and confused. The researchers reported their findings in the Proceedings of the National Academy of Sciences. With no training, five of seven wolf pairs succeeded in mastering the task at least once. Only one of eight dog pairs did. © 2017 The New York Times Company
Keyword: Attention; Evolution
Link ID: 24304 - Posted: 11.08.2017
April Fulton In the wake of the massacre at a small-town Texas church on Sunday, many people are asking why. A large portion of the mass shootings in the U.S. in recent years have roots in domestic violence against partners and family members. Depending on how you count, it could be upwards of 50 percent. We know the Texas gunman, Devin Patrick Kelley, was court-martialed for assaulting his wife and their young child in 2012, although this information apparently was not included in the formal government database that tracks such things. There are laws on the books preventing convicted domestic violence offenders from obtaining weapons. So why does this keep happening? There are no easy answers. NPR's Alison Kodjak recently talked with Daniel Webster, director of the Johns Hopkins Center for Gun Policy and Research in Baltimore, Md., about the complexities of gun violence, mass shootings, and the difficulty we have in understanding the people who commit these crimes. While perpetrators of domestic violence account for only about 10 percent of all gun violence, they accounted for 54 percent of mass shootings between 2009 and 2016, according to the advocacy group Everytown for Gun Safety, so there is a disproportionate link, Webster tells Kodjak. "Generally, it fits a pattern of easy access to firearms of individuals who have very controlling kind of relationships with their intimate partners and are greatly threatened when their control is challenged," he says. © 2017 npr
Keyword: Aggression
Link ID: 24303 - Posted: 11.08.2017


.gif)

