Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
Jeannine Stamatakis, There is no denying the high you feel after a run in the park or a swim at the beach. Exercise not only boosts your physical health--as one can easily see by watching a marathon or a boxing match--but it also improves mental health. According to a recent study, every little bit helps. People who engaged in even a small amount of exercise reported better mental health than others who did none. Another study, from the American College of Sports Medicine, indicated that six weeks of bicycle riding or weight training eased stress and irritability in women who had received an anxiety disorder diagnosis. To see how much exercise is required to relieve stress, researchers at the National Institute of Mental Health observed how prior exercise changed the interactions between aggressive and reserved mice. When placed in the same cage, stronger mice tend to bully the meeker ones. In this study, the small mice that did not have access to running wheels and other exercise equipment before cohabitating with the aggressive mice were extremely stressed and nervous, cowering in dark corners or freezing when placed in an unfamiliar territory. Yet meek rodents that had a chance to exercise before encountering their bullies exhibited resistance to stress. They were submissive while living with the aggressive mice but bounced back when they were alone. The researchers concluded that even a small amount of exercise gave the meeker mice emotional resilience. The scientists looked at the brain cells of these so-called stress-resistant mice and found that the rodents exhibited more activity in their medial prefrontal cortex and their amygdala, both of which are involved in processing emotions. The mice that did not exercise before moving in with the aggressive mice showed less activity in these parts of the brain. © 2012 Scientific American,
Keyword: Emotions; Stress
Link ID: 16958 - Posted: 06.25.2012
By Daisy Yuhas At right is a picture of someone’s brain as seen through functional magnetic resonance imaging or fMRI. This particular subject is taxing his neurons with a working memory task—those sunny orange specks represent brain activity related to the task. fMRI images show the brain according to changes in blood oxygen level, a proxy for degree of mental activity. It’s a pretty amazing tool; it has validated a lot of assumptions about brain regions and helped us make comparisons between groups of people, shedding light on addiction, development and disease. Some scientists believe it can help us read minds (more on that later) or even predict the future. But fMRI doesn’t actually provide detail at the level of a cell. The 3-dimensionsal image it provides is built up in units called voxels. Each one represents a tidy cube of brain tissue—a 3-D image building block analogous to the 2-D pixel of computers screens, televisions or digital cameras. Each voxel can represent a million or so brain cells. Those orange blobs in the image above are actually clusters of voxels—perhaps tens or hundreds of them. fMRI is also too slow to capture all of the changes in the brain. Each scan requires a second or two, enough time for a neuron to fire more than a hundred times. That means it can’t provide a clear sense of precisely when things happen. Trying to explain whether activity in one spot causes activity in another is not possible through fMRI alone. Furthermore, you have to be careful with your conclusions. Just because voxels corresponding to one region ‘light up’ when your subject sees a terrifying tiger doesn’t mean that every time this region appears active, your subject is frightened. Many of the brain’s regions are quite complex and involved in multiple processes. © 2012 Scientific American
Keyword: Brain imaging
Link ID: 16957 - Posted: 06.23.2012
An intervention in which adults actively engaged the attention of preschool children with autism by pointing to toys and using other gestures to focus their attention results in a long term increase in language skills, according to researchers supported by the National Institutes of Health. At age 8, children with autism who received therapy centered on sharing attention and play when they were 3 or 4 years old had stronger vocabularies and more advanced language skills than did children who received standard therapy. All of the children in the study attended preschool for 30 hours each week. “Some studies have indicated that such pre-verbal interactions provide the foundation for building later language skills,” said Alice Kau, Ph.D., of the Intellectual and Developmental Disabilities Branch of the Eunice Kennedy Shriver NICHD.“This study confirms that intensive therapy to engage the attention of young children with autism helps them acquire language faster and build lasting language skills.” The study findings appear in the Journal of the American Academy of Child and Adolescent Psychiatry. The 40 children who participated in the study were 8 and 9 years old. Five years earlier, they had been diagnosed with an autism spectrum disorder and received the intensive therapy program or standard intervention, as part of a separate study.
Keyword: Autism; Attention
Link ID: 16956 - Posted: 06.23.2012
By ALEX STONE PINCH a coin at its edge between the thumb and first fingers of your right hand and begin to place it in your left palm, without letting go. Begin to close the fingers of the left hand. The instant the coin is out of sight, extend the last three digits of your right hand and secretly retract the coin. Make a fist with your left — as if holding the coin — as your right hand palms the coin and drops to the side. You’ve just performed what magicians call a retention vanish: a false transfer that exploits a lag in the brain’s perception of motion, called persistence of vision. When done right, the spectator will actually see the coin in the left palm for a split second after the hands separate. This bizarre afterimage results from the fact that visual neurons don’t stop firing once a given stimulus (here, the coin) is no longer present. As a result, our perception of reality lags behind reality by about one one-hundredth of a second. Magicians have long used such cognitive biases to their advantage, and in recent years scientists have been following in their footsteps, borrowing techniques from the conjurer’s playbook in an effort not to mystify people but to study them. Magic may seem an unlikely tool, but it’s already yielded several widely cited results. Consider the work on choice blindness — people’s lack of awareness when evaluating the results of their decisions. In one study, shoppers in a blind taste test of two types of jam were asked to choose the one they preferred. They were then given a second taste from the jar they picked. Unbeknown to them, the researchers swapped the flavors before the second spoonful. The containers were two-way jars, lidded at both ends and rigged with a secret compartment that held the other jam on the opposite side — a principle that’s been used to bisect countless showgirls. This seems like the sort of thing that wouldn’t scan, yet most people failed to notice that they were tasting the wrong jam, even when the two flavors were fairly dissimilar, like grapefruit and cinnamon-apple. © 2012 The New York Times Company
Keyword: Attention
Link ID: 16955 - Posted: 06.23.2012
By ARIEL KAMINER YOU could drive past the hulking warehouse on the rough patch of waterfront in Sunset Park, Brooklyn, several times without ever figuring it for the latest frontier of neurological thrill-seeking. But that’s where Yehuda Duenyas, 38, who calls himself “a creator of innovative experiences,” was camped out last week, along with his team of scrappy young technical wizards and a quarter-million dollars’ worth of circuitry, theatrical lighting and optimism called “The Ascent.” Part art installation, part adventure ride, part spiritual journey, “The Ascent” claims to let users harness their brain’s own electrical impulses, measured through EEG readings, to levitate themselves. During its brief stay in New York, it welcomed representatives from cultural organizations like PS 122 and Lincoln Center, event promoters and friends of the team. In the shadowy vastness of the warehouse, “The Ascent” looked spare and heroic, like the setting for the final showdown between good and evil. Up high, a large circular track of lights and equipment hung from the ceiling. Down on the floor, another circle mirrored the one above, with incandescent bulbs illuminating transient puffs of smoke and casting the apparatus in a ghostly light. In the 30 feet between the lights above and the lights below, the air seemed heavy with magic and danger. An assistant outfitted me with a harness around my middle and a couple of EEG sensors across my forehead. Another assistant led me to the center of the circle and snapped me into the two hanging cables. For one long and mysterious moment, I stood alone in silence. Then the fun began. © 2012 The New York Times Company
Keyword: Attention
Link ID: 16954 - Posted: 06.23.2012
By Deborah Kotz, Globe Staff I get occasional migraines, and the only good thing about the throbbing pain, nausea, and depressed mood is the sense of euphoria that comes when the pain finally lifts. For some headache sufferers, however, the pain never goes away -- for months, years, or even decades. I received a call recently from a relative whose teenage son developed a headache one day that’s lasted two months and counting, causing him to miss his final months of high school. His diagnosis: new daily persistent headache, a wastebasket term given when everything else has been ruled out. Dr. Elizabeth Loder, chief of the division of headache and pain at Brigham and Women’s/Faulkner Hospital, estimated that about 5 percent of the patients she sees at her clinic have new daily persistent headache. More commonly, patients come in with chronic migraines that result from medication overuse or because a particular drug isn’t working for them or has been prescribed at too low a dose. With new daily persistent headache, or NDPH, however, none of the array of migraine medications seems to work, even when prescribed at optimal doses. There’s no known cause such as a head injury, tumor, or seizure condition. And, unlike the typical headache sufferer, those with NDPH can name the exact day when their headache began -- even what they were doing when it started -- because they’ve never before had a problem with headaches and suddenly they’re in pain all the time with no relief in sight. © 2012 NY Times Co.
Keyword: Pain & Touch
Link ID: 16953 - Posted: 06.23.2012
By John de Dios Derek, left, and Zachary Francis spends most of their together. However, this fall the twins will be attending separate universities Ğ the first time for them to be separated for an extended period of time. Derek, left, and Zachary Francis spends most of their together. However, this fall the twins will be attending separate universities - the first time for them to be separated for an extended period of time. Twins Derek and Zachary Francis sit across from each other in Caffe Luce, a popular coffee shop near the University of Arizona campus. Their faces, still showing signs of youthful hormones, are nearly identical. Their hairstyles, their fashion styles and even their mannerisms are almost mirror images. Derek, the older brother, has a wider jaw and short hair. A red string adorns his left wrist as he writes left-handed. Zachary, with his short hair coifed similar to his brother’s and a small birthmark behind his neck, is more reserved, listening to headphones while he scrolls through his computer with his right hand. The brothers have spent 99 percent of their 19 years of life at each other’s side. They also share an even rarer bond than your typical identical twin: The Francis twins are also mirror-image twins. © 2012 Scientific American
Keyword: Development of the Brain; Genes & Behavior
Link ID: 16952 - Posted: 06.23.2012
by Elizabeth Preston It's 20 million years ago in the forests of Argentina, and Homunculus patagonicus is on the move. The monkey travels quickly, swinging between tree branches as it goes. Scientists have a good idea of how Homunculus got around thanks to a new fossil analysis of its ear canals and those of 15 other ancient primates. These previously hidden passages reveal some surprises about the locomotion of extinct primates—including hints that our own ancestors spent their lives moving at a higher velocity than today's apes. Wherever skeletons of ancient primates exist, anthropologists have minutely analyzed arm, leg, and foot bones to learn about the animals' locomotion. Some of these primates seem to have bodies built for leaping. Others look like they moved more deliberately. But in species such as H. patagonicus, there's hardly anything to go on aside from skulls. That's where the inner ear canals come in. "The semicircular canals function essentially as angular accelerometers for the head," helping an animal keep its balance while its head jerks around, says Timothy Ryan, an anthropologist at Pennsylvania State University, University Park. In the new study, he and colleagues used computed tomography scans to peer inside the skulls of 16 extinct primates, spanning 35 million years of evolution, and reconstruct the architecture of their inner ears. © 2010 American Association for the Advancement of Science
Keyword: Hearing; Evolution
Link ID: 16951 - Posted: 06.23.2012
By Jesse Bering Once, while in a drowsy, altitude-induced delirium 35,000 feet somewhere over iceland, I groped mindlessly for the cozy blue blanket poking out beneath my seat, only to realize—to my unutterable horror—that I was in fact tugging soundly on a wriggling, sock-covered big toe. Now, with a temperament such as mine, life tends to be one awkward conversation after the next, so when I turned around, smiling, to apologize to the owner of this toe, my gaze was met by a very large man whose grunt suggested that he was having some difficulty in finding the humor in this incident. Unpleasant, sure, but I now call this event serendipitous. As I rested my head back against that sanitation-paper-covered airline pillow, my midflight mind lit away to a much happier memory, one involving another big toe, yet this one belonging to a noticeably more good-humored animal than the one sitting behind me. This other toe—which felt every bit as much as its overstuffed human equivalent did, I should add—was attached to a 450-pound western lowland gorilla, with calcified gums, named King. When I was 20 and he was 27, I spent much of the summer of 1996 with my toothless friend King, listening to Frank Sinatra and the Three Tenors, playing chase from one side of his exhibit to the other, and tickling his toes. He'd lean back in his night house, stick out one huge ashen-gray foot through the bars of his cage and leave it dangling there in anticipation, erupting in shoulder-heaving guttural laughter as I'd grab hold of one of his toes and gently give it a palpable squeeze. He almost couldn't control himself when, one day, I leaned down to act as though I were going to bite on that plump digit. If you've never seen a gorilla in a fit of laughter, I'd recommend searching out such a sight before you pass from this world. It's something that would stir up cognitive dissonance in even the heartiest of creationists. © 2012 Scientific American
Keyword: Emotions; Evolution
Link ID: 16950 - Posted: 06.23.2012
By Jason G. Goldman Yogi Bear always claimed that he was smarter than the average bear, but the average bear appears to be smarter than once thought. Psychologists Jennifer Vonk of Oakland University and Michael J. Beran of Georgia State University have taken a testing methodology commonly used for primates and shown not only that the methodology can be more widely used, but also that bears can distinguish among differing numerosities. Numerical cognition is perhaps the best understood of the core building blocks of the mind. Decades of research have provided evidence for the numerical abilities of gorillas, chimpanzees, rhesus, capuchin, and squirrel monkeys, lemurs, dolphins, elephants, birds, and fish. Pre-linguistic human infants share the same mental modules for representing and understanding numbers as those non-human animal species. Each of these species is able to precisely count sets of objects up to three, but after that, they can only approximate the number of items in a set. Even human adults living in cultures whose languages have not developed an explicit count list must rely on approximation rather than precision for quantities larger than three. For this reason, it is easier for infants and animals to distinguish thirty from sixty than it is to distinguish thirty from forty, since the 1:2 ratio (30:60) is smaller than the 3:4 ratio (30:40). As the ratios increase, the difference between the two sets becomes smaller, making it more difficult to discriminate between them without explicit counting. Given that species as divergent as humans and mosquitofish represent number in the same ways, subject to the same (quantity-based and ratio-based) limits and constraints, it stands to reason that the ability to distinguish among two quantities is evolutionarily-ancient. © 2012 Scientific American
Keyword: Intelligence; Evolution
Link ID: 16949 - Posted: 06.21.2012
By Saswato R. Das One of the items high on the big science project to-do list is to devise a wiring diagram for the human brain. Its 100 billion neurons and the hundreds of trillions of connections among these cells consign this goal and the specifics of achieving it to the long-term bin. A first step, though, is a complete diagram of the mouse brain. Scientists at Cold Spring Harbor Laboratory (CSHL) in Long Island, N.Y., have started making public detailed images of mouse brain circuitry, releasing on June 1 the first installment of about 500 terabytes. The goal of the effort, called the Mouse Brain Architecture Project (MBA), is an entire rodent brain wiring plan that would represent the first such mapping of the circuits of a vertebrate brain. "Current knowledge of brain circuitry is incomplete," says Jonathan Pollock, chief of genetics and molecular neurobiology research at the National Institute on Drug Abuse. "The lack of knowledge about neural circuitry has led to recognition by the scientific community of the need map the brain at the macro-, meso- and microscopic scale." The MBA complements other efforts, such as the National Institutes of Health's Human Connectome Project and the ALLEN Brain Connectivity Atlas. Pollock says that because the mouse serves as a general model for mammal genetics, the knowledge gleaned could help in the study of diseases such as Alzheimer's, autism, schizophrenia, depression and addiction. In recent years researchers focusing on mammalian brains have placed much attention on individual synapses, connection points between neurons, using electron microscopy. This approach is too complex and currently impractical for application to the whole mouse brain © 2012 Scientific American,
Keyword: Development of the Brain
Link ID: 16948 - Posted: 06.21.2012
Children exposed to HIV in the womb may be more likely to experience hearing loss by age 16 than are their unexposed peers, according to scientists in a National Institutes of Health research network. The researchers estimated that hearing loss affects 9 to 15 percent of HIV-infected children and 5 to 8 percent of children who did not have HIV at birth but whose mothers had HIV infection during pregnancy. Study participants ranged from 7 to 16 years old. The researchers defined hearing loss as the level at which sounds could be detected, when averaged over four frequencies important for speech understanding (500, 1000, 2000, and 4000 Hertz), that was 20 decibels or higher than the normal hearing level for adolescents or young adults in either ear. “Children exposed to HIV before birth are at higher risk for hearing difficulty, and it's important for them—and the health providers who care for them—to be aware of this,” said George K. Siberry, M.D., of the Pediatric, Adolescent, and Maternal AIDS Branch of the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), the NIH institute that leads the research network. Compared to national averages for other children their age, children with HIV infection were about 200 to 300 percent more likely to have a hearing loss. Children whose mothers had HIV during pregnancy but who themselves were born without HIV were 20 percent more likely than to have hearing loss. The study was published online in The Pediatric Infectious Disease Journal.
Keyword: Hearing; Development of the Brain
Link ID: 16947 - Posted: 06.21.2012
by Andy Coghlan Newborn babies have revealed to the world when they start seeing in three dimensions. Babies were thought to begin seeing in stereo at about four months after their due date. They actually learn to do it four months after they are exposed to light, even if they are born early. Ilona Kovács at Budapest University of Technology and Economics in Hungary and her colleagues gave 15 premature and 15 full-term babies goggles that filtered out red or green light. Once a month for eight months, the team sat the babies in a dark room and got them to stare at patterns of dots on a screen. The goggles made the dots invisible unless viewed in 3D. Sensors placed on each baby's head picked up electrical signals that revealed whether they could see the dots. If they could, the sensor registered pulses of 1.875 hertz; if not, there was only a background signal. The babies began to see stereo images about four months after they were born, whether they were premature or full term, showing that the environment, not an internal clock, is the likely trigger for the development of this ability in the brain Journal reference: Proceedings of the National Academy of Sciences, DOI: 10.1073/pnas.1203096109 © Copyright Reed Business Information Ltd.
Keyword: Vision; Development of the Brain
Link ID: 16946 - Posted: 06.21.2012
Daniel H. Geschwind & Genevieve Konopka The decoding of the human and chimpanzee genomes was heralded as an opportunity to truly understand how changes in DNA resulted in the evolution of our cognitive features. However, more than a decade and much detective work later, the functional consequences of such changes have proved elusive, with a few exceptions1, 2. Now, writing in Cell, Dennis et al.3 and Charrier et al.4 describe the evolutionary history and function of the human gene SRGAP2 and provide evidence for molecular and cellular mechanisms that may link the gene's evolution with that of our brain. It was already known that SRGAP2 is involved in brain development5 and that humans have at least three similar copies of the gene, whereas non-human primates carry only one6. However, the study of duplicated, or very similar, segments of DNA is hampered by the fact that most human cells carry two sets of chromosomes (one inherited from each parent), which makes it difficult to distinguish duplicated copies from the different parental forms of the gene. To circumvent this problem, Dennis et al.3 searched for copies of SRGAP2 in the genome of a hydatidiform mole — an abnormal, non-viable human embryo that results from the fusion of a sperm with an egg that has lost its genetic material; it therefore has chromosomes derived from a single parent. The authors showed that humans carry four non-identical copies (named A–D) of SRGAP2 at different locations on chromosome 1. By comparing the genes' sequences with that of the SRGAP2 gene from the orang-utan and chimpanzee, the authors estimated that SRGAP2 was duplicated in the human lineage about 3.4 million years ago, resulting in SRGAP2A (the ancestral version that we share with other primates) and SRGAP2B. Further duplications of SRGAP2B gave rise to SRGAP2C about 2.4 million years ago and to SRGAP2D about 1 million years ago (Fig. 1a). © 2012 Nature Publishing Group
Keyword: Genes & Behavior; Evolution
Link ID: 16945 - Posted: 06.21.2012
John von Radowitz A protein needed to re-grow injured nerves in limbs has been identified, raising the prospect of new treatments. The findings, in mice, have implications for helping patients recover from peripheral nerve injuries. They also open up new pathways for investigating how to regenerate neurons in the spinal cord and brain. Peripheral nerves provide the sense of touch and drive the muscles that move the arms, legs and feet. Unlike central nervous system nerves of the spinal cord, they can regrow after being cut or crushed. But how this happens is still not well understood. Scientists conducting the new research, reported in the journal Neuron, identified a signalling protein that helps switch on the regeneration process. The molecule, called leucine zipper kinase (DLK), regulates signals that tell a nerve cell it has been injured, often communicating over distances of several feet. Mice lacking DLK were unable to regrow severed nerves. Lead researcher Professor Aaron DiAntonio, from Washington University in St Louis, US, said: "DLK is a key molecule linking an injury to the nerve's response to that injury, allowing the nerve to regenerate. © independent.co.uk
Keyword: Regeneration; Trophic Factors
Link ID: 16944 - Posted: 06.21.2012
Ewen Callaway Ten years ago, psychiatrist David Skuse met a smart, cheery five-year-old boy whose mother was worried because her son had trouble following conversations with other kids at school. He struggled to remember names and often couldn’t summon the words for simple things such as toothpaste. Skuse is an expert on language development at the Institute of Child Health at University College London, but he had never encountered anything like the boy’s condition. His scientific curiosity was piqued when the mother, who is bilingual, mentioned her own difficulties remembering words in English, her native tongue. Her mother, too, had trouble recounting what had happened in television shows she had just seen. “The family history of this word-finding problem needs further investigation,” Skuse noted at the time. About half the members of this family, dubbed JR, share similar language deficits and brain abnormalities. These deficits seem to be inherited across at least four generations, Skuse and his colleagues report today in Proceedings of the Royal Society B1. Identifying the genetic basis of the family’s unique trait — which they call the ‘family problem’ — could help to explain how our brains link words to objects, concepts and ideas. “It’s like that tip-of-the-tongue moment; you’re struggling to find a word,” says Josie Briscoe, a cognitive psychologist at the University of Bristol, UK, and a study co-author. The researchers tested eight JR family members on a number of language and memory tasks to better understand their deficits. © 2012 Nature Publishing Group,
Keyword: Language; Genes & Behavior
Link ID: 16943 - Posted: 06.20.2012
by Moheb Costandi Researchers have yet to understand how genes influence intelligence, but a new study takes a step in that direction. An international team of scientists has identified a network of genes that may boost performance on IQ tests by building and insulating connections in the brain. Intelligence runs in families, but although scientists have identified about 20 genetic variants associated with intelligence, each accounts for just 1% of the variation in IQ scores. Because the effects of these genes on the brain are so subtle, neurologist Paul Thompson of the University of California, Los Angeles, devised a new large-scale strategy for tackling the problem. In 2009, he co-founded the ENIGMA Network, an international consortium of researchers who combine brain scanning and genetic data to study brain structure and function. Earlier this year, Thompson and his colleagues reported that they had identified genetic variants associated with head size and the volume of the hippocampus, a brain structure that is crucial for learning and memory. One of these variants was also weakly associated with intelligence. Those carrying it scored on average 1.29 points better on IQ tests than others, making it one of the strongest candidate intelligence genes so far. The researchers have now used the same strategy to identify more genetic variants associated with brain structure and IQ. In the new study, they analyzed brain images and whole-genome data from 472 Australians, including 85 pairs of identical twins, 100 pairs of nonidentical twins, and their nontwin siblings. They identified 24 genetic variations within six different genes, all of which were linked to differences in the structural integrity of major brain pathways. © 2010 American Association for the Advancement of Science
Keyword: Genes & Behavior; Intelligence
Link ID: 16942 - Posted: 06.20.2012
By ANAHAD O'CONNOR A new study adds to growing evidence that the complications of diabetes may extend to the brain, causing declines in memory, attention and other cognitive skills. The new research showed that over the course of about a decade, elderly men and women with diabetes — primarily Type 2, the form of the disease related to obesity and inactivity — had greater drops in cognitive test scores than other people of a similar age. The more poorly managed their disease, the greater the deterioration in mental function. And the declines were seen not just in those with advanced diabetes. The researchers found that people who did not have diabetes at the start of the study but developed it later on also deteriorated to a greater extent than those without the disease. “What we’ve shown is a clear association with diabetes and cognitive aging in terms of the slope and the rate of decline on these cognitive tests,” said Dr. Kristine Yaffe, a professor of psychiatry and neurology at the University of California, San Francisco. “That’s very powerful.” While correlation does not equal causation and the relationship between diabetes and brain health needs further study, the findings, if confirmed, could have significant implications for a large segment of the population. Nationwide, nearly a third of Americans over the age of 65, or roughly 11 million people, have diabetes. By 2034, about 15 million Medicare-eligible Americans are expected to have the disease. Previous studies have shown that Type 2 diabetes correlates with a higher risk of Alzheimer’s disease and dementia later in life. But how one leads to the other has not been well understood. Copyright 2012 The New York Times Company
Keyword: Obesity; Learning & Memory
Link ID: 16941 - Posted: 06.20.2012
A diet high in cholesterol may help people with a fatal genetic disease which damages the brain, according to early studies in mice. Patients with Pelizaeus-Merzbacher disease struggle to produce a fatty sheath around their nerves, which is essential for function. A study, published in Nature Medicine, showed that a high-cholesterol diet could increase production. The authors said the mice "improved dramatically". Pelizaeus-Merzbacher disease (PMD) is one of many leukodystrophies in which patients struggle to produce the myelin sheath. It protects nerve fibres and helps messages pass along the nerves. Without the sheath, messages do not travel down the nerve - resulting in a range of problems including movement and cognition. Researchers at the Max Planck Institute of Experimental Medicine, in Germany, performed a trial on mice with the disease and fed them a high cholesterol diet. The first tests were on mice when they were six weeks old, after signs of PMD had already emerged. Those fed a normal diet continued to get worse, while those fed a cholesterol-enriched diet stabilised. BBC © 2012
Keyword: Multiple Sclerosis; Genes & Behavior
Link ID: 16940 - Posted: 06.20.2012
By Ferris Jabr In the 18th century Carl Linnaeus named them lemurs, after the Latin lemures—spirits of the dead, wandering ghosts. He knew the primates roamed Madagascar’s forests at night, their large eyes brimming with moonlight, their shrill cries crashing through the treetops. One of the smallest lemurs on the island, the fat-tailed dwarf lemur, resembled a phantom in another way: it completely vanished for seven months each year. For a long time, no one understood where the fat-tailed dwarf lemur went—a remote part of the island? the spirit world?—or what it was doing all that time, but scientists had a hunch. Perhaps the lemur was hibernating. If so, it would be the only primate in the world—and one of the only tropical mammals—to do so. Given Madagascar’s climate, however, it made sense that a lemur might hibernate to survive annual periods of drought. In general, Madagascar has two seasons: the hot, wet season from November to April, and the cooler, dry season from April through October. The deciduous forests on the west coast, where many fat-tailed dwarf lemurs live, offer no open sources of water during the dry season and only fibrous fruits bereft of sugar. Perhaps, scientists reasoned, the fat-tailed dwarf lemur hunkered down and waited for the rains to return, slowing its metabolism and dropping its body temperature. It could survive off of nutrients stored in its tail, which always grew plumper as the dry season drew closer. © 2012 Scientific American
Keyword: Sleep
Link ID: 16939 - Posted: 06.20.2012