Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 12221 - 12240 of 29394

By R. Douglas Fields Imagine if your biggest health problem could be solved with the flip of a switch. Deep-brain stimulation (DBS) offers such a dramatic recovery for a range of neurological illnesses, including Parkinson's disease, epilepsy and major depression. Yet the metal electrodes implanted in the brain are too bulky to tap into intricate neural circuitry with precision and corrode in contact with tissue, so their performance degrades over time. Now neurophysiologists have developed a method of DBS that avoids these problems by using microscopic magnets to stimulate neurons. In experiments published in June 2012 in Nature Communications, neurophysiologist John T. Gale of the Cleveland Clinic and his colleague Giorgio Bonmassar, a physicist at Harvard Medical School and an expert on brain imaging, tested whether micromagnets (which are half a millimeter in diameter) could induce neurons from rabbit retinas to fire. They found that when they electrically energized a micromagnet positioned next to a neuron, it fired. In contrast to the electric currents induced by DBS, which excite neurons in all directions, magnetic fields follow organized pathways from pole to pole, like the magnetic field that surrounds the earth. The researchers found that they could direct the stimulus precisely to individual neurons, and even to particular areas of a neuron, by orienting the magnetic coil appropriately. “That may help us avoid the side effects we see in DBS,” Gale says, referring to, for instance, the intense negative emotions that are sometimes accidentally triggered when DBS is used to relieve motor problems in Parkinson's. © 2013 Scientific American

Keyword: Brain imaging; Parkinsons
Link ID: 17747 - Posted: 02.02.2013

by Elizabeth Devitt Birds may not have big brains, but they know how to navigate. They wing around town and across continents with amazing accuracy, while we watch and wonder. Biologists believe that sight, smell, and an internal compass all contribute to avian orienteering. But none of these skills completely explains how birds fly long distances or return home from places they've never been. A new study proposes that the animals use infrasound—low-level background noise in our atmosphere—to fly by "images" they hear. These acoustical maps may also explain how other creatures steer. Scientists have long considered infrasound as a navigational cue for birds. But until U.S. Geological Survey geophysicist Jonathan Hagstrum in Menlo Park, California, became intrigued by the unexplained loss of almost 60,000 pigeons during a race from France to England in 1997, no one pinpointed how the process worked. The race went bust when the birds' flight route crossed that of a Concorde jet, and Hagstrum wanted to know why. "When I realized the birds in that race were on the same flight path as the Concorde, I knew it had to be infrasound," he says. The supersonic plane laid down a sonic boom when most of the animals were flying across the English Channel. Normally, infrasound is generated when deep ocean waves send pressure waves reverberating into the land and atmosphere. Infrasound can come from other natural causes, such as earthquakes, or humanmade events, such as the acceleration of the Concorde. The long, slow waves move across vast distances. Although humans can't hear them, birds and other animals are able to tune in. © 2010 American Association for the Advancement of Science

Keyword: Hearing; Animal Migration
Link ID: 17746 - Posted: 02.02.2013

Emotional behaviour in childhood may be linked with heart disease in middle age, especially in women, research suggests. A study found being prone to distress at the age of seven was associated with a significantly higher risk of cardiovascular disease in later life. Conversely children who were better at paying attention and staying focused had reduced heart risk when older. The US researchers said more work was needed to understand the link. Their study looked at 377 adults who had taken part in research as children. At seven they had undergone several tests to look at emotional behaviour. They compared the results from this with a commonly used risk score for cardiovascular disease of participants now in their early 40s. After controlling for other factors which might influence heart disease risk, they found that high levels of distress at age seven were associated with a 31% increased risk of cardiovascular disease in middle-aged women. For men with high levels of distress in childhood - which included being easily frustrated and quick to anger - the increased risk of cardiovascular disease was 17%. For 40-year-olds who had been prone to distress as a child, the chances of having a heart attack or stroke in the next 10 years increased from 3.2% to 4.2% for women and 7.3 to 8.5% for men. The researchers also looked at positive emotional factors such as having a good attention span and found this was linked with better cardiovascular health, although to a lesser degree. Other studies have linked adversity in childhood with cardiovascular disease in adults. BBC © 2013

Keyword: Stress; Development of the Brain
Link ID: 17745 - Posted: 02.02.2013

Wray Herbert The Invisible Gorilla is part of the popular culture nowadays, thanks largely to a widely-read 2010 book of that title. In that book, authors and cognitive psychologists Dan Simons and Christopher Chabris popularized a phenomenon of human perception—known in the jargon as “inattentional blindness”—which they had demonstrated in a study some years before. In the best known version of the experiment, volunteers were told to keep track of how many times some basketball players tossed a basketball. While they did this, someone in a gorilla suit walked across the basketball court, in plain view, yet many of the volunteers failed even to notice the beast. What the invisible gorilla study shows is that, if we are paying very close attention to one thing, we often fail to notice other things in our field of vision—even very obvious things. We all love these quirks of human perception. It’s entertaining to know that our senses can play tricks on us. And that’s no doubt the extent of most people’s familiarity with this psychological phenomenon. But what if this perceptual quirk has serious implications—even life-threatening implications? A new study raises that disturbing possibility. Three psychological scientists at Brigham and Women’s Hospital in Boston—Trafton Drew, Melissa Vo and Jeremy Wolfe—wondered if expert observers are also subject to this perceptual blindness. The subjects in the classic study were “naïve”—untrained in any particular domain of expertise and performing a task nobody does in real life. But what about highly trained professionals who make their living doing specialized kinds of observations? The scientists set out to explore this, and in an area of great importance to many people—cancer diagnosis. © Association for Psychological Science

Keyword: Attention
Link ID: 17744 - Posted: 02.02.2013

By Stephani Sutherland If you have trouble sleeping, laptop or tablet use at bedtime might be to blame, new research suggests. Mariana Figueiro of the Lighting Research Center at Rensselaer Polytechnic Institute and her team showed that two hours of iPad use at maximum brightness was enough to suppress people's normal nighttime release of melatonin, a key hormone in the body's clock, or circadian system. Melatonin tells your body that it is night, helping to make you sleepy. If you delay that signal, Figueiro says, you could delay sleep. Other research indicates that “if you do that chronically, for many years, it can lead to disruption of the circadian system,” sometimes with serious health consequences, she explains. The dose of light is important, Figueiro says; the brightness and exposure time, as well as the wavelength, determine whether it affects melatonin. Light in the blue-and-white range emitted by today's tablets can do the trick—as can laptops and desktop computers, which emit even more of the disrupting light but are usually positioned farther from the eyes, which ameliorates the light's effects. The team designed light-detector goggles and had subjects wear them during late-evening tablet use. The light dose measurements from the goggles correlated with hampered melatonin production. On the bright side, a morning shot of screen time could be used as light therapy for seasonal affective disorder and other light-based problems. Figueiro hopes manufacturers will “get creative” with tomorrow's tablets, making them more “circadian friendly,” perhaps even switching to white text on a black screen at night to minimize the light dose. Until then, do your sleep schedule a favor and turn down the brightness of your glowing screens before bed—or switch back to good old-fashioned books. © 2013 Scientific American

Keyword: Biological Rhythms; Sleep
Link ID: 17743 - Posted: 02.02.2013

by Carrie Arnold Studying the links between brain and behavior may have just gotten easier. For the first time, neuroscientists have found a way to watch neurons fire in an independently moving animal. Though the study was done in fish, it may hold clues to how the human brain works. "This technique will really help us understand how we make sense of the world and why we behave the way we do," says Martin Meyer, a neuroscientist at King's College London who was not involved in the work. The study was carried out in zebrafish, a popular animal model because they're small and easy to breed. More important, zebrafish larvae are transparent, which gives scientists an advantage in identifying the neural circuits that make them tick. Yet, under a typical optical microscope, neurons that are active and firing look much the same as their quieter counterparts. To see what neurons are active and when, neuroscientists have therefore developed a variety of indicators and dyes. For example, when a neuron fires, it is flooded with calcium ions, which can cause some of the dyes to light up. Still, the approach has limitations. Traditionally, Meyer explains, researchers would immobilize the head or entire body of a zebrafish larvae so that they could get a clearer picture of what was happening inside the brain. Even so, it was difficult to interpret neural activity for just a few neurons and over a short period of time. Researchers needed a better way to study the zebrafish brain in real time. © 2010 American Association for the Advancement of Science

Keyword: Brain imaging
Link ID: 17742 - Posted: 02.02.2013

By Laura Sanders Some nerve fibers seem to love a good rubdown. These tendrils, which spread across skin like upside-down tree roots, detect smooth, steady stroking and send a feel-good message to the brain, researchers report in the Jan. 31 Nature. Although the researchers found these neurons in mice, similar cells in people may trigger massage bliss. The results are the latest to emphasize the strong and often underappreciated connection between emotions and the sensation of touch, says study coauthor David Anderson, a Howard Hughes Medical Institute investigator at Caltech. “It may seem frivolous to be studying massage neurons in mice, but it raises a profound issue — why do certain stimuli feel a certain way?” he says. It’s no surprise that many people find a caress pleasant. Earlier studies in people suggested that a particular breed of nerve fibers detects a caress and carries that signal to the brain. But scientists hadn’t been able to directly link this type of neuron to good feelings, either in people or in animals. “The beauty of this paper is that it goes one step further and adds behavioral elements,” says cognitive neuroscientist Francis McGlone of Liverpool John Moores University in England. Directly linking these neurons with pleasure clarifies the importance of touch, McGlone says. “Skin is a social organ,” he says. A growing number of studies show that the sensation of touch, particularly early in life, profoundly sculpts the brain. Young animals deprived of touch grow up with severe behavioral abnormalities. Babies fare better when they are held and touched frequently. And touch sensation can be altered in certain disorders. People with autism, for instance, often dislike caresses. © Society for Science & the Public 2000 - 2013

Keyword: Pain & Touch; Emotions
Link ID: 17741 - Posted: 02.02.2013

By Tanya Lewis and LiveScience Drug cravings can be brought on by many factors, such as the sight of drugs, drug availability and lack of self-control. Now, researchers have uncovered some of the neural mechanisms involved in cigarette craving. Two brain areas, the orbitofrontal cortex and the prefrontal cortex, interact to turn cravings on or off depending on whether drugs are available, the study reports today (Jan. 28) in the journal the Proceedings of the National Academy of Sciences. The researchers scanned the brains of 10 moderate-to-heavy smokers using functional magnetic resonance imaging (fMRI), which measures brain activity by changes in blood flow. Researchers measured activity while the participants watched video clips of people smoking as well as neutral videos. Before viewing, some subjects were told cigarettes would be available immediately after the experiment, while others were told they would have to wait 4 hours before lighting up. When participants watched the smoking videos, their brains showed increased activity in the medial orbitofrontal cortex, a brain area that assigns value to a behavior. When the cigarettes were available immediately as opposed to hours later, smokers reported greater cravings and their brains showed more activity in the dorsolateral prefrontal cortex. The researchers hypothesize that this area modulates value. In other words, it can turns up or down the "value level" of cigarettes (or other rewards) in the first area, the medial orbitofrontal cortex. The results show that addiction involves a brain circuit important for self-control and decision-making. © 2013 Scientific American,

Keyword: Drug Abuse
Link ID: 17740 - Posted: 01.30.2013

By GRETCHEN REYNOLDS Recently, researchers from the department of sport science at the University of Innsbruck in Austria stood on the slopes at a local ski resort and trained a radar gun on a group of about 500 skiers and snowboarders, each of whom had completed a lengthy personality questionnaire about whether he or she tended to be cautious or a risk taker. The researchers had asked their volunteers to wear their normal ski gear and schuss or ride down the slopes at their preferred speed. Although they hadn’t informed the volunteers, their primary aim was to determine whether wearing a helmet increased people’s willingness to take risks, in which case helmets could actually decrease safety on the slopes. What they found was reassuring. To many of us who hit the slopes with, in my case, literal regularity — I’m an ungainly novice snowboarder — the value of wearing a helmet can seem self-evident. They protect your head from severe injury. During the Big Air finals at the Winter X Games in Aspen, Colo., this past weekend, for instance, 23-year-old Icelandic snowboarder Halldor Helgason over-rotated on a triple back flip, landed head-first on the snow, and was briefly knocked unconscious. But like the other competitors he was wearing a helmet, and didn’t fracture his skull. Indeed, studies have concluded that helmets reduce the risk of a serious head injury by as much as 60 percent. But a surprising number of safety experts and snowsport enthusiasts remain unconvinced that helmets reduce overall injury risk. Why? A telling 2009 survey of ski patrollers from across the country found that 77 percent did not wear helmets because they worried that the headgear could reduce their peripheral vision, hearing and response times, making them slower and clumsier. In addition, many worried that if they wore helmets, less-adept skiers and snowboarders might do likewise, feel invulnerable and engage in riskier behavior on the slopes. Copyright 2013 The New York Times Company

Keyword: Brain Injury/Concussion
Link ID: 17739 - Posted: 01.30.2013

By BENJAMIN HOFFMAN NEW ORLEANS — It has become a staple of Super Bowl week, as much a part of the pregame to the N.F.L.’s biggest event as the annual media day: a discussion of how football is being affected by head injuries and the mounting evidence that long-term brain damage can be linked to injuries sustained on the field. Years ago, players rarely spoke about the issue and league officials dismissed suggestions that on-field injuries could lead to life-altering health problems. Now, however, the league is facing lawsuits from thousands of former players, rules are being instituted in an attempt to diminish injuries on the field and even President Obama has said that the way football is played will have to change. This week, Bernard Pollard, a hard-hitting safety for the Baltimore Ravens, created a stir by saying that the N.F.L. would not exist in 30 years because of the rules changes designed with safety in mind, but that he also believed there would be a death on the field at some point. At media day Tuesday, players reacted to the comments made by Pollard and Obama, with some agreeing with Pollard that recent rules changes would change the sport to such an extent that it would be less entertaining and lead to a loss of popularity. Pollard stood by his comments. He added, however, that while he was comfortable with the physical risk he was taking by playing football, he was not sure he would want future generations, including his 4-year-old son, to follow his example. “My whole stance right now is that I don’t want him to play football,” Pollard said. “Football has been good to me. It has been my outlet. God has blessed me with a tremendous talent to be able to play this game. But we want our kids to have things better than us.” He said he did not want his son to go through the aches and pains caused by the physicality of the game. © 2013 The New York Times Company

Keyword: Brain Injury/Concussion
Link ID: 17738 - Posted: 01.30.2013

By Gareth Cook Michael Trimble, a British professor at the Institute of Neurology in London, begins his new book with Gana the gorilla. In the summer of 2009, 11-year-old Gana gave birth to a boy at a Muenster zoo. But one day in August, the baby suddenly and mysteriously died. Gana held up her son in front of her, staring at his limp body. She held him close, stroking him. To onlookers it appeared that Gana was trying to reawaken him, and, as the hours passed, that she was mourning his passing. Some at the zoo that day cried. But Gana did not. Humans, Trimble tells us, are the only creatures who cry for emotional reasons. “Why Humans Like to Cry” is an exploration of why this would be so, a neuroanatomical “where do tears come from.” It’s also a meditation on human psychology. Many distinctions have been offered between humans and the rest of the animal world, and to this list Trimble adds another: the anguished tear, the apprehension that life is tragic. Trimble answered questions from Mind Matters editor Gareth Cook. Cook: How did you first become interested in crying? Trimble: Of course, because I cry, and some things bring tears quite easily, notably music, and opera with the power of the human voice. Crying tears, for emotional reasons, is unique to humans. There has been a game of catch me if you can, which has been played by those interested in finding attributes or behaviours which separate humans from our nearest living relatives – namely the chimpanzees and bonobos. Certainly our propositional language is very special, but primate communities have very sophisticated ways of communicating. Other contenders, such as play, using tools, or having what is called theory of mind (the sense that I know that others have a mind very like mine, with similar inclinations and intentions) have all been argued as unique to our species, but all these have been demonstrated, in some form, to be found in other primates. Emotional crying makes us human. © 2013 Scientific American

Keyword: Emotions
Link ID: 17737 - Posted: 01.30.2013

Some but not all antidepressant drugs known as SSRIs pose a very small but serious heart risk, say researchers. Citalopram and escitalopram, which fall into this drug group, can trigger a heart rhythm disturbance, a new study in the British Medical Journal shows. UK and US regulators have already warned doctors to be extra careful about which patients they prescribe these medicines to. And they have lowered the maximum recommended dose. The UK's Medicines and Healthcare products Regulatory Agency (MHRA) says people with pre-existing heart conditions should have a heart trace before going on these drugs, to check for a rhythm disturbance known as long QT interval. Experts reassure that complications are very rare and that in most cases the benefits for the patient taking the drug will outweigh the risks. Long QT QT interval is measured with an electrocardiogram (ECG) and varies with the heart rate - it gets longer when the heart beats slower and is shorter when the heart beats faster. Some variation is normal, but if it gets too long it can upset the timing of heartbeat with potentially dire consequences - dizziness, faints and, rarely, sudden death. To assess how common a problem long QT linked to SSRI use might be, US researchers decided to look at the medical records of more than 38,00 patients from New England. BBC © 2013

Keyword: Depression
Link ID: 17736 - Posted: 01.30.2013

By Rachel Ehrenberg A rare peek into drug company documents reveals troubling differences between publicly available information and materials the company holds close to its chest. In comparing public and private descriptions of drug trials conducted by pharmaceutical giant Pfizer, researchers discovered discrepancies including changes in the number of study participants and inconsistent definitions of protocols and analyses. The researchers, led by Kay Dickersin, director of the Center for Clinical Trials at the Johns Hopkins Bloomberg School of Public Health, gained access to internal Pfizer reports after a lawsuit made them available. Dickersin and her colleagues compared the internal documents with 10 publications in peer-reviewed journals about randomized trials of Pfizer’s anti-epilepsy drug gabapentin (brand name Neurontin) that tested its effectiveness for treating other disorders. The results, the researchers say, suggest that the published trials were biased and misleading, even though they read as if standard protocols were followed. That lack of transparency could mean that clinicians prescribe drugs based on incomplete or incorrect information. We could see all of the biases right in front of us all at once,” says Dickersin, who was an expert witness in the suit, which was brought by a health insurer against Pfizer. Pfizer lost the case in 2010, and a judge ruled it should pay $142 million in damages for violating federal racketeering laws in promoting Neurontin for treating migraines and bipolar disorder. Pfizer had in 2004 settled a case and paid $430 million in civil fines and criminal penalties for promoting Neurontin for unapproved use. © Society for Science & the Public 2000 - 2013

Keyword: Depression; Schizophrenia
Link ID: 17735 - Posted: 01.30.2013

Alison Abbott & Quirin Schiermeier Two of the biggest awards ever made for research have gone to boosting studies of the wonder material graphene and an elaborate simulation of the brain. The winners of the European Commission’s two-year Future and Emerging Technologies ‘flagship’ competition, announced on 28 January, will receive €500 million (US$670 million) each for their planned work, which the commission hopes will help to improve the lives, health and prosperity of millions of Europeans. The Human Brain Project, a supercomputer simulation of the human brain conceived and led by neuroscientist Henry Markram at the Swiss Federal Insitute of Technology in Lausanne, scooped one of the prizes. The other winning team, led by Jari Kinaret at Chalmers University of Technology in Gothenburg, Sweden, hopes to develop the potential of graphene — an ultrathin, flexible, electrically conducting form of carbon — in applications such as personal-communication technologies, energy storage and sensors. The size of the awards — matching funds raised by the participants are expected to bring each project’s budget up to €1 billion over ten years — have some researchers worrying that the flagship programme may draw resources from other research. And both winners have already faced criticism. Many neuroscientists have argued, for example, that the Human Brain Project’s approach to modelling the brain is too cumbersome to succeed (see Nature 482, 456–458; 2012). Markram is unfazed. He explains that the project will have three main thrusts. One will be to study the structure of the mouse brain, from the molecular to the cellular scale and up. Another will generate similar human data. A third will try to identify the brain wiring associated with particular behaviours. The long-term goals, Markram says, include improved diagnosis and treatment of brain diseases, and brain-inspired technology. © 2013 Nature Publishing Group

Keyword: Robotics; Development of the Brain
Link ID: 17734 - Posted: 01.30.2013

By Christof Koch Blindness is a private matter between a person and the eyes with which he or she was born. The sentiment expressed by the late Portuguese writer José Saramago in his famous novel Blindness may be appropriate for a person born unable to see. But what about the tens of millions of people worldwide who suffer from a variety of degenerative diseases that progressively rob them of their eyesight? The problem arises in the nerve cells that line the back of their eyes, their retinas. Fortunately, help is on the way to restore some of the lost vision using advanced neuroengineering. The hallmark of the two most common forms of adult-onset blindness in the West, age-related macular degeneration and retinitis pigmentosa, is that the photoreceptors responsible for converting the incoming rays of light into nervous energy gradually die off. Yet the roughly one million ganglion cells, whose output wires bundle up and leave the eyeball in the form of the optic nerve, remain intact. So visionary (pun intended) clinical ophthalmologists have paired up with technologists to bypass the defective parts of the retina by directly stimulating ganglion cells via advanced electronics. One of the most successful of such prosthetic devices, manufactured by a California company called Second Sight, uses a camera integrated into eyeglasses to convert images into electronic patterns. These patterns are sent to a small, 10- by six-pixel microelectrode array surgically positioned onto the retina. It stimulates neural processes that relay their information in the form of binary electrical pulses, so-called action potentials or spikes, to the brain proper. © 2013 Scientific American,

Keyword: Vision; Robotics
Link ID: 17733 - Posted: 01.30.2013

By Mark Fischetti Various scholars have tried to explain consciousness in long articles and books, but one neuroscience pioneer has just released an unusual video blog to get the point across. In the sharply filmed and edited production, Joseph LeDoux, a renowned expert on the emotional brain at New York University, interrogates his NYU colleague Ned Block on the nature of consciousness. Block is a professor of philosophy, psychology and neural science and is considered a leading thinker on the subject. The interview ends with a transition into a music video performed by LeDoux’s longstanding band, the Amygdaloids. The whole exercise is a bit quirky, yet it succeeds in explaining consciousness in simple, even entertaining terms. LeDoux intends to produce a series of these video blogs to explore other intriguing aspects of the mind and brain, and he is giving Scientific American the chance to post them first on our Web site. LeDoux has already interviewed Michael Gazzaniga at the University of California, Santa Barbara, on free will and Nobel Prize winner Eric Kandel at Columbia University on mapping the mind. The video is not a quick hit, like most on the Net these days. The interview runs about 10 minutes, followed by the four-minute music video. The idea is for viewers to sit back and actually think along with the expert as his or her explanation unfolds. Yet video producer Alexis Gambis has generated some compelling imagery to keep our visual attention as Block unwraps his subject. Gambis directs the Imagine Science Film Festival, is about to complete his graduate degree in film and has a doctorate in molecular biology. © 2013 Scientific American

Keyword: Consciousness
Link ID: 17732 - Posted: 01.29.2013

The offspring of promiscuous baboon males are more successful when they have contact with their father, scientists have found. A study by a team of European researchers has documented increased feeding success when foraging with adult male baboons. Paternity analyses allowed the scientists to determine whether the males were, in fact, the fathers. The findings are published in the journal Behavioural Ecology. Paternal care is uncommon in promiscuous mammals where it is not obvious which male actually is the father. Lead researcher, Dr Elise Huchard of the University of Cambridge's Department of Zoology, told BBC Nature: "Caring for offspring can be costly in terms of time and energy for the parents." She explained that parental care increases the chances of offspring survival, as well as improving an individual's survival and reproductive performance later on in life. "Paternal care is usually observed in species where paternity certainty is high, [such as] in monogamous species," according to Dr Huchard. So when research suggested that juveniles benefitted from paternal input in promiscuous baboon troops, Dr Huchard and colleagues decided to perform field research on two troops of chacma baboons (Papio ursinus) in Tsaobis Leopard Park, central Namibia. BBC © 2013

Keyword: Sexual Behavior
Link ID: 17731 - Posted: 01.29.2013

by Michael Balter CAMBRIDGE, UNITED KINGDOM—Siberia may not be everyone's idea of a tourist destination, but it has been home to humans for tens of thousands of years. Now a new study of indigenous Siberian peoples presented here earlier this month at a meeting on human evolution reveals how natural selection helped people adapt to the frigid north. The findings also show that different living populations adapted in somewhat different ways. Siberia occupies nearly 10% of Earth's land mass, but today it's home to only about 0.5% of the world's population. This is perhaps not surprising, since January temperatures average as low as -25°C. Geneticists have sampled only a few of the region's nearly one dozen indigenous groups; some, such as the 2000-member Teleuts, descendants of a once powerful group of horse and cattle breeders also known for their skill in making leather goods, are in danger of disappearing. Previous research on cold adaptation included two Siberian populations and implicated a couple of related genes. For example, genes called UCP1 and UCP3 tend to be found in more active forms in populations that live in colder climes, according to work published in 2010 by University of Chicago geneticist Anna Di Rienzo and her colleagues. These genes help the body's fat stores directly produce heat rather than producing chemical energy for muscle movements or brain functions, a process called "nonshivering thermogenesis." The new study sampled Siberians much more intensely, including 10 groups that represent nearly all of the region's native populations. © 2010 American Association for the Advancement of Science

Keyword: Genes & Behavior; Evolution
Link ID: 17730 - Posted: 01.29.2013

Voluntary movements involve the coordinated activation of two brain pathways that connect parts of deep brain structures called the basal ganglia, according to a study in mice by researchers at the National Institute on Alcohol Abuse and Alcoholism (NIAAA), part of the National Institutes of Health. The findings, which challenge the classical view of basal ganglia function, were published online in Nature on Jan. 23. “By improving our understanding of how the basal ganglia control movements, these findings could aid in the development of treatments for disorders in which these circuits are disrupted, such as Parkinson’s disease, Huntington’s disease and addiction,” says NIAAA Acting Director Kenneth R. Warren, Ph.D. The predominant model of basal ganglia function proposes that direct and indirect pathways originating in a brain region called the striatum have opposing effects on movement. Activity of neurons in the direct pathway is thought to promote movement, while activity in the indirect pathway is thought to inhibit movement. Newer models, however, suggest that co-activation of these pathways is necessary to synchronize basal ganglia circuits during movement. “Testing these models has been difficult due to the lack of methods to measure specific neurons in the direct and indirect pathways in freely moving animals,” explains first author Guohong Cui, Ph.D., of the NIAAA Laboratory for Integrated Neuroscience (LIN). To overcome these difficulties, Dr. Cui and colleagues devised a new approach for measuring the activity of neurons deep within the brain during complex behaviors. Their technique uses fiber optic probes implanted in the mouse brain striatum to measure light emissions from neurons engineered to glow when activated.

Keyword: Movement Disorders
Link ID: 17729 - Posted: 01.29.2013

When you eat could play an important role in weight loss, a new study suggests. Researchers looked at the role of meal timing in 420 men and women in southeast Spain participating in a 20-week weight-loss treatment following several studies in animals showing a relationship between the timing of feeding and weight regulation. Lunch was the main meal among the Mediterranean population studied.Lunch was the main meal among the Mediterranean population studied. (Eric Gaillard/Reuters) "Our results indicate that late eaters displayed a slower weight-loss rate and lost significantly less weight than early eaters, suggesting that the timing of large meals could be an important factor in a weight loss program," Frank Scheer, director of the medical chronobiology program at Brigham and Women's Hospital in Boston, said in a release. Of the participants, 51 per cent were early eaters who ate their main meal, lunch, before 3 p.m. The other 49 per cent had lunch after three. The researchers found energy and nutrient intake, estimates of calories burned, appetite hormones and hours of sleep were similar between both groups. "Nevertheless, late eaters were more evening types, had less energetic breakfasts and skipped breakfast more frequently than early eaters," Scheer and his co-authors wrote in Tuesday's issue of the International Journal of Obesity. They suggested that new weight loss strategies should incorporate the timing of food as well as the classic look at calorie intake and distribution of carbohydrates, fats and protein. © CBC 2013

Keyword: Obesity; Biological Rhythms
Link ID: 17728 - Posted: 01.29.2013