Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 10641 - 10660 of 28882

By GRETCHEN REYNOLDS If you consider yourself to be a born morning person or an inveterate night owl, there is new research that supports your desire to wake up early or stay up late. Each of us has a personal “chronotype,” or unique circadian rhythm, says Till Roenneberg, a professor of chronobiology at Ludwig Maximilian University in Munich and one of the world’s experts on sleep. In broad strokes, these chronotypes are usually characterized as early, intermediate or late, corresponding to people who voluntarily go to bed and wake early, at a moderate hour or vampirishly late. If you are forced to wake up earlier than your body naturally would, you suffer from what Roenneberg calls “social jet lag.” People with an early chronotype may do well with a 7 a.m. workday rising time, but others do not. Sleeping out of sync with your innate preferences can be detrimental to your health, especially for late chronotypes, who tend to be the most at odds with typical work schedules. A study conducted by the National Institutes of Health and published in March in PLOS ONE found that obese adults with late chronotypes tended to eat larger meals, develop more sleep apnea and have higher levels of stress hormones and lower levels of HDL, or “good,” cholesterol than obese people with other chronotypes. Their chronotype may also have contributed to weight gain in the first place, Roenneberg says. Research has shown that a single hour of social jet lag, the mismatch between your chronotype and your schedule, increases your risk for obesity by about 33 percent. In a study published in June in Chronobiology International, late-night chronotypes gained more weight during their freshman years at college than other new students did, even though college is one of the best fits for night owls. Copyright 2013 The New York Times Company

Keyword: Biological Rhythms; Genes & Behavior
Link ID: 18815 - Posted: 10.21.2013

Research indicates that indeed Americans girls and boys are going through puberty earlier than ever, though the reasons are unclear. Many believe our widespread exposure to synthetic chemicals is at least partly to blame, but it’s hard to pinpoint exactly why our bodies react in certain ways to various environmental stimuli. Researchers first noticed the earlier onset of puberty in the late 1990s, and recent studies confirm the mysterious public health trend. A 2012 analysis by the U.S. Centers for Disease Control and Prevention (CDC) found that American girls exposed to high levels of common household chemicals had their first periods seven months earlier than those with lower exposures. “This study adds to the growing body of scientific research that exposure to environmental chemicals may be associated with early puberty,” says Danielle Buttke, a researcher at CDC and lead author on the study. Buttke found that the age when a girl has her first period (menarche) has fallen over the past century from an average of age 16-17 to age 12-13. Earlier puberty isn’t just for girls. In 2012 researchers from the American Academy of Pediatrics (AAP) surveyed data on 4,100 boys from 144 pediatric practices in 41 states and found a similar trend: American boys are reaching puberty six months to two years earlier than just a few decades ago. African-American boys are starting the earliest, at around age nine, while Caucasian and Hispanics start on average at age 10. One culprit could be rising obesity rates. Researchers believe that puberty (at least for girls) may be triggered in part by the body building up sufficient reserves of fat tissue, signaling fitness for reproductive capabilities. Clinical pediatrician Robert Lustig of Benioff Children’s Hospital in San Francisco reports that obese girls have higher levels of the hormone leptin which in and of itself can lead to early puberty while setting off a domino effect of more weight gain and faster overall physical maturation. © 2013 Scientific American,

Keyword: Development of the Brain; Hormones & Behavior
Link ID: 18814 - Posted: 10.21.2013

by Tina Hesman Saey Sleep hoses garbage out of the brain, a study of mice finds. The trash, including pieces of proteins that cause Alzheimer’s disease, piles up while the rodents are awake. Sleep opens spigots that bathe the brain in fluids and wash away the potentially toxic buildup, researchers report in the Oct. 18 Science. The discovery may finally reveal why sleep seems mandatory for every animal. It may also shed new light on the causes of neurodegenerative disorders such as Alzheimer’s and Parkinson’s diseases. “It’s really an eye-opening and intriguing finding,” says Chiara Cirelli, a sleep researcher at the University of Wisconsin–Madison. The results have already led her and other sleep scientists to rethink some of their own findings. Although sleep requirements vary from individual to individual and across species, a complete lack of it is deadly. But no one knows why. One popular idea is that sleep severs weak connections between brain cells and strengthens more robust connections to solidify memories (SN Online: 4/2/09; SN Online: 6/23/11). But a good memory is not a biological imperative. “You don’t die from forgetting what you learned yesterday,” says Maiken Nedergaard, a neuroscientist at the University of Rochester Medical Center in New York who led the study. Researchers in Nedergaard’s lab stumbled upon sleep’s role in garbage clearance while studying a brain drainage system they described last year (SN: 9/22/12, p. 15). This service, called the glymphatic system, flushes fluid from the brain and spinal cord into the space between brain cells. Ultimately, the fluid and any debris it carries washes into the liver for disposal. © Society for Science & the Public 2000 - 2013

Keyword: Sleep; Learning & Memory
Link ID: 18813 - Posted: 10.19.2013

by Hal Hodson American Football is a rough game, but the toll it takes on players' grey matter is only now becoming clear. For the first time, the number of head impacts on the playing field has been linked with cognitive problems and functional brain abnormalities in ex-footballers. Brain autopsies on retired National Football League (NFL) players have previously shown levels of damage that are higher than those in the general population. Now, this damage has been correlated with performance in tasks related to reasoning, problem solving and planning and highlights the worrying impact of repeated head trauma. To investigate the relationship between head trauma and cognitive damage, Adam Hampshire of Imperial College London, and his colleagues scanned the brains of 13 retired professional American football players and 60 people who had never played the sport, while they performed a series of cognitive tests in an fMRI machine. It wasn't an easy task: David Hubbard, who ran the tests at the Applied fMRI Institute in San Diego, California, says they initially had 15 ex-sportsmen, but two were too large to fit in the machine. The football players only showed modest deficits on the cognitive tasks, which included tests of planning, spatial awareness, memory and counting, however their brains had to work a lot harder to achieve the same results as the non-footballers. Regions of the frontal cortices that normally communicate with each other to handle reasoning and planning tasks were far less efficient in the footballers' brains. © Copyright Reed Business Information Ltd.

Keyword: Brain Injury/Concussion
Link ID: 18812 - Posted: 10.19.2013

By Jesse Bering Disgust, in its most familiar form, is our response to something vile in the world—spoiled food, a dirty floor or rats cavorting in the subway. It is a contamination-avoidance mechanism that evolved to help us make biologically adaptive decisions in the heat of the moment. Yet disgust has also come to have powerful symbolic elements. When left unchecked, these symbolic qualities can have devastating impacts on our mental states. Consider, for example, the often dramatized, heartbreaking image of a woman crouched in the corner of a shower and frantically trying to scrub her body clean after being raped. Empirical evidence supports the characterization. Seventy percent of female victims of sexual assault report a strong impulse to wash afterward, and a quarter of these continue to wash excessively up to three months later. For women, simply imagining an unwanted advance can turn on this moral-cleansing effect. Psychiatrist Nichole Fairbrother of the University of British Columbia Hospital and her colleagues looked more closely at the phenomenon of mental pollution in a study published in 2005. Two groups of female participants were told to close their eyes and picture being kissed. The members of one group were instructed to imagine being aggressively cornered and kissed against their will. The members of the other group were asked to envision themselves in a consensual embrace. Only those women in the coercive condition chose to wash up after the study. In many cases, it seems as though a person's sense of self has become contaminated. © 2013 Scientific American

Keyword: Emotions
Link ID: 18811 - Posted: 10.19.2013

by Bruce Bower Thomas Jefferson defended the right to pursue happiness in the Declaration of Independence. But that’s so 237 years ago. Many modern societies champion everyone’s right to be happy pretty much all the time. Good luck with that, says psychologist Joseph Forgas of the University of New South Wales in Sydney. A lack of close friends, unfulfilled financial dreams and other harsh realities leave many people feeling lonely and forlorn a lot of the time. But there’s a mental and social upside to occasional downers that often goes unappreciated. “Bad moods are seen in our happiness-focused culture as representing a problem, but we need to be aware that temporary, mild negative feelings have important benefits,” Forgas says. Growing evidence suggests that gloomy moods improve key types of thinking and behavior, Forgas asserts in a new review paper aptly titled “Don’t worry, be sad!” For good evolutionary reasons, positive and negative moods subtly recruit thinking styles suited to either benign or troubling situations, he says. Each way of dealing with current circumstances generally works well, if imperfectly. New and recent studies described by Forgas in the June Current Directions in Psychological Science illustrate some of the ways in which periods of sadness spontaneously recruit a detail-oriented, analytical thinking style. Morose moods have evolved as early-warning signs of problematic or dangerous situations that demand close attention, these reports suggest. © Society for Science & the Public 2000 - 2013.

Keyword: Emotions; Depression
Link ID: 18810 - Posted: 10.19.2013

By Brian Palmer Myopia isn’t an infectious disease, but it has reached nearly epidemic proportions in parts of Asia. In Taiwan, for example, the percentage of 7-year-old children suffering from nearsightedness increased from 5.8 percent in 1983 to 21 percent in 2000. An incredible 81 percent of Taiwanese 15-year-olds are myopic. If you think that the consequences of myopia are limited to a lifetime of wearing spectacles—and, let’s be honest, small children look adorable in eyeglasses—you are mistaken. The prevalence of high myopia, an extreme form of the disorder, in Asia has more than doubled since the 1980s, and children who suffer myopia early in life are more likely to progress to high myopia. High myopia is a risk factor for such serious problems as retinal detachment, glaucoma, early-onset cataracts, and blindness. The explosion of myopia is a serious public health concern, and doctors have struggled to identify the source of the problem. Nearsightedness has a strong element of heritability, but the surge in cases shows that a child’s environment plays a significant role. A variety of risk factors has been linked to the disorder: frequent reading, participation in sports, television watching, protein intake, and depression. When each risk factor was isolated, however, its overall effect on myopia rates seemed to be fairly minimal. Researchers believe they are now closing in on a primary culprit: too much time indoors. In 2008 orthoptics professor Kathryn Rose found that only 3.3 percent of 6- and 7-year-olds of Chinese descent living in Sydney, Australia, suffered myopia, compared with 29.1 percent of those living in Singapore. The usual suspects, reading and time in front of an electronic screen, couldn’t account for the discrepancy. The Australian cohort read a few more books and spent slightly more time in front of the computer, but the Singaporean children watched a little more television. On the whole, the differences were small and probably canceled each other out. The most glaring difference between the groups was that the Australian kids spent 13.75 hours per week outdoors compared with a rather sad 3.05 hours for the children in Singapore. © 2013 The Slate Group, LLC.

Keyword: Vision; Development of the Brain
Link ID: 18809 - Posted: 10.19.2013

Daniel Cossins It may not always seem like it, but humans usually take turns speaking. Research published today in Current Biology1 shows that marmosets, too, wait for each other to stop calling before they respond during extended vocal exchanges. The discovery could help to explain how humans came to be such polite conversationalists. Taking turns is a cornerstone of human verbal communication, and is common across all languages. But with no evidence that non-human primates 'converse' similarly, it was not clear how such behaviour evolved. The widely accepted explanation, known as the gestural hypothesis, suggests that humans might somehow have taken the neural machinery underlying cooperative manual gestures such as pointing to something to attract another person's attention to it, and applied that to vocalization. Not convinced, a team led by Daniel Takahashi, a neurobiologist at Princeton University in New Jersey, wanted to see whether another primate species is capable of cooperative calling. The researchers turned to common marmosets (Callithrix jacchus) because, like humans, they are prosocial — that is, generally friendly towards each other — and they communicate using vocalizations. After you The team recorded exchanges between pairs of marmosets that could hear but not see each other, and found that the monkeys never called at the same time. Instead, they always waited for roughly 5 seconds after a caller had finished before responding. © 2013 Nature Publishing Group

Keyword: Language; Animal Communication
Link ID: 18808 - Posted: 10.19.2013

Sending up the alarm when a predator approaches seems like a good idea on the surface. But it isn’t always, because such warnings might help the predator pinpoint the location of its next meal. So animals often take their audience into account when deciding whether or not to warn it of impending danger. And a new study in Biology Letters finds that the vulnerability of that audience matters, at least when we’re talking about baby birds and their parents. Tonya Haff and Robert Magrath of Australian National University in Canberra studied a local species, the white-browed scrubwren, by setting up an experiment to see if parents' reactions to predators changed when the babies were more vulnerable. Baby birds are vulnerable pretty much all the time but more so when they’re begging for food. That whining noise can lead a predator right to them. But a parent’s alarm call can shut them right up. Haff and Magrath began by determining that parent scrubwrens would respond normally when they heard recordings of baby birds. (They used recordings because those are more reliable than getting little chicks to act on cue.) Then they played those recordings or one of background noise near scrubwren nests. The role of the predator was played by a taxidermied pied currawong, with a harmless fake crimson rosella (a kind of parrot) used as a control. The mama and papa birds called out their “buzz” alarm more often when the pied currawong was present and the baby bird recording was being played. They barely buzzed when the parrot was present or only background noise was played. The parents weren’t alarm calling more just to be heard over the noise, the researchers say. If that were the case, then a second type of call — a contact “chirp” that mamas and papas give when approaching a nest — should also have become more common, which it didn’t. © Society for Science & the Public 2000 - 2013.

Keyword: Animal Communication; Language
Link ID: 18807 - Posted: 10.19.2013

Sid Perkins One of the most complete early human skulls yet found suggests that what scientists thought were three hominin species may in fact be one. This controversial claim comes from a comparison between the anatomical features of a 1.8-million-year-old fossil skull with those of four other skulls from the same excavation site at Dmanisi, Georgia. The wide variability in their features suggests that Homo habilis, Homo rudolfensis and Homo erectus, the species so far identified as existing worldwide in that era, might represent a single species. The research is published in Science today1. The newly described skull — informally known as 'skull 5' — was unearthed in 2005. When combined with a jawbone found five years before and less than 2 metres away, it “is the most complete skull of an adult from this date”, says Marcia Ponce de León, a palaeoanthropologist at the Anthropological Institute and Museum in Zurich, Switzerland, and one of the authors of the study. The volume of skull 5’s braincase is only 546 cubic centimetres, about one-third that of modern humans, she notes. Despite that low volume, the hominin’s face was relatively large and protruded more than the faces of the other four skulls found at the site, which have been attributed to H. erectus. Having five skulls from one site provides an unprecedented opportunity to study variation in what presumably was a single population, says co-author Christoph Zollikofer, a neurobiologist at the same institute as Ponce de León. All of the skulls excavated so far were probably deposited within a 20,000-year time period, he notes. © 2013 Nature Publishing Group

Keyword: Evolution
Link ID: 18806 - Posted: 10.19.2013

by Helen Thomson ONE moment you are alive. The next you are dead. A few hours later and you are alive again. Pharmacologists have discovered a mechanism that triggers Cotard's syndrome – the mysterious condition that leaves people feeling like they, or parts of their body, no longer exist. With the ability to switch the so-called walking corpse syndrome on and off comes the prospect of new insights into how conscious experiences are constructed. Acyclovir – also known by the brand name Zovirax – is a common drug used to treat cold sores and other herpes infections. It usually has no harmful side effects. However, about 1 per cent of people who take the drug orally or intravenously experience some psychiatric side effects, including Cotard's. These occur mainly in people who have renal failure. To investigate the potential link between acyclovir and Cotard's, Anders Helldén at Karolinska University Hospital in Stockholm and Thomas Lindén at the Sahlgrenska Academy in Gothenburg pooled data from Swedish drug databases along with hospital admissions. They identified eight people with acyclovir-induced Cotard's. One woman with renal failure began using acyclovir to treat shingles. She ran into a hospital screaming, says Helldén. After an hour of dialysis, she started to talk: she said the reason she was so anxious was that she had a strong feeling she was dead. After a few more hours of dialysis she said, "I'm not quite sure whether I'm dead any more but I'm still feeling very strange." Four hours later: "I'm pretty sure I'm not dead any more but my left arm is definitely not mine." Within 24 hours, the symptoms had disappeared. © Copyright Reed Business Information Ltd.

Keyword: Attention
Link ID: 18805 - Posted: 10.17.2013

by Denise Chow, LiveScience The discovery of a fossilized brain in the preserved remains of an extinct "mega-clawed" creature has revealed an ancient nervous system that is remarkably similar to that of modern-day spiders and scorpions, according to a new study. The fossilized Alalcomenaeus is a type of arthropod known as a megacheiran (Greek for "large claws") that lived approximately 520 million years ago, during a period known as the Lower Cambrian. The creature was unearthed in the fossil-rich Chengjiang formation in southwest China. VIDEO: Bugs, Arthropods, and Insects! Oh My! Researchers studied the fossilized brain, the earliest known complete nervous system, and found similarities between the extinct creature's nervous system and the nervous systems of several modern arthropods, which suggest they may be ancestrally related. [Photos of Clawed Arthropod & Other Strange Cambrian Creatures] Living arthropods are commonly separated into two major groups: chelicerates, which include spiders, horseshoe crabs and scorpions, and a group that includes insects, crustaceans and millipedes. The new findings shed light on the evolutionary processes that may have given rise to modern arthropods, and also provide clues about where these extinct mega-clawed creatures fit in the tree of life. "We now know that the megacheirans had central nervous systems very similar to today's horseshoe crabs and scorpions," senior author Nicholas Strausfeld, a professor in the department of neuroscience at the University of Arizona in Tucson, said in a statement. "This means the ancestors of spiders and their kin lived side by side with the ancestors of crustaceans in the Lower Cambrian." © 2013 Discovery Communications, LLC.

Keyword: Evolution
Link ID: 18804 - Posted: 10.17.2013

By Christopher Wanjek and LiveScience Your liver could be "eating" your brain, new research suggests. People with extra abdominal fat are three times more likely than lean individuals to develop memory loss and dementia later in life, and now scientists say they may know why. It seems that the liver and the hippocampus (the memory center in the brain), share a craving for a certain protein called PPARalpha. The liver uses PPARalpha to burn belly fat; the hippocampus uses PPARalpha to process memory. In people with a large amount of belly fat, the liver needs to work overtime to metabolize the fat, and uses up all the PPARalpha — first depleting local stores and then raiding the rest of the body, including the brain, according to the new study. The process essentially starves the hippocampus of PPARalpha, thus hindering memory and learning, researchers at Rush University Medical Center in Chicago wrote in the study, published in the current issue of Cell Reports. Other news reports were incorrect in stating that the researchers established that obese individuals were 3.6 times more likely than lean individuals to develop dementia. That finding dates back to a 2008 study by researchers at the Kaiser Permanente Division of Research in Oakland, Calif. In another study, described in a 2010 article in the Annals of Neurology, researchers at Boston University School of Medicine found that the greater the amount of belly fat, the greater the brain shrinkage in old age. © 2013 Scientific American

Keyword: Learning & Memory; Development of the Brain
Link ID: 18803 - Posted: 10.17.2013

by Bob Holmes The great flowering of human evolution over the past 2 million years may have been driven not by the African savannahs, but by the lakes of that continent's Great Rift Valley. This novel idea, published this week, may explain why every major advance in the evolution of early humans, from speciation to the vast increase in brain size, appears to have taken place in eastern Africa. Anthropologists have surmised for several years that early humans, or hominins, might have evolved their unusually large, powerful brains to cope with an increasingly variable climate over the past few million years. However, studies testing this hypothesis have been equivocal, perhaps because most use global or continental-scale measures of climate, such as studying trends in the amount of airborne dust from dry earth that is blown into the ocean and incorporated into deep-sea sediments. Mark Maslin, a palaeoclimatologist at University College London, and his colleague Susanne Shultz at the University of Manchester, UK, have taken a local approach instead, by studying whether the presence or absence of lakes in the Rift Valley affected the hominins living there. Maslin's hunch is that relatively short periods of extreme variability 2.6, 1.8, and 1 million years ago – which are important periods for human evolution – corresponded to times of rapid change in the large lakes of the Great Rift Valley. Because the valley concentrates rainfall from a wide area into relatively small basins, these lakes are unusually sensitive to rainfall and swell or disappear depending on climate. © Copyright Reed Business Information Ltd.

Keyword: Evolution
Link ID: 18802 - Posted: 10.17.2013

By Jason G. Goldman Scientists love yawning. No, that’s not quite right. Scientists love doing research on yawning. It seems to be of interest to folks in fields ranging from primatology to developmental psychology to psychopathology to animal behavior. If the notion of scientifically investigation the purpose of yawning makes you, well, yawn, then you’re missing one of the more interesting debates in the social cognition literature. To understand why yawning is about more than feeling tired or bored, we have to go back a few years. Once upon a time, scientists thought that yawning might be process through which the brain keeps itself cool (PDF). Yawning is associated with increases in blood pressure, and the consequential increase in blood flow might mean that the vascular system acts as a radiator, replacing the warm blood in the brain with cooler blood. It could also be that the deep inhalation of cold air during a yawn can, through convection, alter blood temperature which in turn could cool the brain. Even if it turns out that some yawns can be explained through purely physiological means, yawning is also contagious for humans and other species. If someone watches someone else yawning, they’ll be likely to yawn as well. That means that there is social component to yawning, and it might be related to empathy. It turns out that there’s a correlation between a person’s self-reported empathy and their susceptibility to reacting to a yawn contagion, and those who are more skilled at theory of mind tasks are also more likely (PDF) to yawn contagiously. © 2013 Scientific American

Keyword: Emotions; Evolution
Link ID: 18801 - Posted: 10.17.2013

Henry Astley In the Mark Twain story The Celebrated Jumping Frog of Calaveras County, a frog named Daniel Webster "could get over more ground at one straddle than any animal of his breed you ever see." Now, scientists have visited the real Calaveras County in hopes of learning more about these hopping amphibians. They’ve found that what they see in the lab doesn’t always match the goings-on in the real world. If you wanted to know how far the bullfrog Rana catesbeiana could jump, the scientific literature would give you one answer: 1.295 meters, published in Smithsonian Contributions to Zoology in 1978. If you looked at the Guinness Book of World Records, though, you'd find a different answer. In 1986, a bullfrog called Rosie the Ribeter covered 6.55 meters in three hops. If you divide by three, at least one of those hops had to be no shorter than 2.18 meters—about four bullfrog body lengths more than the number in the scientific paper. The disparity matters. If bullfrogs can hop only 1.3 meters, they have enough power in their muscles to pull off the jump without any other anatomical help. But if they can jump farther, they must also be using a stretchy tendon to power their hops—an ability that other frogs have but that researchers thought bullfrogs had lost. These particular amphibians, scientists speculated, might have made some kind of evolutionary tradeoff that shortened their jumps but enabled them to swim better in the water, where they spend much of their lives. © 2013 American Association for the Advancement of Science

Keyword: Miscellaneous
Link ID: 18800 - Posted: 10.17.2013

By Lary C. Walker Clumps of proteins twisted into aberrant shapes cause the prion diseases that have perplexed biologists for decades. The surprises just keep coming with a new report that the simple clusters of proteins responsible for Mad Cow and other prions diseases may, without help from DNA or RNA, be capable of changing form to escape the predations of drugs that target their eradication. Prion drug resistance could be eerily similar to that found in cancer and HIV—and may have implications for drug development for Alzheimer’s and Parkinson’s, neurodegenerative diseases also characterized by misfolded proteins. Prion diseases include scrapie, chronic wasting disease and bovine spongiform encephalopathy (mad cow disease) in nonhuman species, and Creutzfeldt-Jakob disease and fatal insomnia in humans. They are unusual in that they can arise spontaneously, as a result of genetic mutations, or, in some instances, through infection. Remarkably, the infectious agent is not a microbe or virus, but rather the prion itself, a clump of proteins without genetic material. The noxious agents originate when a normally generated protein – called the prion protein – mistakenly folds into a stable, sticky, and potentially toxic shape. When the misfolded protein contacts other prion protein molecules, they too are corrupted and begin to bind to one another. In the ensuing chain reaction, the prions grow, break apart, and spread; within the nervous system, they relentlessly destroy neurons, ultimately, and invariably, leading to death. © 2013 Scientific American

Keyword: Prions; Alzheimers
Link ID: 18799 - Posted: 10.17.2013

Anne Trafton, MIT News Office Schizophrenia patients usually suffer from a breakdown of organized thought, often accompanied by delusions or hallucinations. For the first time, MIT neuroscientists have observed the neural activity that appears to produce this disordered thinking. The researchers found that mice lacking the brain protein calcineurin have hyperactive brain-wave oscillations in the hippocampus while resting, and are unable to mentally replay a route they have just run, as normal mice do. Mutations in the gene for calcineurin have previously been found in some schizophrenia patients. Ten years ago, MIT researchers led by Susumu Tonegawa, the Picower Professor of Biology and Neuroscience, created mice lacking the gene for calcineurin in the forebrain; these mice displayed several behavioral symptoms of schizophrenia, including impaired short-term memory, attention deficits, and abnormal social behavior. In the new study, which appears in the Oct. 16 issue of the journal Neuron, Tonegawa and colleagues at the RIKEN-MIT Center for Neural Circuit Genetics at MIT’s Picower Institute for Learning and Memory recorded the electrical activity of individual neurons in the hippocampus of these knockout mice as they ran along a track. Previous studies have shown that in normal mice, “place cells” in the hippocampus, which are linked to specific locations along the track, fire in sequence when the mice take breaks from running the course. This mental replay also occurs when the mice are sleeping. These replays occur in association with very high frequency brain-wave oscillations known as ripple events.

Keyword: Schizophrenia
Link ID: 18798 - Posted: 10.17.2013

By MICHAEL WINES NEW HOLSTEIN, Wis. — Next to their white clapboard house on a rural road here, in long rows of cages set beneath the roofs of seven open-air sheds, Virginia and Gary Bonlander are raising 5,000 minks. Or were, anyway, until two Saturdays ago, when the police roused them from bed at 5 a.m. with a rap on their door. The Bonlanders woke one recent morning to find thousands of the creatures zipping across their lawn. Outside, 2,000 minks were scampering away — up to 50 top-quality, full-length and, suddenly, free-range mink coats. “The backyard was full of mink. The driveway was full of mink,” Mrs. Bonlander recalled a few days ago. “Then, pshew” — she made a whooshing sound — “they were gone.” And not only in Wisconsin, the mink-raising capital of the United States. After something of a hiatus, the animal rights movement has resumed a decades-old guerrilla war against the fur industry with a vengeance — and hints of more to come. In New Holstein; in Grand Meadow, Minn.; in Coalville, Utah; in Keota, Iowa; and four other states, activists say, eight dark-of-night raids on mink farms have liberated at least 7,700 of the critters — more than $770,000 worth of pelts — just since late July. That is more such raids than in the preceding three years combined. Two more raids in Ontario and British Columbia freed 1,300 other minks and foxes during the same period, according to the North American Animal Liberation Press Office, which bills itself as a conduit for messages from anonymous animal rights activists. “What we’re seeing now is unprecedented,” Peter Young, a Santa Cruz, Calif., activist who was imprisoned in 2005 for his role in raids on six mink ranches, said in a telephone interview. Though still an outspoken defender of the animal rights movement and mink-ranch raids, Mr. Young says he has no contact with those who raid fur farms or commit other illegal acts and, in fact, does not know who they are. © 2013 The New York Times Company

Keyword: Animal Rights
Link ID: 18797 - Posted: 10.17.2013

by Colin Barras A part of all of us loves sums. Eavesdropping on the brain while people go about their daily activity has revealed the first brain cells specialised for numbers. Josef Parvizi and his colleagues at Stanford University in California enlisted the help of three people with epilepsy whose therapy involved placing a grid of electrodes on the surface of their brain that record activity. Neurons fired in a region called the intraparietal sulcus when the three volunteers performed arithmetic tests, suggesting they dealt with numbers. The team continued to monitor brain activity while the volunteers went about their normal activity in hospital. Comparing video footage of their stay with their brain activity (see video, above) revealed that the neurons remained virtually silent for most of the time, bursting into life only when the volunteers talked about numbers or numerical concepts such as "more than" or "less than". There is debate over whether some neural populations perform many functions or are involved in very precise tasks. "We show here that there is specialisation for numeracy," says Parvizi. Journal reference: Nature Communications, DOI: 10.1038/ncomms3528 © Copyright Reed Business Information Ltd.

Keyword: Attention
Link ID: 18796 - Posted: 10.16.2013