Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 11081 - 11100 of 29326

by NPR Staff Soon you'll be able to direct the path of a cockroach with a smartphone and the swipe of your finger. Greg Gage and his colleagues at Backyard Brains have developed a device called the that lets you control the path of an insect. It may make you squirm, but Gage says the device could inspire a new generation of neuroscientists. "The sharpest kids amongst us are probably going into other fields right now. And so we're kind of in the dark ages when it comes to neuroscience," he tells NPR's Arun Rath. He wants to get kids interested in neuroscience early enough to guide them toward that career path. And a cyborg cockroach might be the inspiration. "The neurons in the insects are very, very similar to the neurons inside the human brain," Gage says. "It's a beautiful way to just really understand what's happening inside your brain by looking at these little insects." The idea was spawned by a device the Backyard Brain-iacs developed called , which is capable of amplifying real living neurons. Insert a small wire into a cockroach's antennae, and you can hear the sound of actual neurons. "Lining the inside of the cockroach are these neurons that are picking up touch or vibration sensing, chemical sensing," Gage says. "They use it like a nose or a large tongue, their antennas, and they use it to sort of navigate the world. "So when you put a small wire inside of there, you can actually pick up the information as it's being encoded and being sent to the brain." With the RoboRoach device and smartphone app, you can interact with the antennae to influence the insect's behavior. ©2013 NPR

Keyword: Robotics
Link ID: 18819 - Posted: 10.22.2013

by Susan Milius Your calamari, it turns out, may have come from a temporary transvestite with rainbows in its armpits. Well, not armpits, but spots just below where the fins flare out. “Finpits,” cell biologist Daniel DeMartini nicknamed them. He and his colleagues have documented unusual color-change displays in female California market squid, popular in restaurants. Squids, octopuses and cuttlefishes are nature’s iPads, changing their living pixels at will. DeMartini, of the University of California, Santa Barbara, saw so many sunset shimmers, blink-of-an-eye blackouts and other marvels in California’s Doryteuthis opalescens that it took him a while to notice that only females shimmered the finpit stripe. It shows up now and then during life, and reliably for about 24 hours after decapitation, DeMartini found. The squid are color-blind, and what prompts their display is known only to them. But the researchers have figured out how it works. The squid make rainbows when color-change cells called iridocytes lose water. Other kinds of color-change cells work their magic via pigments, but not iridocytes. “If you take a bunch of iridocyte cells in red, blue, green or yellow and you grind them up, then you wouldn’t see any color,” DeMartini says. Instead, little stacks of protein plates inside the cells turn colorful only when water rushes out of the stack. How closely the plates snug together determines whether the stack looks blue, scarlet or anything in between. © Society for Science & the Public 2000 - 2013

Keyword: Vision; Sexual Behavior
Link ID: 18818 - Posted: 10.22.2013

By MAGGIE KOERTH-BAKER Between the fall of 2011 and the spring of 2012, people across the United States suddenly found themselves unable to get their hands on A.D.H.D. medication. Low-dose generics were particularly in short supply. There were several factors contributing to the shortage, but the main cause was that supply was suddenly being outpaced by demand. The number of diagnoses of Attention Deficit Hyperactivity Disorder has ballooned over the past few decades. Before the early 1990s, fewer than 5 percent of school-age kids were thought to have A.D.H.D. Earlier this year, data from the Centers for Disease Control and Prevention showed that 11 percent of children ages 4 to 17 had at some point received the diagnosis — and that doesn’t even include first-time diagnoses in adults. (Full disclosure: I’m one of them.) That amounts to millions of extra people receiving regular doses of stimulant drugs to keep neurological symptoms in check. For a lot of us, the diagnosis and subsequent treatments — both behavioral and pharmaceutical — have proved helpful. But still: Where did we all come from? Were that many Americans always pathologically hyperactive and unable to focus, and only now are getting the treatment they need? Probably not. Of the 6.4 million kids who have been given diagnoses of A.D.H.D., a large percentage are unlikely to have any kind of physiological difference that would make them more distractible than the average non-A.D.H.D. kid. It’s also doubtful that biological or environmental changes are making physiological differences more prevalent. Instead, the rapid increase in people with A.D.H.D. probably has more to do with sociological factors — changes in the way we school our children, in the way we interact with doctors and in what we expect from our kids. © 2013 The New York Times Company

Keyword: ADHD; Development of the Brain
Link ID: 18817 - Posted: 10.21.2013

Maggie Fox NBC News Every cell in your body has a little clock ticking away in it, researchers reported on Sunday. And while most of you is aging in a coordinated way, odd anomalies that have the researchers curious: Your heart may be “younger” than the rest of your tissues, and a woman’s breasts are older. Tumors are the oldest of all, a finding reported in the journal Genome Biology that might help scientists better understand cancer, explain why breast cancer is so common and help researchers find better ways to prevent it. Less surprising, but intriguing: embryonic stem cells, the body’s master cells, look just like newborns with a biological age of zero. The new measurements might be useful in the search for drugs or other treatments that can turn back the clock on aging tissue and perhaps treating or preventing diseases of aging, such as heart disease and cancer, says Steve Horvath, a professor of genetics at the David Geffen School of Medicine at UCLA. “The big question is whether the biological clock controls a process that leads to aging,” Horvath said. Horvath looked at a genetic process called methylation. It’s a kind of chemical reaction that turns on or off stretches of DNA. All cells have the entire genetic map inside; methylation helps determine which bits of the map the cells use to perform specific functions.

Keyword: Biological Rhythms
Link ID: 18816 - Posted: 10.21.2013

By GRETCHEN REYNOLDS If you consider yourself to be a born morning person or an inveterate night owl, there is new research that supports your desire to wake up early or stay up late. Each of us has a personal “chronotype,” or unique circadian rhythm, says Till Roenneberg, a professor of chronobiology at Ludwig Maximilian University in Munich and one of the world’s experts on sleep. In broad strokes, these chronotypes are usually characterized as early, intermediate or late, corresponding to people who voluntarily go to bed and wake early, at a moderate hour or vampirishly late. If you are forced to wake up earlier than your body naturally would, you suffer from what Roenneberg calls “social jet lag.” People with an early chronotype may do well with a 7 a.m. workday rising time, but others do not. Sleeping out of sync with your innate preferences can be detrimental to your health, especially for late chronotypes, who tend to be the most at odds with typical work schedules. A study conducted by the National Institutes of Health and published in March in PLOS ONE found that obese adults with late chronotypes tended to eat larger meals, develop more sleep apnea and have higher levels of stress hormones and lower levels of HDL, or “good,” cholesterol than obese people with other chronotypes. Their chronotype may also have contributed to weight gain in the first place, Roenneberg says. Research has shown that a single hour of social jet lag, the mismatch between your chronotype and your schedule, increases your risk for obesity by about 33 percent. In a study published in June in Chronobiology International, late-night chronotypes gained more weight during their freshman years at college than other new students did, even though college is one of the best fits for night owls. Copyright 2013 The New York Times Company

Keyword: Biological Rhythms; Genes & Behavior
Link ID: 18815 - Posted: 10.21.2013

Research indicates that indeed Americans girls and boys are going through puberty earlier than ever, though the reasons are unclear. Many believe our widespread exposure to synthetic chemicals is at least partly to blame, but it’s hard to pinpoint exactly why our bodies react in certain ways to various environmental stimuli. Researchers first noticed the earlier onset of puberty in the late 1990s, and recent studies confirm the mysterious public health trend. A 2012 analysis by the U.S. Centers for Disease Control and Prevention (CDC) found that American girls exposed to high levels of common household chemicals had their first periods seven months earlier than those with lower exposures. “This study adds to the growing body of scientific research that exposure to environmental chemicals may be associated with early puberty,” says Danielle Buttke, a researcher at CDC and lead author on the study. Buttke found that the age when a girl has her first period (menarche) has fallen over the past century from an average of age 16-17 to age 12-13. Earlier puberty isn’t just for girls. In 2012 researchers from the American Academy of Pediatrics (AAP) surveyed data on 4,100 boys from 144 pediatric practices in 41 states and found a similar trend: American boys are reaching puberty six months to two years earlier than just a few decades ago. African-American boys are starting the earliest, at around age nine, while Caucasian and Hispanics start on average at age 10. One culprit could be rising obesity rates. Researchers believe that puberty (at least for girls) may be triggered in part by the body building up sufficient reserves of fat tissue, signaling fitness for reproductive capabilities. Clinical pediatrician Robert Lustig of Benioff Children’s Hospital in San Francisco reports that obese girls have higher levels of the hormone leptin which in and of itself can lead to early puberty while setting off a domino effect of more weight gain and faster overall physical maturation. © 2013 Scientific American,

Keyword: Development of the Brain; Hormones & Behavior
Link ID: 18814 - Posted: 10.21.2013

by Tina Hesman Saey Sleep hoses garbage out of the brain, a study of mice finds. The trash, including pieces of proteins that cause Alzheimer’s disease, piles up while the rodents are awake. Sleep opens spigots that bathe the brain in fluids and wash away the potentially toxic buildup, researchers report in the Oct. 18 Science. The discovery may finally reveal why sleep seems mandatory for every animal. It may also shed new light on the causes of neurodegenerative disorders such as Alzheimer’s and Parkinson’s diseases. “It’s really an eye-opening and intriguing finding,” says Chiara Cirelli, a sleep researcher at the University of Wisconsin–Madison. The results have already led her and other sleep scientists to rethink some of their own findings. Although sleep requirements vary from individual to individual and across species, a complete lack of it is deadly. But no one knows why. One popular idea is that sleep severs weak connections between brain cells and strengthens more robust connections to solidify memories (SN Online: 4/2/09; SN Online: 6/23/11). But a good memory is not a biological imperative. “You don’t die from forgetting what you learned yesterday,” says Maiken Nedergaard, a neuroscientist at the University of Rochester Medical Center in New York who led the study. Researchers in Nedergaard’s lab stumbled upon sleep’s role in garbage clearance while studying a brain drainage system they described last year (SN: 9/22/12, p. 15). This service, called the glymphatic system, flushes fluid from the brain and spinal cord into the space between brain cells. Ultimately, the fluid and any debris it carries washes into the liver for disposal. © Society for Science & the Public 2000 - 2013

Keyword: Sleep; Learning & Memory
Link ID: 18813 - Posted: 10.19.2013

by Hal Hodson American Football is a rough game, but the toll it takes on players' grey matter is only now becoming clear. For the first time, the number of head impacts on the playing field has been linked with cognitive problems and functional brain abnormalities in ex-footballers. Brain autopsies on retired National Football League (NFL) players have previously shown levels of damage that are higher than those in the general population. Now, this damage has been correlated with performance in tasks related to reasoning, problem solving and planning and highlights the worrying impact of repeated head trauma. To investigate the relationship between head trauma and cognitive damage, Adam Hampshire of Imperial College London, and his colleagues scanned the brains of 13 retired professional American football players and 60 people who had never played the sport, while they performed a series of cognitive tests in an fMRI machine. It wasn't an easy task: David Hubbard, who ran the tests at the Applied fMRI Institute in San Diego, California, says they initially had 15 ex-sportsmen, but two were too large to fit in the machine. The football players only showed modest deficits on the cognitive tasks, which included tests of planning, spatial awareness, memory and counting, however their brains had to work a lot harder to achieve the same results as the non-footballers. Regions of the frontal cortices that normally communicate with each other to handle reasoning and planning tasks were far less efficient in the footballers' brains. © Copyright Reed Business Information Ltd.

Keyword: Brain Injury/Concussion
Link ID: 18812 - Posted: 10.19.2013

By Jesse Bering Disgust, in its most familiar form, is our response to something vile in the world—spoiled food, a dirty floor or rats cavorting in the subway. It is a contamination-avoidance mechanism that evolved to help us make biologically adaptive decisions in the heat of the moment. Yet disgust has also come to have powerful symbolic elements. When left unchecked, these symbolic qualities can have devastating impacts on our mental states. Consider, for example, the often dramatized, heartbreaking image of a woman crouched in the corner of a shower and frantically trying to scrub her body clean after being raped. Empirical evidence supports the characterization. Seventy percent of female victims of sexual assault report a strong impulse to wash afterward, and a quarter of these continue to wash excessively up to three months later. For women, simply imagining an unwanted advance can turn on this moral-cleansing effect. Psychiatrist Nichole Fairbrother of the University of British Columbia Hospital and her colleagues looked more closely at the phenomenon of mental pollution in a study published in 2005. Two groups of female participants were told to close their eyes and picture being kissed. The members of one group were instructed to imagine being aggressively cornered and kissed against their will. The members of the other group were asked to envision themselves in a consensual embrace. Only those women in the coercive condition chose to wash up after the study. In many cases, it seems as though a person's sense of self has become contaminated. © 2013 Scientific American

Keyword: Emotions
Link ID: 18811 - Posted: 10.19.2013

by Bruce Bower Thomas Jefferson defended the right to pursue happiness in the Declaration of Independence. But that’s so 237 years ago. Many modern societies champion everyone’s right to be happy pretty much all the time. Good luck with that, says psychologist Joseph Forgas of the University of New South Wales in Sydney. A lack of close friends, unfulfilled financial dreams and other harsh realities leave many people feeling lonely and forlorn a lot of the time. But there’s a mental and social upside to occasional downers that often goes unappreciated. “Bad moods are seen in our happiness-focused culture as representing a problem, but we need to be aware that temporary, mild negative feelings have important benefits,” Forgas says. Growing evidence suggests that gloomy moods improve key types of thinking and behavior, Forgas asserts in a new review paper aptly titled “Don’t worry, be sad!” For good evolutionary reasons, positive and negative moods subtly recruit thinking styles suited to either benign or troubling situations, he says. Each way of dealing with current circumstances generally works well, if imperfectly. New and recent studies described by Forgas in the June Current Directions in Psychological Science illustrate some of the ways in which periods of sadness spontaneously recruit a detail-oriented, analytical thinking style. Morose moods have evolved as early-warning signs of problematic or dangerous situations that demand close attention, these reports suggest. © Society for Science & the Public 2000 - 2013.

Keyword: Emotions; Depression
Link ID: 18810 - Posted: 10.19.2013

By Brian Palmer Myopia isn’t an infectious disease, but it has reached nearly epidemic proportions in parts of Asia. In Taiwan, for example, the percentage of 7-year-old children suffering from nearsightedness increased from 5.8 percent in 1983 to 21 percent in 2000. An incredible 81 percent of Taiwanese 15-year-olds are myopic. If you think that the consequences of myopia are limited to a lifetime of wearing spectacles—and, let’s be honest, small children look adorable in eyeglasses—you are mistaken. The prevalence of high myopia, an extreme form of the disorder, in Asia has more than doubled since the 1980s, and children who suffer myopia early in life are more likely to progress to high myopia. High myopia is a risk factor for such serious problems as retinal detachment, glaucoma, early-onset cataracts, and blindness. The explosion of myopia is a serious public health concern, and doctors have struggled to identify the source of the problem. Nearsightedness has a strong element of heritability, but the surge in cases shows that a child’s environment plays a significant role. A variety of risk factors has been linked to the disorder: frequent reading, participation in sports, television watching, protein intake, and depression. When each risk factor was isolated, however, its overall effect on myopia rates seemed to be fairly minimal. Researchers believe they are now closing in on a primary culprit: too much time indoors. In 2008 orthoptics professor Kathryn Rose found that only 3.3 percent of 6- and 7-year-olds of Chinese descent living in Sydney, Australia, suffered myopia, compared with 29.1 percent of those living in Singapore. The usual suspects, reading and time in front of an electronic screen, couldn’t account for the discrepancy. The Australian cohort read a few more books and spent slightly more time in front of the computer, but the Singaporean children watched a little more television. On the whole, the differences were small and probably canceled each other out. The most glaring difference between the groups was that the Australian kids spent 13.75 hours per week outdoors compared with a rather sad 3.05 hours for the children in Singapore. © 2013 The Slate Group, LLC.

Keyword: Vision; Development of the Brain
Link ID: 18809 - Posted: 10.19.2013

Daniel Cossins It may not always seem like it, but humans usually take turns speaking. Research published today in Current Biology1 shows that marmosets, too, wait for each other to stop calling before they respond during extended vocal exchanges. The discovery could help to explain how humans came to be such polite conversationalists. Taking turns is a cornerstone of human verbal communication, and is common across all languages. But with no evidence that non-human primates 'converse' similarly, it was not clear how such behaviour evolved. The widely accepted explanation, known as the gestural hypothesis, suggests that humans might somehow have taken the neural machinery underlying cooperative manual gestures such as pointing to something to attract another person's attention to it, and applied that to vocalization. Not convinced, a team led by Daniel Takahashi, a neurobiologist at Princeton University in New Jersey, wanted to see whether another primate species is capable of cooperative calling. The researchers turned to common marmosets (Callithrix jacchus) because, like humans, they are prosocial — that is, generally friendly towards each other — and they communicate using vocalizations. After you The team recorded exchanges between pairs of marmosets that could hear but not see each other, and found that the monkeys never called at the same time. Instead, they always waited for roughly 5 seconds after a caller had finished before responding. © 2013 Nature Publishing Group

Keyword: Language; Animal Communication
Link ID: 18808 - Posted: 10.19.2013

Sending up the alarm when a predator approaches seems like a good idea on the surface. But it isn’t always, because such warnings might help the predator pinpoint the location of its next meal. So animals often take their audience into account when deciding whether or not to warn it of impending danger. And a new study in Biology Letters finds that the vulnerability of that audience matters, at least when we’re talking about baby birds and their parents. Tonya Haff and Robert Magrath of Australian National University in Canberra studied a local species, the white-browed scrubwren, by setting up an experiment to see if parents' reactions to predators changed when the babies were more vulnerable. Baby birds are vulnerable pretty much all the time but more so when they’re begging for food. That whining noise can lead a predator right to them. But a parent’s alarm call can shut them right up. Haff and Magrath began by determining that parent scrubwrens would respond normally when they heard recordings of baby birds. (They used recordings because those are more reliable than getting little chicks to act on cue.) Then they played those recordings or one of background noise near scrubwren nests. The role of the predator was played by a taxidermied pied currawong, with a harmless fake crimson rosella (a kind of parrot) used as a control. The mama and papa birds called out their “buzz” alarm more often when the pied currawong was present and the baby bird recording was being played. They barely buzzed when the parrot was present or only background noise was played. The parents weren’t alarm calling more just to be heard over the noise, the researchers say. If that were the case, then a second type of call — a contact “chirp” that mamas and papas give when approaching a nest — should also have become more common, which it didn’t. © Society for Science & the Public 2000 - 2013.

Keyword: Animal Communication; Language
Link ID: 18807 - Posted: 10.19.2013

Sid Perkins One of the most complete early human skulls yet found suggests that what scientists thought were three hominin species may in fact be one. This controversial claim comes from a comparison between the anatomical features of a 1.8-million-year-old fossil skull with those of four other skulls from the same excavation site at Dmanisi, Georgia. The wide variability in their features suggests that Homo habilis, Homo rudolfensis and Homo erectus, the species so far identified as existing worldwide in that era, might represent a single species. The research is published in Science today1. The newly described skull — informally known as 'skull 5' — was unearthed in 2005. When combined with a jawbone found five years before and less than 2 metres away, it “is the most complete skull of an adult from this date”, says Marcia Ponce de León, a palaeoanthropologist at the Anthropological Institute and Museum in Zurich, Switzerland, and one of the authors of the study. The volume of skull 5’s braincase is only 546 cubic centimetres, about one-third that of modern humans, she notes. Despite that low volume, the hominin’s face was relatively large and protruded more than the faces of the other four skulls found at the site, which have been attributed to H. erectus. Having five skulls from one site provides an unprecedented opportunity to study variation in what presumably was a single population, says co-author Christoph Zollikofer, a neurobiologist at the same institute as Ponce de León. All of the skulls excavated so far were probably deposited within a 20,000-year time period, he notes. © 2013 Nature Publishing Group

Keyword: Evolution
Link ID: 18806 - Posted: 10.19.2013

by Helen Thomson ONE moment you are alive. The next you are dead. A few hours later and you are alive again. Pharmacologists have discovered a mechanism that triggers Cotard's syndrome – the mysterious condition that leaves people feeling like they, or parts of their body, no longer exist. With the ability to switch the so-called walking corpse syndrome on and off comes the prospect of new insights into how conscious experiences are constructed. Acyclovir – also known by the brand name Zovirax – is a common drug used to treat cold sores and other herpes infections. It usually has no harmful side effects. However, about 1 per cent of people who take the drug orally or intravenously experience some psychiatric side effects, including Cotard's. These occur mainly in people who have renal failure. To investigate the potential link between acyclovir and Cotard's, Anders Helldén at Karolinska University Hospital in Stockholm and Thomas Lindén at the Sahlgrenska Academy in Gothenburg pooled data from Swedish drug databases along with hospital admissions. They identified eight people with acyclovir-induced Cotard's. One woman with renal failure began using acyclovir to treat shingles. She ran into a hospital screaming, says Helldén. After an hour of dialysis, she started to talk: she said the reason she was so anxious was that she had a strong feeling she was dead. After a few more hours of dialysis she said, "I'm not quite sure whether I'm dead any more but I'm still feeling very strange." Four hours later: "I'm pretty sure I'm not dead any more but my left arm is definitely not mine." Within 24 hours, the symptoms had disappeared. © Copyright Reed Business Information Ltd.

Keyword: Attention
Link ID: 18805 - Posted: 10.17.2013

by Denise Chow, LiveScience The discovery of a fossilized brain in the preserved remains of an extinct "mega-clawed" creature has revealed an ancient nervous system that is remarkably similar to that of modern-day spiders and scorpions, according to a new study. The fossilized Alalcomenaeus is a type of arthropod known as a megacheiran (Greek for "large claws") that lived approximately 520 million years ago, during a period known as the Lower Cambrian. The creature was unearthed in the fossil-rich Chengjiang formation in southwest China. VIDEO: Bugs, Arthropods, and Insects! Oh My! Researchers studied the fossilized brain, the earliest known complete nervous system, and found similarities between the extinct creature's nervous system and the nervous systems of several modern arthropods, which suggest they may be ancestrally related. [Photos of Clawed Arthropod & Other Strange Cambrian Creatures] Living arthropods are commonly separated into two major groups: chelicerates, which include spiders, horseshoe crabs and scorpions, and a group that includes insects, crustaceans and millipedes. The new findings shed light on the evolutionary processes that may have given rise to modern arthropods, and also provide clues about where these extinct mega-clawed creatures fit in the tree of life. "We now know that the megacheirans had central nervous systems very similar to today's horseshoe crabs and scorpions," senior author Nicholas Strausfeld, a professor in the department of neuroscience at the University of Arizona in Tucson, said in a statement. "This means the ancestors of spiders and their kin lived side by side with the ancestors of crustaceans in the Lower Cambrian." © 2013 Discovery Communications, LLC.

Keyword: Evolution
Link ID: 18804 - Posted: 10.17.2013

By Christopher Wanjek and LiveScience Your liver could be "eating" your brain, new research suggests. People with extra abdominal fat are three times more likely than lean individuals to develop memory loss and dementia later in life, and now scientists say they may know why. It seems that the liver and the hippocampus (the memory center in the brain), share a craving for a certain protein called PPARalpha. The liver uses PPARalpha to burn belly fat; the hippocampus uses PPARalpha to process memory. In people with a large amount of belly fat, the liver needs to work overtime to metabolize the fat, and uses up all the PPARalpha — first depleting local stores and then raiding the rest of the body, including the brain, according to the new study. The process essentially starves the hippocampus of PPARalpha, thus hindering memory and learning, researchers at Rush University Medical Center in Chicago wrote in the study, published in the current issue of Cell Reports. Other news reports were incorrect in stating that the researchers established that obese individuals were 3.6 times more likely than lean individuals to develop dementia. That finding dates back to a 2008 study by researchers at the Kaiser Permanente Division of Research in Oakland, Calif. In another study, described in a 2010 article in the Annals of Neurology, researchers at Boston University School of Medicine found that the greater the amount of belly fat, the greater the brain shrinkage in old age. © 2013 Scientific American

Keyword: Learning & Memory; Development of the Brain
Link ID: 18803 - Posted: 10.17.2013

by Bob Holmes The great flowering of human evolution over the past 2 million years may have been driven not by the African savannahs, but by the lakes of that continent's Great Rift Valley. This novel idea, published this week, may explain why every major advance in the evolution of early humans, from speciation to the vast increase in brain size, appears to have taken place in eastern Africa. Anthropologists have surmised for several years that early humans, or hominins, might have evolved their unusually large, powerful brains to cope with an increasingly variable climate over the past few million years. However, studies testing this hypothesis have been equivocal, perhaps because most use global or continental-scale measures of climate, such as studying trends in the amount of airborne dust from dry earth that is blown into the ocean and incorporated into deep-sea sediments. Mark Maslin, a palaeoclimatologist at University College London, and his colleague Susanne Shultz at the University of Manchester, UK, have taken a local approach instead, by studying whether the presence or absence of lakes in the Rift Valley affected the hominins living there. Maslin's hunch is that relatively short periods of extreme variability 2.6, 1.8, and 1 million years ago – which are important periods for human evolution – corresponded to times of rapid change in the large lakes of the Great Rift Valley. Because the valley concentrates rainfall from a wide area into relatively small basins, these lakes are unusually sensitive to rainfall and swell or disappear depending on climate. © Copyright Reed Business Information Ltd.

Keyword: Evolution
Link ID: 18802 - Posted: 10.17.2013

By Jason G. Goldman Scientists love yawning. No, that’s not quite right. Scientists love doing research on yawning. It seems to be of interest to folks in fields ranging from primatology to developmental psychology to psychopathology to animal behavior. If the notion of scientifically investigation the purpose of yawning makes you, well, yawn, then you’re missing one of the more interesting debates in the social cognition literature. To understand why yawning is about more than feeling tired or bored, we have to go back a few years. Once upon a time, scientists thought that yawning might be process through which the brain keeps itself cool (PDF). Yawning is associated with increases in blood pressure, and the consequential increase in blood flow might mean that the vascular system acts as a radiator, replacing the warm blood in the brain with cooler blood. It could also be that the deep inhalation of cold air during a yawn can, through convection, alter blood temperature which in turn could cool the brain. Even if it turns out that some yawns can be explained through purely physiological means, yawning is also contagious for humans and other species. If someone watches someone else yawning, they’ll be likely to yawn as well. That means that there is social component to yawning, and it might be related to empathy. It turns out that there’s a correlation between a person’s self-reported empathy and their susceptibility to reacting to a yawn contagion, and those who are more skilled at theory of mind tasks are also more likely (PDF) to yawn contagiously. © 2013 Scientific American

Keyword: Emotions; Evolution
Link ID: 18801 - Posted: 10.17.2013

Henry Astley In the Mark Twain story The Celebrated Jumping Frog of Calaveras County, a frog named Daniel Webster "could get over more ground at one straddle than any animal of his breed you ever see." Now, scientists have visited the real Calaveras County in hopes of learning more about these hopping amphibians. They’ve found that what they see in the lab doesn’t always match the goings-on in the real world. If you wanted to know how far the bullfrog Rana catesbeiana could jump, the scientific literature would give you one answer: 1.295 meters, published in Smithsonian Contributions to Zoology in 1978. If you looked at the Guinness Book of World Records, though, you'd find a different answer. In 1986, a bullfrog called Rosie the Ribeter covered 6.55 meters in three hops. If you divide by three, at least one of those hops had to be no shorter than 2.18 meters—about four bullfrog body lengths more than the number in the scientific paper. The disparity matters. If bullfrogs can hop only 1.3 meters, they have enough power in their muscles to pull off the jump without any other anatomical help. But if they can jump farther, they must also be using a stretchy tendon to power their hops—an ability that other frogs have but that researchers thought bullfrogs had lost. These particular amphibians, scientists speculated, might have made some kind of evolutionary tradeoff that shortened their jumps but enabled them to swim better in the water, where they spend much of their lives. © 2013 American Association for the Advancement of Science

Keyword: Miscellaneous
Link ID: 18800 - Posted: 10.17.2013