Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Breanna Draxler Infants are known for their impressive ability to learn language, which most scientists say kicks in somewhere around the six-month mark. But a new study indicates that language recognition may begin even earlier, while the baby is still in the womb. Using a creative means of measurement, researchers found that babies could already recognize their mother tongue by the time they left their mothers’ bodies. The researchers tested American and Swedish newborns between seven hours and three days old. Each baby was given a pacifier hooked up to a computer. When the baby sucked on the pacifier, it triggered the computer to produce a vowel sound—sometimes in English and sometimes in Swedish. The vowel sound was repeated until the baby stopped sucking. When the baby resumed sucking, a new vowel sound would start. The sucking was used as a metric to determine the babies’ interest in each vowel sound. More interest meant more sucks, according to the study soon to be published in Acta Paediatrica. In both countries, babies sucked on the pacifier longer when they heard foreign vowel sounds as compared to those of their mom’s native language. The researchers suggest that this is because the babies already recognize the vowels from their mothers and were keen to learn new ones. Hearing develops in a baby’s brain at around the 30th week of pregnancy, which leaves the last 10 weeks of gestation for babies to put that newfound ability to work. Baby brains are quick to learn, so a better understanding of these mechanisms may help researchers figure out how to improve the learning process for the rest of us.
Keyword: Language; Development of the Brain
Link ID: 17648 - Posted: 01.05.2013
By Christof Koch Unless you have been deaf and blind to the world over the past decade, you know that functional magnetic resonance brain imaging (fMRI) can look inside the skull of volunteers lying still inside the claustrophobic, coffinlike confines of a loud, banging magnetic scanner. The technique relies on a fortuitous property of the blood supply to reveal regional activity. Active synapses and neurons consume power and therefore need more oxygen, which is delivered by the hemoglobin molecules inside the circulating red blood cells. When these molecules give off their oxygen to the surrounding tissue, they not only change color—from arterial red to venous blue—but also turn slightly magnetic. Activity in neural tissue causes an increase in the volume and flow of fresh blood. This change in the blood supply, called the hemodynamic signal, is tracked by sending radio waves into the skull and carefully listening to their return echoes. FMRI does not directly measure synaptic and neuronal activity, which occurs over the course of milliseconds; instead it uses a relatively sluggish proxy—changes in the blood supply—that rises and falls in seconds. The spatial resolution of fMRI is currently limited to a volume element (voxel) the size of a pea, encompassing about one million nerve cells. Neuroscientists routinely exploit fMRI to infer what volunteers are seeing, imagining or intending to do. It is really a primitive form of mind reading. Now a team has taken that reading to a new, startling level. A number of groups have deduced the identity of pictures viewed by volunteers while lying in the magnet scanner from the slew of maplike representations found in primary, secondary and higher-order visual cortical regions underneath the bump on the back of the head. © 2012 Scientific American
Keyword: Vision; Consciousness
Link ID: 17647 - Posted: 01.01.2013
By SINDYA N. BHANOO The human brain responds to music in different ways, depending on the listener’s emotional reaction, among other things. Now researchers report that the same holds true for birds listening to birdsong. “The same regions that respond to music in humans, at least the areas that can also be found in the bird brain, responded to song in our sparrows,” said an author of the new report, Donna Maney, a neuroscientist at Emory University. Primed with estrogen to simulate their state during breeding, female white-throated sparrows responded to the songs of male sparrows in the same way as humans listening to pleasant music, she said. Females in a nonbreeding state responded no differently to birdsong than to generic tones of the same frequencies. “So during breeding season, birdsong is received differently by females,” Dr. Maney said. Moreover, male birds treated with testosterone showed a response in the amygdala, the brain’s emotional center, when they heard other males singing. The response is akin to the reaction humans have when they hear the sort of music used in a scary movie scene. “If you’re a male and you hear the song, it means that you’re invading territory or being invaded,” Dr. Maney said. “It’s an aggressive signal.” © 2012 The New York Times Company
Keyword: Emotions; Hearing
Link ID: 17646 - Posted: 01.01.2013
Barry Gordon The intuitive notion of a “photographic” memory is that it is just like a photograph: you can retrieve it from your memory at will and examine it in detail, zooming in on different parts. But a true photographic memory in this sense has never been proved to exist. Most of us do have a kind of photographic memory, in that most people's memory for visual material is much better and more detailed than our recall of most other kinds of material. For instance, most of us remember a face much more easily than the name associated with that face. But this isn't really a photographic memory; it just shows us the normal difference between types of memory. Even visual memories that seem to approach the photographic ideal are far from truly photographic. These memories seem to result from a combination of innate abilities, combined with zealous study and familiarity with the material, such as the Bible or fine art. Sorry to disappoint further, but even an amazing memory in one domain, such as vision, is not a guarantee of great memory across the board. That must be rare, if it occurs at all. A winner of the memory Olympics, for instance, still had to keep sticky notes on the refrigerator to remember what she had to do during the day. So how does an exceptional, perhaps photographic, memory come to be? It depends on a slew of factors, including our genetics, brain development and experiences. It is difficult to disentangle memory abilities that appear early from those cultivated through interest and training. Most people who have exhibited truly extraordinary memories in some domain have seemed to possess them all their lives and honed them further through practice. © 2012 Scientific American
Keyword: Learning & Memory
Link ID: 17645 - Posted: 01.01.2013
By John Horgan We’re approaching the end of one year and the beginning of another, when people resolve to quit smoking, swill less booze, gobble less ice cream, jog every day, or every other day, work harder, or less hard, be nicer to kids, spouses, ex-spouses, co-workers, read more books, watch less TV, except Homeland, which is awesome. In other words, it’s a time when people seek to alter their life trajectories by exercising their free will. Some mean-spirited materialists deny that free will exists, and this specious claim—not mere physiological processes in my brain–motivates me to reprint a defense of free will that I wrote for The New York Times 10 years ago: When I woke this morning, I stared at the ceiling above my bed and wondered: To what extent will my rising really be an exercise of my free will? Let’s say I got up right . . . now. Would my subjective decision be the cause? Or would computations unfolding in a subconscious neural netherworld actually set off the muscular twitches that slide me out of the bed, quietly, so as not to wake my wife (not a morning person), and propel me toward the door? One of the risks of science journalism is that occasionally you encounter research that threatens something you cherish. Free will is something I cherish. I can live with the idea of science killing off God. But free will? That’s going too far. And yet a couple of books I’ve been reading lately have left me brooding over the possibility that free will is as much a myth as divine justice. © 2012 Scientific American
Keyword: Consciousness; Attention
Link ID: 17644 - Posted: 12.29.2012
By Lisa Raffensperger Among the many unpleasant side effects of chemotherapy treatment, researchers have just confirmed another: chemo brain. The term refers to the mental fog that chemotherapy patients report feeling during and after treatment. According to Jame Abraham, a professor at West Virginia University, about a quarter of patients undergoing chemotherapy have trouble focusing, processing numbers, and using short-term memory. A recent study points to the cause. The study relied on PET (positron emission tomography) brain scanning to examine brain blood flow, a marker for brain activity. Abraham and colleagues scanned the brains of 128 breast cancer patients before chemotherapy began and then 6 months later. The results showed a significant decrease in activity in regions responsible for memory, attention, planning and prioritizing. The findings aren’t immediately useful for treating or preventing the condition of chemo brain, but the hard and fast evidence may comfort those experiencing chemo-related forgetfulness. And luckily chemo brain is almost always temporary: patients’ mental processing generally returns to normal within a year or two after chemotherapy treatment ends.
Keyword: Neurotoxins; Brain imaging
Link ID: 17643 - Posted: 12.29.2012
By Susan Lunn, CBC News It's the time of year when people take stock of the past 12 months, and make resolutions for the New Year. That's kind of what Svante Paabo is doing — but the Swedish archeological geneticist is looking over a time span of 30,000 years. He's almost finished mapping the DNA of neanderthal man, a distant cousin of modern humans. Paabo has found that many people today carry within their DNA about 3 to 5 per cent in common with neanderthals. Paabo says it's important to learn more about our caveman cousins' DNA to reveal the differences between us and them, differences that have seen modern humans surive and thrive over the millennia, while neanderthals have become extinct. "I really hope that over the next 10 years we will understand much more of those things that set us apart. Which changes in our genome made human culture and technology possible? And allowed us to expand and become 7, 8, 9 billion people and spread all over the world?," he asked at a recent genetic conference in Ottawa. The room was packed with people from across North America who wanted to hear Paabo speak. He's recognized as the inspiration for Michael Crichton's Jurassic Park. © CBC 2012
Keyword: Genes & Behavior; Evolution
Link ID: 17642 - Posted: 12.29.2012
by Carrie Arnold ‘Tis the season for twinkling lights, wrapping paper, and virgin birth. For billions of Christians around the world, the holidays are a time to celebrate Jesus’s birth to the Virgin Mary. But for many animals, virgin birth is far from a miraculous event. Researchers have discovered a growing number of species that reproduce without assistance from the opposite sex. Known formally as parthenogenesis, virgin birth occurs when an embryo develops from an unfertilized egg cell. The development of an embryo usually requires genetic material from sperm and egg, as well as a series of chemical changes sparked by fertilization. In some parthenogenetic species, egg cells don’t undergo meiosis, the typical halving of the cell’s chromosomes, before dividing into new cells. These offspring are generally all female and clones of their mother. Other forms of parthenogenesis occur when two egg cells fuse after meiosis. Biologists think that sexual reproduction evolved as a way to mix the gene pool and reduce the impact of harmful mutations. Still, parthenogenesis can be beneficial if the mother is particularly well adapted to her environment, since all of her offspring will be just as well adapted. © 2010 American Association for the Advancement of Science.
Keyword: Sexual Behavior; Evolution
Link ID: 17641 - Posted: 12.29.2012
By Ben Thomas They called him “Diogenes the Cynic,” because “cynic” meant “dog-like,” and he had a habit of basking naked on the lawn while his fellow philosophers talked on the porch. While they debated the mysteries of the cosmos, Diogenes preferred to soak up some rays – some have called him the Jimmy Buffett of ancient Greece. Anyway, one morning, the great philosopher Plato had a stroke of insight. He caught everyone’s attention, gathered a crowd around him, and announced his deduction: “Man is defined as a hairless, featherless, two-legged animal!” Whereupon Diogenes abruptly leaped up from the lawn, dashed off to the marketplace, and burst back onto the porch carrying a plucked chicken – which he held aloft and shouted, “Behold: I give you… Man!” I’m sure Plato was less than thrilled at this stunt, but the story reminds us that these early philosophers were still hammering out the most basic tenets of the science we now know as taxonomy: The grouping of objects from the world into abstract categories. This technique of chopping up reality wasn’t invented in ancient Greece, though. In fact, as a recent study shows, it’s fundamental to the way our brains work. At the most basic level, we don’t really perceive separate objects at all – we perceive our nervous systems’ responses to a boundless flow of electromagnetic waves and biochemical reactions. Our brains slot certain neural response patterns into sensory pathways we call “sight,” “smell” and so on – but abilities like synesthesia and echolocation show that even the boundaries between our senses can be blurry. © 2012 Scientific American
Keyword: Language; Attention
Link ID: 17640 - Posted: 12.27.2012
A simple eye test may offer a fast and easy way to monitor patients with multiple sclerosis (MS), medical experts say in the journal Neurology. Optical Coherence Tomography (OCT) is a scan that measures the thickness of the lining at the back of the eye - the retina. It takes a few minutes per eye and can be performed in a doctor's surgery. In a trial involving 164 people with MS, those with thinning of their retina had earlier and more active MS. The team of researchers from the Johns Hopkins University School of Medicine say larger trials with a long follow up are needed to judge how useful the test might be in everyday practice. The latest study tracked the patients' disease progression over a two-year period. Unpredictable disease Multiple sclerosis is an illness that affects the nerves in the brain and spinal cord causing problems with muscle movement, balance and vision. In MS, the protective sheath or layer around nerves, called myelin, comes under attack which, in turn, leaves the nerves open to damage. There are different types of MS - most people with the condition have the relapsing remitting type where the symptoms come and go over days, weeks or months. Usually after a decade or so, half of patients with this type of MS will develop secondary progressive disease where the symptoms get gradually worse and there are no or very few periods of remission. BBC © 2012
Keyword: Multiple Sclerosis; Vision
Link ID: 17639 - Posted: 12.27.2012
Scientists say they have found a way to distinguish between different types of dementia without the need for invasive tests, like a lumbar puncture. US experts could accurately identify Alzheimer's disease and another type of dementia from structural brain patterns on medical scans, Neurology reports. Currently, doctors can struggle to diagnose dementia, meaning the most appropriate treatment may be delayed. More invasive tests can help, but are unpleasant for the patient. Despite being two distinct diseases, Alzheimer's and frontotemporal dementia, share similar clinical features and symptoms and can be hard to tell apart without medical tests. Both cause the person to be confused and forgetful and can affect their personality, emotions and behaviour. Alzheimer's tends to attack the cerebral cortex - the layer of grey matter covering the brain - where as frontotemporal dementia, as the name suggests, tends to affect the temporal and frontal lobes of the brain, which can show up on brain scans, but these are not always diagnostic. A lumbar puncture - a needle in the spine - may also be used to check protein levels in the brain, which tend to be higher in Alzheimer's than with frontotemporal dementia. BBC © 2012
Keyword: Alzheimers; Brain imaging
Link ID: 17638 - Posted: 12.27.2012
By DAVID DOBBS Psychological trauma dims tens of millions of lives around the world and helps create costs of at least $42 billion a year in the United States alone. But what is trauma, exactly? Both culturally and medically, we have long seen it as arising from a single, identifiable disruption. You witness a shattering event, or fall victim to it — and as the poet Walter de la Mare put it, “the human brain works slowly: first the blow, hours afterward the bruise.” The world returns more or less to normal, but you do not. In 1980, the Diagnostic and Statistical Manual of Mental Disorders defined trauma as “a recognizable stressor that would evoke significant symptoms of distress in almost everyone” — universally toxic, like a poison. But it turns out that most trauma victims — even survivors of combat, torture or concentration camps — rebound to live full, normal lives. That has given rise to a more nuanced view of trauma — less a poison than an infectious agent, a challenge that most people overcome but that may defeat those weakened by past traumas, genetics or other factors. Now, a significant body of work suggests that even this view is too narrow — that the environment just after the event, particularly other people’s responses, may be just as crucial as the event itself. The idea was demonstrated vividly in two presentations this fall at the Interdisciplinary Conference on Culture, Mind and Brain at the University of California, Los Angeles. Each described reframing a classic model of traumatic experience — one in lab rats, the other in child soldiers. © 2012 The New York Times Company
Keyword: Stress; Learning & Memory
Link ID: 17637 - Posted: 12.27.2012
Becky Summers Monkeys might not be known for their generosity, but when they do seem to act selflessly, a specific area in their brains keeps track of these kindnesses. The discovery of this neuronal tally chart may help scientists to understand the neural mechanisms underlying normal social behaviour in primates and humans, and might even provide insight into disorders such as autism, in which social processing is disrupted. Steve Chang and his colleagues from Duke University in Durham, North Carolina, used electrodes to directly record neuronal activity in three areas of the brain prefrontal cortex that are known to be involved in social decision-making, while monkeys performed reward-related tasks. When given the option either to drink juice from a tube themselves or to give the juice away to a neighbour, the test monkeys would mostly keep the drink. But when the choice was between giving the juice to the neighbour or neither monkey receiving it, the choosing monkey would frequently opt to give the drink to the other monkey. The researchers found that in two out of the three brain areas being recorded, neurons fired in the presence or absence of the juice reward only. By contrast, the third area — known as the anterior cingulate gyrus — responded only when the monkey allocated the juice to the neighbour and observed it being received. The authors suggest the neurons in the ACG respond to and record the act simultaneously. The study's results are published today in Nature Neuroscience1. © 2012 Nature Publishing Group,
Keyword: Emotions
Link ID: 17636 - Posted: 12.27.2012
By GRETCHEN REYNOLDS Anyone whose resolve to exercise in 2013 is a bit shaky might want to consider an emerging scientific view of human evolution. It suggests that we are clever today in part because a million years ago, we could outrun and outwalk most other mammals over long distances. Our brains were shaped and sharpened by movement, the idea goes, and we continue to require regular physical activity in order for our brains to function optimally. The role of physical endurance in shaping humankind has intrigued anthropologists and gripped the popular imagination for some time. In 2004, the evolutionary biologists Daniel E. Lieberman of Harvard and Dennis M. Bramble of the University of Utah published a seminal article in the journal Nature titled “Endurance Running and the Evolution of Homo,” in which they posited that our bipedal ancestors survived by becoming endurance athletes, able to bring down swifter prey through sheer doggedness, jogging and plodding along behind them until the animals dropped. Endurance produced meals, which provided energy for mating, which meant that adept early joggers passed along their genes. In this way, natural selection drove early humans to become even more athletic, Dr. Lieberman and other scientists have written, their bodies developing longer legs, shorter toes, less hair and complicated inner-ear mechanisms to maintain balance and stability during upright ambulation. Movement shaped the human body. But simultaneously, in a development that until recently many scientists viewed as unrelated, humans were becoming smarter. Their brains were increasing rapidly in size. Copyright 2012 The New York Times Company
Keyword: Evolution; Learning & Memory
Link ID: 17635 - Posted: 12.27.2012
By Rachel Ehrenberg Outfitted with a bionic eye, arm, legs and fantastic ’70s hair, Steve Austin was a cyborg whose implants allowed him to recover stolen atomic weapons, fight aliens and protect cryptographers in distress. Finally, real life is starting to catch up with the Six Million Dollar Man. In one of this year’s bionic breakthroughs, a paralyzed woman carried out her own superhuman feat: Using an implanted brain chip, she controlled a robotic arm with her mind (SN: 6/16/12, p. 5). She used the arm to grasp a cuppa joe and take a long, satisfying sip of coffee through a straw, an act she hadn’t done on her own for nearly 15 years. “We’re entering a really exciting area where we can develop all sorts of very complicated technologies that can actually have biomedical applications and improve the quality of life for people,” says bioengineer Grégoire Courtine of the Swiss Federal Institute of Technology in Lausanne. “It’s a revolution.” After her groundbreaking sip, Cathy Hutchinson, who had been paralyzed years earlier by a stroke, smiled and then laughed. A roomful of scientists burst into applause. This was a big year for prosthetic parts, both in and out of the lab. Athletes in London for the Paralympics and the Olympics sprinted on high-tech carbon blades and hurled javelins while balancing on the microprocessor-controlled C-Leg. People in wheelchairs used battery-powered robotic suits to keep their lower limbs in shape. A young man who lost his right leg in a motorcycle accident climbed the 103 flights of stairs in Chicago’s Willis Tower with a thought-controlled limb. That technology is still in development. But some bionic add-ons are starting to come out of the lab and into the clinic for the first time, though costs remain prohibitive for many potential users. © Society for Science & the Public 2000 - 2012
Keyword: Robotics
Link ID: 17634 - Posted: 12.27.2012
Cannabis makes pain more bearable rather than actually reducing it, a study from the University of Oxford suggests. Using brain imaging, researchers found that the psychoactive ingredient in cannabis reduced activity in a part of the brain linked to emotional aspects of pain. But the effect on the pain experienced varied greatly, they said. The researchers' findings are published in the journal Pain. The Oxford researchers recruited 12 healthy men to take part in their small study. Participants were given either a 15mg tablet of THC (delta-9-tetrahydrocannabinol) - the ingredient that is responsible for the high - or a placebo. The volunteers then had a cream rubbed into the skin of one leg to induce pain, which was either a dummy cream or a cream that contained chilli - which caused a burning and painful sensation. Each participant had four MRI scans which revealed how their brain activity changed when their perception of the pain reduced. Dr Michael Lee, lead study author from Oxford University's Centre for Functional Magnetic Resonance Imaging of the Brain, said: "We found that with THC, on average people didn't report any change in the burn, but the pain bothered them less." BBC © 2012
Keyword: Pain & Touch; Drug Abuse
Link ID: 17633 - Posted: 12.22.2012
by Karl Gruber Cigarettes leave you with more than a smoky scent on your clothes and fingernails. A new study has found strong evidence that tobacco use can chemically modify and affect the activity of genes known to increase the risk of developing cancer. The finding may give researchers a new tool to assess cancer risk among people who smoke. DNA isn't destiny. Chemical compounds that affect the functioning of genes can bind to our genetic material, turning certain genes on or off. These so-called epigenetic modifications can influence a variety of traits, such as obesity and sexual preference. Scientists have even identified specific epigenetic patterns on the genes of people who smoke. None of the modified genes has a direct link to cancer, however, making it unclear whether these chemical alterations increase the risk of developing the disease. In the new study, published in Human Molecular Genetics, researchers analyzed epigenetic signatures in blood cells from 374 individuals enrolled in the European Prospective Investigation into Cancer and Nutrition. EPIC, as it's known, is a massive study aimed at linking diet, lifestyle, and environmental factors to the incidence of cancer and other chronic diseases. Half of the group consisted of people who went on to develop colon or breast cancer 5 to 7 years after first joining the study, whereas the other half remained healthy. © 2010 American Association for the Advancement of Science.
Keyword: Drug Abuse; Genes & Behavior
Link ID: 17632 - Posted: 12.22.2012
By Liz Kowalczyk Health officials investigating the national fungal meningitis outbreak caused by tainted steroid injections had thought that the worst was over. The number of new cases was dwindling. Then came patients like Anna Adair. An avid gardener and dog-breeder, Adair was rolled into a Michigan emergency room in a wheelchair Nov. 15. She had been bedridden for days, and that morning a bolt of pain in her lower back had caused her to tumble to the bathroom floor. Doctors quickly reached a disturbing realization: An infection caused by black mold had infiltrated her spine, near where she had received an injection made by a Massachusetts pharmacy, and spread into the bone. It was not the meningitis that sickened hundreds of others in late summer and early fall, but part of a frightening second wave of fungal infections caused by contaminated drugs. Dozens more people have now been diagnosed with excruciating abscesses or inflamed nerves in their backs that are proving formidable to cure. In a health alert issued Thursday, the federal Centers for Disease Control and Prevention said it is worried that some patients with spinal infections may not even be aware of their condition because the symptoms mimic the very back pain they originally sought to treat with steroids. The agency is now recommending that doctors consider performing MRI scans to screen all patients who have persistent back pain and received steroids from one of three contaminated batches. Previously, it advised scanning just those with new or worsening pain. © 2012 NY Times Co.
Keyword: Pain & Touch
Link ID: 17631 - Posted: 12.22.2012
Analysis by Sheila Eldred This spring, it's likely there will be a new diet pill on the market. Belviq (lorcaserin) won approval from the FDA last spring, making it the first weight-loss drug approved in 13 years, and the DEA proposed this week that the drug be classified as a Schedule IV controlled substance. Belviq is an appetite suppressant. The new chemical entity works by activating the brain's response to serotonin. Serotonin is a neuro-transmitter known for evoking happy moods; some anti-depressants work by keeping serotonin levels elevated. Belviq works specifically with the serotonin receptors involved with appetite, according to Time. In trial, patients who took Belviq lost 3 to 3.7 percent more weight than those taking a placebo; after taking it for one or two years, 47 percent lost at least 5 percent of their body weight (compared to 23 percent of those who took a placebo), WebMD reports. Another new weight loss drug, Qsymia, is already on the market, although sales have been slow. Belviq is approved for obese people and overweight people who have another weight-related disease or risk factor. Side effects include headache, dizziness, fatigue, nausea, dry mouth and constipation; in patients with diabetes, additional side effects include low blood sugar, back pain, and coughing. © 2012 Discovery Communications, LLC.
Keyword: Obesity
Link ID: 17630 - Posted: 12.22.2012
Fighting may have shaped the evolution of the human hand, according to a new study by a US team. The University of Utah researchers used instruments to measure the forces and acceleration when martial artists hit a punch bag. They found that the structure of the fist provides support that increases the ability of the knuckles to transmit "punching" force. Details have been published in the Journal of Experimental Biology. "We asked the question: 'can you strike harder with a fist than with an open palm?'," co-author David Carrier told BBC News. "We were surprised because the fist strikes were not more forceful than the strikes with the palm. In terms of the work on the bag there is really no difference." Of course, the surface that strikes the target with a fist is smaller, so there is more stress from a fist strike. "The force per area is higher in a fist strike and that is what causes localised tissue damage," said Prof Carrier. BBC © 2012
Keyword: Aggression; Evolution
Link ID: 17629 - Posted: 12.22.2012