Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
Taking beta-blocker drugs may cut the risk of dementia, a trial in 774 men suggests. The medication is used to treat high blood pressure, a known risk factor for dementia. In the study, which will be presented at the American Academy of Neurology's annual meeting in March, men on beta-blockers were less likely to have brain changes suggestive of dementia. Experts say it is too early to recommend beta-blockers for dementia. The findings are preliminary and larger studies in men and women from different ethnicities are needed to see what benefit beta-blockers might offer. People with high blood pressure are advised to see their doctor and get their condition under control to prevent associated complications like heart disease, stroke and vascular dementia. Having high blood pressure may damage the small vessels that supply the brain with blood. Blood carries essential oxygen and nourishment to the brain and without it, brain cells can die. Vascular dementia is the second most common cause of dementia after Alzheimer's disease and can occur if blood flow to the brain is reduced. Other research in a much larger sample of men - 800,000 in all - suggests another type of blood pressure drug known as an angiotensin receptor blocker (ARB) may cut dementia risk, including Alzheimer's disease, by as much as 50%. BBC © 2013
Keyword: Alzheimers
Link ID: 17661 - Posted: 01.08.2013
By KENNETH CHANG Mosquito bite? Poison ivy? Dry skin? Fuzzy sweater? Everyone has an itch to scratch. Why we and other animals itch remains something of a mystery. But now researchers at Johns Hopkins and Yale in the United States and several universities in China have found a key piece of the puzzle, identifying sensory neurons in mice that are dedicated to relaying itchy sensations from the top layers of skin to the spinal cord. “Our study, for the first time, shows the existence of itch-specific nerves,” said Xinzhong Dong, a professor of neuroscience at the Johns Hopkins University School of Medicine and the senior author of a paper about the findings in the journal Nature Neuroscience. Scientists have debated for decades whether separate circuitry existed for itchiness or whether its signals passed through the same nerves used to transmit pain. Earlier data — suppressing pain with morphine can cause chronic itching, for example — indicated some overlap between the two sensations. But the fact that evolution also produced dedicated itch nerves in mice — and almost certainly in people as well — suggests that itching serves an important role in survival and is not just a byproduct of the pain nerves. © 2013 The New York Times Company
Keyword: Pain & Touch
Link ID: 17660 - Posted: 01.08.2013
By Cheryl Murphy Enhancing your level of vision on demand sounds like something out of a comic book. Superman, if you recall, had the power to turn his x-ray vision on and off like a light switch. So is x-ray vision possible? I’m sorry to say: no. The ability of our naked eyes to see through layers of objects remains an idea conjured up in the minds of science fiction writers. However, the possibility of training your brain to flip to a heightened level of visual discrimination and detection whenever you want may in fact be a reality. Last month, researchers in Switzerland found that participants who were successfully trained to consciously up-regulate the level of activity in their early visual cortex as seen by neurofeedback on fMRI in real time were also able to voluntarily give their level of visual discrimination and detection a boost. This study may sound like science fiction but it is not. Here is how it was done. Sixteen, young healthy participants with normal or corrected-to-normal vision were told to focus on a central fixation light while they imagined high resolution pictures of changing color, shape and intensity in a particular part of their visual field which the researchers called the target region of interest. They visualized such things as writing their name in the air, a boat sailing on the ocean, patterns of spinning wheels and spirals, a model walking down the runway or their pet. They received on-the-spot visual feedback indicating how well their visualizations were boosting their brain activity to aid in their brain training. By imagining these detailed objects, seven out of the sixteen participants were able to train themselves to consciously up-regulate activity in areas of their early visual cortex over the course of a series of separate training sessions. In essence what the participants did was learn how to jump-start their visual cortex. Once their visual cortex was held at a higher state of activity, it was more sensitive and could better detect other stimuli in the target region of interest where they projected their visualizations. © 2013 Scientific American
Keyword: Vision; Learning & Memory
Link ID: 17659 - Posted: 01.08.2013
By Laura Sanders Astronauts on a months-long mission to Mars and back will have more to contend with than boredom and a lack of gourmet cuisine: Disrupted sleep may be a serious side effect of extended space flight, potentially changing crew dynamics and affecting performance on high-pressure tasks. In an epic feat of playacting, a crew of six men lived for 520 days inside a hermetically sealed 550-cubic-meter capsule in Moscow. As the grueling experiment wore on, the crew drifted into torpor, moving less and sleeping more. Four men experienced sleep problems, scientists report online January 7 in the Proceedings of the National Academy of Sciences. Developed by the Russian Academy of Sciences, the “Mars 500” project was designed to test the feasibility of sending people on a journey to Mars and back. The simulation was realistic: The chamber was sealed, mission control was on standby 24 hours a day with built-in communications delays during parts of the mission, and the crew had specific jobs to do during transit and on a simulated landing on Mars. “If we at some point really want to go to Mars and we want to send humans, then we need to know how they will cope with this long period of confinement,” says study coauthor Mathias Basner, of the University of Pennsylvania’s Perelman School of Medicine in Philadelphia. Basner’s team was one of many that conducted studies on the six men during the long simulation. © Society for Science & the Public 2000 - 2013
Keyword: Sleep
Link ID: 17658 - Posted: 01.08.2013
A strong family history of seizures could increase the chances of having severe migraines, says a study in Epilepsia journal. Scientists from Columbia University, New York, analysed 500 families containing two or more close relatives with epilepsy. Their findings could mean that genes exist that cause both epilepsy and migraine. Epilepsy Action said it could lead to targeted treatments. Previous studies have shown that people with epilepsy are substantially more likely than the general population to have migraine headaches, but it was not clear whether that was due to a shared genetic cause. The researchers found that people with three or more close relatives with a seizure disorder were more than twice as likely to experience 'migraine with aura' than patients from families with fewer individuals with seizures. Migraine with aura is a severe headache preceded by symptoms such as seeing flashing lights, temporary visual loss, speech problems or numbness of the face. Dr Melodie Winawer, lead author of the study from Columbia University Medical Centre, said the findings had implications for epilepsy patients. "Our study demonstrates a strong genetic basis for migraine and epilepsy, because the rate of migraine is increased only in people who have close (rather than distant) relatives with epilepsy." BBC © 2013
Keyword: Epilepsy; Pain & Touch
Link ID: 17657 - Posted: 01.07.2013
By Diane Mapes The video touched millions: An 8-month old boy smiles with unabashed adoration at his mother as he hears her voice, seemingly for the first time, thanks to a new cochlear implant. Posted on YouTube in April of 2008, the video of "Jonathan's Cochlear Implant Activation" has received more than 3.6 million hits and thousands of comments from viewers, many clamoring for an update. Five-year-old Jonathan is “doing great,” according to his parents, Brigette and Mark Breaux of Houston, Texas. "He's in kindergarten and we're working on speech," Brigette, his 35-year-old stay-at-home mom, told TODAY.com. "He can hear everything that we say to him. It's of course artificial hearing but he can hear and understand what we're saying." After a bout with bacterial meningitis left him deaf, Jonathan Breaux regained hearing with the help of a cochlear implant, and is now a happy 5-year-old. "He's a flirt," adds Mark, a 36-year-old corporate controller. "He was chasing girls around the playground when Brigette went to see him for his class party. He's a handful." © 2013 NBCNews.com
Keyword: Hearing; Robotics
Link ID: 17656 - Posted: 01.07.2013
By James Gallagher Health and science reporter, BBC News Totally blind mice have had their sight restored by injections of light-sensing cells into the eye, UK researchers report. The team in Oxford said their studies closely resemble the treatments that would be needed in people with degenerative eye disease. Similar results have already been achieved with night-blind mice. Experts said the field was advancing rapidly, but there were still questions about the quality of vision restored. Patients with retinitis pigmentosa gradually lose light-sensing cells from the retina and can become blind. The research team, at the University of Oxford, used mice with a complete lack of light-sensing photoreceptor cells in their retinas. The mice were unable to tell the difference between light and dark. Reconstruction They injected "precursor" cells which will develop into the building blocks of a retina once inside the eye. Two weeks after the injections a retina had formed, according to the findings presented in the Proceedings of the National Academy of Sciences journal. Prof Robert MacLaren said: "We have recreated the whole structure, basically it's the first proof that you can take a completely blind mouse, put the cells in and reconstruct the entire light-sensitive layer." BBC © 2013
Keyword: Vision; Development of the Brain
Link ID: 17655 - Posted: 01.07.2013
By DAN FROSCH ALBUQUERQUE — It has been almost four decades since Betty Jo Lopez started using heroin. Her face gray and wizened well beyond her 59 years, Ms. Lopez would almost certainly still be addicted, if not for the fact that she is locked away in jail, not to mention the cup of pinkish liquid she downs every morning. “It’s the only thing that allows me to live a normal life,” Ms. Lopez said of the concoction, which contains methadone, a drug used to treat opiate dependence. “These nurses that give it to me, they’re like my guardian angels.” For the last six years, the Metropolitan Detention Center, New Mexico’s largest jail, has been administering methadone to inmates with drug addictions, one of a small number of jails and prisons around the country that do so. At this vast complex, sprawled out among the mesas west of downtown Albuquerque, any inmate who was enrolled at a methadone clinic just before being arrested can get the drug behind bars. Pregnant inmates addicted to heroin are also eligible. Here in New Mexico, which has long been plagued by one of the nation’s worst heroin scourges, there is no shortage of participants — hundreds each year — who have gone through the program. © 2013 The New York Times Company
Keyword: Drug Abuse
Link ID: 17654 - Posted: 01.07.2013
Ed Yong For years, a particular protein has been cast as a lynchpin of long-term memory. Inhibiting this enzyme could erase old memories, whereas adding it could strengthen faded ones1–3. But two independent groups of US scientists have now seriously challenged the role of this 'memory molecule' by developing mice that completely lack it — and showing that these mice have no detectable memory problems. Their results are published today in Nature4, 5. The excitement around the enzyme, called protein kinase M-ζ (PKM-ζ), started building in 2006, when Todd Sacktor at the SUNY Downstate Medical Center in New York City wiped out established spatial memories in rats. He did so by injecting their brains with ZIP, a small peptide that is meant to block the enzyme1. Other teams obtained similar results, erasing different types of memory by injecting ZIP into various brain regions in rodents, flies and sea slugs. And in 2011, Sacktor did the opposite: he strengthened rats' memory of unpleasant tastes by injecting their brains with viruses carrying extra copies of PKM-ζ3. These fascinating studies suggested that long-term memory, rather than being static and stable, is surprisingly fragile, and depends on the continuous activity of a single enzyme. Richard Huganir of Johns Hopkins University in Baltimore, Maryland, was intrigued by these results, but was concerned that much of the data depended on the actions of ZIP. He and his collaborators took a different route, by deleting two genes — one for PKM-ζ and one for a related protein called PKC-ζ — in embryonic mice4. Working independently, Robert Messing and colleagues at the University of California, San Francisco, created similar mice5. © 2013 Nature Publishing Group
Keyword: Learning & Memory
Link ID: 17653 - Posted: 01.05.2013
By Rita Levi-Montalcini and Pietro Calissano The human nervous system is a vast network of several billion neurons, or nerve cells, endowed with the remarkable ability to receive, store and transmit information. In order to communicate with one another and with non-neuronal cells the neurons rely on the long extensions called axons, which are somewhat analogous to electrically conducting wires. Unlike wires, however, the axons are fluid-filled cylindrical structures that not only transmit electrical signals but also ferry nutrients and other essential substances to and from the cell body. Many basic questions remain to be answered about the mechanisms governing the formation of this intricate cellular network. How do the nerve cells differentiate into thousands of different types? How do their axons establish specific connections (synapses) with other neurons and non-neuronal cells? And what is the nature of the chemical messages neurons send and receive once the synaptic connections are made? This article will describe some major characteristics and effects of a protein called the nerve-growth factor (NGF), which has made it possible to induce and analyze under highly favorable conditions some crucial steps in the differentiation of neurons, such as the growth and maturation of axons and the synthesis and release of neurotransmitters: the bearers of the chemical messages. The discovery of NGF has also promoted an intensive search for other specific growth factors, leading to the isolation and characterization of a number of proteins with the ability to enhance the growth of different cell lines. © 2013 Scientific American,
Keyword: Development of the Brain; Trophic Factors
Link ID: 17652 - Posted: 01.05.2013
By ANAHAD O'CONNOR A new study of driving behavior across the country found that slightly more than 4 percent of adults admit to having fallen asleep at the wheel. Certain people were particularly likely to report drowsiness while driving, including those who slept less than six hours daily and those who snored at night, a potential sign of a sleep disorder. Though only 4.2 percent of adults said they had actually fallen asleep while driving in the past 30 days, the researchers said they believed the true number was probably several times that, since people who doze or nod off for a moment at the wheel may not realize it at the time or recall it later on. Drowsy driving has a widespread impact on the nation’s highways, experts say. In 2009, an estimated 730 deadly motor vehicle accidents involved a driver who was either sleepy or dozing off, and an additional 30,000 crashes that were nonfatal involved a drowsy driver. Accidents involving sleepy drivers are more likely to be deadly or cause injuries, in part because people who fall asleep at the wheel either fail to hit their brakes or veer off the road before crashing. To get a sense of just how prevalent the phenomenon is, Anne G. Wheaton, an epidemiologist at the Centers for Disease Control and Prevention, led a study looking at 147,000 adults in 19 states and the District of Columbia. The subjects were asked detailed questions about their daily activities, including their driving, sleep and work habits. Dr. Wheaton and her colleagues found that men were more likely to report drowsy driving than women, and that the behavior increased with age. About 1.7 percent of adults between 18 and 44 admitted to it, compared to 5 percent or more of those age 65 or older. The findings were published in the latest issue of Morbidity and Mortality Weekly Report. Copyright 2013 The New York Times Company
Keyword: Sleep; Attention
Link ID: 17651 - Posted: 01.05.2013
By Alexandra Witze Quietly, on the top floor of a nondescript commercial building overlooking Boston Harbor, the future is being born. Rows of young scientists tap intently in front of computer monitors, their concentration unbroken even as the occasional plane from Logan Airport buzzes by. State-of-the-art lab equipment hums away in the background. This office, in Boston’s Marine Industrial Park, is what California’s Silicon Valley was four decades ago — the vanguard of an industry that will change your life. Just as researchers from Stanford provided the brains behind the semiconductor revolution, so are MIT and Harvard fueling the next big transformation. Students and faculty cross the Charles River not to build computer chips, but to re-engineer life itself. Take Reshma Shetty, one of the young minds at work in the eighth-floor biological production facility. After receiving her doctorate at MIT in 2008, she, like many new graduates, decided she wanted to make her mark on the world. She got together with four colleagues, including her Ph.D. adviser Tom Knight, to establish a company that aims “to make biology easy to engineer.” Place an order with Ginkgo BioWorks and its researchers will make an organism to do whatever you want. Need to suck carbon dioxide out of the atmosphere? They can engineer the insides of a bacterium to do just that. Want clean, biologically based fuels to replace petroleum taken from the ground? Company scientists will design a microbe to poop those out. © Society for Science & the Public 2000 - 2013
Keyword: Robotics
Link ID: 17650 - Posted: 01.05.2013
By Christie Wilcox There’s a lot to be said for smarts—at least we humans, with some of the biggest brains in relation to our bodies in the animal kingdom, certainly seem to think so. The size of animal brains is extravagantly well-studied, as scientists have long sought to understand why our ancestors developed such complex and energetically costly neural circuitry. One of the most interesting evolutionary hypotheses about brain size is The Expensive Tissue Hypothesis. Back in the early 1990s, scientists were looking to explain how brain size evolves. Brains are exceedingly useful organs; more brain cells allows for more behavioral flexibility, better control of larger bodies, and, of course, intelligence. But if bigger brains were always better, every animal would have them. Thus, scientists reasoned, there must be a downside. The hypothesis suggests that while brains are great and all, their extreme energetic cost limits their size and tempers their growth. When it comes to humans, for example, though our brains are only 2% of our bodies, they take up a whopping 20% of our energy requirements. And you have to wonder: with all that energy being used by our brains, what body parts have paid the price? The hypothesis suggested our guts took the hit, but that intelligence made for more efficient foraging and hunting, thus overcoming the obstacle. This makes sense, but despite over a century of research on the evolution of brain size, there is still controversy, largely stemming from the fact that evidence for the expensive tissue hypothesis is based entirely on between species comparisons and correlations, with no empirical tests. © 2013 Scientific American
Keyword: Evolution; Obesity
Link ID: 17649 - Posted: 01.05.2013
By Breanna Draxler Infants are known for their impressive ability to learn language, which most scientists say kicks in somewhere around the six-month mark. But a new study indicates that language recognition may begin even earlier, while the baby is still in the womb. Using a creative means of measurement, researchers found that babies could already recognize their mother tongue by the time they left their mothers’ bodies. The researchers tested American and Swedish newborns between seven hours and three days old. Each baby was given a pacifier hooked up to a computer. When the baby sucked on the pacifier, it triggered the computer to produce a vowel sound—sometimes in English and sometimes in Swedish. The vowel sound was repeated until the baby stopped sucking. When the baby resumed sucking, a new vowel sound would start. The sucking was used as a metric to determine the babies’ interest in each vowel sound. More interest meant more sucks, according to the study soon to be published in Acta Paediatrica. In both countries, babies sucked on the pacifier longer when they heard foreign vowel sounds as compared to those of their mom’s native language. The researchers suggest that this is because the babies already recognize the vowels from their mothers and were keen to learn new ones. Hearing develops in a baby’s brain at around the 30th week of pregnancy, which leaves the last 10 weeks of gestation for babies to put that newfound ability to work. Baby brains are quick to learn, so a better understanding of these mechanisms may help researchers figure out how to improve the learning process for the rest of us.
Keyword: Language; Development of the Brain
Link ID: 17648 - Posted: 01.05.2013
By Christof Koch Unless you have been deaf and blind to the world over the past decade, you know that functional magnetic resonance brain imaging (fMRI) can look inside the skull of volunteers lying still inside the claustrophobic, coffinlike confines of a loud, banging magnetic scanner. The technique relies on a fortuitous property of the blood supply to reveal regional activity. Active synapses and neurons consume power and therefore need more oxygen, which is delivered by the hemoglobin molecules inside the circulating red blood cells. When these molecules give off their oxygen to the surrounding tissue, they not only change color—from arterial red to venous blue—but also turn slightly magnetic. Activity in neural tissue causes an increase in the volume and flow of fresh blood. This change in the blood supply, called the hemodynamic signal, is tracked by sending radio waves into the skull and carefully listening to their return echoes. FMRI does not directly measure synaptic and neuronal activity, which occurs over the course of milliseconds; instead it uses a relatively sluggish proxy—changes in the blood supply—that rises and falls in seconds. The spatial resolution of fMRI is currently limited to a volume element (voxel) the size of a pea, encompassing about one million nerve cells. Neuroscientists routinely exploit fMRI to infer what volunteers are seeing, imagining or intending to do. It is really a primitive form of mind reading. Now a team has taken that reading to a new, startling level. A number of groups have deduced the identity of pictures viewed by volunteers while lying in the magnet scanner from the slew of maplike representations found in primary, secondary and higher-order visual cortical regions underneath the bump on the back of the head. © 2012 Scientific American
Keyword: Vision; Consciousness
Link ID: 17647 - Posted: 01.01.2013
By SINDYA N. BHANOO The human brain responds to music in different ways, depending on the listener’s emotional reaction, among other things. Now researchers report that the same holds true for birds listening to birdsong. “The same regions that respond to music in humans, at least the areas that can also be found in the bird brain, responded to song in our sparrows,” said an author of the new report, Donna Maney, a neuroscientist at Emory University. Primed with estrogen to simulate their state during breeding, female white-throated sparrows responded to the songs of male sparrows in the same way as humans listening to pleasant music, she said. Females in a nonbreeding state responded no differently to birdsong than to generic tones of the same frequencies. “So during breeding season, birdsong is received differently by females,” Dr. Maney said. Moreover, male birds treated with testosterone showed a response in the amygdala, the brain’s emotional center, when they heard other males singing. The response is akin to the reaction humans have when they hear the sort of music used in a scary movie scene. “If you’re a male and you hear the song, it means that you’re invading territory or being invaded,” Dr. Maney said. “It’s an aggressive signal.” © 2012 The New York Times Company
Keyword: Emotions; Hearing
Link ID: 17646 - Posted: 01.01.2013
Barry Gordon The intuitive notion of a “photographic” memory is that it is just like a photograph: you can retrieve it from your memory at will and examine it in detail, zooming in on different parts. But a true photographic memory in this sense has never been proved to exist. Most of us do have a kind of photographic memory, in that most people's memory for visual material is much better and more detailed than our recall of most other kinds of material. For instance, most of us remember a face much more easily than the name associated with that face. But this isn't really a photographic memory; it just shows us the normal difference between types of memory. Even visual memories that seem to approach the photographic ideal are far from truly photographic. These memories seem to result from a combination of innate abilities, combined with zealous study and familiarity with the material, such as the Bible or fine art. Sorry to disappoint further, but even an amazing memory in one domain, such as vision, is not a guarantee of great memory across the board. That must be rare, if it occurs at all. A winner of the memory Olympics, for instance, still had to keep sticky notes on the refrigerator to remember what she had to do during the day. So how does an exceptional, perhaps photographic, memory come to be? It depends on a slew of factors, including our genetics, brain development and experiences. It is difficult to disentangle memory abilities that appear early from those cultivated through interest and training. Most people who have exhibited truly extraordinary memories in some domain have seemed to possess them all their lives and honed them further through practice. © 2012 Scientific American
Keyword: Learning & Memory
Link ID: 17645 - Posted: 01.01.2013
By John Horgan We’re approaching the end of one year and the beginning of another, when people resolve to quit smoking, swill less booze, gobble less ice cream, jog every day, or every other day, work harder, or less hard, be nicer to kids, spouses, ex-spouses, co-workers, read more books, watch less TV, except Homeland, which is awesome. In other words, it’s a time when people seek to alter their life trajectories by exercising their free will. Some mean-spirited materialists deny that free will exists, and this specious claim—not mere physiological processes in my brain–motivates me to reprint a defense of free will that I wrote for The New York Times 10 years ago: When I woke this morning, I stared at the ceiling above my bed and wondered: To what extent will my rising really be an exercise of my free will? Let’s say I got up right . . . now. Would my subjective decision be the cause? Or would computations unfolding in a subconscious neural netherworld actually set off the muscular twitches that slide me out of the bed, quietly, so as not to wake my wife (not a morning person), and propel me toward the door? One of the risks of science journalism is that occasionally you encounter research that threatens something you cherish. Free will is something I cherish. I can live with the idea of science killing off God. But free will? That’s going too far. And yet a couple of books I’ve been reading lately have left me brooding over the possibility that free will is as much a myth as divine justice. © 2012 Scientific American
Keyword: Consciousness; Attention
Link ID: 17644 - Posted: 12.29.2012
By Lisa Raffensperger Among the many unpleasant side effects of chemotherapy treatment, researchers have just confirmed another: chemo brain. The term refers to the mental fog that chemotherapy patients report feeling during and after treatment. According to Jame Abraham, a professor at West Virginia University, about a quarter of patients undergoing chemotherapy have trouble focusing, processing numbers, and using short-term memory. A recent study points to the cause. The study relied on PET (positron emission tomography) brain scanning to examine brain blood flow, a marker for brain activity. Abraham and colleagues scanned the brains of 128 breast cancer patients before chemotherapy began and then 6 months later. The results showed a significant decrease in activity in regions responsible for memory, attention, planning and prioritizing. The findings aren’t immediately useful for treating or preventing the condition of chemo brain, but the hard and fast evidence may comfort those experiencing chemo-related forgetfulness. And luckily chemo brain is almost always temporary: patients’ mental processing generally returns to normal within a year or two after chemotherapy treatment ends.
Keyword: Neurotoxins; Brain imaging
Link ID: 17643 - Posted: 12.29.2012
By Susan Lunn, CBC News It's the time of year when people take stock of the past 12 months, and make resolutions for the New Year. That's kind of what Svante Paabo is doing — but the Swedish archeological geneticist is looking over a time span of 30,000 years. He's almost finished mapping the DNA of neanderthal man, a distant cousin of modern humans. Paabo has found that many people today carry within their DNA about 3 to 5 per cent in common with neanderthals. Paabo says it's important to learn more about our caveman cousins' DNA to reveal the differences between us and them, differences that have seen modern humans surive and thrive over the millennia, while neanderthals have become extinct. "I really hope that over the next 10 years we will understand much more of those things that set us apart. Which changes in our genome made human culture and technology possible? And allowed us to expand and become 7, 8, 9 billion people and spread all over the world?," he asked at a recent genetic conference in Ottawa. The room was packed with people from across North America who wanted to hear Paabo speak. He's recognized as the inspiration for Michael Crichton's Jurassic Park. © CBC 2012
Keyword: Genes & Behavior; Evolution
Link ID: 17642 - Posted: 12.29.2012