Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 11581 - 11600 of 28886

By Partha Mitra The Sherlock Holmes novel The Hound of the Baskervilles features the great Grimpen Mire, a treacherous marsh in Dartmoor, England. Holmes’ protagonist, the naturalist Stapleton, knows where the few secure footholds are, allowing him to cross the mire and reach the hills with rare plants and butterflies, but he warns Dr. Watson that a false step can be fatal, the bog inexorably consuming the unsuspecting traveller. Trying to unravel the complexities of the brain is a bit like crossing the great Grimpen Mire: one needs to know where the secure stepping-stones are, and a false step can mean sinking into a morass. As we enter the era of Big Brain Science projects, it is important to know where the next firm foothold is. As a goal worthy of a multi-billion dollar brain project, we have now been offered a motto that is nearly as rousing as “climb every mountain”: “record every action potential from every neuron.” According to recent reporting in the New York Times, this goal, proclaimed in a paper published in 2012, will be the basis of a decade-long “Brain Activity Map” project. Not content with a goal as lofty as this in worms, flies and mice, the press reports imply (and the authors also speculate) that these technologies will be used for comprehensive spike recordings in the human brain, generating a “Brain Activity Map” that will provide the answers to Alzheimers and Schizophrenia and lead us out of the “impenetrable jungles of the brain” that hapless neuroscientists have wandered over the past century. Neuroscience is most certainly in need of integration, and brain research will without doubt benefit from the communal excitement and scaled up funding associated with a Big Brain Initiative. However, success will depend on setting the right goals and guarding against irrational exuberance. Successful big science projects are engineering projects with clear, technically feasible goals: setting a human on the moon, sequencing the Human Genome, finding the Higgs Boson. The technologies proposed in the paper under discussion may or may not be feasible in a given species (they will not be feasible in the normal human brain, since the methods involved are invasive and require that the skull be surgically opened). However, technology development is notoriously difficult to predict, and may carry unforeseen benefits. What we really need to understand is whether the overall goal is meaningful. © 2013 Scientific American,

Keyword: Brain imaging
Link ID: 17879 - Posted: 03.09.2013

By JAMES GORMAN Nothing kicks the brain into gear like a jolt of caffeine. For bees, that is. And they don’t need to stand in line for a triple soy latte. A new study shows that the naturally caffeine-laced nectar of some plants enhances the learning process for bees, so that they are more likely to return to those flowers. “The plant is using this as a drug to change a pollinator’s behavior for its own benefit,” said Geraldine Wright, a honeybee brain specialist at Newcastle University in England, who, with her colleagues, reported those findings in Science on Thursday. The research, other scientists said, not only casts a new light on the ancient evolutionary interaction between plants and pollinators, but is an intriguing confirmation of deep similarities in brain chemistry across the animal kingdom. Plants are known to go to great lengths to attract pollinators. They produce all sorts of chemicals that affect animal behavior: sugar in nectar, memorable fragrances, even substances in fruit that can act like laxatives in the service of quick seed dispersal. Lars Chittka, who studies bee behavior at Queen Mary, University of London, and wrote a commentary on the research in the same issue of Science, said that in the marketplace of plants seeking pollinators, the plants “want their customers to remain faithful,” thus the sugary nectar and distinctive scents. © 2013 The New York Times Company

Keyword: Drug Abuse; Evolution
Link ID: 17878 - Posted: 03.09.2013

by Andy Coghlan Stimulating the brain with electrical signals can sharpen some of your faculties, but now it seems it can dim others at the same time. Transcranial electrical stimulation (TES), delivered by electrodes on the surface of the head, has been shown to double people's speed of learning. Now the first evidence has emerged that improvements in one aspect of learning might come at the expense of other abilities. Roi Cohen Kadosh of the University of Oxford, showed volunteers pairs of unfamiliar symbols. Each symbol had a secret numerical value, and the volunteers' task was to state – as quickly as possible while avoiding mistakes – which symbol in a pair had the bigger value. The correct answer was then displayed. Over six sessions in one week, it was possible to measure how quickly and efficiently the volunteers learned the value of each symbol. Second task In a second task, participants had to register which of each pair of symbols was physically larger, a measure of automatic thinking. "Automaticity is the skill of doing things without thinking about them, such as reading, driving or mounting stairs," says Cohen Kadosh, who conducted the experiment with Teresa Iucalano of the Stanford Cognitive and Systems Neuroscience Laboratory in Palo Alto, California. During the experiments, volunteers received TES to their posterior parietal cortex – vital for numerical learning – or their dorsolateral prefrontal cortex – vital for automaticity. Some unknowingly received a sham treatment. © Copyright Reed Business Information Ltd.

Keyword: Learning & Memory
Link ID: 17877 - Posted: 03.09.2013

by Trevor Quirk Many smartphones claim to filter out background noise, but they've got nothing on the human brain. We can tune in to just one speaker at a noisy cocktail party with little difficulty—an ability that has been a scientific mystery since the early 1950s. Now, researchers argue that the competing noise of other partygoers is filtered out in the brain before it reaches regions involved in higher cognitive functions, such as language and attention control. Their experiments were the first to demonstrate this process. The scientists didn't do anything as social as attend a noisy party. Instead, Charles Schroeder, a psychiatrist at the Columbia University College of Physicians and Surgeons in New York City, and colleagues recorded the brain activity of six people with intractable epilepsy who required brain surgery. In order to identify the part of their brains responsible for seizures, the patients underwent 1 to 4 weeks of observation through electrocorticography (ECoG), a technique that provides precise neural recordings via electrodes placed directly on the surface of the brain. Schroeder and his team, using the ECoG data, conducted their experiments during this time. The researchers showed the patients two videos simultaneously, each of a person telling a 9- to 12-second story; they were asked to concentrate on just one speaker. To determine which neural recordings corresponded to the "ignored" and "attended" speech, the team reconstructed speech patterns from the brain's electrical activity using a mathematical model. The scientists then matched the reconstructed patterns with the original patterns coming from the ignored and attended speakers. © 2010 American Association for the Advancement of Science.

Keyword: Attention; Hearing
Link ID: 17876 - Posted: 03.07.2013

Meredith Wadman Ron Kalil, a neuroscientist at the University of Wisconsin–Madison, didn’t expect to see his son among the 28,500 attendees at the meeting of the Society for Neuroscience in New Orleans last October. And he wondered why Tom Kalil, deputy director for policy at the White House’s Office of Science and Technology Policy (OSTP), was accompanied by Miyoung Chun, vice-president of science programmes at the Kavli Foundation in Oxnard, California. Tom Kalil told his father that the Kavli Foundation had wanted his help in bringing nanoscientists together behind an ambitious idea. Ron Kalil says he thought: “Why are you talking about it at a neuroscience meeting?” He understands now. These two people, neither of them a working scientist, had been quietly pushing into existence the Brain Activity Map (BAM), the largest and most ambitious effort in fundamental biology since the Human Genome Project — and one that would need advances in both nanoscience and neuroscience to achieve its goals. This is the kind of science — big and bold — that politicians like. President Barack Obama praised brain mapping in his State of the Union address on 12 February. Soon after, Francis Collins, director of the US National Institutes of Health (NIH) in Bethesda, Maryland, which will be the lead agency on the project, talked up the idea in a television appearance. The Obama administration is expected to provide more details about the initiative this month, possibly in conjunction with the release of the federal 2014 budget request. But already, some scientists are wondering whether the project, a concept less than two years old and still evolving, can win new funding from Congress, or whether it would crowd out projects pitched by individual scientists. “Creative science is bottom-up, not top-down,” says Cori Bargmann, a neurobiologist at the Rockefeller University in New York. “Are we talking about central planning inside the Beltway?” © 2013 Nature Publishing Group

Keyword: Brain imaging
Link ID: 17875 - Posted: 03.07.2013

By Scicurious I heard the rumblings on Twitter, and then on the blogs. It was telepathy. No, it wasn’t telepathy, but it was close. It was like the Borg. No it wasn’t. It was a mind meld! Ok, maybe. So what was it? It was one rat learning to do something, while electrodes recorded his every move. In the meantime, on another continent, another rat received the signals into his own brain…and changed his behavior. Telepathy? No. A good solid proof of concept? I’m not sure. An interesting idea? Absolutely. So I wanted to look at this paper in depth. We know already that some other experts weren’t really thrilled with the results. But I’m going to look at WHY, and what a more convincing experiment might look like. So what actually happened here? Each experiment involved two sets of rats. First, you have your “encoder rats”. These rats were water-deprived (not terribly, just thirsty), and trained to press a lever for a water reward (water deprivation is one training technique for lever pressing, and is one of the fastest. But you can also food-deprive and train for food or just train the animal to something tasty, like Crisco or sweetened milk). The rats were trained until they were 95% accurate at the task. They were then implanted with electrodes in the motor cortex, that recorded the firing of the neurons as the rats pressed the left or right lever. © 2013 Scientific American,

Keyword: Robotics
Link ID: 17874 - Posted: 03.07.2013

Surgically implanting pacemaker-like devices into the brains of people with severe anorexia might help improve their symptoms, a small Canadian study suggests. Anorexia affects an estimated 15,000 to 20,000 people in Canada, mainly young women who face a high risk of premature death. The mortality rate is between six to 11 per cent. About 60 to 70 per cent of people with anorexia recover fully with traditional treatments, said Dr. Blake Woodside, medical director of the eating disorders program at Toronto General Hospital. But in Wednesday's online issue of the medical journal The Lancet, Woodside and his co-authors describe using deep brain stimulation to treat six women with severe anorexia that did not respond to treatment. The treatment involves surgery to implant the electrical stimulators. It's considered minimally invasive and the stimulation can be turned off. In the pilot study, the average age of the women at diagnosis was 20 and they ranged in age from 24 to 57 when the surgery was performed. Five had a history of repeated hospital admissions for eating disorders. While the study was meant to test the safety of the procedure, not its effectiveness, Woodside's team found three of the six patients achieved and maintained a body mass index greater than their historical level. © CBC 2013

Keyword: Anorexia & Bulimia
Link ID: 17873 - Posted: 03.07.2013

By Meghan Rosen Zombies aren’t the only things that feast on brains. Immune cells called microglia gorge on neural stem cells in developing rat and monkey brains, researchers report in the March 6 Journal of Neuroscience. Chewing up neuron-spawning stem cells could help control brain size by pruning away excess growth. Scientists have previously linked abnormal human brain size to autism and schizophrenia. “It shows microglia are very important in the developing brain,” says neuroscientist Joseph Mathew Antony of the University of Toronto, who was not involved in the research. Scientists have long known that in adult brains, microglia hunt for injured cells as well as pathogens. “They mop up all the dead and dying cells,” Antony says. And when the scavengers find a dangerous intruder, they pounce. “These guys are relentless,” says study coauthor Stephen Noctor, of the University of California, Davis MIND Institute in Sacramento. “They seek and destroy bacteria — it’s really quite amazing.” Microglia also lurk in embryonic brains, but the immune cells’ role there is less well understood. Previous studies had found microglia near neural stem cells — tiny factories that pump out new neurons. When Noctor’s team examined slices of embryonic human, monkey and rodent brains, he was struck by just how many microglia crowded around the stem cells and how closely the two cell types touched. © Society for Science & the Public 2000 - 2013

Keyword: Glia; Neuroimmunology
Link ID: 17872 - Posted: 03.07.2013

By Helen Shen When does a monkey turn down a free treat? When it is offered by a selfish person, apparently. Given the choice between accepting goodies from helpful, neutral or unhelpful people, capuchin monkeys (Cebus apella) tend to avoid individuals who refuse aid to others, according to a study published today in Nature Communications. “Humans can build up an impression about somebody just based on what we see,” says author James Anderson, a comparative psychologist at the University of Stirling, UK. The capuchin results suggest that this skill “probably extends to other species”, he says. Anderson chose to study capuchins because of their highly social and cooperative instincts. Monkeys in the study watched as a person either agreed or refused to help another person to open a jar containing a toy. Afterwards, both people offered a food pellet to the animal. The monkey was allowed to accept food from only one. When help was given, the capuchins showed little preference between the person requesting help and the one providing aid. But when help was denied, the seven monkeys tended to accept food less often from the unhelpful person than from the requester. To try to understand the monkeys’ motivations, Anderson and his team tested different scenarios. The animals showed no bias against people who failed to help because they were busy opening their own jar. But they tended to avoid people who were available to help but did not do so. © 2013 Scientific American

Keyword: Emotions; Aggression
Link ID: 17871 - Posted: 03.07.2013

by Elizabeth Norton The prospect of undergoing surgery while not fully "under" may sound like the stuff of horror movies. But one patient in a thousand remembers moments of awareness while under general anesthesia, physicians estimate. The memories are sometimes neutral images or sounds of the operating room, but occasionally patients report being fully aware of pain, terror, and immobility. Though surgeons scrupulously monitor vital signs such as pulse and blood pressure, anesthesiologists have no clear signal of whether the patient is conscious. But a new study finds that the brain may produce an early-warning signal that consciousness is returning—one that's detectable by electroencephalography (EEG), the recording of neural activity via electrodes on the skull. "We've known since the 1930s that brain activity changes dramatically with increasing doses of anesthetic," says the study's corresponding author, anesthesiologist Patrick Purdon of Massachusetts General Hospital in Boston. "But monitoring a patient's brain with EEG has never become routine practice." Beginning in the 1990s, some anesthesiologists began using an approach called the bispectral (BIS) index, in which readings from a single electrode are connected to a device that calculates, and displays, a single number indicating where the patient's brain activity falls on a scale of 100 (fully conscious) to zero (a "flatline" EEG). Anything between 40 and 60 is considered the target range for unconsciousness. But this index and other similar ones are only indirect measurements, Purdon explains. In 2011, a team led by anesthesiologist Michael Avidan at the Washington University School of Medicine in St. Louis, Missouri, found that monitoring with the BIS index was slightly less successful at preventing awareness during surgery than the nonbrain-based method of measuring exhaled anesthesia in the patient's breath. Of the 2861 patients monitored with the BIS index, seven had memories of the surgery, whereas only two of 2852 patients whose breath was analyzed remembered anything. © 2010 American Association for the Advancement of Science.

Keyword: Sleep; Consciousness
Link ID: 17870 - Posted: 03.05.2013

By David Brown, Hey, you, yawning in your cubicle at 2 in the afternoon. Your genes feel it, too. A new study, paid for by the U.S. Air Force but relevant for anyone with a small child, a large prostate or a lot on the mind, is helping illuminate what’s happening at the genetic level when we don’t get enough sleep. It turns out that chronic sleep deprivation — in this experiment, less than six hours a night for a week — changes the activity of about 700 genes, which is roughly 3 percent of all we carry. About one-third of the affected genes are ramped up when we go with insufficient sleep night after night. The other two-thirds are partially suppressed. Hundreds of “circadian genes” whose activity rises and falls each day abruptly lose their rhythm. Among the genes disturbed by sleep deprivation are ones involved in metabolism, immunity, inflammation, hormone response, the expression of other genes and the organization of material called chromatin on chromosomes. These changes may help explain how inadequate sleep alters attention and thinking and raises the risk for illnesses such as diabetes and coronary heart disease. “The findings will identify some of the pathways linking insufficient sleep and negative health outcomes,” said Derk-Jan Dijk, a physiologist at the University of Surrey in England, who led the study. “But how these things ultimately lead to obesity or diabetes is an unanswered question at this moment in time.” © 1996-2013 The Washington Post

Keyword: Sleep; Genes & Behavior
Link ID: 17869 - Posted: 03.05.2013

Steve Connor Physical exhaustion can occur when the brain – as well as the muscles – grows tired according to a study that sheds fresh light on the role played by the mind in determining endurance levels. Scientists have found that a key neurotransmitter in the brain, which controls signalling between nerve cells, can determine whether someone feels exhausted following physical exercise or after taking anti-depressant drugs such as Prozac. Although levels of serotonin rise during exercise, which provides a psychological boost and “feel-good” factor, it can also result in a widespread central fatigue that ultimately leads to someone feeling exhausted and unable to carry on, scientists found. Researchers led by Professor Jean-Francois Perrier of the University of Copenhagen found that while serotonin helps to keep people going during the early stage of vigorous exercise, a build-up of the neurotransmitter in the brain can have the opposite effect by causing “central fatigue” of the nervous system even when the muscles are still able to carry on. “We can now see it is actually a surplus of serotonin that triggers a braking mechanism in the brain. In other words, serotonin functions as an accelerator but also as a brake when the strain becomes excessive,” said Professor Perrier, whose study is published in the Proceedings of the National Academy of Sciences. © independent.co.uk

Keyword: Sleep
Link ID: 17868 - Posted: 03.05.2013

By George Johnson In the week since I wrote about Oliver Sacks and the idiot savant twins, I’ve been catching up with Season 2 of “Touch,” the TV series about an autistic boy named Jake who has an inexplicable ability to commune with a secret world of numbers — a buried skein of mathematics in which the Golden Mean, the fibonacci sequence, the genetic code, and the Kabbalah are all mysteriously connected. Jungian synchronicity, quantum entanglement, chaos theory — all turn out to be manifestations of an underlying order in which everything that perplexes us ultimately makes sense. It is the dream of both mystics and scientists, and I had wondered shortly after the show first began how the conceit was going to be sustained through more than a few episodes. The connecting thread has turned out to be a conspiracy by a shadowy corporation called AsterCorp — as secretive and powerful as Massive Dynamic, purveyors of the mind-enhancing medicine Cortexiphan in “Fringe” — to kidnap Jake and others like him in their attempt to control the world. Or the universe. It is too soon to tell. Dr. Sacks’s twins, with their power to see, hear, smell — somehow sense within minutes if a number was prime — would also have been on AsterCorp’s wish list. Something keeps pulling me back to Sacks’s story. That is how enchanting a writer he is. (His memoir, Uncle Tungsten, is my favorite of his books.) There are plenty of accounts in the psychiatric literature of amazing human calculators and mnemonists. Sacks describes some famous cases in his essay. But what he thought he saw in the twins went far beyond that. Somehow, as Sacks described it, they could recognize that a number is prime in the way that one might recognize a face. Something on the surface of 3334401341 told them it was prime while 3334401343 was not.

Keyword: Attention
Link ID: 17867 - Posted: 03.05.2013

By GINA KOLATA The psychiatric illnesses seem very different — schizophrenia, bipolar disorder, autism, major depression and attention deficit hyperactivity disorder. Yet they share several genetic glitches that can nudge the brain along a path to mental illness, researchers report. Which disease, if any, develops is thought to depend on other genetic or environmental factors. Their study, published online Wednesday in the Lancet, was based on an examination of genetic data from more than 60,000 people worldwide. Its authors say it is the largest genetic study yet of psychiatric disorders. The findings strengthen an emerging view of mental illness that aims to make diagnoses based on the genetic aberrations underlying diseases instead of on the disease symptoms. Two of the aberrations discovered in the new study were in genes used in a major signaling system in the brain, giving clues to processes that might go awry and suggestions of how to treat the diseases. “What we identified here is probably just the tip of an iceberg,” said Dr. Jordan Smoller, lead author of the paper and a professor of psychiatry at Harvard Medical School and Massachusetts General Hospital. “As these studies grow we expect to find additional genes that might overlap.” The new study does not mean that the genetics of psychiatric disorders are simple. Researchers say there seem to be hundreds of genes involved and the gene variations discovered in the new study confer only a small risk of psychiatric disease. Steven McCarroll, director of genetics for the Stanley Center for Psychiatric Research at the Broad Institute of Harvard and M.I.T., said it was significant that the researchers had found common genetic factors that pointed to a specific signaling system. © 2013 The New York Times Company

Keyword: Schizophrenia; Genes & Behavior
Link ID: 17866 - Posted: 03.04.2013

by Sheila M. Eldred Picture someone with attention deficit hyperactivity disorder, or ADHD, and you probably conjure up an image of an elementary school-age boy. But an analysis of data from the first large, population-based study to follow kids through to adulthood shows that the neurobehavioral disorder rarely goes away with age. Indeed, as ADHD patients make the transition to adulthood, the issues they face often multiply: they are more likely to have other psychiatric disorders and even commit suicide, reports a new study published online today in Pediatrics. NEWS: ADHD Linked to Missing Genes In fact, researchers found that only 37.5 percent of the adults who had been diagnosed with the disorder as a child were free of other psychiatric disorders, including alcohol and drug dependence, in their late 20s. Very few of the children with ADHD were still being treated as adults -- although neuropsychiatric interviews confirmed that 29 percent still had it. “I think there has been a view that ADHD is a childhood disorder, and it’s only relatively recently that people have been trained to detect it in adults,” said Nathan Blum, a developmental-behavioral pediatrician at Children’s Hospital in Philadelphia, who was not involved in the study. Among the adults who’d had ADHD as a child, 57 percent had at least one other psychiatric disorder, compared with 35 percent of the controls. Just under 2 percent percent had died; of the seven deaths, three were suicides. Of the controls, less than 1 percent had died. Of those 37 deaths, five were from suicide. And 2.7 percent were incarcerated at the time of recruitment for the study. © 2013 Discovery Communications, LLC.

Keyword: ADHD
Link ID: 17865 - Posted: 03.04.2013

by Emily Underwood No single cause has yet been discovered for schizophrenia, the devastating neuropsychiatric syndrome characterized by hallucinations, disordered thoughts, and other cognitive and emotional problems, typically beginning in early adulthood. Although schizophrenia runs in families, in many cases no genetic risk is apparent, leading many researchers to look for environmental explanations. Now, research in mice provides support for a long-held hypothesis: that the syndrome, and other neurological disorders, can emerge when multiple environmental insults such as prenatal infection and adolescent trauma combine. Environmental stressors such as infection and abuse were long ago shown to be risk factors for schizophrenia. Large studies of children whose mothers were infected with influenza during the last months of their pregnancy, for example, have a roughly twofold increase in risk of developing the syndrome compared with the general population. That doesn't explain why a few people who are exposed to an infection in the womb go on to develop schizophrenia while most don't, however, says Urs Meyer, a behavioral neurobiologist at the Swiss Federal Institute of Technology in Zurich and co-author of the study reported online today in Science. One long-held hypothesis, he says, is that early infection creates a latent vulnerability to schizophrenia that is only "unmasked" by later insults, such as physical injury or psychological trauma. Such stressors are thought to be particularly damaging during critical periods of brain development such as early puberty, he says. Although the "multiple-hit" hypothesis has been prominent in the literature for some time, it is difficult to test the idea with human epidemiology studies, he says. "You need huge, huge data sets to see anything." © 2010 American Association for the Advancement of Science.

Keyword: Schizophrenia; Stress
Link ID: 17864 - Posted: 03.02.2013

By Daisy Yuhas It's news chocolate lovers have been craving: raw cocoa may be packed with brain-boosting compounds. Researchers at the University of L'Aquila in Italy, with scientists from Mars, Inc., and their colleagues published findings last September that suggest cognitive function in the elderly is improved by ingesting high levels of natural compounds found in cocoa called flavanols. The study included 90 individuals with mild cognitive impairment, a precursor to Alzheimer's disease. Subjects who drank a cocoa beverage containing either moderate or high levels of flavanols daily for eight weeks demonstrated greater cognitive function than those who consumed low levels of flavanols on three separate tests that measured factors that included verbal fluency, visual searching and attention. Exactly how cocoa causes these changes is still unknown, but emerging research points to one flavanol in particular: (-)-epicatechin, pronounced “minus epicatechin.” Its name signifies its structure, differentiating it from other catechins, organic compounds highly abundant in cocoa and present in apples, wine and tea. The graph below shows how (-)-epicatechin fits into the world of brain-altering food molecules. Other studies suggest that the compound supports increased circulation and the growth of blood vessels, which could explain improvements in cognition, because better blood flow would bring the brain more oxygen and improve its function. Animal research has already demonstrated how pure (-)-epicatechin enhances memory. Findings published last October in the Journal of Experimental Biology note that snails can remember a trained task—such as holding their breath in deoxygenated water—for more than a day when given (-)-epicatechin but for less than three hours without the flavanol. Salk Institute neuroscientist Fred Gage and his colleagues found previously that (-)-epicatechin improves spatial memory and increases vasculature in mice. “It's amazing that a single dietary change could have such profound effects on behavior,” Gage says. If further research confirms the compound's cognitive effects, flavanol supplements—or raw cocoa beans—could be just what the doctor ordered. © 2013 Scientific American

Keyword: Attention; Obesity
Link ID: 17863 - Posted: 03.02.2013

But critics are sceptical about predicted organic computer. Ed Yong The brains of two rats on different continents have been made to act in tandem. When the first, in Brazil, uses its whiskers to choose between two stimuli, an implant records its brain activity and signals to a similar device in the brain of a rat in the United States. The US rat then usually makes the same choice on the same task. Miguel Nicolelis, a neuroscientist at Duke University in Durham, North Carolina, says that this system allows one rat to use the senses of another, incorporating information from its far-away partner into its own representation of the world. “It’s not telepathy. It’s not the Borg,” he says. “But we created a new central nervous system made of two brains.” Nicolelis says that the work, published today in Scientific Reports1, is the first step towards constructing an organic computer that uses networks of linked animal brains to solve tasks. But other scientists who work on neural implants are sceptical. Lee Miller, a physiologist at Northwestern University in Evanston, Illinois, says that Nicolelis’s team has made many important contributions to neural interfaces, but the current paper could be mistaken for a “poor Hollywood science-fiction script”. He adds, “It is not clear to what end the effort is really being made.” In earlier work2, Nicolelis’s team developed implants that can send and receive signals from the brain, allowing monkeys to control robotic or virtual arms and get a sense of touch in return. This time, Nicolelis wanted to see whether he could use these implants to couple the brains of two separate animals. © 2013 Nature Publishing Group

Keyword: Robotics
Link ID: 17862 - Posted: 03.02.2013

by Lizzie Wade With its complex interweaving of symbols, structure, and meaning, human language stands apart from other forms of animal communication. But where did it come from? A new paper suggests that researchers look to bird songs and monkey calls to understand how human language might have evolved from simpler, preexisting abilities. One reason that human language is so unique is that it has two layers, says Shigeru Miyagawa, a linguist at the Massachusetts Institute of Technology (MIT) in Cambridge. First, there are the words we use, which Miyagawa calls the lexical structure. "Mango," "Amanda," and "eat" are all components of the lexical structure. The rules governing how we put those words together make up the second layer, which Miyagawa calls the expression structure. Take these three sentences: "Amanda eats the mango," "Eat the mango, Amanda," and "Did Amanda eat the mango?" Their lexical structure—the words they use—is essentially identical. What gives the sentences different meanings is the variation in their expression structure, or the different ways those words fit together. The more Miyagawa studied the distinction between lexical structure and expression structure, "the more I started to think, 'Gee, these two systems are really fundamentally different,' " he says. "They almost seem like two different systems that just happen to be put together," perhaps through evolution. One preliminary test of his hypothesis, Miyagawa knew, would be to show that the two systems exist separately in nature. So he started studying the many ways that animals communicate, looking for examples of lexical or expressive structures. © 2010 American Association for the Advancement of Science.

Keyword: Language; Evolution
Link ID: 17861 - Posted: 03.02.2013

By Tina Hesman Saey If someone shouts “look behind you,” tadpoles in Michael Levin’s laboratory may be ready. The tadpoles can see out of eyes growing from their tails, even though the organs aren’t directly wired to the animals’ brains, Levin and Douglas Blackiston, both of Tufts University in Medford, Mass., report online February 27 in the Journal of Experimental Biology. Levin and Blackiston’s findings may help scientists better understand how the brain and body communicate, including in humans, and could be important for regenerative medicine or designing prosthetic devices to replace missing body parts, says Günther Zupanc, a neuroscientist at Northeastern University in Boston. Researchers have transplanted frog eyes to other body parts for decades, but until now, no one had shown that those oddly placed eyes (called “ectopic” eyes) actually worked. Ectopic eyes on tadpoles’ tails allow the animals to distinguish blue light from red light, the Tufts team found. Levin wanted to know whether the brain is hardwired to get visual information only from eyes in the head, or whether the brain could use data coming from elsewhere. To find out, he and Blackiston started with African clawed frog tadpoles (Xenopus laevis) and removed the normal eyes. They then transplanted cells that would grow into eyes onto the animals’ tails. The experiment seemed like a natural to test how well the brain can adapt, Levin says. “There’s no way the tadpole’s brain is expecting an eye on its tail.” Expected or not, some of the tadpoles managed to detect red and blue light from their tail eyes. The researchers placed tadpoles with transplanted eyes in chambers in which half of the chamber was illuminated in blue light and the other half in red light. A mild electric shock zapped the tadpole when it was in one half of the dish so that the animal learned to associate the color with the shock. The researchers periodically switched the colors in the chamber so that the tadpoles didn’t learn that staying still would save them. © Society for Science & the Public 2000 - 2013

Keyword: Vision; Evolution
Link ID: 17860 - Posted: 03.02.2013