Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
Carl Zimmer A novelist scrawling away in a notebook in seclusion may not seem to have much in common with an NBA player doing a reverse layup on a basketball court before a screaming crowd. But if you could peer inside their heads, you might see some striking similarities in how their brains were churning. That’s one of the implications of new research on the neuroscience of creative writing. For the first time, neuroscientists have used fMRI scanners to track the brain activity of both experienced and novice writers as they sat down — or, in this case, lay down — to turn out a piece of fiction. The researchers, led by Martin Lotze of the University of Greifswald in Germany, observed a broad network of regions in the brain working together as people produced their stories. But there were notable differences between the two groups of subjects. The inner workings of the professionally trained writers in the bunch, the scientists argue, showed some similarities to people who are skilled at other complex actions, like music or sports. The research is drawing strong reactions. Some experts praise it as an important advance in understanding writing and creativity, while others criticize the research as too crude to reveal anything meaningful about the mysteries of literature or inspiration. Dr. Lotze has long been intrigued by artistic expression. In previous studies, he has observed the brains of piano players and opera singers, using fMRI scanners to pinpoint regions that become unusually active in the brain. Needless to say, that can be challenging when a subject is singing an aria. Scanners are a lot like 19th-century cameras: They can take very sharp pictures, if their subject remains still. To get accurate data, Dr. Lotze has developed software that can take into account fluctuations caused by breathing or head movements. © 2014 The New York Times Company
Keyword: Language; Brain imaging
Link ID: 19756 - Posted: 06.21.2014
Karen Ravn To the west, the skies belong to the carrion crow. To the east, the hooded crow rules the roost. In between, in a narrow strip running roughly north to south through central Europe, the twain have met, and mated, for perhaps as long as 10,000 years. But although the crows still look very different — carrion crows are solid black, whereas hooded crows are grey — researchers have found that they are almost identical genetically. The taxonomic status of carrion crows (Corvus corone) and hooded crows (Corvus cornix) has been debated ever since Carl Linnaeus, the founding father of taxonomy, declared them to be separate species in 1758. A century later, Darwin called any such classification impossible until the term 'species' had been defined in a generally accepted way. But the definition is still contentious, and many believe it always will be. The crows are known to cross-breed and produce viable offspring, so lack the reproductive barriers that some biologists consider essential to the distinction of a species, leading to proposals that they are two subspecies of carrion crow. In fact, evolutionary biologist Jochen Wolf from Uppsala University in Sweden and his collaborators have now found that the populations living in the cross-breeding zone are so similar genetically that the carrion crows there are more closely related to hooded crows than to the carrion crows farther west1. Only a small part of the genome — less than 0.28% — differs between the populations, the team reports in this week's Science1. This section is located on chromosome 18, in an area associated with pigmentation, visual perception and hormonal regulation. It is no coincidence, the researchers suggest, that the main differences between carrion and hooded crows are in colouring, mating preferences (both choose mates whose colouring matches theirs), and hormone-influenced social behaviours (carrion crows lord it over hooded ones). © 2014 Nature Publishing Group,
Keyword: Sexual Behavior; Evolution
Link ID: 19755 - Posted: 06.21.2014
By Gary Stix James DiCarlo: We all have this intuitive feel for what object recognition is. It’s the ability to discriminate your face from other faces, a car from other cars, a dog from a camel, that ability we all intuitively feel. But making progress in understanding how our brains are able to accomplish that is a very challenging problem and part of the reason is that it’s challenging to define what it isn’t and is. We take this problem for granted because it seems effortless to us. However, a computer vision person would tell you is that this is an extremely challenging problem because each object presents an essentially infinite number of images to your retina so you essentially never see the same image of each object twice. SA: It seems like object recognition is actually one of the big problems both in neuroscience and in the computational science of machine learning? DiCarlo: That’s right., not only machine learning but also in psychology or cognitive science because the objects that we see are the sources in the world of what we use to build higher cognition, things like memory and decision-making. Should I reach for this, should I avoid it? Our brains can’t do what you would call higher cognition without these foundational elements that we often take for granted. SA: Maybe you can talk about what’s actually happening in the brain during this process. DiCarlo: It’s been known for several decades that there’s a portion of the brain, the temporal lobe down the sides of our head, that, when lost or damaged in humans and non-human primates, leads to deficits of recognition. So we had clues that that’s where these algorithms for object recognition are living. But just saying that part of your brain solves the problem is not really specific. It’s still a very large piece of tissue. Anatomy tells us that there’s a whole network of areas that exist there, and now the tools of neurophysiology and still more advanced tools allow us to go in and look more closely at the neural activity, especially in non-human primates. We can then begin to decipher the actual computations to the level that an engineer might, for instance, in order to emulate what’s going on in our heads. © 2014 Scientific American
Keyword: Vision; Attention
Link ID: 19754 - Posted: 06.21.2014
—By Indre Viskontas He might be fictional. But the gigantic Hodor, a character in the blockbuster Game of Thrones series, nonetheless sheds light on something very much in the realm of fact: how our ability to speak emerges from a complex ball of neurons, and how certain brain-damaged patients can lose very specific aspects of that ability. According to George R.R. Martin, who wrote the epic books that inspired the HBO show, the 7-foot-tall Hodor could only say one word—"Hodor"—and everyone therefore tended to assume that was his name. Here's one passage about Hodor from the first novel in Martin's series: Theon Greyjoy had once commented that Hodor did not know much, but no one could doubt that he knew his name. Old Nan had cackled like a hen when Bran told her that, and confessed that Hodor's real name was Walder. No one knew where "Hodor" had come from, she said, but when he started saying it, they started calling him by it. It was the only word he had. Yet it's clear that Hodor can understand much more than he can say; he's able to follow instructions, anticipate who needed help, and behave in socially appropriate ways (mostly). Moreover, he says this one word in many different ways, implying very different meanings: So what might be going on in Hodor's brain? Hodor's combination of impoverished speech production with relatively normal comprehension is a classic, albeit particularly severe, presentation of expressive aphasia, a neurological condition usually caused by a localized stroke in the front of the brain, on the left side. Some patients, however, have damage to that part of the brain from other causes, such as a tumor, or a blow to the head. ©2014 Mother Jones
Keyword: Language
Link ID: 19753 - Posted: 06.21.2014
Heidi Ledford If shown to be possible in humans, addiction to the Sun could help explain why some tanners continue to seek out sunlight despite being well aware of the risks. The lure of a sunny day at the beach may be more than merely the promise of fun and relaxation. A study published today reports that mice exposed to ultraviolet (UV) rays exhibit behaviours akin to addiction. The researchers found that mice exposed repeatedly to UV light produced an opioid called β-endorphin, which numbs pain and is associated with addiction to drugs. When they were given a drug that blocks the effect of opioids, the mice also showed signs of withdrawal — including shaky paws and chattering teeth. If the results hold true in humans, they would suggest an explanation for why many tanners continue to seek out sunlight, despite the risks — and, in some cases, even after being diagnosed with skin cancer. “This offers a clear potential mechanism for how UV radiation can be rewarding and, in turn, potentially addictive,” says Bryon Adinoff, an addiction psychiatrist at the University of Texas Southwestern Medical Center in Dallas, who was not involved with the study. “That’s a big deal.” Oncologist David Fisher of the Massachusetts General Hospital in Boston and his colleagues became interested in sunlight addiction after studying the molecular mechanisms of pigment production in the skin after UV light exposure. In the new study published today in Cell1, they show that in mice, some skin cells also synthesize β-endorphin in response to chronic, low doses of UV light. © 2014 Nature Publishing Group
Keyword: Drug Abuse
Link ID: 19752 - Posted: 06.21.2014
By Robert Dudley When we think about the origins of agriculture and crop domestication, alcohol isn’t necessarily the first thing that comes to mind. But our forebears may well have been intentionally fermenting fruits and grains in parallel with the first Neolithic experiments in plant cultivation. Ethyl alcohol, the product of fermentation, is an attractive and psychoactively powerful inebriant, but fermentation is also a useful means of preserving food and of enhancing its digestibility. The presence of alcohol prolongs the edibility window of fruits and gruels, and can thus serve as a means of short-term storage for various starchy products. And if the right kinds of bacteria are also present, fermentation will stabilize certain foodstuffs (think cheese, yogurt, sauerkraut, and kimchi, for example). Whoever first came up with the idea of controlling the natural yeast-based process of fermentation was clearly on to a good thing. Using spectroscopic analysis of chemical residues found in ceramic vessels unearthed by archaeologists, scientists know that the earliest evidence for intentional fermentation dates to about 7000 BCE. But if we look deeper into our evolutionary past, alcohol was a component of our ancestral primate diet for millions of years. In my new book, The Drunken Monkey, I suggest that alcohol vapors and the flavors produced by fermentation stimulate modern humans because of our ancient tendencies to seek out and consume ripe, sugar-rich, and alcohol-containing fruits. Alcohol is present because of particular strains of yeasts that ferment sugars, and this process is most common in the tropics where fruit-eating primates originated and today remain most diverse. © 1986-2014 The Scientist
Keyword: Drug Abuse; Evolution
Link ID: 19751 - Posted: 06.21.2014
by Colin Barras The Neanderthals knew how to make an entrance: teeth first. Our sister species' distinctive teeth were among the first unique aspects of their anatomy to evolve, according to a study of their ancestors. These early Neanderthals may have used their teeth as a third hand, gripping objects that they then cut with tools. The claim comes from a study of fossils from Sima de los Huesos in northern Spain. This "pit of bones" may be an early burial site, and 28 near-complete skeletons have been pulled from it, along with a large hand-axe that might be a funeral gift. The hominins in the pit look like Neanderthals, but are far too old. That suggests they are forerunners of the Neanderthals, and if that is the case they can tell us how the species evolved. To find out, Juan Luis Arsuaga Ferreras at the UCM-ISCIII Joint Centre for Research into Human Evolution and Behaviour in Madrid, Spain, and colleagues studied 17 of the skulls. They found that the brain case was still the same shape as in older species. But the skulls' protruding faces and small molar teeth were much more Neanderthal-like. This suggests the earliest Neanderthals used their jaws in a specialised way. It's not clear how, but it probably wasn't about food, says Ferreras. "There are no indications of any dietary specialisation in the Neanderthals and their ancestors. They were basically carnivores." © Copyright Reed Business Information Ltd.
Keyword: Evolution
Link ID: 19750 - Posted: 06.21.2014
By Elizabeth Norton A single dose of a century-old drug has eliminated autism symptoms in adult mice with an experimental form of the disorder. Originally developed to treat African sleeping sickness, the compound, called suramin, quells a heightened stress response in neurons that researchers believe may underlie some traits of autism. The finding raises the hope that some hallmarks of the disorder may not be permanent, but could be correctable even in adulthood. That hope is bolstered by reports from parents who describe their autistic children as being caught behind a veil. "Sometimes the veil parts, and the children are able to speak and play more normally and use words that didn't seem to be there before, if only for a short time during a fever or other stress" says Robert Naviaux, a geneticist at the University of California, San Diego, who specializes in metabolic disorders. Research also shows that the veil can be parted. In 2007, scientists found that 83% of children with autism disorders showed temporary improvement during a high fever. The timing of a fever is crucial, however: A fever in the mother can confer a higher risk for the disorder in the unborn child. As a specialist in the cell's life-sustaining metabolic processes, Naviaux was intrigued. Autism is generally thought to result from scrambled signals at synapses, the points of contact between nerve cells. But given the specific effects of something as general as a fever, Naviaux wondered if the problem lay "higher up" in the cell's metabolism. © 2014 American Association for the Advancement of Science.
Keyword: Autism
Link ID: 19749 - Posted: 06.19.2014
by Helen Thomson KULLERVO HYNYNEN is preparing to cross neuroscience's final frontier. In July he will work with a team of doctors in the first attempt to open the blood-brain barrier in humans – the protective layer around blood vessels that shields our most precious organ against threats from the outside world. If successful, the method would be a huge step in the treatment of pernicious brain diseases such as cancer, Parkinson's and Alzheimer's, by allowing drugs to pass into the brain. The blood-brain barrier (BBB) keeps toxins in the bloodstream away from the brain. It consists of a tightly packed layer of endothelial cells that wrap around every blood vessel throughout the brain. It prevents viruses, bacteria and any other toxins passing into the brain, while simultaneously ushering in vital molecules such as glucose via specialised transport mechanisms. The downside of this is that the BBB also completely blocks the vast majority of drugs. Exceptions include some classes of fat and lipid-soluble chemicals, but these aren't much help as such drugs penetrate every cell in the body – resulting in major side effects. "Opening the barrier is really of huge importance. It is probably the major limitation for innovative drug development for neurosciences," says Bart De Strooper, co-director of the Leuven Institute for Neuroscience and Disease in Belgium. © Copyright Reed Business Information Ltd.
Keyword: Glia
Link ID: 19748 - Posted: 06.19.2014
By Brady Dennis Government warnings a decade ago about the risks associated with children and adolescents taking antidepressants appear to have backfired, causing an increase in suicide attempts and discouraging many depressed young people from seeking treatment, according to a study published Wednesday in the academic journal BMJ. Researchers said their findings underscore how even well-intentioned public health warnings can produce unintended consequences, particularly when they involve widespread media attention and sensitive topics such as depression and suicide. In 2003 and 2004, the Food and Drug Administration issued a series of warnings based on data that pointed to an increase in suicidal thinking among some children and adolescents prescribed a class of antidepressants known as selective serotonin reuptake inhibitors, or SSRIs. They included such drugs as Paxil and Zoloft. In late 2004, the agency directed manufacturers to include a “black box” warning on their labels notifying consumers and doctors about the increased risk of suicidal thoughts and behaviors in youths being treated with these medications. The FDA warnings received a flood of media coverage that researchers said focused more on the tiny percentage of patients who had experienced suicidal thinking due to the drugs than on the far greater number who benefited from them. “There was a huge amount of publicity,” said Stephen Soumerai, professor of population medicine at Harvard Medical School and a co-author of Wednesday’s study. “The media concentrated more on the relatively small risk than on the significant upside.”
Keyword: Depression
Link ID: 19747 - Posted: 06.19.2014
by Lauren Hitchings Our brain's ability to rapidly interpret and analyse new information may lie in the musical hum of our brainwaves. We continuously take in information about the world but establishing new neural connections and pathways – the process thought to underlie memory formation – is too slow to account for our ability to learn rapidly. Evan Antzoulatos and Earl Miller at the Massachusetts Institute of Technology decided to see if brainwaves – the surges of electricity produced by individual neurons firing en masse – play a role. They used EEG to observe patterns of electrical activity in the brains of monkeys as they taught the animals to categorise patterns of dots into two distinct groups. At first, they memorised which dots went where, but as the task became harder, they shifted to learning the rules that defined the categories. Humming brainwaves The researchers found that, initially, brainwaves of different frequencies were being produced independently by the prefrontal cortex and the striatum – two brain regions involved in learning. But as the monkeys made sense of the game, the waves began to synchronise and "hum" at the same frequency – with each category of dots having its own frequency. Miller says the synchronised brainwaves indicate the formation of a communication circuit between the two brain regions. He believes this happens before anatomical changes in brain connections take place, giving our minds time to think through various options when presented with new information before the right one gets laid down as a memory. Otherwise, the process is too time-consuming to account for the flexibility and speed of the human mind, says Miller. © Copyright Reed Business Information Ltd.
Keyword: Learning & Memory
Link ID: 19746 - Posted: 06.19.2014
Migraines have been diagnosed in about eight per cent of Canadians, a quarter or more of whom say the severe headaches impact day-to-day life such as getting a good night’s sleep or driving, Statistics Canada says. The federal agency on Wednesday released its first report on the prevalence of migraine, saying an estimated 2.7 million Canadians, or 8.3 per cent, reported they had been diagnosed with the severe headaches in 2010-2011. Chronic migraines are frequent, severe, pulsating headaches accompanied by nausea, vomiting, and sensitivity to light and sound. "I think the key finding that was quite interesting was the impact of migraine," said report author Pamela Ramage-Morin, a senior analyst in Ottawa. "For three-quarters to say that it had an impact on their getting a good night sleep, over half said it prevented them from driving on some occasions, even people feeling left out of things because of their condition. There's some social isolation that could be occurring. It may be limiting on people's education and employment opportunities. That can have a long-term effect." The sleep findings are important given lack of sleep can impact other aspects of life, Ramage-Morin said, noting how the effects can extend beyond the individual to the larger community. For both men and women surveyed, migraines were most common at ages 30 to 49, a group represents 12 per cent of the population and the prime working years. © CBC 2014
Keyword: Pain & Touch
Link ID: 19745 - Posted: 06.19.2014
by Laura Sanders Some brain cells need a jolt of stress to snap to attention. Cells called astroglia help regulate blood flow, provide energy to nearby cells and even influence messages’ movement between nerve cells. Now, scientists report June 18 in Neuron that astroglia can be roused by the stress molecule norepinephrine, an awakening that may help the entire brain jump into action. As mice were forced to walk on a treadmill, an activity that makes them alert, astroglia in several parts of their brains underwent changes in calcium levels, a sign of activity, neuroscientist Dwight Bergles of Johns Hopkins University School of Medicine and colleagues found. Norepinephrine, which acts as a fight-or-flight hormone in the body and a neural messenger in the brain, seemed to cause the cell activity boost. When researchers depleted norepinephrine, treadmill walking no longer activated astroglia. It’s not clear whether astroglia in all parts of the brain heed this wake-up call, nor is it clear whether this activation influences behavior. Norepinephrine might help shift brain cells, both neurons and astroglia, into a state of heightened vigilance, the authors write. © Society for Science & the Public 2000 - 2013.
By PAM BELLUCK Cindy Wachenheim was someone people didn’t think they had to worry about. She was a levelheaded lawyer working for the State Supreme Court, a favorite aunt who got down on the floor to play with her nieces and nephews, and, finally, in her 40s, the mother she had long dreamed of becoming. But when her baby was a few months old, she became obsessed with the idea that she had caused him irrevocable brain damage. Nothing could shake her from that certainty, not even repeated assurances from doctors that he was normal. “I love him so much, but it’s obviously a terrible kind of love,” she agonized in a 13-page handwritten note. “It’s a love where I can’t bear knowing he is going to suffer physically and mentally/emotionally for much of his life.” Ms. Wachenheim’s story provides a wrenching case study of one woman’s experience with maternal mental illness in its most extreme and rare form. It also illuminates some of the surprising research findings that are redefining the scientific understanding of such disorders: that they often develop later than expected and include symptoms not just of depression, but of psychiatric illnesses. Now these mood disorders, long hidden in shame and fear, are coming out of the shadows. Many women have been afraid to admit to terrifying visions or deadened emotions, believing they should be flush with maternal joy or fearing their babies would be taken from them. But now, advocacy groups on maternal mental illness are springing up, and some mothers are blogging about their experiences with remarkable candor. A dozen states have passed laws encouraging screening, education and treatment. And celebrities, including Brooke Shields, Gwyneth Paltrow and Courteney Cox, have disclosed their postpartum depression. © 2014 The New York Times Company
Keyword: Depression; Hormones & Behavior
Link ID: 19743 - Posted: 06.17.2014
by Bethany Brookshire When a cartoon character gets an idea, you know it. A lightbulb goes on over Wile E. Coyote’s head, or a ding sounds as Goofy puts two and two together. While the lightbulb and sound effects are the stuff of cartoons, scientists can, in a way, watch learning in action. In a new study, a learning task in rats was linked to increases in activity patterns in groups of brain cells. The results might help scientists pin down what learning looks like at the nerve cell level, and give us a clue about how memories are made. Different areas of the brain communicate with each other, transferring information from one area to another for processing and interpretation. Brain cell meets brain cell at connections called synapses. But to transfer information between areas often takes more than one neuron firing a lonely signal. It takes cortical oscillations — networks of brain cells sending electrical signals in concert — over and over again for a message to transmit from one brain area to another. Changes in electrical fields increase the probability that neurons in a population will fire. These cortical oscillations are like a large crowd chanting. Not all voices may be yelling at once, some people may be ahead or behind, some may even be whispering, but you still hear an overwhelming “USA! USA!” Cortical oscillations can occur within a single brain area, or they can extend from one area to another. “The oscillation tells you what the other brain area is likely to ‘see’ when it gets that input,” explains Leslie Kay, a neuroscientist at the University of Chicago. Once the receiving area ‘sees’ the incoming oscillation, it may synchronize its own population firing, joining in the chant. “A synchronized pattern of oscillations in two separate brain regions serves to communicate between the two regions,” says Kei Igarashi, a neuroscientist at the Norwegian University of Science and Technology in Trondheim. © Society for Science & the Public 2000 - 2013
Keyword: Learning & Memory
Link ID: 19742 - Posted: 06.17.2014
By Michelle Roberts Health editor, BBC News online Scientists say they have devised a helmet that can quickly determine whether a patient has had a stroke. It could speed diagnosis and treatment of stroke to boost chances of recovery, the scientists say. The wearable cap bounces microwaves off the brain to determine whether there has been a bleed or clot deep inside. The Swedish scientists who made the device plan to give it to ambulance crews to test after successful results in early studies with 45 patients. When a person has a stroke, doctors must work quickly to limit any brain damage. If it takes more than four hours to get to hospital and start treatment, parts of their brain tissue may already be dying. But to give the best treatment, doctors first need to find out if the stroke is caused by a leaky blood vessel or one blocked by a clot. A computerised tomography (CT) scan will show this, but it can take some time to organise one for a patient, even if they have been admitted as an emergency to a hospital that has one of these scanners. Any delay in this "golden hour" of treatment opportunity could hamper recovery. To speed up the process, researchers in Sweden, from Chalmers University of Technology, Sahlgrenska Academy and Sahlgrenska University Hospital, have come up with a mobile device that could be used on the way to hospital. The helmet uses microwave signals - the same as the ones emitted by microwave ovens and mobile phones but much weaker - to build a picture of what is going on throughout the brain. BBC © 2014
Keyword: Stroke; Brain imaging
Link ID: 19741 - Posted: 06.17.2014
A selfie video that a 49-year-old Toronto-area woman took to show numbness and slurred speech she was experiencing helped doctors to diagnose her as having a mini-stroke, after she had earlier been given a diagnosis of stress. When Stacey Yepes’s face originally froze and she had trouble speaking in April, she remembered the signs of stroke from public service announcements. After the symptoms subsided, she went to a local emergency room, but the tests were clear and she was given tips on how to manage stress. The numbing sensation happened again as she left the hospital. When the left side of her body went numb while driving two days later, she pulled over, grabbed her smartphone and hit record. "The sensation is happening again," the Thornhill, Ont., woman says at the beginning of the video posted on YouTube by Toronto’s University Health Network. "It’s all tingling on left side," as she points to her lower lip, trying to smile. Yepes remembers that doctors said to breathe in and out and to try to manage stress, and she says she's trying. "I don’t know why this is happening to me." About a minute later, she shows that it’s hard to lift up her hand. "I think it was just to show somebody, because I knew it was not stress-related," she said in an interview. "And I thought if I could show somebody what was happening, they would have a better understanding." After going to Mount Sinai Hospital in downtown Toronto, Yepes was referred to Toronto Western Hospital’s stroke centre. © CBC 2014
Keyword: Stroke
Link ID: 19740 - Posted: 06.17.2014
By Denali Tietjen Caffeine isn’t healthy, but that’s no news. The withdrawal headaches, jitteriness and dehydration kind of gave that one way. What is news, however, is that starting at puberty, it’s worse for boys than girls. Girls and boys have the same cardiovascular reactions to caffeine in childhood, but begin to react differently in adolescence, finds a new study conducted by researchers from The University of Buffalo. In the double-blind study published in the June issue of Pediatrics, researchers examined the cardiovascular reactions of 52 pre-pubescent (ages eight to nine) and 49 post-pubescent (ages 15 to 17) children to varying levels of caffeine. Participants consumed either the placebo, 1 mg/kg or 2 mg/kg caffeinated sodas, and then had their heart rates and blood pressures taken. The results found that pre-pubescent children had the same reaction to caffeine regardless of gender, while post-pubescent boys had much stronger cardiovascular reactions to caffeine than girls. The study also examined post-pubescent girls’ reactions to caffeine at various phases of their menstrual cycles. At different stages of the cycle, the girls metabolized caffeine differently. “We found differences in responses to caffeine across the menstrual cycle in post-pubertal girls, with decreases in heart rate that were greater in the mid-luteal phase and blood pressure increases that were greater in the mid-follicular phase of the menstrual cycle,” Dr. Jennifer Temple, one of the researchers who conducted the study said in a University at Buffalo press release announcing the study.
Keyword: Sexual Behavior; Drug Abuse
Link ID: 19739 - Posted: 06.17.2014
By Adam Brimelow Health Correspondent, BBC News Researchers from Oxford University say they've made a breakthrough in developing smart glasses for people with severe sight loss. The glasses enhance images of nearby people and objects on to the lenses, providing a much clearer sense of surroundings. They have allowed some people to see their guide dogs for the first time. The Royal National Institute of Blind People says they could be "incredibly important". Lyn Oliver has a progressive eye disease which means she has very limited vision. Now 70, she was diagnosed with retinitis pigmentosa in her early twenties. She can spot movement but describes her sight as "smudged and splattered". Her guide dog Jess helps her find her way around - avoiding most obstacles and hazards - but can't convey other information about her surroundings. Lyn is one of nearly two million people in the UK with a sight problem which seriously affects their daily lives. Most though have at least some residual sight. Researchers at Oxford University have developed a way to enhance this - using smart glasses. They are fitted with a specially adapted 3D camera. retinitis pigmentosa Dark spots across the retina (back of the eye) correspond with the extent of vision loss in retinitis pigmentosa The images are processed by computer and projected in real-time on to the lenses - so people and objects nearby become bright and clearly defined. 'More independent' Lyn Oliver has tried some of the early prototypes, but the latest model marks a key stage in the project, offering greater clarity and detail than ever before. Dr Stephen Hicks, from the University of Oxford, who has led the project, says they are now ready to be taken from the research setting to be used in the home. BBC © 2014
Keyword: Vision; Robotics
Link ID: 19738 - Posted: 06.17.2014
by Tania Lombrozo Science doesn't just further technology and help us predict and control our environment. It also changes the way we understand ourselves and our place in the natural world. This understanding can and a sense of . But it can also be , especially when it calls into question our basic assumptions about the kinds of creatures we are and the universe we inhabit. Current developments in neuroscience seem to be triggering precisely this jumble of reactions: wonder alongside disquiet, hope alongside alarm. A recent at Salon.com, for example, promises an explanation for "how neuroscience could save addicts from relapse," while an by Nathan Greenslit at The Atlantic, published less than a week later, raises worries that neuroscience is being used to reinforce racist drug policy. Obama's hails "," but with it comes the need to rapidly work out the of what we're learning about the brain and about ourselves. We're ; but we're not always sure what to make of it. In at the journal Psychological Science, psychologists Azim Shariff, Joshua Greene and six of their colleagues bring these heady issues down to earth by considering whether learning about neuroscience can influence judgments in a real-world situation: deciding how someone who commits a crime should be punished. The motivating intuition is this: to hold someone responsible for her actions, she must have acted with free will. ©2014 NPR
Keyword: Consciousness
Link ID: 19737 - Posted: 06.17.2014