Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 9721 - 9740 of 29398

GrrlScientist Since today is caturday, that wonderful day when the blogosphere takes a breather from hell-raising to celebrate pets, I thought some of my favourite animals: corvids. I ran across this lovely video created by Cornell University’s Laboratory of Ornithology (more fondly referred to as the “Lab of O”) that discusses the differences between and potential meanings of the sounds made by crows and ravens. If you watch birds, even casually, you might be confused by trying to distinguish these two large black corvid species. However, both species are quite chatty, and these birds’ sounds provide important identifying information. In this video, narrated by Kevin McGowan, an ornithologist at the Cornell Lab of O, you’ll learn how to distinguish crows and ravens on the basis of their voices alone. Both crows and ravens make loud raspy signature calls, described as “caw” and “kraa” respectively, but American crows and common ravens have large repertoires of sounds in addition to these calls. They also can learn to imitate the calls of other birds. As you’ll learn in this video, crows often make a “rattle” sound along with their territorial “caw”. They also communicate using a wide variety of other sounds including clicks and bell-like notes. Ravens, on the other hand, produce deep, throaty kraa calls.

Keyword: Animal Communication
Link ID: 20255 - Posted: 10.29.2014

By Virginia Morell Human fetuses are clever students, able to distinguish male from female voices and the voices of their mothers from those of strangers between 32 and 39 weeks after conception. Now, researchers have demonstrated that the embryos of the superb fairy-wren (Malurus cyaneus, pictured), an Australian songbird, also learn to discriminate among the calls they hear. The scientists played 1-minute recordings to 43 fairy-wren eggs collected from nests in the wild. The eggs were between days 9 and 13 of a 13- to 14-day incubation period. The sounds included white noise, a contact call of a winter wren, or a female fairy-wren’s incubation call. Those embryos that listened to the fairy-wrens’ incubation calls and the contact calls of the winter wrens lowered their heart rates, a sign that they were learning to discriminate between the calls of a different species and those of their own kind, the researchers report online today in the Proceedings of the Royal Society B. (None showed this response to the white noise.) Thus, even before hatching, these small birds’ brains are engaged in tasks requiring attention, learning, and possibly memory—the first time embryonic learning has been seen outside humans, the scientists say. The behavior is key because fairy-wren embryos must learn a password from their mothers’ incubation calls; otherwise, they’re less successful at soliciting food from their parents after hatching. © 2014 American Association for the Advancement of Science.

Keyword: Language; Development of the Brain
Link ID: 20254 - Posted: 10.29.2014

By C. NATHAN DeWALL How many words does it take to know you’re talking to an adult? In “Peter Pan,” J. M. Barrie needed just five: “Do you believe in fairies?” Such belief requires magical thinking. Children suspend disbelief. They trust that events happen with no physical explanation, and they equate an image of something with its existence. Magical thinking was Peter Pan’s key to eternal youth. The ghouls and goblins that will haunt All Hallows’ Eve on Friday also require people to take a leap of faith. Zombies wreak terror because children believe that the once-dead can reappear. At haunted houses, children dip their hands in buckets of cold noodles and spaghetti sauce. Even if you tell them what they touched, they know they felt guts. And children surmise that with the right Halloween makeup, costume and demeanor, they can frighten even the most skeptical adult. We do grow up. We get jobs. We have children of our own. Along the way, we lose our tendencies toward magical thinking. Or at least we think we do. Several streams of research in psychology, neuroscience and philosophy are converging on an uncomfortable truth: We’re more susceptible to magical thinking than we’d like to admit. Consider the quandary facing college students in a clever demonstration of magical thinking. An experimenter hands you several darts and instructs you to throw them at different pictures. Some depict likable objects (for example, a baby), others are neutral (for example, a face-shaped circle). Would your performance differ if you lobbed darts at a baby? It would. Performance plummeted when people threw the darts at the baby. Laura A. King, the psychologist at the University of Missouri who led this investigation, notes that research participants have a “baseless concern that a picture of an object shares an essential relationship with the object itself.” Paul Rozin, a psychology professor at the University of Pennsylvania, argues that these studies demonstrate the magical law of similarity. Our minds subconsciously associate an image with an object. When something happens to the image, we experience a gut-level intuition that the object has changed as well. © 2014 The New York Times Company

Keyword: Attention
Link ID: 20253 - Posted: 10.28.2014

By Melissa Hogenboom Science reporter, BBC News A genetic analysis of almost 900 offenders in Finland has revealed two genes associated with violent crime. Those with the genes were 13 times more likely to have a history of repeated violent behaviour. The authors of the study, published in the journal Molecular Psychiatry, said at least 4-10% of all violent crime in Finland could be attributed to individuals with these genotypes. But they stressed the genes could not be used to screen criminals. Many more genes may be involved in violent behaviour and environmental factors are also known to have a fundamental role. Even if an individual has a "high-risk combination" of these genes the majority will never commit a crime, the lead author of the work Jari Tiihonen of the Karolinska Institutet in Sweden said. "Committing a severe, violent crime is extremely rare in the general population. So even though the relative risk would be increased, the absolute risk is very low," he told the BBC. The study, which involved analysis of almost 900 criminals, is the first to have looked at the genetic make-up of so many violent criminals in this way. Warrior gene Each criminal was given a profile based on their offences, categorising them into violent or non-violent. The association between genes and previous behaviour was strongest for the 78 who fitted the "extremely violent offender" profile. This group had committed a total of 1,154 murders, manslaughters, attempted homicides or batteries. A replication group of 114 criminals had all committed at least one murder. BBC © 2014

Keyword: Aggression; Genes & Behavior
Link ID: 20252 - Posted: 10.28.2014

Daniel Duane, Men's Journal For more than half a century, the conventional wisdom among nutritionists and public health officials was that fat is dietary enemy No. 1 — the leading cause of obesity and heart disease. It appears the wisdom was off. And not just off. Almost entirely backward. According to a new study from the National Institutes of Health, a diet that reduces carbohydrates in favor of fat — including the saturated fat in meat and butter — improves nearly every health measurement, from reducing our waistlines to keeping our arteries clear, more than the low-fat diets that have been recommended for generations. "The medical establishment got it wrong," says cardiologist Dennis Goodman, director of Integrative Medicine at New York Medical Associates. "The belief system didn't pan out." It's not the conclusion you would expect given the NIH study's parameters. Lead researcher Lydia Bazanno, of the Tulane University School of Public Health, pitted this high-fat, low-carb diet against a fat-restricted regimen prescribed by the National Cholesterol Education Program. "We told both groups to get carbs from green, leafy vegetables, because those are high in nutrients and fiber to keep you sated," Bazanno says. "We also told everyone to stay away from trans fats." The fat-restricted group continued to eat carbs, including bread and cereals, while keeping saturated fat — common in animal products — below 7 percent of total calories. By contrast, the high-fat group cut carbs in half and did not avoid butter, meat, and cheese. Most important, both groups ate as much as they wanted — no calorie counting, no going hungry.

Keyword: Obesity
Link ID: 20251 - Posted: 10.28.2014

By Paula Span First, an acknowledgment: Insomnia bites. S. Bliss, a reader from Albuquerque, comments that even taking Ativan, he or she awakens at 4:30 a.m., can’t get back to sleep and suffers “a state of sleep deprivation and eventually a kind of walking exhaustion.” Molly from San Diego bemoans “confusion, anxiety, exhaustion, depression, loss of appetite, frankly a loss of will to go on,” all consequences of her sleeplessness. She memorably adds, “Give me Ambien or give me death.” Marciacornute reports that she’s turned to vodka (prompting another reader to wonder if Medicare will cover booze). After several rounds of similar laments here (and not only here; insomnia is prevalent among older adults), I found the results of a study by University of Chicago researchers particularly striking. What if people who report sleep problems are actually getting enough hours of sleep, overall? What if they’re not getting significantly less sleep than people who don’t complain of insomnia? Maybe there’s something else going on. It has always been difficult to ascertain how much people sleep; survey questions are unreliable (how can you tell when you’ve dozed off?), and wiring people with electrodes creates such an abnormal situation that the results may bear little resemblance to ordinary nightlife. Enter the actigraph, a wrist-motion monitor. “The machines have gotten better, smaller, less clunky and more reliable,” said Linda Waite, a sociologist and a co-author of the study. By having 727 older adults across the United States (average age: almost 72) wear actigraphs for three full days, Dr. Waite and her colleagues could tell when subjects were asleep and when they weren’t. Then they could compare their reported insomnia to their actual sleep patterns. Overall, in this random sample, taken from an ongoing national study of older adults, people didn’t appear sleep-deprived. They fell asleep at 10:27 p.m. on average, and awakened at 6:22 a.m. After subtracting wakeful periods during the night, they slept an average seven and a quarter hours. But averages don’t tell us much, so let’s look more closely at their reported insomnia. “What was surprising to us is that there’s very little association between people’s specific sleep problems and what the actigraph shows,” Dr. Waite said. © 2014 The New York Times Company

Keyword: Sleep; Development of the Brain
Link ID: 20250 - Posted: 10.28.2014

By Eric Niiler Has our reliance on iPhones and other instant-info devices harmed our memories? Michael Kahana, a University of Pennsylvania psychology professor who studies memory, says maybe: “We don’t know what the long-lasting impact of this technology will be on our brains and our ability to recall.” Kahana, 45, who has spent the past 20 years looking at how the brain creates memories, is leading an ambitious four-year Pentagon project to build a prosthetic memory device that can be implanted into human brains to help veterans with traumatic brain injuries. He spoke by telephone with The Post about what we can do to preserve or improve memory. Practicing the use of your memory is helpful. The other thing which I find helpful is sleep, which I don’t get enough of. As a general principle, skills that one continues to practice are skills that one will maintain in the face of age-related changes in cognition. [As for all those brain games available], I am not aware of any convincing data that mental exercises have a more general effect other than maintaining the skills for those exercises. I think the jury is out on that. If you practice doing crossword puzzles, you will preserve your ability to do crossword puzzles. If you practice any other cognitive skill, you will get better at that as well. Michael Kahana once could name every student in a class of 100. Now, says the University of Pennsylvania psychology professor who studies memory, “I find it too difficult even with a class of 20.” (From Michael Kahana)

Keyword: Learning & Memory
Link ID: 20249 - Posted: 10.28.2014

With the passing away of Professor Allison Doupe on Friday, October 24, of cancer, UCSF and biomedical science have lost a scholar of extraordinary intelligence and erudition and a campus leader. Allison Doupe was a psychiatrist and systems neuroscientist who became a leader of her field, the study of sensorimotor learning and its neural control. Allison was recruited to the Departments of Psychiatry and Physiology and the Neuroscience Graduate Program in 1993, rising to Professor in 2000. Her academic career has been outstanding at every stage, including First Class Honors at McGill, an MD and PhD in Neurobiology from Harvard, and a prestigious Junior Fellowship from the Harvard University Society of Fellows. Her PhD work with Professor Paul Patterson definitively established the role of particular environmental factors in the development of autonomic neurons and was important in the molecular and cellular investigations of the roles of hormones and growth factors in that system. After internship at the Massachusetts General Hospital and residency in psychiatry at UCLA, she chose to pursue a postdoctoral fellowship at Caltech, studying song learning in birds with Professor Mark Konishi as a way of combining her clinical interests in behavior and development with research in cognitive neuroscience. The development of birdsong is in many important respects similar to language development in humans. The pioneering work of Peter Marler, on song sparrows in Golden Gate Park, showed that each baby songbird learns its father’s dialect but could readily learn the dialect of any singing bird of the same species placed in the role of tutor. Many birds, including the ones studied by Allison Doupe, learn their song by listening to their father sing during a period of life in which they are not themselves singing, and they later practice and perfect their own song by comparison with their memory of the father’s (or tutor’s) song.

Keyword: Animal Communication; Language
Link ID: 20248 - Posted: 10.28.2014

by Bethany Brookshire In many scientific fields, the study of the body is the study of boys. In neuroscience, for example, studies in male rats, mice, monkeys and other mammals outnumber studies in females 5.5 to 1. When scientists are hunting for clues, treatments or cures for a human population that is around 50 percent female, this boys-only club may miss important questions about how the other half lives. So in an effort to reduce this sex bias in biomedical studies, National Institutes of Health director Francis Collins and Office of Research on Women’s Health director Janine Clayton announced in May a new policy that will roll out practices promoting sex parity in research, beginning with a requirement that scientists state whether males, females or both were used in experiments, and moving on to mandate that both males and females are included in all future funded research. The end goal will be to make sure that NIH-funded scientists “balance male and female cells and animals in preclinical studies in all future [grant] applications” to the NIH. In 1993, the NIH Revitalization Act mandated the inclusion of women and minorities in clinical trials. This latest move extends that inclusion to cells and animals in preclinical research. Because NIH funds the work of morethan 300,000 researchers in the United States and other countries, many of whom work on preclinical and basic biomedical science, the new policy has broad implications for the biomedical research community. And while some scientists are pleased with the effort, others are worried that the mandate is ill-conceived and underfunded. In the end, whether it succeeds or fails comes down to interpretation and future implementation. © Society for Science & the Public 2000 - 2014

Keyword: Sexual Behavior; Pain & Touch
Link ID: 20247 - Posted: 10.27.2014

By PAM BELLUCK Science edged closer on Sunday to showing that an antioxidant in chocolate appears to improve some memory skills that people lose with age. In a small study in the journal Nature Neuroscience, healthy people, ages 50 to 69, who drank a mixture high in antioxidants called cocoa flavanols for three months performed better on a memory test than people who drank a low-flavanol mixture. On average, the improvement of high-flavanol drinkers meant they performed like people two to three decades younger on the study’s memory task, said Dr. Scott A. Small, a neurologist at Columbia University Medical Center and the study’s senior author. They performed about 25 percent better than the low-flavanol group. “An exciting result,” said Craig Stark, a neurobiologist at the University of California, Irvine, who was not involved in the research. “It’s an initial study, and I sort of view this as the opening salvo.” He added, “And look, it’s chocolate. Who’s going to complain about chocolate?” The findings support recent research linking flavanols, especially epicatechin, to improved blood circulation, heart health and memory in mice, snails and humans. But experts said the new study, although involving only 37 participants and partly funded by Mars Inc., the chocolate company, goes further and was a well-controlled, randomized trial led by experienced researchers. Besides improvements on the memory test — a pattern recognition test involving the kind of skill used in remembering where you parked the car or recalling the face of someone you just met — researchers found increased function in an area of the brain’s hippocampus called the dentate gyrus, which has been linked to this type of memory. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 20246 - Posted: 10.27.2014

By Gary Stix Scott Small, a professor of neurology at Columbia University’s College of Physicians and Surgeons, researches Alzheimer’s, but he also studies the memory loss that occurs during the normal aging process. Research on the commonplace “senior moments” focuses on the hippocampus, an area of the brain involved with formation of new memories. In particular, one area of the hippocampus, the dentate gyrus, which helps distinguish one object from another, has lured researchers on age-related memory problems. In a study by Small and colleagues published Oct. 26 in Nature Neuroscience, naturally occurring chemicals in cocoa increased dentate gyrus blood flow. Psychological testing showed that the pattern recognition abilities of a typical 60-year-old on a high dose of the cocoa phytochemicals in the 37-person study matched those of a 30-or 40-year old after three months. The study received support from the food company Mars, but Small cautions against going out to gorge on Snickers Bars, as most of the beneficial chemicals, or flavanols, are removed when processing cocoa. An edited transcript of an interview with Small follows: Can you explain what you found in your study? The main motive of the study was to causally establish an anatomical source of age-related memory loss. A number of labs have shown in the last 10 years that there’s one area of the brain called the dentate gyrus that is linked to the aging process. But no one has tested that concept. Until now the observations have been correlational. There is decreased function in that region and, to prove causation, we were trying to see if we could reverse that. © 2014 Scientific American

Keyword: Learning & Memory
Link ID: 20245 - Posted: 10.27.2014

By GABRIELE OETTINGEN MANY people think that the key to success is to cultivate and doggedly maintain an optimistic outlook. This belief in the power of positive thinking, expressed with varying degrees of sophistication, informs everything from affirmative pop anthems like Katy Perry’s “Roar” to the Mayo Clinic’s suggestion that you may be able to improve your health by eliminating “negative self-talk.” But the truth is that positive thinking often hinders us. More than two decades ago, I conducted a study in which I presented women enrolled in a weight-reduction program with several short, open-ended scenarios about future events — and asked them to imagine how they would fare in each one. Some of these scenarios asked the women to imagine that they had successfully completed the program; others asked them to imagine situations in which they were tempted to cheat on their diets. I then asked the women to rate how positive or negative their resulting thoughts and images were. A year later, I checked in on these women. The results were striking: The more positively women had imagined themselves in these scenarios, the fewer pounds they had lost. My colleagues and I have since performed many follow-up studies, observing a range of people, including children and adults; residents of different countries (the United States and Germany); and people with various kinds of wishes — college students wanting a date, hip-replacement patients hoping to get back on their feet, graduate students looking for a job, schoolchildren wishing to get good grades. In each of these studies, the results have been clear: Fantasizing about happy outcomes — about smoothly attaining your wishes — didn’t help. Indeed, it hindered people from realizing their dreams. © 2014 The New York Times Company

Keyword: Attention; Emotions
Link ID: 20244 - Posted: 10.27.2014

By Neuroskeptic A new paper threatens to turn the world of autism neuroscience upside down. Its title is Anatomical Abnormalities in Autism?, and it claims that, well, there aren’t very many. Published in Cerebral Cortex by Israeli researchers Shlomi Haar and colleagues, the new research reports that there are virtually no differences in brain anatomy between people with autism and those without. What makes Haar et al.’s essentially negative claims so powerful is that their study had a huge sample size: they included structural MRI scans from 539 people diagnosed with high-functioning autism spectrum disorder (ASD) and 573 controls. This makes the paper an order of magnitude bigger than a typical structural MRI anatomy study in this field. The age range was 6 to 35. The scans came from the public Autism Brain Imaging Data Exchange (ABIDE) database, a data sharing initiative which pools scans from 18 different neuroimaging centers. Haar et al. examined the neuroanatomy of the cases and controls using the popular FreeSurfer software package. What did they find? Well… not much. First off, the ASD group had no differences in overall brain size (intracranial volume). Nor were there any group differences in the volumes of most brain areas; the only significant finding here was an increased ventricle volume in the ASD group, but even this had a small effect size (d = 0.34). Enlarged ventricles is not specific to ASD by any means – the same thing has been reported in schizophrenia, dementia, and many other brain disorders.

Keyword: Autism; Brain imaging
Link ID: 20243 - Posted: 10.27.2014

Sarah Boseley, health editor A record haul of “smart” drugs, sold to students to enhance their memory and thought processes, stay awake and improve concentration, has been seized from a UK website by the medicines regulator, which is alarmed about the recent rise of such sites. The seizure, worth £200,000, illustrates the increasing internet trade in cognitive enhancement drugs and suggests people who want to stay focused and sharp are moving on from black coffee and legally available caffeine tablets. Most of the seized drugs are medicines that should only be available on a doctor’s prescription. One, Sunifiram, is entirely experimental and has never been tested on humans in clinical trials. Investigators from the Medicines and Healthcare Products Regulatory Authority (MHRA) are worried at what they see as a new phenomenon – the polished, plausible, commercial website targeting students and others who are looking for a mental edge over the competition. In addition to Ritalin, the drug that helps young people with attention deficit disorder (ADD) focus in class and while writing essays, and Modafinil (sold as Provigil), licensed in the US for people with narcolepsy, they are also offering experimental drugs and research chemicals. MHRA head of enforcement, Alastair Jeffrey, said the increase in people buying cognitive-enhancing drugs or “nootropics” is recent and very worrying. “The idea that people are willing to put their overall health at risk in order to attempt to get an intellectual edge over others is deeply troubling,” he said. © 2014 Guardian News and Media Limited

Keyword: Drug Abuse; ADHD
Link ID: 20242 - Posted: 10.27.2014

by Clare Wilson Call them the neuron whisperers. Researchers are eavesdropping on conversations going on between brain cells in a dish. Rather than hearing the chatter, they watch neurons that have been genetically modified so that the electrical impulses moving along their branched tendrils cause sparkles of red light (see video). Filming these cells at up to 100,000 frames a second is allowing researchers to analyse their firing in unprecedented detail. Until recently, a neuron's electrical activity could only be measured with tiny electrodes. As well as being technically difficult, such "patch clamping" only reveals the voltage at those specific points. The new approach makes the neuron's entire surface fluoresce as the impulse passes by. "Now we see the whole thing sweep through," says Adam Cohen of Harvard University. "We get much more information - like how fast and where does it start and what happens at a branch." The idea is a reverse form of optogenetics – where neurons are given a gene from bacteria that make a light-sensitive protein, so the cells fire when illuminated. The new approach uses genes that make the neurons do the opposite - glow when they fire. "It's pretty cool," says Dimitri Kullmann of University College London. "It's amazing that you can dispense with electrodes." Cohen's team is using the technique to compare cells from typical brains with those from people with disorders such as motor neuron disease or amyotrophic lateral sclerosis. Rather than taking a brain sample, they remove some of the person's skin cells and grow them alongside chemicals that rewind the cells into an embryonic-like state. Another set of chemicals is used to turn these stem cells into neurons. "You can recreate something reminiscent of the person's brain in the dish," says Cohen. © Copyright Reed Business Information Ltd.

Keyword: Brain imaging
Link ID: 20241 - Posted: 10.25.2014

By Michael Hedrick I have a hard time making friends. Getting to trust people well enough to call them a friend takes a lot of work. It’s especially hard when you are living with schizophrenia and think everyone is making fun of you. Schizophrenia is the devil on your shoulder that keeps whispering in your ear and, no matter what you try, the little demon won’t stop. He hasn’t stopped in the almost nine years I’ve lived with the illness, and he’s not about to stop now. He’s just quieted down a bit. I’d call him my companion but that would imply a degree of friendship, and there’s no way in hell I’m the little devil’s friend. I have plenty of acquaintances, and a couple hundred “friends” on Facebook. But real friends, mostly family, I can count on one hand. For me, making friends is like climbing a vertical rock wall with no ropes, requiring a degree of thrill-seeking, and a good deal of risk. For someone to be my friend, they have to accept that I’m crazy, and even getting to the point of telling them that is daunting when all you hear is the devil’s whispering that they’re making snap judgments about you or will be going back to their real friends and laughing about you. But interestingly, in my efforts to make friends, coffee shops have helped. The simple routine of going to get your fix of liquid energy every day provides a sort of breeding ground for community. You see these people every day,whether you like it or not and, over time, friendships form. I used to live in a small town called Niwot, about five miles down the highway from Boulder, where I now live. Every morning around 6 I would go to Winot Coffee, the small independent coffee shop, and every morning, without fail, there was a guy my age sitting outside with his computer smoking clove cigarettes. Given the regularity of seeing him every morning, and given that we were some of the only 20-somethings in town, we got to talking. © 2014 The New York Times Company

Keyword: Schizophrenia
Link ID: 20240 - Posted: 10.25.2014

by Neurobonkers A paper published in Nature Reviews Neuroscience last week addressed the prevalence of neuromyths among educators. The paper has been widely reported, but the lion's share of the coverage glossed over the impact that neuromyths have had in the real world. Your first thought after reading the neuromyths in the table below — which were widely believed by teachers — may well be, "so what?" It is true that some of the false beliefs are relatively harmless. For example, encouraging children to drink a little more water might perhaps result in the consumption of less sugary drinks. This may do little if anything to reduce hyperactivity but could encourage a more nutritious diet which might have impacts on problems such as Type II diabetes. So, what's the harm? The paper addressed a number of areas where neuromyths have had real world impacts on educators and policymakers, which may have resulted negatively on the provision of education. The graph above, reprinted in the Nature Reviews Neuroscience, paper has been included as empirical data in educational policy documents to provide evidence for an "allegedly scientific argument for withdrawing public funding of university education." The problem? The data is made up. The graph is in fact a model that is based on the false assumption that investment before the age of three will have many times the benefit of investment made in education later in life. The myth of three — the belief that there is a critical window to educate children before the age of three, after which point the trajectory is fixed — is one of the most persistent neuromyths. Viewed on another level, while some might say investment in early education can never be a bad thing, how about the implication that the potential of a child is fixed at such an early point in their life, when in reality their journey has just begun. © Copyright 2014, The Big Think, Inc

Keyword: Development of the Brain; Learning & Memory
Link ID: 20239 - Posted: 10.25.2014

By CLIVE THOMPSON “You just crashed a little bit,” Adam Gazzaley said. It was true: I’d slammed my rocket-powered surfboard into an icy riverbank. This was at Gazzaley’s San Francisco lab, in a nook cluttered with multicolored skullcaps and wires that hooked up to an E.E.G. machine. The video game I was playing wasn’t the sort typically pitched at kids or even middle-aged, Gen X gamers. Indeed, its intended users include people over 60 — because the game might just help fend off the mental decline that accompanies aging. It was awfully hard to play, even for my Call of Duty-toughened brain. Project: Evo, as the game is called, was designed to tax several mental abilities at once. As I maneuvered the surfboard down winding river pathways, I was supposed to avoid hitting the sides, which required what Gazzaley said was “visual-motor tracking.” But I also had to watch out for targets: I was tasked with tapping the screen whenever a red fish jumped out of the water. The game increased in difficulty as I improved, making the river twistier and obliging me to remember turns I’d taken. (These were “working-memory challenges.”) Soon the targets became more confusing — I was trying to tap blue birds and green fish, but the game faked me out by mixing in green birds and blue fish. This was testing my “selective attention,” or how quickly I could assess a situation and react to it. The company behind Project: Evo is now seeking approval from the Food and Drug Administration for the game. If it gets that government stamp, it might become a sort of cognitive Lipitor or Viagra, a game that your doctor can prescribe for your aging mind. After only two minutes of play, I was making all manner of mistakes, stabbing frantically at the wrong fish as the game sped up. “It’s hard,” Gazzaley said, smiling broadly as he took back the iPad I was playing on. “It’s meant to really push it.” “Brain training” games like Project: Evo have become big business, with Americans spending an estimated $1.3 billion a year on them. They are also a source of controversy. © 2014 The New York Times Company

Keyword: Alzheimers; Learning & Memory
Link ID: 20238 - Posted: 10.23.2014

By Emily Underwood Aging baby boomers and seniors would be better off going for a hike than sitting down in front of one of the many video games designed to aid the brain, a group of nearly 70 researchers asserted this week in a critique of some of the claims made by the brain-training industry. With yearly subscriptions running as much as $120, an expanding panoply of commercial brain games promises to improve memory, processing speed, and problem-solving, and even, in some cases, to stave off Alzheimer’s disease. Many companies, such as Lumosity and Cogmed, describe their games as backed by solid scientific evidence and prominently note that neuroscientists at top universities and research centers helped design the programs. But the cited research is often “only tangentially related to the scientific claims of the company, and to the games they sell,” according to the statement released Monday by the Stanford Center on Longevity in Palo Alto, California, and the Max Planck Institute for Human Development in Berlin. Although the letter, whose signatories include many researchers outside those two organizations, doesn’t point to specific bad actors, it concludes that there is “little evidence that playing brain games improves underlying broad cognitive abilities, or that it enables one to better navigate a complex realm of everyday life.” A similar statement of concern was published in 2008 with a smaller number of signatories, says Ulman Lindenberger of the Max Planck Institute for Human Development, who helped organize both letters. Although Lindenberger says there was no particular trigger for the current statement, he calls it the “expression of a growing collective concern among a large number of cognitive psychologists and neuroscientists who study human cognitive aging.” © 2014 American Association for the Advancement of Science

Keyword: Alzheimers; Learning & Memory
Link ID: 20237 - Posted: 10.23.2014

By J. PEDER ZANE Striking it rich is the American dream, a magnetic myth that has drawn millions to this nation. And yet, a countervailing message has always percolated through the culture: Money can’t buy happiness. From Jay Gatsby and Charles Foster Kane to Tony Soprano and Walter White, the woefully wealthy are among the seminal figures of literature, film and television. A thriving industry of gossipy, star-studded magazines and websites combines these two ideas, extolling the lifestyles of the rich and famous while exposing the sadness of celebrity. All of which raises the question: Is the golden road paved with misery? Yes, in a lot of cases, according to a growing body of research exploring the connection between wealth and happiness. Studies in behavioral economics, cognitive psychology and neuroscience are providing new insights into how a changing American economy and the wiring of the human brain can make life on easy street feel like a slog. Make no mistake, it is better to be rich than poor — psychologically as well as materially. Levels of depression, anxiety and stress diminish as incomes rise. What has puzzled researchers is that the psychological benefits of wealth seem to stop accruing once people reach an income of about $75,000 a year. “The question is, What are the factors that dampen the rewards of income?” said Scott Schieman, a professor of sociology at the University of Toronto. “Why doesn’t earning even more money — beyond a certain level — make us feel even happier and more satisfied?” The main culprit, he said, is the growing demands of work. For millenniums, leisure was wealth’s bedfellow. The rich were different because they worked less. The tables began to turn in America during the 1960s, when inherited privilege gave way to educational credentials and advancement became more closely tied to merit. © 2014 The New York Times Company

Keyword: Emotions
Link ID: 20236 - Posted: 10.23.2014