Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By GRETCHEN REYNOLDS The value of mental-training games may be speculative, as Dan Hurley writes in his article on the quest to make ourselves smarter, but there is another, easy-to-achieve, scientifically proven way to make yourself smarter. Go for a walk or a swim. For more than a decade, neuroscientists and physiologists have been gathering evidence of the beneficial relationship between exercise and brainpower. But the newest findings make it clear that this isn’t just a relationship; it is the relationship. Using sophisticated technologies to examine the workings of individual neurons — and the makeup of brain matter itself — scientists in just the past few months have discovered that exercise appears to build a brain that resists physical shrinkage and enhance cognitive flexibility. Exercise, the latest neuroscience suggests, does more to bolster thinking than thinking does. The most persuasive evidence comes from several new studies of lab animals living in busy, exciting cages. It has long been known that so-called “enriched” environments — homes filled with toys and engaging, novel tasks — lead to improvements in the brainpower of lab animals. In most instances, such environmental enrichment also includes a running wheel, because mice and rats generally enjoy running. Until recently, there was little research done to tease out the particular effects of running versus those of playing with new toys or engaging the mind in other ways that don’t increase the heart rate. So, last year a team of researchers led by Justin S. Rhodes, a psychology professor at the Beckman Institute for Advanced Science and Technology at the University of Illinois, gathered four groups of mice and set them into four distinct living arrangements. © 2012 The New York Times Company
Keyword: Learning & Memory; Attention
Link ID: 16691 - Posted: 04.23.2012
By DAN HURLEY Early on a drab afternoon in January, a dozen third graders from the working-class suburb of Chicago Heights, Ill., burst into the Mac Lab on the ground floor of Washington-McKinley School in a blur of blue pants, blue vests and white shirts. Minutes later, they were hunkered down in front of the Apple computers lining the room’s perimeter, hoping to do what was, until recently, considered impossible: increase their intelligence through training. Games based on N-back tests require players to remember the location of a symbol or the sound of a particular letter presented just before (1-back), the time before last (2-back), the time before that (3-back) and so on. Some researchers say that playing games like this may actually make us smarter. “Can somebody raise their hand,” asked Kate Wulfson, the instructor, “and explain to me how you get points?” On each of the children’s monitors, there was a cartoon image of a haunted house, with bats and a crescent moon in a midnight blue sky. Every few seconds, a black cat appeared in one of the house’s five windows, then vanished. The exercise was divided into levels. On Level 1, the children earned a point by remembering which window the cat was just in. Easy. But the game is progressive: the cats keep coming, and the kids have to keep watching and remembering. © 2012 The New York Times Company
Keyword: Attention; Intelligence
Link ID: 16690 - Posted: 04.23.2012
By SIDDHARTHA MUKHERJEE Few medicines, in the history of pharmaceuticals, have been greeted with as much exultation as a green-and-white pill containing 20 milligrams of fluoxetine hydrochloride — the chemical we know as Prozac. In her 1994 book “Prozac Nation,” Elizabeth Wurtzel wrote of a nearly transcendental experience on the drug. Before she began treatment with antidepressants, she was living in “a computer program of total negativity . . . an absence of affect, absence of feeling, absence of response, absence of interest.” She floated from one “suicidal reverie” to the next. Yet, just a few weeks after starting Prozac, her life was transformed. “One morning I woke up and really did want to live. . . . It was as if the miasma of depression had lifted off me, in the same way that the fog in San Francisco rises as the day wears on. Was it the Prozac? No doubt.” Like Wurtzel, millions of Americans embraced antidepressants. In 1988, a year after the Food and Drug Administration approved Prozac, 2,469,000 prescriptions for it were dispensed in America. By 2002, that number had risen to 33,320,000. By 2008, antidepressants were the third-most-common prescription drug taken in America. Fast forward to 2012 and the same antidepressants that inspired such enthusiasm have become the new villains of modern psychopharmacology — overhyped, overprescribed chemicals, symptomatic of a pill-happy culture searching for quick fixes for complex mental problems. In “The Emperor’s New Drugs,” the psychologist Irving Kirsch asserted that antidepressants work no better than sugar pills and that the clinical effectiveness of the drugs is, largely, a myth. If the lodestone book of the 1990s was Peter Kramer’s near-ecstatic testimonial, “Listening to Prozac,” then the book of the 2000s is David Healy’s “Let Them Eat Prozac: The Unhealthy Relationship Between the Pharmaceutical Industry and Depression.” © 2012 The New York Times Company
Keyword: Depression
Link ID: 16689 - Posted: 04.23.2012
By LAUREN SLATER Pam Sakuda was 55 when she found out she was dying. Shortly after having a tumor removed from her colon, she heard the doctor’s dreaded words: Stage 4; metastatic. Sakuda was given 6 to 14 months to live. Determined to slow her disease’s insidious course, she ran several miles every day, even during her grueling treatment regimens. By nature upbeat, articulate and dignified, Sakuda — who died in November 2006, outlasting everyone’s expectations by living for four years — was alarmed when anxiety and depression came to claim her after she passed the 14-month mark, her days darkening as she grew closer to her biological demise. Norbert Litzinger, Sakuda’s husband, explained it this way: “When you pass your own death sentence by, you start to wonder: When? When? It got to the point where we couldn’t make even the most mundane plans, because we didn’t know if Pam would still be alive at that time — a concert, dinner with friends; would she still be here for that?” When came to claim the couple’s life completely, their anxiety building as they waited for the final day. As her fears intensified, Sakuda learned of a study being conducted by Charles Grob, a psychiatrist and researcher at Harbor-U.C.L.A. Medical Center who was administering psilocybin — an active component of magic mushrooms — to end-stage cancer patients to see if it could reduce their fear of death. Twenty-two months before she died, Sakuda became one of Grob’s 12 subjects. When the research was completed in 2008 — (and published in the Archives of General Psychiatry last year) — the results showed that administering psilocybin to terminally ill subjects could be done safely while reducing the subjects’ anxiety and depression about their impending deaths. © 2012 The New York Times Company
Keyword: Drug Abuse; Depression
Link ID: 16688 - Posted: 04.23.2012
CBC News A special computer game is as effective for treating adolescent depression as one-on-one counselling with trained clinicians, Australian researchers report in the British Medical Journal. Researchers at the University of Auckland in New Zealand observed 187 young people12 to 19 years old who showed mild to moderate depressive symptoms. Roughly half played SPARX, a 3D fantasy game that has users work through a series of seven levels dealing with a range of topics, including: Emotions. Finding hope. Recognizing unhelpful thoughts. Over four to seven weeks, players must restore balance in a world dominated by GNATs, or gloomy negative automatic thoughts. Another group underwent face-to-face treatment by trained counsellors and psychologists. According to the researchers, SPARX was as effective in reducing symptoms of depression. Moreover, 44 per cent of those who played at least four levels completely recovered compared to 26 per cent in usual care. The results were based on several depression rating scales. © CBC 2012
Keyword: Depression; Development of the Brain
Link ID: 16687 - Posted: 04.21.2012
By Harvey Black Some people are friendly drunks, whereas others are hostile, potentially endangering themselves and others. The difference may lie in their ability to foresee the consequences of their actions, according to a recent study in the online Journal of Experimental Social Psychology. Brad Bushman, a psychologist at Ohio State University, and his colleagues asked nearly 500 volunteers to play a simple game. The subjects, an even mix of women and men, believed they were competing against an opponent to press a button as quickly as possible. In reality, they were simply using a computer program that randomly decided if they had won or lost. When they lost, they received a shock. When the “opponent” lost, the participant gave the shock and chose how long and intense it should be. Before playing, the participants completed a survey designed to measure their general concern for the future consequences of their actions. Half the participants then received enough alcohol mixed with orange juice to make them legally drunk, and the other half received a drink with a very tiny amount of alcohol in it. Subjects who expressed little interest in consequences were more likely to administer longer, more intense shocks. In the sober group, they were slightly more aggressive than people who cared about consequences. When drunk, however, their belligerence was off the charts. “They are by far the most aggressive people in the study,” Bushman says. © 2012 Scientific American
Keyword: Drug Abuse; Aggression
Link ID: 16686 - Posted: 04.21.2012
By JAMES GORMAN First things first: The hyrax is not the Lorax. And it does not speak for the trees. It sings, on its own behalf. The hyrax is a bit Seussian, however. It looks something like a rabbit, something like a woodchuck. Its closest living relatives are elephants, manatees and dugongs. And male rock hyraxes have complex songs like those of birds, in the sense that males will go on for 5 or 10 minutes at a stretch, apparently advertising themselves. One might have expected that the hyrax would have some unusual qualities — the animals’ feet, if you know how to look at them, resemble elephants’ toes, the experts say. And their visible front teeth are actually very small tusks. But Arik Kershenbaum and colleagues at the University of Haifa and Tel Aviv University have found something more surprising. Hyraxes’ songs have something rarely found in mammals: syntax that varies according to where the hyraxes live, geographical dialects in how they put their songs together. The research was published online Wednesday in The Proceedings of the Royal Society B. Bird songs show syntax, this ordering of song components in different ways, but very few mammals make such orderly, arranged sounds. Whales, bats and some primates show syntax in their vocalizations, but nobody really expected such sophistication from the hyrax, and it was thought that the selection of sounds in the songs were relatively random. © 2012 The New York Times Company
Keyword: Language; Sexual Behavior
Link ID: 16685 - Posted: 04.21.2012
by Jane J. Lee Whales use sound to communicate over entire oceans, search for food, and coordinate attacks. But just how baleen whales—a group that uses comblike projections from the roof of their mouth to catch food—heard these grunts and moans was something of a mystery. Toothed whales, including dolphins and porpoises, use lobes of fat connected to their jawbones and ears to pick up sounds. But in-depth analyses of baleen whales weren't previously possible because their sheer size made them impossible to fit into scanners such that use computed tomography and magnetic resonance imaging, which analyze soft tissues. So in a new study, published online this month in The Anatomical Record, researchers focused on one of the smaller species, minke whales (Balaenoptera acutorostrata). They found that triangular patches of fat surrounding minke whale ears (yellow patches, above) could be key to how they hear. They scanned seven minke whale heads in CT and MRI machines, created computer models of the ears and surrounding soft tissue, and dissected the whale noggins to reveal ear fat running from blubber just under the skin to the ear bones. This is similar to the arrangement found in toothed whales. The novel analysis allowed the authors to speculate that the ear fat in both toothed and baleen whales could have shared a common evolutionary origin. © 2010 American Association for the Advancement of Science.
Keyword: Hearing; Evolution
Link ID: 16684 - Posted: 04.21.2012
By Nathan Seppa Neighborhood amenities such as green space and a nearby grocery store may offer residents more than just curb appeal. Children who live in such neighborhoods are roughly half as likely to be obese as kids living in areas lacking these features, researchers report in two studies in the May American Journal of Preventive Medicine. The research combines two health aspects of residential life that studies usually examine separately — neighborhood amenities that boost physical activity and ready access to a grocery store in place of fast food outlets. The new studies “are important contributions to the needed evidence documenting the influence of environmental factors on people's health, in particular obesity,” says Laura Kettel Khan, a nutritionist at the Centers for Disease Control and Prevention in Atlanta. To assess those effects, Lawrence Frank, an urban planner and public health researcher at the University of British Columbia in Vancouver, and his colleagues rated the “built environment” of hundreds of neighborhoods in San Diego County, Calif., and King County, Wash., which includes Seattle. The researchers rated the number and quality of parks and a neighborhood’s “walkability” — whether its layout had a low level of sprawl, few cul-de-sacs and easy access to retail outlets. © Society for Science & the Public 2000 - 2012
Keyword: Obesity
Link ID: 16683 - Posted: 04.21.2012
By LINDA LEE A Field Notes column last Sunday (“Bridal Hunger Games”) reported on some diets that brides use to drop 15 or 20 pounds before their weddings: Weight Watchers and a personal fitness trainer, juice cleanses, the Dukan diet, diet pills, hormone shots and, new to the United States, a feeding tube diet. Readers began to respond as soon as the article went online and was posted on the Times’s Facebook page. “If you’re with someone who wants a swimsuit model for a partner, then he is free to contact Sports Illustrated and ask to date one directly,” one woman wrote on Facebook. Or why not just buy a larger size dress, asked one reader, a man. Several commenters suggested that the solution to looking good in wedding photos wasn’t losing weight, but acquiring skills in Photoshop. There were complaints about the commodification of marriage: “Just one more example of the disgusting spectacle weddings have become,” another grumped. A man jokingly suggested reverse psychology: “I say balloon up so you look as big as a house on your wedding day (wear a fat suit if you have to).” Ten years later, he wrote, people “will say admiringly how great you look today.” BluePrintCleanse’s Web site was mentioned in the column for suggesting that a bridal party cleanse together. “If a friend asked me to lose weight, or join her in such an awful venture, to be in her wedding, she wouldn’t be my friend any longer,” a woman wrote. © 2012 The New York Times Company
Keyword: Obesity; Anorexia & Bulimia
Link ID: 16682 - Posted: 04.21.2012
By LINDA LEE JENNIFER DERRICK’S weight had crept to 159 pounds from 125, and she knew she would not fit into her grandmother’s wedding dress. “Women were smaller back then, and there was nothing to let out,” said Ms. Derrick, of Rockford, Ill. She took prescription pills, had vitamin B shots and made weekly $45 visits to a Medithin clinic in Janesville, Wis. When she married on March 18, she was back to 125 pounds; the gown, from 1938, fit perfectly. In March, Jessica Schnaider, 41, of Surfside, Fla., was preparing to shop for a wedding gown by spending eight days on a feeding tube. The diet, under a doctor’s supervision, offered 800 calories a day while she went about her business, with a tube in her nose. A 2007 Cornell University study by Lori Neighbors and Jeffery Sobal found that 70 percent of 272 engaged women said they wanted to lose weight, typically 20 pounds. So brides are increasingly going on crash diets, inspired by seeing celebrities like Sarah Jessica Parker or Gwyneth Paltrow, cowed by the prospect of wearing a revealing and expensive gown and knowing that wedding photos (if not the marriage) are forever. In the two months of fittings before most clients’ weddings at Kleinfeld Bridal in New York, seamstresses are kept busy taking in gowns. Brides-to-be say, “I don’t want the size 16, I want the 14 or the 12,’ ” said Jennette Kruszka, Kleinfeld’s marketing director. © 2012 The New York Times Company
Keyword: Obesity; Anorexia & Bulimia
Link ID: 16681 - Posted: 04.21.2012
Amy Maxmen By tacking drugs onto molecules targeting rogue brain cells, researchers have alleviated symptoms in newborn rabbits that are similar to those of cerebral palsy in children. Cerebral palsy refers to a group of incurable disorders characterized by impairments in movement, posture and sensory abilities. In general, medicines tend to act broadly rather than influence certain sets of cells in the brain. “You don’t expect large molecules to enter the brain, and if they do, you don’t expect them to target specific cells, and immediately act therapeutically — but all of this happened,” says study co-author Rangaramanujam Kannan, a chemical engineer at the Johns Hopkins University School of Medicine in Baltimore, Maryland. The paper is published today in Science Translational Medicine1. According the US Centers for Disease Control and Prevention, approximately 1 in 303 children have cerebral palsy by age 8, which usually results from neurological damage in the womb, caused by, for example, a kink in the umbilical cord that briefly dimishes the foetus' oxygen, or maternal infection. Such injuries lead to the activation of immune cells in the brain called microglia and astrocytes, which cause further inflammation and exacerbate the damage. Calming the cells is difficult, because anti-inflammatory drugs don’t easily cross the blood–brain barrier. And those that do tend to diffuse nonspecifically. “What’s amazing here is that the authors target the drug directly to the microglia,” says Mike Johnston, a paediatric neurologist at the Kennedy Krieger Institute in Baltimore. © 2012 Nature Publishing Group
Keyword: Development of the Brain; Glia
Link ID: 16680 - Posted: 04.21.2012
Ewen Callaway Medical geneticists are giving genome sequencing its first big test in the clinic by applying it to some of their most baffling cases. By the end of this year, hundreds of children with unexplained forms of intellectual disability and developmental delay will have had their genomes decoded as part of the first large-scale, national clinical sequencing projects. These programmes, which were discussed last month at a rare-diseases conference hosted by the Wellcome Trust Sanger Institute near Cambridge, UK, aim to provide a genetic diagnosis that could end years of uncertainty about a child’s disability. In the longer term, they could provide crucial data that will underpin efforts to develop therapies. The projects are also highlighting the logistical and ethical challenges of bringing genome sequencing to the consulting room. “The overarching theme is that genome-based diagnosis is now hitting mainstream medicine,” says Han Brunner, a medical geneticist at the Radboud University Nijmegen Medical Centre in the Netherlands, who leads one of the projects. About 2% of children experience some form of intellectual disability. Many have disorders such as Down’s syndrome and fragile X syndrome, which are linked to known genetic abnormalities and so are easily diagnosed. Others have experienced environmental risk factors, such as fetal alcohol exposure, that rule out a simple genetic explanation. However, a large proportion of intellectual disability cases are thought to be the work of single, as-yet-unidentified mutations. © 2012 Nature Publishing Group
Keyword: Genes & Behavior; Development of the Brain
Link ID: 16679 - Posted: 04.19.2012
Teenagers can suffer severe sleep deprivation when the clocks change, say researchers at the University of Surrey. The amount they sleep decreases to less than six hours a night on average the week following the move to British Summer Time. During this period their concentration may be lower and mood affected. Scientists also found that even before the change, teenagers were getting less than the recommended hours of sleep. The activity of some sixth-form students from Claremont Fan Court School in Surrey was studied using wristwatches. These were worn constantly over a 10-day period before and after the clocks moved forward on 25 March. The watches reliably indicated when the teenagers were awake and asleep. The researchers found that in the days following the clock change, the teenagers had less than six hours of sleep a night. Adults generally have eight. Joanne Bower, a sleep researcher at the University of Surrey said: "During adolescence, teenagers experience a shift in their circadian rhythm [body clock] - these make sure the same things happen at the same time every day. One of these things is the production of the sleep-promoting hormone, melatonin. BBC © 2012
Keyword: Sleep; Development of the Brain
Link ID: 16678 - Posted: 04.19.2012
By Jennifer Welsh and LiveScience Can a psychiatric disorder be diagnosed with a blood test? That may be the future if two recent studies pan out. Researchers are figuring out how to differentiate the blood of a depressed person from that of someone without depression. In the latest study, published today (April 17) in the journal Translational Psychiatry, researchers identified 11 new markers, or chemicals in the blood, for early-onset depression. These markers were found in different levels in teens with depression compared with their levels in teens who didn't have the condition. Currently, depression is diagnosed by a subjective test, dependent upon a person's own explanation of their symptoms, and a psychiatrist's interpretation of them. These blood tests aren't meant to replace a psychiatrist, but could make the diagnosis process easier. If a worried parent could have a family physician run a blood test, it might ease the diagnosis process during the already tough time of adolescence, said Eva Redei, a professor at Northwestern University in Evanston, Ill., who was involved in the study of the teen-depression blood test. If they hold up to further testing, blood tests could help young adults, who often go untreated because they aren't aware of their disease, get treated. The biological basis of a blood test could also help to reduce that stigma, researchers suggest. © 2012 Scientific American
Keyword: Depression; Development of the Brain
Link ID: 16677 - Posted: 04.19.2012
By Brian Alexander Good news for all those who ever had a teacher or a parent say “If you would just apply yourself you could learn anything! You’re only using 10 percent of your brain!” All those people were wrong. If we did use only 10 percent of our brains we’d be close to dead, according to Eric Chudler, director of the Center for Sensorimotor Neural Engineering at the University of Washington, who maintains an entertaining brain science website for kids. “When recordings are made from brain EEGs, or PET scans, or any type of brain scan, there’s no part of the brain just sitting there unused,” he said. Larry Squire, a research neuroscientist with the Veterans Administration hospital in San Diego, and at the University of California San Diego, pointed out that “any place the brain is damaged there is a consequence.” Damaged brains may have been where this myth originated. During the first half of the last century, a pioneering neuroscientist named Karl Lashley experimented on rodents by excising portions of their brains to see what happened. When he put these rodents in mazes they’d been trained to navigate, he found that animals with missing bits of brain often successfully navigated the mazes. This wound up being transmuted into the idea humans must be wasting vast brain potential. With the rise of the human potential movement in the 1960s, some preached that all sorts of powers, including bending spoons and psychic abilities, were laying dormant in our heads and that all we had to do was get off our duffs and activate them. © 2012 msnbc.com
Keyword: Brain imaging
Link ID: 16676 - Posted: 04.19.2012
by Greg Miller Spinal cord injuries cause paralysis because they sever crucial communication links between the brain and the muscles that move limbs. A new study with monkeys demonstrates a way to re-establish those connections. By implanting electrodes in a movement control center in the brain and wiring them up to electrodes attached to muscles in the arm, researchers restored movement to monkeys with a temporarily paralyzed hand. The work is the latest promising development in the burgeoning field of neuroprosthetics. In recent years, scientists have taken many steps toward creating prosthetics to help paralyzed people interact more with the world around them. They've developed methods to decode signals from electrodes implanted in the brain so that a paralyzed person can control a cursor on a computer screen or manipulate a robotic arm with their thoughts alone. Such brain implants are still experimental, and only a handful of people have received them. Several hundred patients have received a different kind of neural prosthetic that uses residual shoulder movement or nerve activity to stimulate arm muscles, allowing them to grasp objects with their hands. The new study combines these two approaches. Neuroscientist Lee Miller of the Northwestern University Feinberg School of Medicine in Chicago, Illinois, and colleagues implanted electrode grids into the primary motor cortex of two monkeys. This brain region issues commands that move muscles throughout the body, and the researchers positioned the electrodes in the part of the primary motor cortex that controls the hand, enabling them to record the electrical activity of about 100 neurons there. © 2010 American Association for the Advancement of Science.
Keyword: Robotics
Link ID: 16675 - Posted: 04.19.2012
Leila Haghighat At the turn of the twentieth century, the promise of regenerating damaged tissue was so far-fetched that Thomas Hunt Morgan, despairing that his work on earthworms could ever be applied to humans, abandoned the field to study heredity instead. Though he won the Nobel Prize in 1933 for his work on the role of chromosomes in inheritance, if he lived today, the advances in regenerative medicine may have tempted him to reconsider. Three studies published this week show that introducing new cells into mice can replace diseased cells — whether hair, eye or heart — and help to restore the normal function of those cells. These proof-of-principle studies now have researchers setting their sights on clinical trials to see if the procedures could work in humans. “You can grow cells in a Petri dish, but that’s not regenerative medicine,” says Robin Ali, a geneticist at University College London, who led the eye study. “You have to think about the biology of repair in a living system.” Sprouting hair In work published in Nature Communications, Japanese researchers grew different types of hair on nude mice, using stem cells from normal mice and balding humans to recreate the follicles from which hair normally emerges1. Takashi Tsuji, a regenerative-medicine specialist at Tokyo University of Science who led the study, says that the technique holds promise for treating male pattern baldness. © 2012 Nature Publishing Group
Keyword: Regeneration; Stem Cells
Link ID: 16674 - Posted: 04.19.2012
By Tina Hesman Saey The farmer’s wife in the nursery rhyme Three Blind Mice may need a different mouse hunting strategy. Thanks to new cell transplants, some formerly night-blind mice can see in the dark again, perhaps even well enough to evade the carving knife of the farmer’s wife. Injections of light-gathering nerve cells called rods into the retinas of night-blind mice integrated into the brain’s visual system and restored sight, Robin Ali of the University College London Institute of Ophthalmology and colleagues report online April 18 in Nature. The finding gives new hope that cell transplants may reverse damage to the brain and eyes caused by degenerative diseases and help heal spinal cord injuries. Other researchers have tried, and failed, to repair damaged retinas with stem cell transplants, says Christian Schmeer, a neurologist at the University Hospital Jena in Germany. The new study is the first to demonstrate that transplanted nerve cells can restore function. "They show it is possible and they do it convincingly," Schmeer says. At the same time, Schmeer cautions, “there’s still a lot to be done until it’s ready for clinical use.” In the study, Ali’s group transplanted immature rod cells from newborn mice into the retinas of adult mice. Rods, found in the back of the eye, work in dim light conditions. Other retina cells called cones sense bright light. In previous studies, the researchers had been able to transplant about 1,000 rods into mice’s retinas, but that wasn’t enough to restore vision. By optimizing techniques, the researchers coaxed about 26,000 rod cells to incorporate into the retina of each injected eye. © Society for Science & the Public 2000 - 2012
Keyword: Regeneration; Stem Cells
Link ID: 16673 - Posted: 04.19.2012
By Julian De Freitas and Brandon Liverence Notice that, even as you fixate on the screen in front of you, you can still shift your attention to different regions in your peripheries. For decades, cognitive scientists have conceptualized attention as akin to a shifting spotlight that “illuminates” regions it shines upon, or as a zoom lens, focusing on things so that we see them in finer detail. These metaphors are commonplace because they capture the intuition that attention illuminates or sharpens things, and thus, enhances our perception of them. Some of the important early studies to directly confirm this intuition were conducted by NYU psychologist Marisa Carrasco and colleagues, who showed that attention enhances the perceived sharpness of attended patterns. In their experiment, participants saw two textured patterns presented side-by-side on a computer screen, and judged which of the two patterns looked sharper. However, just before the patterns appeared, an attention-attracting cue was flashed at the upcoming location of one of the patterns. They found that attended patterns were perceived as sharper than physically identical unattended patterns. In other words, attention may make physically blurry (or otherwise degraded) images appear sharper – much like a zoom lens on a camera. Subsequent studies by Carrasco’s group and others found that attention also enhances perception of other features – for example, color saturation , orientation , and speed . This research suggests that attention causes incoming sensory information from attended locations to be processed more fully, without changing the information itself. © 2012 Scientific American,
Keyword: Attention; Vision
Link ID: 16672 - Posted: 04.19.2012