Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By LINDA LEE A Field Notes column last Sunday (“Bridal Hunger Games”) reported on some diets that brides use to drop 15 or 20 pounds before their weddings: Weight Watchers and a personal fitness trainer, juice cleanses, the Dukan diet, diet pills, hormone shots and, new to the United States, a feeding tube diet. Readers began to respond as soon as the article went online and was posted on the Times’s Facebook page. “If you’re with someone who wants a swimsuit model for a partner, then he is free to contact Sports Illustrated and ask to date one directly,” one woman wrote on Facebook. Or why not just buy a larger size dress, asked one reader, a man. Several commenters suggested that the solution to looking good in wedding photos wasn’t losing weight, but acquiring skills in Photoshop. There were complaints about the commodification of marriage: “Just one more example of the disgusting spectacle weddings have become,” another grumped. A man jokingly suggested reverse psychology: “I say balloon up so you look as big as a house on your wedding day (wear a fat suit if you have to).” Ten years later, he wrote, people “will say admiringly how great you look today.” BluePrintCleanse’s Web site was mentioned in the column for suggesting that a bridal party cleanse together. “If a friend asked me to lose weight, or join her in such an awful venture, to be in her wedding, she wouldn’t be my friend any longer,” a woman wrote. © 2012 The New York Times Company
Keyword: Obesity; Anorexia & Bulimia
Link ID: 16682 - Posted: 04.21.2012
By LINDA LEE JENNIFER DERRICK’S weight had crept to 159 pounds from 125, and she knew she would not fit into her grandmother’s wedding dress. “Women were smaller back then, and there was nothing to let out,” said Ms. Derrick, of Rockford, Ill. She took prescription pills, had vitamin B shots and made weekly $45 visits to a Medithin clinic in Janesville, Wis. When she married on March 18, she was back to 125 pounds; the gown, from 1938, fit perfectly. In March, Jessica Schnaider, 41, of Surfside, Fla., was preparing to shop for a wedding gown by spending eight days on a feeding tube. The diet, under a doctor’s supervision, offered 800 calories a day while she went about her business, with a tube in her nose. A 2007 Cornell University study by Lori Neighbors and Jeffery Sobal found that 70 percent of 272 engaged women said they wanted to lose weight, typically 20 pounds. So brides are increasingly going on crash diets, inspired by seeing celebrities like Sarah Jessica Parker or Gwyneth Paltrow, cowed by the prospect of wearing a revealing and expensive gown and knowing that wedding photos (if not the marriage) are forever. In the two months of fittings before most clients’ weddings at Kleinfeld Bridal in New York, seamstresses are kept busy taking in gowns. Brides-to-be say, “I don’t want the size 16, I want the 14 or the 12,’ ” said Jennette Kruszka, Kleinfeld’s marketing director. © 2012 The New York Times Company
Keyword: Obesity; Anorexia & Bulimia
Link ID: 16681 - Posted: 04.21.2012
Amy Maxmen By tacking drugs onto molecules targeting rogue brain cells, researchers have alleviated symptoms in newborn rabbits that are similar to those of cerebral palsy in children. Cerebral palsy refers to a group of incurable disorders characterized by impairments in movement, posture and sensory abilities. In general, medicines tend to act broadly rather than influence certain sets of cells in the brain. “You don’t expect large molecules to enter the brain, and if they do, you don’t expect them to target specific cells, and immediately act therapeutically — but all of this happened,” says study co-author Rangaramanujam Kannan, a chemical engineer at the Johns Hopkins University School of Medicine in Baltimore, Maryland. The paper is published today in Science Translational Medicine1. According the US Centers for Disease Control and Prevention, approximately 1 in 303 children have cerebral palsy by age 8, which usually results from neurological damage in the womb, caused by, for example, a kink in the umbilical cord that briefly dimishes the foetus' oxygen, or maternal infection. Such injuries lead to the activation of immune cells in the brain called microglia and astrocytes, which cause further inflammation and exacerbate the damage. Calming the cells is difficult, because anti-inflammatory drugs don’t easily cross the blood–brain barrier. And those that do tend to diffuse nonspecifically. “What’s amazing here is that the authors target the drug directly to the microglia,” says Mike Johnston, a paediatric neurologist at the Kennedy Krieger Institute in Baltimore. © 2012 Nature Publishing Group
Keyword: Development of the Brain; Glia
Link ID: 16680 - Posted: 04.21.2012
Ewen Callaway Medical geneticists are giving genome sequencing its first big test in the clinic by applying it to some of their most baffling cases. By the end of this year, hundreds of children with unexplained forms of intellectual disability and developmental delay will have had their genomes decoded as part of the first large-scale, national clinical sequencing projects. These programmes, which were discussed last month at a rare-diseases conference hosted by the Wellcome Trust Sanger Institute near Cambridge, UK, aim to provide a genetic diagnosis that could end years of uncertainty about a child’s disability. In the longer term, they could provide crucial data that will underpin efforts to develop therapies. The projects are also highlighting the logistical and ethical challenges of bringing genome sequencing to the consulting room. “The overarching theme is that genome-based diagnosis is now hitting mainstream medicine,” says Han Brunner, a medical geneticist at the Radboud University Nijmegen Medical Centre in the Netherlands, who leads one of the projects. About 2% of children experience some form of intellectual disability. Many have disorders such as Down’s syndrome and fragile X syndrome, which are linked to known genetic abnormalities and so are easily diagnosed. Others have experienced environmental risk factors, such as fetal alcohol exposure, that rule out a simple genetic explanation. However, a large proportion of intellectual disability cases are thought to be the work of single, as-yet-unidentified mutations. © 2012 Nature Publishing Group
Keyword: Genes & Behavior; Development of the Brain
Link ID: 16679 - Posted: 04.19.2012
Teenagers can suffer severe sleep deprivation when the clocks change, say researchers at the University of Surrey. The amount they sleep decreases to less than six hours a night on average the week following the move to British Summer Time. During this period their concentration may be lower and mood affected. Scientists also found that even before the change, teenagers were getting less than the recommended hours of sleep. The activity of some sixth-form students from Claremont Fan Court School in Surrey was studied using wristwatches. These were worn constantly over a 10-day period before and after the clocks moved forward on 25 March. The watches reliably indicated when the teenagers were awake and asleep. The researchers found that in the days following the clock change, the teenagers had less than six hours of sleep a night. Adults generally have eight. Joanne Bower, a sleep researcher at the University of Surrey said: "During adolescence, teenagers experience a shift in their circadian rhythm [body clock] - these make sure the same things happen at the same time every day. One of these things is the production of the sleep-promoting hormone, melatonin. BBC © 2012
Keyword: Sleep; Development of the Brain
Link ID: 16678 - Posted: 04.19.2012
By Jennifer Welsh and LiveScience Can a psychiatric disorder be diagnosed with a blood test? That may be the future if two recent studies pan out. Researchers are figuring out how to differentiate the blood of a depressed person from that of someone without depression. In the latest study, published today (April 17) in the journal Translational Psychiatry, researchers identified 11 new markers, or chemicals in the blood, for early-onset depression. These markers were found in different levels in teens with depression compared with their levels in teens who didn't have the condition. Currently, depression is diagnosed by a subjective test, dependent upon a person's own explanation of their symptoms, and a psychiatrist's interpretation of them. These blood tests aren't meant to replace a psychiatrist, but could make the diagnosis process easier. If a worried parent could have a family physician run a blood test, it might ease the diagnosis process during the already tough time of adolescence, said Eva Redei, a professor at Northwestern University in Evanston, Ill., who was involved in the study of the teen-depression blood test. If they hold up to further testing, blood tests could help young adults, who often go untreated because they aren't aware of their disease, get treated. The biological basis of a blood test could also help to reduce that stigma, researchers suggest. © 2012 Scientific American
Keyword: Depression; Development of the Brain
Link ID: 16677 - Posted: 04.19.2012
By Brian Alexander Good news for all those who ever had a teacher or a parent say “If you would just apply yourself you could learn anything! You’re only using 10 percent of your brain!” All those people were wrong. If we did use only 10 percent of our brains we’d be close to dead, according to Eric Chudler, director of the Center for Sensorimotor Neural Engineering at the University of Washington, who maintains an entertaining brain science website for kids. “When recordings are made from brain EEGs, or PET scans, or any type of brain scan, there’s no part of the brain just sitting there unused,” he said. Larry Squire, a research neuroscientist with the Veterans Administration hospital in San Diego, and at the University of California San Diego, pointed out that “any place the brain is damaged there is a consequence.” Damaged brains may have been where this myth originated. During the first half of the last century, a pioneering neuroscientist named Karl Lashley experimented on rodents by excising portions of their brains to see what happened. When he put these rodents in mazes they’d been trained to navigate, he found that animals with missing bits of brain often successfully navigated the mazes. This wound up being transmuted into the idea humans must be wasting vast brain potential. With the rise of the human potential movement in the 1960s, some preached that all sorts of powers, including bending spoons and psychic abilities, were laying dormant in our heads and that all we had to do was get off our duffs and activate them. © 2012 msnbc.com
Keyword: Brain imaging
Link ID: 16676 - Posted: 04.19.2012
by Greg Miller Spinal cord injuries cause paralysis because they sever crucial communication links between the brain and the muscles that move limbs. A new study with monkeys demonstrates a way to re-establish those connections. By implanting electrodes in a movement control center in the brain and wiring them up to electrodes attached to muscles in the arm, researchers restored movement to monkeys with a temporarily paralyzed hand. The work is the latest promising development in the burgeoning field of neuroprosthetics. In recent years, scientists have taken many steps toward creating prosthetics to help paralyzed people interact more with the world around them. They've developed methods to decode signals from electrodes implanted in the brain so that a paralyzed person can control a cursor on a computer screen or manipulate a robotic arm with their thoughts alone. Such brain implants are still experimental, and only a handful of people have received them. Several hundred patients have received a different kind of neural prosthetic that uses residual shoulder movement or nerve activity to stimulate arm muscles, allowing them to grasp objects with their hands. The new study combines these two approaches. Neuroscientist Lee Miller of the Northwestern University Feinberg School of Medicine in Chicago, Illinois, and colleagues implanted electrode grids into the primary motor cortex of two monkeys. This brain region issues commands that move muscles throughout the body, and the researchers positioned the electrodes in the part of the primary motor cortex that controls the hand, enabling them to record the electrical activity of about 100 neurons there. © 2010 American Association for the Advancement of Science.
Keyword: Robotics
Link ID: 16675 - Posted: 04.19.2012
Leila Haghighat At the turn of the twentieth century, the promise of regenerating damaged tissue was so far-fetched that Thomas Hunt Morgan, despairing that his work on earthworms could ever be applied to humans, abandoned the field to study heredity instead. Though he won the Nobel Prize in 1933 for his work on the role of chromosomes in inheritance, if he lived today, the advances in regenerative medicine may have tempted him to reconsider. Three studies published this week show that introducing new cells into mice can replace diseased cells — whether hair, eye or heart — and help to restore the normal function of those cells. These proof-of-principle studies now have researchers setting their sights on clinical trials to see if the procedures could work in humans. “You can grow cells in a Petri dish, but that’s not regenerative medicine,” says Robin Ali, a geneticist at University College London, who led the eye study. “You have to think about the biology of repair in a living system.” Sprouting hair In work published in Nature Communications, Japanese researchers grew different types of hair on nude mice, using stem cells from normal mice and balding humans to recreate the follicles from which hair normally emerges1. Takashi Tsuji, a regenerative-medicine specialist at Tokyo University of Science who led the study, says that the technique holds promise for treating male pattern baldness. © 2012 Nature Publishing Group
Keyword: Regeneration; Stem Cells
Link ID: 16674 - Posted: 04.19.2012
By Tina Hesman Saey The farmer’s wife in the nursery rhyme Three Blind Mice may need a different mouse hunting strategy. Thanks to new cell transplants, some formerly night-blind mice can see in the dark again, perhaps even well enough to evade the carving knife of the farmer’s wife. Injections of light-gathering nerve cells called rods into the retinas of night-blind mice integrated into the brain’s visual system and restored sight, Robin Ali of the University College London Institute of Ophthalmology and colleagues report online April 18 in Nature. The finding gives new hope that cell transplants may reverse damage to the brain and eyes caused by degenerative diseases and help heal spinal cord injuries. Other researchers have tried, and failed, to repair damaged retinas with stem cell transplants, says Christian Schmeer, a neurologist at the University Hospital Jena in Germany. The new study is the first to demonstrate that transplanted nerve cells can restore function. "They show it is possible and they do it convincingly," Schmeer says. At the same time, Schmeer cautions, “there’s still a lot to be done until it’s ready for clinical use.” In the study, Ali’s group transplanted immature rod cells from newborn mice into the retinas of adult mice. Rods, found in the back of the eye, work in dim light conditions. Other retina cells called cones sense bright light. In previous studies, the researchers had been able to transplant about 1,000 rods into mice’s retinas, but that wasn’t enough to restore vision. By optimizing techniques, the researchers coaxed about 26,000 rod cells to incorporate into the retina of each injected eye. © Society for Science & the Public 2000 - 2012
Keyword: Regeneration; Stem Cells
Link ID: 16673 - Posted: 04.19.2012
By Julian De Freitas and Brandon Liverence Notice that, even as you fixate on the screen in front of you, you can still shift your attention to different regions in your peripheries. For decades, cognitive scientists have conceptualized attention as akin to a shifting spotlight that “illuminates” regions it shines upon, or as a zoom lens, focusing on things so that we see them in finer detail. These metaphors are commonplace because they capture the intuition that attention illuminates or sharpens things, and thus, enhances our perception of them. Some of the important early studies to directly confirm this intuition were conducted by NYU psychologist Marisa Carrasco and colleagues, who showed that attention enhances the perceived sharpness of attended patterns. In their experiment, participants saw two textured patterns presented side-by-side on a computer screen, and judged which of the two patterns looked sharper. However, just before the patterns appeared, an attention-attracting cue was flashed at the upcoming location of one of the patterns. They found that attended patterns were perceived as sharper than physically identical unattended patterns. In other words, attention may make physically blurry (or otherwise degraded) images appear sharper – much like a zoom lens on a camera. Subsequent studies by Carrasco’s group and others found that attention also enhances perception of other features – for example, color saturation , orientation , and speed . This research suggests that attention causes incoming sensory information from attended locations to be processed more fully, without changing the information itself. © 2012 Scientific American,
Keyword: Attention; Vision
Link ID: 16672 - Posted: 04.19.2012
By Laura Sanders The brain’s power to focus can make a single voice seem like the only sound in a room full of chatter, a new study shows. The results help explain how people can pick out a speaker from a jumbled stream of incoming sounds. A deeper understanding of this feat could help scientists better treat people who can’t sort out sound signals effectively, an ability that can decline with age. “I think this is a truly outstanding study, which has deep implications for the way we think about the auditory brain,” says auditory neuroscientist Christophe Micheyl of the University of Minnesota, who was not involved in the new research. For the project, engineer Nima Mesgarani and neurosurgeon Edward Chang, both of the University of California, San Francisco, studied what happens in the brains of people who are trying to follow one of two talkers, a scenario known to scientists as the cocktail party problem. Electrodes placed under the skulls of three people for an epilepsy treatment picked up signs of brain signals called high gamma waves produced by groups of nerve cells. The pattern and strength of these signals reflect which sounds people are paying attention to. “We are able to assess what someone is actually hearing — not just what’s coming in through their ears,” Chang says. Volunteers listened to two speakers, one female and one male, saying nonsense sentences such as “Ready tiger go to red two now.” The participants had to report the color and number spoken by the person who said one of two call signs (“ringo” or “tiger”). © Society for Science & the Public 2000 - 2012
Keyword: Attention; Hearing
Link ID: 16671 - Posted: 04.19.2012
by Carl Zimmer Ahmad Hariri stands in a dim room at the Duke University Medical Center, watching his experiment unfold. There are five computer monitors spread out before him. On one screen, a giant eye jerks its gaze from one corner to another. On a second, three female faces project terror, only to vanish as three more female faces, this time devoid of emotion, pop up instead. A giant window above the monitors looks into a darkened room illuminated only by the curve of light from the interior of a powerful functional magnetic resonance imaging (fMRI) scanner. A Duke undergraduate—we’ll call him Ross—is lying in the tube of the scanner. He’s looking into his own monitor, where he can observe pictures as the apparatus tracks his eye movements and the blood oxygen levels in his brain. Ross has just come to the end of an hour-long brain scanning session. One of Hariri’s graduate students, Yuliya Nikolova, speaks into a microphone. “Okay, we’re done,” she says. Ross emerges from the machine, pulls his sweater over his head, and signs off on his paperwork. As he’s about to leave, he notices the image on the far-left computer screen: It looks like someone has sliced his head open and imprinted a grid of green lines on his brain. The researchers will follow those lines to figure out which parts of Ross’s brain became most active as he looked at the intense pictures of the women. He looks at the brain image, then looks at Hariri with a smile. “So, am I sane?” Hariri laughs noncommitally. “Well, that I can’t tell you.” © 2012, Kalmbach Publishing Co.
Keyword: Brain imaging
Link ID: 16670 - Posted: 04.19.2012
Lizzie Buchen Teenagers can do terrible things. In 1999, Kuntrell Jackson, then 14, was walking with his cousin and a friend in Blytheville, Arkansas, when they decided to rob a local video store. On the way there, his friend, Derrick Shields, revealed that he was carrying a sawn-off shotgun in his coat sleeve. During the robbery, Shields shot a shop worker in the face, killing her. Four years later, 14-year-old Evan Miller and an older friend were getting drunk and stoned with a middle-aged neighbour in a trailer park in Moulton, Alabama. A fight broke out, and Miller and the friend beat the neighbour with a baseball bat. Then they set fire to his home and ran, leaving him to die. Both Miller and Jackson were found guilty of homicide and sentenced to life without parole, meaning that both will spend the rest of their lives in prison. They are not alone. The United States currently has more than 2,500 individuals serving such sentences for crimes they committed as juveniles — that is, before their eighteenth birthdays. It is the only country that officially punishes juveniles in this way. Both Miller and Jackson appealed, arguing that their immaturity at the time of the crime rendered them less culpable for their actions than adults, and that they deserved a less severe punishment. The Supreme Court heard arguments in Miller v. Alabama and Jackson v. Hobbs in March, and is expected to deliver its ruling by this summer. The cases are notable not only because they could abolish life-without-parole sentences for juveniles, but also because neuroscience research may play a part in the decision. © 2012 Nature Publishing Group,
Keyword: Development of the Brain; Aggression
Link ID: 16669 - Posted: 04.19.2012
Dr. Becca, author of the blog ‘Fumbling Toward Tenure’. Last week, the New York Times’ “Well” section ran a piece titled, “How Exercise Can Prime the Brain for Addiction.” Scary, right? One minute you’re cruising along on the treadmill, and next thing you know, you’re ADDICTED TO COCAINE. Hovering over the web page tab header, however, reveals what may have been the original title—the more qualified, but less provocative “How Exercise May Make Addictions Better, or Worse.” Ironically, it’s the cutting-room-floor version of the title that more accurately (but only marginally so) reflects the findings of Mustroph et al (2012), an Illinois-based research group who studied the influence of exercise on the learning processes associated with drug use. In a nutshell, the researchers showed that the timing of exercise and drug exposure mattered: animals that exercised after getting a few injections of cocaine had an easier time “letting go” of their drug-associated cues than animals that exercised before cocaine exposure did. What Mustroph et al were not studying, though, was addiction—and this is only the beginning of where NYT writer Gretchen Reynolds does a disappointingly poor job of science reporting. This paper is about learning. With every experience we have, we learn something about the circumstances in which that experience occurred, and experience with drugs is no different. If you always do drugs in a certain room of your house, or at one particular club, you’re going to start associating those places with the drug, and, in all likelihood, with the way the drug makes you feel. You might even enjoy hanging out in those places when you’re not using the drug, because of the positive associations you’ve formed. © 2012 Scientific American
Keyword: Drug Abuse; Learning & Memory
Link ID: 16668 - Posted: 04.19.2012
by Andy Coghlan A massive genetics study relying on fMRI brain scans and DNA samples from over 20,000 people has revealed what is claimed as the biggest effect yet of a single gene on intelligence – although the effect is small. There is little dispute that genetics accounts for a large amount of the variation in people's intelligence, but studies have consistently failed to find any single genes that have a substantial impact. Instead, researchers typically find that hundreds of genes contribute. Following a brain study on an unprecedented scale, an international collaboration has now managed to tease out a single gene that does have a measurable effect on intelligence. But the effect – although measurable – is small: the gene alters IQ by just 1.29 points. According to some researchers, that essentially proves that intelligence relies on the action of a multitude of genes after all. "It seems like the biggest single-gene impact we know of that affects IQ," says Paul Thompson of the University of California, Los Angeles, who led the collaboration of 207 researchers. "But it's not a massive effect on IQ overall," he says. The variant is in a gene called HMGA2, which has previously been linked with people's height. At the site of the relevant mutation, the IQ difference depends on a change of a single DNA "letter" from C, standing for cytosine, to T, standing for thymine. © Copyright Reed Business Information Ltd.
Keyword: Intelligence; Genes & Behavior
Link ID: 16667 - Posted: 04.17.2012
By DOUGLAS QUENQUA CAMBRIDGE, Mass. — For a table set up by a campus student group, this one held some unusual items: a gynecologist’s speculum, diaphragms, condoms (his and hers) and several packets of lubricant. Nearby, two students batted an inflated condom back and forth like a balloon. “This is Implanon,” said Gabby Bryant, a 22-year-old senior who had helped set up the table, showing off a sample of the implantable birth control. “Here at Harvard, you get it for free.” It was Sex Week at Harvard, a student-run program of lectures, panel discussions and blush-inducing conversations about all things sexual. The event was Harvard’s first, though the tradition started at Yale in 2002 and has since spread to colleges around the country: Brown, Northeastern, the University of Kentucky, Indiana University and Washington University have all held some version of Sex Week in recent years. Despite the busy national debate over contraception and financing for reproductive health, Sex Week at Harvard (and elsewhere) has veered away from politics, emerging instead as a response to concern among students that classroom lessons in sexuality — whether in junior high school or beyond — fall short of preparing them for the experience itself. Organizers of these events say that college students today face a confusing reality: At a time when sexuality is more baldly and blatantly on display, young people are, paradoxically, having less sex than in generations past, surveys indicate. © 2012 The New York Times Company
Keyword: Sexual Behavior
Link ID: 16666 - Posted: 04.17.2012
By Ferris Jabr In kindergarten, several of my friends and I were very serious about learning to tie our shoes. I remember sitting on the edge of the playground, looping laces into bunny ears and twisting them into a knot over and over again until I had it just right. A few years later, whistling became my new challenge. On the car ride to school or walking between classes, I puckered my lips and blew, shifting my tongue like rudder to direct the air. Finally, after weeks of nothing but tuneless wooshing, I whistled my first note. Although I had no inkling of it at the time, my persistence rewired my brain. Just about everything we do modifies connections between brain cells—learning and memory are dependent on this flexibility. When we improve a skill through practice, we strengthen connections between neurons involved in that skill. In a recent study, scientists peeked into the brains of living mice as the rodents learned some new tricks. Mice who repeated the same task day after day grew more clusters of mushroomlike appendages on their neurons than mice who divided their attention among different tasks. In essence, the scientists observed a physical trace of practice in the brain. Yi Zuo of the University of California, Santa Cruz, and her colleagues studied how neurons changed in the brains of three groups of mice that learned different kinds of behaviors over four days, as well as a fourth group of mice that went about business as usual, learning nothing new. Of the three learning groups, the first practiced the same task each day, learning how to stretch their paws through gaps in a Plexiglass cage to get a tasty seed just within reach. The second group practiced two tasks: reaching for a seed and learning how to eat slippery bits of capellini, a very thin pasta. Each day mice in the third group played in a cage outfitted with a different set of toys, such as ropes, ladders and mesh on which to scamper and climb. © 2012 Scientific American
Keyword: Learning & Memory
Link ID: 16665 - Posted: 04.17.2012
By Ferris Jabr In search of nectar, a honeybee flies into a well-manicured suburban garden and lands on one of several camellia bushes planted in a row. After rummaging through the ruffled pink petals of several flowers, the bee leaves the first bush for another. Finding hardly any nectar in the flowers of the second bush, the bee flies to a third. And so on. Our brains may have evolved to forage for some kinds of memories in the same way, shifting our attention from one cluster of stored information to another depending on what each patch has to offer. Recently, Thomas Hills of the University of Warwick in England and his colleagues found experimental evidence for this potential parallel. "Memory foraging" is only one way of thinking about memory—and it does not apply universally to all types of information retained in the brain—but, so far, the analogy seems to work well for particular cases of active remembering. Hills and his colleagues asked 141 Indiana University Bloomington students to type the names of as many animals as they could think of in three minutes. For decades, psychologists have used such "verbal fluency tasks" to study memory and diseases in which memory breaks down, such as Alzheimer's and dementia. Again and again, researchers have found that people name animals—or vegetables or movies—in clusters of related items. They might start out saying "cat, dog, goldfish, hamster"—animals kept as pets—and then, having exhausted that subcategory, move onto ocean animals: "dolphin, whale, shark, octopus." © 2012 Scientific American
Keyword: Learning & Memory
Link ID: 16664 - Posted: 04.17.2012
by John Bohannon A smile and a frown mean the same thing everywhere—or so say many anthropologists and evolutionary psychologists, who for more than a century have argued that all humans express basic emotions the same way. But a new study of people's perceptions of computer-generated faces suggests that facial expressions may not be universal and that our culture strongly shapes the way we read and express emotions. The hypothesis that facial expressions convey the same meaning the world over goes all the way back to Charles Darwin. In his 1872 book The Expression of the Emotions in Man and Animals, the famed naturalist identified six basic emotional states: happiness, surprise, fear, disgust, anger, and sadness. If facial expressions are just cultural traits, passed down through the generations by imitation, their meanings would have diverged by now, he argued. A smile would signal happiness for some and disgust for others. But that's not what he found, based on his correspondence with researchers around the world using photos of various facial expressions. So Darwin concluded that the common ancestors of all living humans had the same set of basic emotions, with corresponding facial expressions as part of our genetic inheritance. Smiles and frowns are biological, not cultural. Or are they? Rachael Jack, a psychologist at the University of Glasgow in the United Kingdom, says that there is a fundamental flaw in the facial expression studies carried out since Darwin's time: Researchers have been using Darwin's six basic expressions as their starting point, and yet they were first identified by Western European scientists studying Western European subjects. The fact that non-Western subjects can recognize the emotions from photographs of those facial expressions has been taken as support for the universality hypothesis. But what if non-Western cultures have different basic emotions that underlie their expressions? Those expressions may be similar to those of Westerners, but with subtle differences that have gone undetected because no one has looked. © 2010 American Association for the Advancement of Science
Keyword: Emotions; Evolution
Link ID: 16663 - Posted: 04.17.2012


.gif)

