Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 16881 - 16900 of 29484

Women who are optimistic have a lower risk of heart disease and death, an American study shows. The latest study by US investigators mirrors the findings of earlier work by a Dutch team showing optimism reduces heart risk in men. The research on nearly 100,000 women, published in the journal Circulation, found pessimists had higher blood pressure and cholesterol. Even taking these risk factors into account, attitude alone altered risks. Optimistic women had a 9% lower risk of developing heart disease and a 14% lower risk of dying from any cause after more than eight years of follow-up. In comparison, cynical women who harboured hostile thoughts about others or were generally mistrusting of others were 16% more likely to die over the same time-scale. One possibility is that optimists are better at coping with adversity, and might, for example take better care of themselves when they do fall ill. In the study, the optimistic women exercised more and were leaner than pessimistic peers. Lead researcher Dr Hilary Tindle, assistant professor of medicine at the University of Pittsburgh, said: "The majority of evidence suggests that sustained, high degrees of negativity are hazardous to health." A spokeswoman for the British Heart Foundation said: "We know that hostile emotions can release certain chemicals in the body which may increase the risk of heart disease, but we don't fully understand how and why. (C)BBC

Keyword: Emotions; Stress
Link ID: 13171 - Posted: 08.17.2009

By RONI CARYN RABIN Elderly people who are physically active appear to be at lower risk of developing Alzheimer’s disease, as are those who eat a heart-healthy Mediterranean style diet, rich in fruits and vegetables and low in red meat. Now, a new study has found that the effects of the two lifestyle behaviors are independent — and the benefits add up. The Columbia University study followed a diverse group of 1,880 septuagenarian New Yorkers, assessing their diets and levels of physical activity, and screening them periodically for Alzheimer’s disease. After an average of five years, 282 cases of Alzheimer’s were diagnosed. Those who followed the healthiest diets were 40 percent less likely to develop Alzheimer’s than those with the worst diets, and those who got the most exercise were 37 percent less likely to develop the disease than those who got none. But the greatest benefits occurred in those who both ate healthy and remained active. Participants who scored in the top one-third for both diet and exercise were 59 percent less likely to be diagnosed with Alzheimer’s than those who scored in the lowest one-third. While one in 5 participants with the lowest scores developed Alzheimer’s, fewer than one in 10 of the top scorers developed the disease. Copyright 2009 The New York Times Company

Keyword: Alzheimers
Link ID: 13170 - Posted: 06.24.2010

Daniel H. Silverman, M.D., Ph.D. Personal Health columnist Jane Brody recently wrote about chemo brain, the sometimes prolonged mental fogginess that can follow cancer treatments, in “The Fog That Follows Chemotherapy” and “Taking Steps to Cope With Chemo Brain.” Dr. Daniel Silverman, a leading researcher in the field and co-author of “Your Brain on Chemo,” joined the Consults blog to answer readers’ questions. Here, Dr. Silverman responds to two readers’ concerns about chemotherapy, Alzheimer’s disease and attention deficit disorder. Q. Is There a Link Between Cancer Treatment and Alzheimer’s? Dr. Silverman responds: Sally, you have brought up what are really four related great questions that are of critical pertinence to the ultimate welfare of your sister, as well as of literally tens of thousands of older cancer patients who have received chemotherapy, and have subsequently been diagnosed with Alzheimer’s disease: The most important thing to realize is that, no matter how confident your sisters’ neurologists may be that she has Alzheimer’s disease, there’s a very good chance that they’re wrong — unless your sister has undergone biological testing that is specific for Alzheimer’s disease. Such tests might include a biopsy of her brain tissue, a lumbar puncture to extract and analyze cerebrospinal fluid, and/or (least invasively) a brain PET scan that has been interpreted by someone with specific expertise in reading that kind of scan. Copyright 2009 The New York Times Company

Keyword: Alzheimers
Link ID: 13169 - Posted: 06.24.2010

by Andy Coghlan Women: do you have a man? If you do, better beware. Chances are that some lone female has her eye on him. A new study provides evidence for what many have long suspected: that single women are much keener on pursuing a man who's already taken than a singleton. "The single women really, really liked the guy when he was taken," says Melissa Burkley of Oklahoma State University in Stillwater, who conducted the "mate-poaching" study with her colleague Jessica Parker. They asked 184 heterosexual students at the university to participate in a study on sexual attraction and told the volunteers that a computer program would match them with an ideal partner. Half the participants were single and half attached, with equal numbers of men and women in each group. Unknown to the participants, everyone was offered a fictitious candidate partner who had been tailored to match their interests exactly. The photograph of "Mr Right" was the same for all women participants, as was that of the ideal women presented to the men. Half the participants were told their ideal mate was single, and the other half that he or she was already in a romantic relationship. "Everything was the same across all participants, except whether their ideal mate was already attached or not," says Burkley. Journal reference: Journal of Experimental Social Psychology, DOI: 10.1016/j.jesp.2009.04.022

Keyword: Sexual Behavior; Evolution
Link ID: 13168 - Posted: 08.17.2009

By Katherine Harmon Gene therapy has been rhapsodized and vilified in its nearly two decades of human testing, helping some and making others sicker. But a new 12-month clinical trial has shown that, at least in one ocular disease, it appears safe and—perhaps even more impressive—effective. The research, part of a phase I clinical trial to test the safety of the treatment, was published as a letter to the editor in The New England Journal of Medicine earlier this week and will be in the September issue of Human Gene Therapy. (The paper was co-authored by about a dozen researchers, two of whom own equity in a company that could profit from a commercialized version of this procedure.) The researchers report that three young adults with severe vision impairment from a hereditary disease maintained improved eyesight a year after gene therapy was administered—and didn't suffer any health side effects in the meantime. Gene therapy, which often employs viruses to deliver the good genes to a body's target cells, has been known to trigger severe immune responses and was blamed for the death of an 18-year-old in 1999, who was receiving gene therapy for a hereditary metabolic disorder. The test subjects suffer from Leber congenital amaurosis (LCA), a form of hereditary retinal degeneration that occurs in infants and young children and is relatively rare. Most people who have lost vision due to hereditary retinal degeneration have either no photoreceptors with which to perceive light or photoreceptors that don't work. "This disease has a little bit of both," explains lead study author Artur Cideciyan, an associate research professor at the Scheie Eye Institute at the University of Pennsylvania. "It's a complex disease." © 1996-2009 Scientific American Inc.

Keyword: Vision; Genes & Behavior
Link ID: 13167 - Posted: 06.24.2010

By Vilayanur S. Ramachandran and Diane Rogers-Ramachandran All primates, including humans, have two eyes facing forward. With this binocular vision, the views through the two eyes are nearly identical. In contrast, many other animal groups, especially herbivores such as ungulates (hooved animals, including cows, sheep and deer) and lagomorphs (rabbits, for example), have eyes pointing sideways. This perspective provides largely independent views for each eye and an enormously enlarged field of view overall. Why did primates sacrifice panoram­ic vision? What benefit did they gain? We know binocular vision evolved several times independently in vertebrates. For example, among birds, predatory species such as owls and hawks have forward-pointing eyes. One theory is that the feature conferred a statistical advantage—two eyes are better than one—for detecting and discriminating objects, such as prey, in low light levels. But whatever the original reason for its emergence, the evolutionary novelty afforded a huge advantage: stereoscopic (literally, solid) vision. How does it work? Even though both your eyes point forward, they are separated horizontally so that they look at the world from two slightly different vantage points. It follows that each eye receives a slightly different picture of the three-dimensional scene around you; the differences (called retinal disparities) are proportional to the relative distances of the objects from you. Try this quick experiment to see what we mean: hold two fingers up, one in front of the other. Now, while fixating on the closer finger, alternately open and close each eye. You’ll notice that the farther the far finger is from you (don’t move the near finger), the greater the lateral shift in its position as you open and close each eye. On the retinas, this difference in line-of-sight shift manifests itself as disparity between the left and right eye images. © 1996-2009 Scientific American Inc.

Keyword: Vision
Link ID: 13166 - Posted: 06.24.2010

By Judith Burns A new study suggests that people from different cultures read facial expressions differently. East Asian participants in the study focused mostly on the eyes, but those from the West scanned the whole face. In the research carried out by a team from Glasgow University, East Asian observers found it more difficult to distinguish some facial expressions. The work published in Current Biology journal challenges the idea facial expressions are universally understood. In the study, East Asians were more likely than Westerners to read the expression for "fear" as "surprise", and "disgust" as "anger". The researchers say the confusion arises because people from different cultural groups observe different parts of the face when interpreting expression. East Asians participants tended to focus on the eyes of the other person, while Western subjects took in the whole face, including the eyes and the mouth. Co-author, Dr Rachael Jack, from the University of Glasgow, said: "Interestingly, although the eye region is ambiguous, subjects tended to bias their judgements towards less socially-threatening emotions - surprise rather than fear, for example. "This perhaps highlights cultural differences when it comes to the social acceptability of emotions." The team showed 13 Western Caucasians and 13 East Asians a set of standardised images depicting the seven main facial expressions: happy, sad, neutral, angry, disgusted, fearful and surprised. (C)BBC

Keyword: Emotions; Evolution
Link ID: 13165 - Posted: 08.15.2009

By Jenny Lauren Lee You are hiking in the mountains when, out of the corner of your eye, you see something suspiciously snakelike. You freeze and look more carefully, this time identifying the source of your terror: a stick. Yet you could have sworn it was a snake. The brain may play tricks, but in this case it was actually doing you a favor. The context — a mountain trail — was right for a snake. So your brain was primed to see one. And the stick was sufficiently snakelike to make your brain jump to a visual conclusion. But it turns out emotions are involved here, too. A fear of snakes means that given an overwhelming number of items to look at — rocks, shrubs, a hiking buddy — “snake” would take precedence. Studies show that the brain guesses the identity of objects before it has finished processing all the sensory information collected by the eyes. And now there is evidence that how you feel may play a part in this guessing game. A number of recent studies show that these two phenomena — the formation of an expectation about what one will see based on context and the visual precedence that emotions give to certain objects — may be related. In fact, they may be inseparable. New evidence suggests that the brain uses “affect” (pronounced AFF-ect) — a concept researchers use to talk about emotion in a cleaner, more clearly defined way — not only to tell whether an object is important enough to merit further attention, but also to see that object in the first place. © Society for Science & the Public 2000 - 2009

Keyword: Emotions; Vision
Link ID: 13164 - Posted: 06.24.2010

By Elizabeth Pennisi Imitation may be the sincerest form of flattery, but it's also a good way to make friends. People warm up to us when we unconsciously mimic them. Now, it turns out that capuchin monkeys also favor people who "flatter" them with imitation--the first time the behavior has been seen in nonhumans. "This work is really important because it shows that human social interaction ... [is] rooted in very implicit evolutionarily ancient processes," says Laurie Santos, a psychologist at Yale University. Researchers speculate that, because imitation seems to foster friendship in humans, it helps maintain the peaceful relationships that have been essential to the success of our species. Humans aren't the only social animal, however, and psychologist Annika Paukner wondered whether other primates share the imitation-friendship connection. Paukner and colleagues at the National Institute of Child Health and Human Development in Poolesville, Maryland, focused on capuchin monkeys, which live in highly social groups of 30 to 40 animals. They placed a monkey in the middle of three interconnected cages, while a person stood in front of each of the end cages. The monkey and the two people each had a small ball: One person poked, mouthed, and pounded the ball randomly; the other mimicked whatever the monkey did with the ball. Like humans, the monkeys seemed drawn to the imitation. After watching the two people for just a few minutes, the capuchins spent twice as much time looking at the imitator and almost 38% of the test period sitting directly in front of the imitator, compared with 27% in front of the other person. When the people switched places, the monkey tended to shift cages as well, showing a greater affiliation for the imitator.

Keyword: Emotions; Evolution
Link ID: 13163 - Posted: 08.15.2009

By TARA PARKER-POPE Researchers have found a genetic mutation in two people who need far less sleep than average, a discovery that might open the door to understanding human sleep patterns and lead to treatments for insomnia and other sleep disorders. The finding, published in the Friday issue of the journal Science, marks the first time scientists have identified a genetic mutation that relates to sleep duration in any animal or human. Although the mutation has been identified in only two people, the power of the research stems from the fact that the shortened sleep effect was replicated in mouse and fruit-fly studies. As a result, the research now gives scientists a clearer sense of where to look for genetic traits linked to sleep patterns. “I think it’s really a landmark study,” said Dr. Charles A. Czeisler, a leading sleep researcher and chief of sleep medicine at Brigham and Women’s Hospital in Boston. “It opens up a window to the understanding of the genetic basis of individual differences in sleep duration. Now you have a piece of the puzzle and you can begin to try to trace back as opposed to having little information as to where to start.” The gene mutation was found by scientists at the University of California, San Francisco, who were conducting DNA screening on several hundred blood samples from people who had taken part in sleep studies. Copyright 2009 The New York Times Company

Keyword: Sleep; Genes & Behavior
Link ID: 13162 - Posted: 06.24.2010

Colin Ellard is an associate professor of psychology at the University of Waterloo and the director of the university’s Research Laboratory for Immersive Virtual Environments, which is devoted to studies of the psychology of space, especially as it pertains to architecture, planning and design. He is also the author of You Are Here, a new book about the emerging psychology of direction. Mind Matters editor Gareth Cook chatted with him about the surprising ways we misunderstand the world around us. COOK: In your book, you pose the question: Which city is farther west, Reno or Los Angeles. Can you please explain? ELLARD: I based this question on some interesting research done by Barbara Tversky in which she showed that most people answer this question by saying that LA is farther west. This happens because our minds play all kinds of tricks to schematize space -- that is, to reduce complicated spatial relationships to very simple ones by aligning things that aren't aligned, straightening things that are curved, and grouping things together in ways that may not reflect reality. (California is west of Nevada so therefore everything in California must be west of everything in Nevada -- but it's not). The tricks are there for a very good reason -- they can help us to organize memories for spaces, but they can also let us down sometimes. COOK: I have looked at a map, and, honestly, I am still having trouble believing it is true... ELLARD: I'm laughing here because I actually had to fire up Google Maps to be sure I had this right. And I wrote the book! It's a testament to the power of these mental space-warping tools of ours that even when we understand how they work we still fall prey to them. © 1996-2009 Scientific American Inc.

Keyword: Miscellaneous
Link ID: 13161 - Posted: 06.24.2010

By Tina Hesman Saey Human see. Human do. As with monkeys, it’s apparently the same for some nerve cells in the brain. Macaque monkeys have specialized brain cells — called mirror neurons — that activate when a monkey performs an action involving an object, such as picking up a grape, or when watching someone else do the same task. The discovery of these neurons in 1996 led to speculation that they could be involved in everything from simulating others’ actions to language development to autism. There was only one problem: no one had definite proof that such cells exist in humans. Now a new study in the Aug. 12 Journal of Neuroscience provides strong evidence that humans have mirror neurons too. Researchers used functional MRI to examine volunteers’ brains for signs of mirror neurons. While in a scanner, volunteers either performed two different types of grips — a precision grip or putting a finger through a ring and pulling the ring — or watched videos of someone else making the movements. Groups of neurons in a part of the brain called the inferior frontal gyrus responded both to watching and doing the same action, researchers led by James Kilner, a neuroscientist at the Wellcome Trust Center for Neuroimaging at University College London in England, reported. Other groups have tried various techniques to discover human mirror neurons, but without success. Those groups had used volunteers perform or imitate that didn’t involve objects, such as playing rock, paper, scissors or making undirected motions. But interactions with objects are necessary to activate mirror neurons in monkeys, says Scott Grafton, a cognitive neuroscientist at the University of California, Santa Barbara. Kilner succeeded in finding the neurons because his experiments explored the interaction between movements and objects, Grafton says. © Society for Science & the Public 2000 - 2009

Keyword: Vision; Autism
Link ID: 13160 - Posted: 06.24.2010

By Carolyn Y. Johnson It’s fat as you’ve never thought of it before: A few ounces can burn off up to a fifth of a day’s calories. Recent discoveries are highlighting a good type of fat, called “brown fat,’’ that offers a potential new weapon to scientists looking for ways to fight obesity. Unlike better-known white fat, brown fat converts stored energy into heat. It was long known to exist in infants but had been thought to disappear with age. Then this spring, three research groups reported that brown fat exists in at least some adults. And two groups of Boston researchers have reported finding cellular switches that can be flipped on to make brown fat cells out of ordinary skin cells and other types of cells. “This is definitely a very big change in our thinking, because it really does mean now there is an opportunity to really work with this as a way to burn off energy,’’ said Dr. C. Ronald Kahn, head of obesity and hormone action research at the Joslin Diabetes Center who is involved in both lines of research. The discoveries raise the possibility that in the future, obesity could be treated by spurring the growth of brown fat cells in patients, transplanting such cells, or increasing the activity level of patients’ existing brown fat, Kahn said. © 2009 NY Times Co.

Keyword: Pain & Touch
Link ID: 13159 - Posted: 06.24.2010

By Ann Gibbons At a recent lunch with scientists who study the evolution of diet, a geneticist casually passed around a vial filled with strips of paper to lick, as though she were handing out toothpicks. Some people tasted nothing, whereas others puckered at the bitter flavor to varying degrees. That's because humans vary genetically in their ability to taste a bitter chemical known as phenylthiocarbamide, which elicits the same response as bitter flavors in Brussels sprouts, cabbage, and broccoli. A new study finds that if our close relatives, the Neandertals, were at that lunch, they too would have had varied tastes for bitter food, suggesting that differences in the ability to detect bitterness stretches back at least half a million years. The vast majority of living humans--at least 75% of people in the world--can detect bitter flavors. They have inherited either one or two copies of the "major taster" variant of the TAS2R38 gene, which mediates how protein receptors on the surface of the tongue detect bitterness. But about 25% of humans are insensitive to these bitter flavors. This has prompted researchers to wonder when and why this feature evolved in human evolution. In the new study, a team of Spanish researchers examined the DNA from a Neandertal from El Sidron cave in northern Spain. The team reports online today in Biology Letters that it sequenced the amino acids encoded by the TAS2R38 gene and found that 55% of the DNA included the taster version of the gene and 44% included the nontaster version. "This Neandertal was a taster although slightly less of a taster" than someone with two copies of the gene, says evolutionary biologist Carles Lalueza-Fox of the Institute of Evolutionary Biology in Barcelona, Spain, who led the study. © 2009 American Association for the Advancement of Science.

Keyword: Chemical Senses (Smell & Taste); Evolution
Link ID: 13158 - Posted: 06.24.2010

By Adam Brimelow Fresh evidence has emerged of the stigma surrounding mental health problems. A poll by YouGov suggests that more than one in three of the public think people with schizophrenia are likely to be violent. Two short films that challenge this misconception have been released. They can be viewed online and will soon be screened in cinemas. The opening frames create a mood of menace and tension -- with shifting shadows, eyes twitching, a jarring soundtrack, and the flashing banner "Schizo". Bit by bit the camera edges closer to white creaky door..... And then, behind it, there is Stuart - pouring a cup of tea, and talking about his life with schizophrenia. "Hi there. I'm sorry to disappoint you if you were expecting a lunatic with a knife and some sort of rampage," he says. He explains he was diagnosed with the condition 12 years ago. He says many people with mental illness face prejudice, but that he had family and friends to help him lead a full life. The film's director, Jonathan Pearson, says he wants to make people confront their own attitudes about mental illness and violence. "It challenges it by using the typical conventions of a horror movie, and then half way through the film we change," he said. "All the lights change from a horror section to a very inviting comfortable environment. (C)BBC

Keyword: Schizophrenia
Link ID: 13157 - Posted: 08.11.2009

Brain radiotherapy affects mind Brain tumour Radiotherapy is a common treatment for brain tumours Radiotherapy used to treat brain tumours may lead to a decline in mental function many years down the line, say Dutch researchers. A study of 65 patients, 12 years after they were treated, found those who had radiotherapy were more likely to have problems with memory and attention. Writing in The Lancet Neurology, the researchers said doctors should hold off using radiotherapy where possible. One UK expert said doctors were cautious about using radiotherapy. The patients in the study all had a form of brain tumour called a low-grade glioma - one of the most common types of brain tumour. In these cases radiotherapy is commonly given after initial surgery to remove the tumour, but there is some debate about whether this should be done immediately or used only if the cancer returns. It always depends on the patient, but if it is possible to defer radiotherapy, maybe people should Dr Linda Douw, study leader It is known that radiation treatment in the brain causes some damage to normal tissue and the study's researchers suspected it could lead to decline in mental function. A previous study in the same patients done six years after treatment found no difference in aspects like memory, attention and the speed at which people could process information, in those who had received radiotherapy. But the latest research, carried out more than a decade after original treatment, did find significant variation in the results of several mental tests between those who had had radiotherapy and those who had not. In all, 53% of patients who had radiotherapy showed decline in brain function compared with 27% of patients who only had surgery. The most profound differences were in tests to measure attention. Radiotherapy used to treat brain tumours may lead to a decline in mental function many years down the line, say Dutch researchers. A study of 65 patients, 12 years after they were treated, found those who had radiotherapy were more likely to have problems with memory and attention. Writing in The Lancet Neurology, the researchers said doctors should hold off using radiotherapy where possible. One UK expert said doctors were cautious about using radiotherapy. The patients in the study all had a form of brain tumour called a low-grade glioma - one of the most common types of brain tumour. In these cases radiotherapy is commonly given after initial surgery to remove the tumour, but there is some debate about whether this should be done immediately or used only if the cancer returns. It is known that radiation treatment in the brain causes some damage to normal tissue and the study's researchers suspected it could lead to decline in mental function. A previous study in the same patients done six years after treatment found no difference in aspects like memory, attention and the speed at which people could process information, in those who had received radiotherapy. But the latest research, carried out more than a decade after original treatment, did find significant variation in the results of several mental tests between those who had had radiotherapy and those who had not. In all, 53% of patients who had radiotherapy showed decline in brain function compared with 27% of patients who only had surgery. The most profound differences were in tests to measure attention. (C)BBC

Keyword: Alzheimers
Link ID: 13156 - Posted: 08.11.2009

By Steve Connor, Science Editor Men worried about keeping their marbles should take a long look at themselves in the shaving mirror. Scientists have found that the more symmetrical a man's face is, the less likely he is to suffer mental decline in very old age. Although the connection between a symmetrical face and cognitive ability may seem surprising, scientists believe that it could be explained by the idea that a good set of genes for facial symmetry may be linked with an equally good set of genes for brain preservation. The same study, however, failed to find any link between facial symmetry in women and mental decline in old age. Scientists said that they were surprised to find the link in men but not in women of the same age. The study is based on the Scottish Mental Survey undertaken in 1932 when hundreds of 11-year-olds were given an IQ test. A sample of the survivors of this study were tested again when they reached the age of 79, and then 216 of them were given a further IQ test when they reached the age of 83. Data from the health survey have enabled scientists to study the mental decline that occurs over a lifetime, and especially the more rapid decline that takes place in much later life in the years just prior to death. A research team led by Lars Penke of the University of Edinburgh analysed photographs of the 216 pensioners and compared their facial symmetries – how similar the left side is to the right – with the degree of mental decline they suffered between the ages of 79 and 83. ©independent.co.uk

Keyword: Alzheimers; Genes & Behavior
Link ID: 13155 - Posted: 06.24.2010

By AMANDA SCHAFFER You can do almost anything on the Internet these days. What about getting a good night’s sleep? It might be possible, some researchers say. Web-based programs to treat insomnia are proliferating, and two small but rigorous studies suggest that online applications based on cognitive behavioral therapy can be effective. “Fifteen years ago, people would have thought it was crazy to get therapy remotely,” said Bruce Wampold, a professor of counseling psychology at the University of Wisconsin. “But as we do more and more things electronically, including have social relationships, more therapists have come to believe that this can be an effective way to deliver services to some people.” The first controlled study of an online program for insomnia was published in 2004. But the results were hard to interpret, because they showed similar benefits for those who used the program and those in the control group. The two new studies, from researchers in Virginia and in Canada, advance the evidence that such programs can work. In the Virginia study, called SHUTi, patients enter several weeks of sleep diaries, and the program calculates a window of time during which they are allowed to sleep. Patients limit the time they spend in bed to roughly the hours that they have actually been sleeping. Copyright 2009 The New York Times Company

Keyword: Sleep
Link ID: 13154 - Posted: 06.24.2010

By NICHOLAS BAKALAR Researchers have found experimental evidence that a touch can be worth a thousand words, that fleeting physical contact can express specific emotions — silently, subtly and unmistakably. Scientists led by Matthew J. Hertenstein, an associate professor of psychology at DePauw University, recruited 248 students, each to touch or be touched by a partner previously unknown to them to try to communicate a specific emotion: anger, fear, happiness, sadness, disgust, love, gratitude or sympathy. The person touched was blindfolded and ignorant of the sex of the toucher, who was instructed to try to convey one of the eight emotions, and both participants remained silent. Forty-four women and 31 men touched a female partner, while 25 men and 24 women touched a male partner. Afterward, each person touched was given the list of eight emotions and told to pick the one conveyed. There was also a ninth choice, “none of these terms are correct,” to eliminate the possibility of forcing a choice of emotion when none were truly felt. The touchers were instructed to touch any appropriate part of the body, and they chose variously to touch the head, face, arms, hands, shoulders, trunk and back. Accurate understanding ranged from 50 percent to 78 percent, much higher than the 11 percent expected by chance and comparable to rates seen in studies of verbal and facial emotion. Copyright 2009 The New York Times Company

Keyword: Emotions; Pain & Touch
Link ID: 13153 - Posted: 06.24.2010

By C. CLAIBORNE RAY Q. If I eat a raw jalapeño pepper, my mouth is afire, my eyes water and my nose runs. How can some people eat pepper after pepper without pain? Have they destroyed the sensory receptors in their mouths and throats? A. No receptors are destroyed, said Harry T. Lawless, a professor of food science at Cornell and an expert in the taste, smell and sensory evaluation of food. Instead, “people who eat a lot of the stuff tend to develop a tolerance that we call desensitization,” he said. “There is nothing harmful in the capsaicin molecule, the active ingredient of hot peppers,” he said. “Capsaicin is kind of a harmless drug, and like any drug we develop a tolerance to it.” One theory is that a neurotransmitter gets depleted so that people respond less vigorously to capsaicin the more they are exposed to it, he said. The capsaicin molecule has both stimulating and anesthetic properties, Dr. Lawless said. In 1952, The Dublin Medical Press recommended it as a temporary cure for toothache, he said, and pharmacologists, particularly in Hungary, have studied this anesthetic property in related molecules. “The antidote to the mouth burning and the eyes watering is to eat more,” Dr. Lawless said, “either right away or later.” Chronic desensitization seems to be a matter of long-term dietary change, he said, but there is also the short-term numbing effect. Copyright 2009 The New York Times Company

Keyword: Pain & Touch
Link ID: 13152 - Posted: 06.24.2010