Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 13661 - 13680 of 29398

By James Gallagher Health and science reporter, BBC News It may be possible to detect autism at a much earlier age than previously thought, according to an international team of researchers. A study published in Current Biology identified differences in infants' brainwaves from as early as six months. Behavioural symptoms of autism typically develop between a child's first and second birthdays. Autism charities said identifying the disorder at an earlier stage could help with treatment. It is thought that one in every 100 children has an autism spectrum disorder in the UK. It affects more boys than girls. While there is no "cure", education and behavioural programmes can help. One of the researchers, Prof Mark Johnson from Birkbeck College, University of London, told the BBC: "The prevailing view is that if we are able to intervene before the onset of full symptoms, such as a training programme, at least in some cases we can maybe alleviate full symptoms." His team looked for the earliest signs of autism in 104 children aged between six and 10 months. Half were known to be at risk of the disorder because they had on older sibling who had been diagnosed with autism. The rest were low risk. Older children with autism can show a lack of eye contact, so the babies were shown pictures of people's faces that switched between looking at or away from the baby. BBC © 2012

Keyword: Autism
Link ID: 16309 - Posted: 01.28.2012

by Michael Marshall For most of us, blurry vision is a bad thing, if only because it means we're going to have to spend a lot of money on a new pair of glasses. For one jumping spider, though, it's how it catches dinner. Adanson's house jumper, as the name implies, is a jumping spider. It springs on unsuspecting prey insects from several centimetres away and swiftly dispatches them. To pull off these leaps, it has to be an excellent judge of distance. And for that, paradoxically, it has part of its visual field permanently out of focus. It's the only animal known to judge distance in this way. Stalk, jump and bite The Adanson's house jumper is a cosmopolitan species – meaning it lives all over the place. It hunts during the day, pouncing on insects and other prey, although like many jumping spiders it may also take the occasional drink of nectar. To cope with its agile lifestyle, it must have excellent eyesight. How it works is not obvious, though. Lab tests have shown that it has top-class colour vision, but that doesn't help it judge distance. Other animals have all sorts of ways to work out how far away an object is, the most obvious being simply to have two eyes with overlapping fields of vision and compare what they see. © Copyright Reed Business Information Ltd.

Keyword: Vision; Evolution
Link ID: 16308 - Posted: 01.28.2012

By Jason G. Goldman Which limb do you prefer? If you’re like most members of our species, you prefer your right hand for most tasks. If you’re like a smaller minority of our species, you might prefer your left hand. Very, very few of us are truly ambidextrous. Most of us have at least a minor preference for one hand over the other. So do wallabies. On the one hand (ha!), this shouldn’t be all that surprising. Nervous systems became lateralized quite early in the evolution of vertebrates. For example, there is research showing that fish show a preference for touching the sides of aquariums with one side of their ventral fins or another. And it is not surprising that humans overwhelmingly favor their right hands. When it comes to feeding behaviors, fishes, reptiles, and toads all favor their right eye (and their brain’s left hemisphere). The same is true for birds like chickens, pigeons, quails, and stilts. The right-eye preference can be so strong that one bird – New Zealand wry-billed plover – evolved a beak that slopes slightly to the right. And a study of seventy-five whales showed that sixty of them had abrasions on the right side of their jaws, while the other fifteen had only injured the left side of their jaws. As Peter F. MacNeilage, Lesley J. Rogers and Giorgio Vallortigara pointed out in a 2009 article in Scientific American, the data indicated that whales tended to use one side of the jaw more than the other for gathering food, “and that ‘right-jawedness’ is by far the norm.” On the other hand, we have no real reason to automatically assume that wallabies would show a limb preference, just because a diverse handful of other species do. After all, the vast majority of research on limb preference and of behavioral laterality more generally has focused on primates, mainly because researchers’ main goal has been to discern the evolutionary origins of brain asymmetry and handedness in humans. © 2012 Scientific American

Keyword: Laterality
Link ID: 16307 - Posted: 01.28.2012

By Laura Sanders Humankind’s sharpest minds have figured out some of nature’s deepest secrets. Why the sun shines. How humans evolved from single-celled life. Why an apple falls to the ground. Humans have conceived and built giant telescopes that glimpse galaxies billions of light-years away and microscopes that illuminate the contours of a single atom. Yet the peculiar quality that enabled such flashes of scientific insight and grand achievements remains a mystery: consciousness. Though in some ways deeply familiar, consciousness is at the same time foreign to those in its possession. Deciphering the cryptic machinations of the brain — and how they create a mind — poses one of the last great challenges facing the scientific world. For a long time, the very question was considered to be in poor taste, acceptable for philosophical musing but outside the bounds of real science. Whispers of the C-word were met with scorn in polite scientific society. Toward the end of the last century, though, sentiment shifted as some respectable scientists began saying the C-word out loud. Initially these discussions were tantalizing but hazy: Like kids parroting a dirty word without knowing what it means, scientists speculated on what consciousness is without any real data. After a while, though, researchers developed ways to turn their instruments inward to study the very thing that was doing the studying. © Society for Science & the Public 2000 - 2012

Keyword: Consciousness; Attention
Link ID: 16306 - Posted: 01.28.2012

By Tom Siegfried When Francis Crick decided to embark on a scientific research career, he chose his specialty by applying the “gossip test.” He’d noticed that he liked to gossip about two especially hot topics in the 1940s — the molecular basis for heredity and the mysteries of the brain. He decided to tackle biology’s molecules first. By 1953, with collaborator James Watson (and aided by data from competitor Rosalind Franklin), Crick had identified the structure of the DNA molecule, establishing the foundation for modern genetics. A quarter century later, he decided it was time to try the path not taken and turn his attention to the brain — in particular, the enigma of consciousness. At first, Crick believed the mysteries of consciousness would be solved with a striking insight, similar to the way the DNA double helix structure explained heredity’s mechanisms. But after a while he realized that consciousness posed a much tougher problem. Understanding DNA was easier because it appeared in life’s history sooner; the double helix template for genetic replication marked the beginning of evolution as we know it. Consciousness, on the other hand, represented evolution’s pinnacle, the outcome of eons of ever growing complexity in biochemical information processing. “The simplicity of the double helix … probably goes back to near the origin of life when things had to be simple,” Crick said in a 1998 interview. “It isn’t clear there will be a similar thing in the brain.” © Society for Science & the Public 2000 - 2012

Keyword: Consciousness; Attention
Link ID: 16305 - Posted: 01.28.2012

By Rebecca Cheung The protein-based pathogens known as prions may pass between different species more easily than has been thought, a team of French researchers reports in the Jan. 27 Science. By infecting engineered mice with prions from cows and goats, scientists also have shown that the invaders readily target tissues other than the brain. “We may underestimate the threat posed by some of these diseases by focusing only on the brain,” says Pierluigi Gambetti, a prion researcher at Case Western Reserve University in Cleveland. “It adds a new element to the equation.” The research also raises the possibility that new prion strains recently identified in cattle and small rodents might be able to jump to other species, including humans. “We should, in the future, be more exhaustive when looking at the possibility of prions being passed from one species to another,” says Hubert Laude, a professor at the French National Institute for Agricultural Research in Jouy-en-Josas and a coauthor of the study. Prions closely resemble normal proteins made by a host. When prions invade a host, they propagate by forcing these normal host proteins, actually called prion proteins, to assemble improperly. When these malformed proteins accumulate in the brain, they cause mind-wasting conditions such as Creutzfeldt-Jakob disease in people and scrapie in sheep. © Society for Science & the Public 2000 - 2012

Keyword: Prions
Link ID: 16304 - Posted: 01.28.2012

By Amber Dance A fighter pilot heads back to base after a long mission, feeling spent. A warning light flashes on the control panel. Has she noticed? If so, is she focused enough to fix the problem? Thanks to current advances in electroencephalographic (EEG) brain-wave detection technology, military commanders may not have to guess the answers to these questions much longer. They could soon be monitoring her mental state via helmet sensors, looking for signs she is concentrating on her flying and reacting to the warning light. This is possible because of two key advances made EEG technology wireless and mobile, says Scott Makeig, director of the University of California, San Diego's Swartz Center for Computational Neuroscience (SCCN) in La Jolla, Calif. EEG used to require users to sit motionless, weighted down by heavy wires. Movement interfered with the signals, so that even an eyebrow twitch could garble the brain impulses. Modern technology lightened the load and wirelessly linked the sensors and the computers that collect the data. In addition, Makeig and others developed better algorithms—in particular, independent component analysis. By reading signals from several electrodes, they can infer where, within the skull, a particular impulse originated. This is akin to listening to a single speaker's voice in a crowded room. In so doing, they are also able to filter out movements—not just eyebrow twitches, but also the muscle flexing needed to walk, talk or fly a plane. © 2012 Scientific American,

Keyword: Robotics
Link ID: 16303 - Posted: 01.28.2012

by Sarah C. P. Williams What do you get when you combine a monkey's brain with the whiskers of a rat? A robotic rodent that can sense its environment almost as well as the real thing. The new rat-bot could lead to the development of robots that can feel their way through earthquake rubble and could provide clues to how live rats analyze sensory information from their whiskers. Although recent research has helped scientists understand what information whiskers send to the brains of rodents, deciphering how rats and mice interpret that sensory information has been trickier. Previous models assumed that rodents looked at whisker movement patterns and vibrations over a set duration of time and that their brain made a decision, based on the whole of the data, about the most likely surface the whiskers were touching. If the overall data best matched the known patterns for a hard vinyl floor, for example, the rats would conclude that's the surface that they're on. But different robots created using this model of reasoning were only 50% to 80% accurate at guessing the floor underneath them after 0.4 seconds of exposure, multiple studies have found. Computational neuroscientist Nathan Lepora of the University of Sheffield in the United Kingdom and his team thought that a model of information processing recently discovered in monkeys might help the robots make better judgments on floor type. The primates don't use a single piece of evidence to make a decision about what they're seeing. Rather, their brains rely on an accumulation of data. When the monkeys watch screens of randomly moving dots, for example, different neurons sense each direction of movement: up, down, left, and right. As dots on the screen flit about, more neurons of each type begin to fire, accumulating a total activity level for the group of neurons. Once, say, the "up" neurons reach a specific threshold, they pass on the message that the dots are moving in that direction. © 2010 American Association for the Advancement of Science.

Keyword: Pain & Touch; Robotics
Link ID: 16302 - Posted: 01.26.2012

Erin Allday, Chronicle Staff Writer Morgellons disease - a creepy illness that leaves patients with painful lesions, gives them a feeling that bugs are crawling all over their body, and has them seeing colorful, threadlike fibers poking through their skin - isn't infectious and probably isn't caused by anything in the environment, according to the first government study of the condition. Rather, Morgellons is likely to be a mental illness and should probably be treated with the same drug and psychiatric care that works for people who suffer delusions, researchers with the U.S. Centers for Disease Control and Prevention said Wednesday. "There were some possibilities of what could be causing this, and we've taken a couple of the big ones off the table. That's a really big step forward," said Dr. Mark Eberhard, director of the CDC's Division of Parasitic Diseases and Malaria and a lead investigator in the study. Seeking acknowledgement The study focused on patients in the Bay Area, where a cluster of Morgellons cases have been reported over the past several years. Patients all complained of the same strange, often horrifying symptoms, and they became increasingly angry and frustrated that physicians weren't taking their condition seriously. Just getting the research done was a major coup for Morgellons sufferers, who had been clamoring for a serious scientific study of their illness for years. But by ruling out infectious and environmental causes of the disease and suggesting it's a delusional condition, the CDC report was disappointing, patients said Wednesday. © 2012 Hearst Communications Inc.

Keyword: Depression
Link ID: 16301 - Posted: 01.26.2012

By BENEDICT CAREY When does a broken heart become a diagnosis? In a bitter skirmish over the definition of depression, a new report contends that a proposed change to the diagnosis would characterize grieving as a disorder and greatly increase the number of people treated for it. The criteria for depression are being reviewed by the American Psychiatric Association, which is finishing work on the fifth edition of its Diagnostic and Statistical Manual of Mental Disorders, or D.S.M., the first since 1994. The manual is the standard reference for the field, shaping treatment and insurance decisions, and its revisions will affect the lives of millions of people for years to come. In coming months, as the manual is finalized, outside experts will intensify scrutiny of its finer points, many of which are deeply contentious in the field. A controversy erupted last week over the proposed tightening of the definition of autism, possibly sharply reducing the number of people who receive the diagnosis. Psychiatrists say current efforts to revise the manual are shaping up as the most contentious ever. The new report, by psychiatric researchers from Columbia and New York Universities, argues that the current definition of depression — which excludes bereavement, the usual grieving after the loss of a loved one — is far more accurate. If the “bereavement exclusion” is eliminated, they say, “there is the potential for considerable false-positive diagnosis and unnecessary treatment of grief-stricken persons.” Drugs for depression can have side effects, including low sex drive and sleeping problems. © 2012 The New York Times Company

Keyword: Depression
Link ID: 16300 - Posted: 01.26.2012

by Debora MacKenzie Sleeping sickness is a formidable foe, killing thousands in Africa every year. There are only five drugs to combat the parasite, which is carried by tsetse flies, and they can have severe side effects. Worse, the parasite is becoming resistant. "If we knew how the drugs work, we could perhaps design better ones," says David Horn of the London School of Hygiene and Tropical Medicine. To investigate, Horn's lab exploited a phenomenon called RNA interference (RNAi) – the ability of certain small RNA molecules to block the activity of individual genes. Horn's team used a previously created DNA library, in which the parasite's genome was cut into chunks, and these were put into bacteria in a way that generated the interfering RNAs. Each of these inactivated a parasite gene with the corresponding genetic code. The researchers then exposed parasites to all of the interfering RNA molecules as well as each of the five drugs. If the parasites survived, it meant that the RNA sequences that had bound to them must have blocked a gene or genes needed for that drug to work. They then mapped those RNA sequences in the parasite's DNA. This revealed 55 genes that the drugs interact with – a step towards working out how they kill the parasite and finding safer drugs with the same effect. © Copyright Reed Business Information Ltd.

Keyword: Sleep
Link ID: 16299 - Posted: 01.26.2012

By Erica Westly Amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig’s disease, is a progressive neuromuscular disease that affects about 130,000 people worldwide a year. The vast majority of patients are isolated cases with no known family history of the disease. They usually start developing symptoms of the loss of motor neurons in middle age and die within five years of diagnosis. Researchers know very little about what causes ALS. Now a recent study in Nature Biotechnology suggests that the neuron death associated with the disease may be caused by astrocytes, a type of brain cell that normally helps neurons. Previous research had suggested that astrocytes could become toxic in the rare form of ALS known to have genetic roots, and the study authors wanted to see if a similar phenomenon might happen in the more common iso­­lated cases. The answer turned out to be yes: when they cultured astro­cytes from those ALS patients, the healthy motor neurons in the culture began to die off after a few days. Other types of neurons were unaffected by the astrocytes, suggesting that they specifically harm the neurons involved in controlling the body’s movements. Lead author Brian Kaspar, a neuroscientist at Ohio State University, and his collaborators next will attempt to figure out what makes the astrocytes behave this way. If researchers can understand why motor neurons die in ALS, they may have a better chance of finding a cure. © 2012 Scientific American,

Keyword: ALS-Lou Gehrig's Disease ; Glia
Link ID: 16298 - Posted: 01.26.2012

By GINA KOLATA Fat people have less than thin people. Older people have less than younger people. Men have less than younger women. It is brown fat, actually brown in color, and its great appeal is that it burns calories like a furnace. A new study finds that one form of it, which is turned on when people get cold, sucks fat out of the rest of the body to fuel itself. Another new study finds that a second form of brown fat can be created from ordinary white fat by exercise. Of course, researchers say, they are not blind to the implications of their work. If they could turn on brown fat in people without putting them in cold rooms or making them exercise night and day, they might have a terrific weight loss treatment. And companies are getting to work. But Dr. André Carpentier, an endocrinologist at the University of Sherbrooke in Quebec and lead author of one of the new papers, notes that much work lies ahead. It is entirely possible, for example, that people would be hungrier and eat more to make up for the calories their brown fat burns. “We have proof that this tissue burns calories — yes, indeed it does,” Dr. Carpentier said. “But what happens over the long term is unknown.” Until about three years ago, researchers thought brown fat was something found in rodents, which cannot shiver and use heat-generating brown fat as an alternate way to keep warm. Human infants also have it, for the same reason. But researchers expected that adults, who shiver, had no need for it and did not have it. © 2012 The New York Times Company

Keyword: Obesity
Link ID: 16297 - Posted: 01.26.2012

By GRETCHEN REYNOLDS Alzheimer’s disease, with its inexorable loss of memory and self, understandably alarms most of us. This is especially so since, at the moment, there are no cures for the condition and few promising drug treatments. But a cautiously encouraging new study from The Archives of Neurology suggests that for some people, a daily walk or jog could alter the risk of developing Alzheimer’s or change the course of the disease if it begins. For the experiment, researchers at Washington University in St. Louis recruited 201 adults, ages 45 to 88, who were part of a continuing study at the university’s Knight Alzheimer’s Disease Research Center. Some of the participants had a family history of Alzheimer’s, but none, as the study began, showed clinical symptoms of the disease. They performed well on tests of memory and thinking. “They were, as far as we could determine, cognitively normal,” says Denise Head, an associate professor of psychology at Washington University who led the study. The volunteers had not had their brains scanned, however, so the Washington University scientists began their experiment by using positron emission tomography, an advanced scanning technique, to look inside the volunteers’ brains for signs of amyloid plaques, the deposits that are a hallmark of Alzheimer’s. People with a lot of plaque tend to have more memory loss, though the relation is complex. Next they genetically typed their volunteers for APOE, a gene involved in cholesterol metabolism. Everyone carries the APOE gene, but scientists have determined that those who have a particular variation of the gene known as e4 are at 15 times the risk of developing Alzheimer’s compared with those who do not carry the variant. © 2012 The New York Times Company

Keyword: Alzheimers; Genes & Behavior
Link ID: 16296 - Posted: 01.26.2012

Ewen Callaway Skin cells from patients with Alzheimer’s disease have been reprogrammed to form brain cells, offering clues to their dementia and, for others, the prospect of early diagnosis and new ways of finding treatments. An estimated 30 million people worldwide have Alzheimer’s disease, which causes neurodegeneration and typically strikes late in life. The disease is nearly impossible to diagnose before symptoms develop, and no drugs exist at present that can change its course. Scientists aiming to learn the causes of Alzheimer’s have looked to brain biopsies of patients after they die, blood tests and animals as diverse as fruitflies and fish. Until recently, it has not been possible to probe the neurons of Alzheimer’s patients before they show symptoms. “By the time you can see dementia in a person, their brain cells have been behaving in an abnormal way for years, perhaps decades or longer,” says Larry Goldstein, a neuroscientist at the University of California, San Diego, who led the study published online today in Nature1. Goldstein and his team created induced pluripotent stem (iPS) cells from four patients with Alzheimer’s and two people without dementia. iPS cells are made by treating fibroblasts, a type of skin cell, with reprogramming factors to revert them to an embryonic-like state. Like the stem cells in early embryos, iPS cells can form any tissue in the body — including neurons. © 2012 Nature Publishing Group

Keyword: Alzheimers
Link ID: 16295 - Posted: 01.26.2012

Mark Napadano watched in horror as his 13-year-old son slammed head first into the hard ground after a motocross accident. In seconds he was at the side of his son, Sam, terrified by the sight of the junior high athlete so full of life just moments before lying limp in front of him - and not breathing. “It was like a nightmare,” Mark remembers. At the hospital doctors examined Sam and gave Mark the frightening news: Sam had a large pocket of blood pooling near the top of his head and two smaller bleeds in the front and two in the back. “They didn’t say he was going to die, but they didn’t say he was going to live,” recalls the 45-year-old car dealer from Butler, Pa. advertisement Sam was in a coma for days and in critical care for almost a month. By the time he was released to a rehab facility the 5-foot-4-inch teen had dropped from a trim and muscular 114 pounds to just 84. For months Mark and his wife, Sue, watched as their son learned to talk and walk for a second time. Now, three years after the wreck Sam is almost back to where he was before, Mark says. Sam returned to school three months after the accident and kept up his rehab for two years. He still has some short term memory problems and though his working memory has improved, it can be a challenge if too many commands are thrown his way at the same time. © 2012 msnbc.com

Keyword: Brain Injury/Concussion; Development of the Brain
Link ID: 16294 - Posted: 01.24.2012

By JAMES GORMAN Disgust is the Cinderella of emotions. While fear, sadness and anger, its nasty, flashy sisters, have drawn the rapt attention of psychologists, poor disgust has been hidden away in a corner, left to muck around in the ashes. No longer. Disgust is having its moment in the light as researchers find that it does more than cause that sick feeling in the stomach. It protects human beings from disease and parasites, and affects almost every aspect of human relations, from romance to politics. In several new books and a steady stream of research papers, scientists are exploring the evolution of disgust and its role in attitudes toward food, sexuality and other people. Paul Rozin, a psychologist who is an emeritus professor at the University of Pennsylvania and a pioneer of modern disgust research, began researching it with a few collaborators in the 1980s, when disgust was far from the mainstream. “It was always the other emotion,” he said. “Now it’s hot.” It still won’t wear glass slippers, which may be just as well, given the stuff it has to walk through. Nonetheless, its reach takes disgust beyond the realms of rot and excrement. © 2012 The New York Times Company

Keyword: Emotions; Evolution
Link ID: 16293 - Posted: 01.24.2012

By ANDREW POLLACK LOS ANGELES — A treatment for eye diseases that is derived from human embryonic stem cells might have improved the vision of two patients, bolstering the beleaguered field, researchers reported Monday. The report, published online in the medical journal The Lancet, is the first to describe the effect on patients of a therapy involving human embryonic stem cells. The paper comes two months after the Geron Corporation cast a pall over the field by abruptly halting the world’s first clinical trial based on embryonic stem cells — one aimed at treating spinal cord injury. Geron, which has not published results from the aborted trial, also said it would abandon the entire stem cell field. The results reported Monday could help lift some of that pall. They come from the second clinical trial involving the stem cells, using a therapy developed by Advanced Cell Technology to treat macular degeneration, a leading cause of blindness. “It’s a big step forward for regenerative medicine,” said Dr. Steven D. Schwartz, a retina specialist at the University of California, Los Angeles, who treated the two patients. Both patients, who were legally blind, said in interviews that they had gains in eyesight that were meaningful for them. One said she could see colors better and was able to thread a needle and sew on a button for the first time in years. The other said she was able to navigate a shopping mall by herself. © 2012 The New York Times Company

Keyword: Vision; Stem Cells
Link ID: 16292 - Posted: 01.24.2012

by Andy Coghlan It is the light we think we see that counts. Optical illusions designed to seem brighter than they are make your pupils constrict a little more. This suggests that we have evolved systems for anticipating dazzling light to protect our eyes. Our pupils' fast response to light appears to occur even without input from the brain. For example, it is seen in people with damage to the visual cortex. Appearances can be deceptive, though. Bruno Laeng of the University of Oslo in Norway measured tiny changes in pupil size as volunteers viewed various illusions that were all identical in brightness, though did not look so. If light levels alone dictated pupil size, they would have reacted identically whichever image a person viewed. Instead, people's pupils constricted more when they viewed the illusions designed to appear brightest. "What's surprising is that even something as simple as how bright we think our environment is will be affected by our expectations," says Stuart Peirson of the University of Oxford, who was not involved in the study. Previous studies show that the brain controls pupil size in other situations: our pupils dilate when we make decisions, for instance. Journal reference: Proceedings of the National Academy of Sciences, DOI: 10.1073/pnas.1118298109 © Copyright Reed Business Information Ltd.

Keyword: Vision
Link ID: 16291 - Posted: 01.24.2012

Erin Allday, Chronicle Staff Writer A Stanford study sheds new light on the old cliche about women having a higher tolerance for pain than men - according to tens of thousands of electronic patient records, women tend to report much more severe pain than men, no matter the source of the pain. The study being released today found that when asked to rate their pain on a scale of 0 to 10 - with 0 being no pain at all, and 10 being the worst pain imaginable - women on average scored their pain 20 percent more intense than men. The results held up across a wide variety of diseases and injuries, including back and neck pain, digestive disorders, sinus infections, and even ankle strains and sprains. In almost every category researchers looked at, women reported more pain than men. "We may have to adjust our thinking about how men and women report their pain. The killer question is: Do women actually feel more pain than men?" said Dr. Atul Butte, lead author of the study, which was published in the Journal of Pain. "That may be more philosophy than anything - how can we tell that for sure?" Of course, the fact that women report more pain overall doesn't necessarily mean they have more or less tolerance to pain than men, Butte said, adding that his results have been the source of some lighthearted debate with his wife. The study doesn't explain the reason for the difference, and researchers say it could include social, psychological or biological factors. Men may be more reluctant to confess intense pain to a female nurse, for example. Women are more likely than men to suffer from depression and anxiety, two psychological conditions that can increase susceptibility to pain. © 2012 Hearst Communications Inc.

Keyword: Pain & Touch; Sexual Behavior
Link ID: 16290 - Posted: 01.24.2012