Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 15441 - 15460 of 29289

by Kristen Minogue Most people don't want their parents meddling in their sex lives. But for one species of ape, having mom nearby can actually increase the odds of hooking up with an eligible mate. Bonobos—our closest living relatives, along with chimpanzees—aren't puritanical. Sex for these apes is a public, accepted form of social currency. They use it to acquire food from others, defuse conflicts, and ingratiate themselves with their superiors. But bonobos also live under a rigid social hierarchy. An ape retains its rank even when a community splits up into smaller groups to forage for food, which the primates do frequently. Normally with bonobos, the highest-ranking male in the group also mates the most, typically with the nubile females. But male bonobos also stick close to their mothers, sometimes spending as much as 90% of their time in their company. Because the mother-son bond is so strong, biologist Martin Surbeck of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and colleagues wondered whether having a mother close by could upset the mating hierarchy. For almost 2.5 years, the researchers observed a community of more than 30 wild bonobos in the Democratic Republic of the Congo's Salonga National Park, keeping a close eye on the adult and adolescent males—nine in all. When the apes split into small groups with fertile females but no moms, the highest-ranking male had about 40% of the intercourse with the females, the team will report online tomorrow in the Proceedings of the Royal Society B. But if every male's mom was also present, the top male managed only about 25% of the matings, leaving more for the subordinate males. Mom's presence didn't change the hierarchy, but it did level the playing field somewhat for the apes further down, says Surbeck. "The mother's like a social passport." © 2010 American Association for the Advancement of Science.

Keyword: Sexual Behavior; Evolution
Link ID: 14416 - Posted: 09.03.2010

By Katherine Harmon Marine worms might seem like lowly, slow-witted creatures, but new gene mapping shows that we might share an ancient brainy ancestor with them. Human cognition is largely rooted in the cerebral cortex, the part of the brain that enables consciousness, language and other higher-level functions. We share the basic evolutionary underpinnings of our big brains with other vertebrates, which have a structure known as the pallium. Although lacking palliums, many invertebrates, such as insects, spiders and some worms, instead have what are know as mushroom bodies—sections of the brain so called because their shape resembles mushrooms. Mushroom bodies and vertebrate palliums are both responsible for some sensory integration and memory, and they have "long been noted and interpreted as convergent acquisitions," noted a team of researchers in a new study, published online September 2 in Cell. In other words, the thinking has been that these two kinds of brains evolved from independent paths. The team, however, has proposed instead that these two brain structures do share a single common ancestor, one that likely lived some 600 million years ago. The group based their conclusions on new gene expression maps—"molecular fingerprints"— gathered from the mushroom bodies in developing marine ragworms (Platynereis dumerilii) that could be compared with gene expression patterns of developing vertebrate palliums. © 2010 Scientific American,

Keyword: Evolution
Link ID: 14415 - Posted: 09.03.2010

By Bruce Bower Mental exercise lets seniors outrun Alzheimer’s disease — for a while. Then the race takes a tragic turn for the sharp-minded, a new study finds, as declines in memory and other thinking skills kick into high gear. After age 65, regular participation in mentally stimulating activities, including doing crossword puzzles and reading, delays intellectual decay caused by Alzheimer’s disease, say neuropsychologist Robert Wilson of Rush University Medical Center in Chicago and his colleagues. But when this debilitating condition finally breaks through the defenses of a mentally fortified brain, it rapidly makes up for lost time, the scientists report in a paper published online September 1 in Neurology. “The benefit of delaying initial signs of cognitive decline by keeping mentally active may come at the cost of more rapid dementia progression later on,” Wilson says. His team also found that mental stimulation slows cognitive declines typically experienced by seniors with healthy brains but offers no protection against the onset of memory and thinking problems that fall short of Alzheimer’s disease. Several recent studies have pointed to a delayed but sharp drop in thinking skills among mentally active people who develop Alzheimer’s disease, remarks neuropsychologist Yaakov Stern of Columbia University College of Physicians and Surgeons in New York City. Unlike the new report, though, those studies did not compare mentally active adults who developed Alzheimer’s disease with those who remained healthy or lost some mental function. © Society for Science & the Public 2000 - 2010

Keyword: Alzheimers; Learning & Memory
Link ID: 14414 - Posted: 09.03.2010

by Catherine de Lange FROM the relaxing effects of cannabis to the highs of LSD and ecstasy, illegal drugs are not generally associated with the lab bench. Now, for the first time in decades, that is starting to change. For almost 40 years, mainstream research has shied away from investigating the therapeutic benefits of drugs whose recreational use is prohibited by law. But a better understanding of how these drugs work in animal studies, and the advancement of brain-imaging techniques, has sparked a swathe of new research. What's more, clinical trials of MDMA (ecstasy), LSD and other psychoactive drugs are starting to yield some positive results. This could lead to a call for governments to take a new approach to the funding and regulation of research into the potential benefits of such chemicals. LSD was developed in the 1940s (see "The highs and lows of LSD") but by the 1970s it and many other drugs became classed as schedule 1 in many countries - described as "abuse" drugs with no accepted medical use. "Research on psychedelics was severely restricted and interest in the therapeutic use of these drugs faded," says Franz Vollenweider of the neuropsychopharmacology and brain-imaging unit at the Zurich University Hospital of Psychiatry, Switzerland. The classification of LSD as schedule 1 was a mistake born of "ignorance and taboo", says Amanda Feilding, director of the Beckley Foundation, a charitable trust that promotes investigation into consciousness and its modulation, based in Oxford, UK. © Copyright Reed Business Information Ltd.

Keyword: Drug Abuse
Link ID: 14413 - Posted: 09.03.2010

By MATTHEW PERRONE WASHINGTON — Andrew White returned from a nine-month tour in Iraq beset with signs of post-traumatic stress disorder: insomnia, nightmares, constant restlessness. Doctors tried to ease his symptoms using three psychiatric drugs, including a potent anti-psychotic called Seroquel. Thousands of soldiers suffering from PTSD have received the same medication over the last nine years, helping to make Seroquel one of the Veteran Affairs Department's top drug expenditures and the No. 5 best-selling drug in the nation. Several soldiers and veterans have died while taking the pills, raising concerns among some military families that the government is not being up front about the drug's risks. They want Congress to investigate. In White's case, the nightmares persisted. So doctors recommended progressively larger doses of Seroquel. At one point, the 23-year-old Marine corporal was prescribed more than 1,600 milligrams per day — more than double the maximum dose recommended for schizophrenia patients. A short time later, White died in his sleep. "He was told if he had trouble sleeping he could take another (Seroquel) pill," said his father, Stan White, a retired high school principal. An investigation by the Veterans Affairs Department concluded that White died from a rare drug interaction. He was also taking an antidepressant and an anti-anxiety pill, as well as a painkiller for which he did not have a prescription. Inspectors concluded he received the "standard of care" for his condition. Copyright 2010 The Associated Press.

Keyword: Schizophrenia; Sleep
Link ID: 14412 - Posted: 08.31.2010

Smoking marijuana does help relieve a certain amount of pain, a small but well-designed Canadian study has found. People who suffer chronic neuropathic or nerve pain from damage or dysfunction of the nervous system have few treatment options with varying degrees of effectiveness and side-effects. Neuropathic pain is caused by damage to nerves that don't repair, which can make the skin sensitive to a light touch. Cannabis pills have been shown to help treat some types of pain but the effects and risks from smoked cannabis were unclear. To find out more, Dr. Mark Ware, an assistant professor in family medicine and anesthesia at Montreal's McGill University, and his colleagues conducted a randomized controlled trial — the gold standard of medical research — of inhaled cannabis in 21 adults with chronic neuropathic pain. Investigators used three different strengths of the active drug — THC levels of 2.5 per cent, six per cent and 9.4 per cent, as well as a zero per cent placebo. "We found that 25 mg herbal cannabis with 9.4 per cent THC, administered as a single smoked inhalation three times daily for five days, significantly reduces average pain intensity compared with a zero per cent THC cannabis placebo in adult subjects with chronic post traumatic/post surgical neuropathic pain," the study's authors concluded in Monday's online issue of the Canadian Medical Association Journal. Study participants inhaled the 25-milligram dose through a pipe for five days and then took no marijuana for nine days. Then they rotated through the other doses of THC. © CBC 2010

Keyword: Pain & Touch; Drug Abuse
Link ID: 14411 - Posted: 08.31.2010

By RICHARD A. FRIEDMAN, M.D. Of all the things that people do, few are as puzzling to psychiatrists as compulsive drug use. Sure, all drugs of abuse feel good — at least initially. But for most people, the euphoria doesn’t last. A patient of mine is all too typical. “I know this will sound strange,” he said, as I recall, “but cocaine doesn’t get me high any more and still I can’t stop.” When he first started using the drug, in his early 30s, my patient would go for days on a binge, hardly eating or drinking. The high was better than anything, even sex. Within several months, though, he had lost the euphoria — followed by his job. Only when his wife threatened to leave him did he finally seek treatment. When I met him, he told me that he would lose everything if he could not stop using cocaine. Well, I asked, what did he like about this drug, if it cost him so much and no longer made him feel good? He stared at me blankly. He had no clue. Neither did most psychiatrists, until recently. We understand the initial allure of recreational drugs pretty well. Whether it is cocaine, alcohol, opiates, you name it, drugs rapidly activate the brain’s reward system — a primitive neural circuit buried beneath the cortex — and release dopamine. This neurotransmitter, which is central to pleasure and desire, sends a message to the brain: This is an important experience that is worth remembering. Copyright 2010 The New York Times Company

Keyword: Drug Abuse
Link ID: 14410 - Posted: 08.31.2010

By TARA PARKER-POPE For kids around the country it’s back-to-school time. But for many of them, it’s also the return of headache season. Doctors say frequent headaches and migraines are among the most common childhood health complaints, yet the problem gets surprisingly little attention from the medical community. Many pediatricians and parents view migraines as an adult condition. And because many children complain of headaches more often during the school year than the summer, parents often think a child is exaggerating symptoms to get out of schoolwork. Often the real issue, say doctors, is that changes in a child’s sleep schedule, including getting up early for school and staying up late to study, as well as skipping breakfast, not drinking enough water and weather changes can all trigger migraines when the school year starts. “In many areas people just don’t think kids can get migraines,” says Dr. Andrew Hershey, professor of pediatrics and neurology and director of the headache center at Cincinnati Children’s Hospital Medical Center. “But kids shouldn’t be missing activities and having trouble at school because they’re having headaches. If it happens, it shouldn’t be ignored.” Migraine is an inherited neurological condition characterized by severe, often disabling headache pain. During a migraine attack, a number of changes occur throughout the brain causing dilation of blood vessels; severe pain; increased sensitivity to lights, sounds and smells; nausea and vomiting; and other symptoms. It’s estimated that about 10 percent of young children and up to 28 percent of older teenagers suffer from migraines. (Hormonal changes during puberty can also be a trigger.) Copyright 2010 The New York Times Company

Keyword: Pain & Touch; Development of the Brain
Link ID: 14409 - Posted: 08.31.2010

By CARL ZIMMER Why are worker ants sterile? Why do birds sometimes help their parents raise more chicks, instead of having chicks of their own? Why do bacteria explode with toxins to kill rival colonies? In 1964, the British biologist William Hamilton published a landmark paper to answer these kinds of questions. Sometimes, he argued, helping your relatives can spread your genes faster than having children of your own. For the past 46 years, biologists have used Dr. Hamilton’s theory to make sense of how animal societies evolve. They’ve even applied it to the evolution of our own species. But in the latest issue of the journal Nature, a team of prominent evolutionary biologists at Harvard try to demolish the theory. The scientists argue that studies on animals since Dr. Hamilton’s day have failed to support it. The scientists write that a close look at the underlying math reveals that Dr. Hamilton’s theory is superfluous. “It’s precisely like an ancient epicycle in the solar system,” said Martin Nowak, a co-author of the paper with Edward O. Wilson and Corina Tarnita. “The world is much simpler without it.” Other biologists are sharply divided about the paper. Some praise it for challenging a concept that has outlived its usefulness. But others dismiss it as fundamentally wrong. “Things are just bouncing around right now like a box full of Ping-Pong balls,” said James Hunt, a biologist at North Carolina State University. Copyright 2010 The New York Times Company

Keyword: Evolution; Sexual Behavior
Link ID: 14408 - Posted: 08.31.2010

By Steve Connor, Science Editor A revolutionary way of screening the entire human genome for the genetic signposts of disease has produced its latest success – the first inherited link to common migraine and a possible reason for extreme headaches. The technique, which scans all 23 pairs of human chromosomes in a single sweep, has found the first genetic risk factor that predisposes someone to the common form of migraine, which affects one in six women and one in 12 men. The discovery has immediately led to a new possible cause of migraine by alerting scientists to DNA defects involved in the build-up of a substance in the nerves of sufferers that could be the trigger for their migraines. Scientists believe the findings could lead both to a better understanding as well as new treatments for the chronic and debilitating condition which is estimated to be one of the most costly brain-related disorders in society, causing countless lost working days. Scanning the entire blueprint of human DNA by genome-wide association studies (GWAS) has had a profound effect on the understanding of a range of other medical conditions over the past few years, from heart disease and obesity to bipolar disorder and testicular cancer. The study of migraine, published in the journal Nature Genetics, was an archetypal example of the new approach of medical genetics using the GWAS technique. Scientists analysed the genomes of some 5,000 people with migraine and compared their DNA to that of unaffected people to see if there were any significant differences that could be linked statistically to the condition. ©independent.co.uk

Keyword: Pain & Touch; Genes & Behavior
Link ID: 14407 - Posted: 08.30.2010

Lauran Neergaard, Associated Press Scientists have created a new kind of artificial cornea, inserting a sliver of collagen into the eye that coaxes corneal cells to regrow and restore vision. It worked in a first-stage study of 10 patients in Sweden, researchers reported Wednesday. While larger studies are needed, it's a step toward developing an alternative to standard cornea transplants, which aren't available in much of the world because of a shortage of donated corneas. "We're trying to regenerate the cornea from within," said Dr. May Griffith, senior scientist at the Ottawa Hospital Research Institute in Canada and a professor of regenerative medicine at Linkoping University in Sweden. Vision depends on a healthy cornea, the filmlike covering of the eye's surface that helps it focus light. Corneas are fragile and easily harmed by injury or infection. About 42,000 people in the United States receive transplanted corneas every year. While that's considered an adequate supply in this country, donated corneas aren't available in many countries for the estimated 10 million people worldwide with corneal blindness. Transplants also bring risk of rejection. The new work, published in the journal Science Translational Medicine, is a bioartificial cornea - an attempt to use the same natural substances that make up a real cornea to induce healing. © 2010 Hearst Communications Inc.

Keyword: Vision
Link ID: 14406 - Posted: 08.30.2010

By GUY DEUTSCHER Seventy years ago, in 1940, a popular science magazine published a short article that set in motion one of the trendiest intellectual fads of the 20th century. At first glance, there seemed little about the article to augur its subsequent celebrity. Neither the title, “Science and Linguistics,” nor the magazine, M.I.T.’s Technology Review, was most people’s idea of glamour. And the author, a chemical engineer who worked for an insurance company and moonlighted as an anthropology lecturer at Yale University, was an unlikely candidate for international superstardom. And yet Benjamin Lee Whorf let loose an alluring idea about language’s power over the mind, and his stirring prose seduced a whole generation into believing that our mother tongue restricts what we are able to think. In particular, Whorf announced, Native American languages impose on their speakers a picture of reality that is totally different from ours, so their speakers would simply not be able to understand some of our most basic concepts, like the flow of time or the distinction between objects (like “stone”) and actions (like “fall”). For decades, Whorf’s theory dazzled both academics and the general public alike. In his shadow, others made a whole range of imaginative claims about the supposed power of language, from the assertion that Native American languages instill in their speakers an intuitive understanding of Einstein’s concept of time as a fourth dimension to the theory that the nature of the Jewish religion was determined by the tense system of ancient Hebrew. Eventually, Whorf’s theory crash-landed on hard facts and solid common sense, when it transpired that there had never actually been any evidence to support his fantastic claims. The reaction was so severe that for decades, any attempts to explore the influence of the mother tongue on our thoughts were relegated to the loony fringes of disrepute. But in the last few years, new research has revealed that when we learn our mother tongue, we do after all acquire certain habits of thought that shape our experience in significant and often surprising ways. Copyright 2010 The New York Times Company

Keyword: Language
Link ID: 14405 - Posted: 08.30.2010

By GINA KOLATA BETHESDA, Md. — The scene was a kind of science court. On trial was the question “Can anything — running on a treadmill, eating more spinach, learning Arabic — prevent Alzheimer’s disease or delay its progression?” To try to answer that question, the National Institutes of Health sponsored the court, appointing a jury of 15 medical scientists with no vested interests in Alzheimer’s research. They would hear the evidence and reach a judgment on what the data showed. For a day and a half last spring, researchers presented their cases, describing studies and explaining what they had hoped to show. The jury also heard from scientists from Duke University who had been commissioned to look at the body of evidence — hundreds of research papers — and weigh it. And the jury members had read the papers themselves, preparing for this day. The studies included research on nearly everything proposed to prevent the disease: exercise, mental stimulation, healthy diet, social engagement, nutritional supplements, anti-inflammatory drugs or those that lower cholesterol or blood pressure, even the idea that people who marry or stay trim might be saved from dementia. And they included research on traits that might hasten Alzheimer’s onset, like not having much of an education or being a loner. It is an issue that has taken on intense importance because scientists recently reported compelling evidence that two types of tests, PET scans of Alzheimer’s plaque in the brain and tests of spinal fluid, can find signs of the disease years before people have symptoms. That gives rise to the question: What, if anything, can people do to prevent it? Copyright 2010 The New York Times Company

Keyword: Alzheimers
Link ID: 14404 - Posted: 08.30.2010

Jerome Burne Sixty years ago, the philosopher Gilbert Ryle published his famous attack on Cartesian dualism, The Concept of Mind, which claimed to find a logical flaw in the popular notion that mental life has a parallel but separate existence from the physical body. Among other effects it provided sophisticated support for the psychological behaviourists, then in the ascendant, who asserted that since we could not objectively observe mental activity it was not really a fit subject for scientific investigation. Nowhere was the notion of banning mental states taken up more enthusiastically than by the emerging discipline of neuropsychiatry. If consciousness and all its manifestations were "merely" the firing of neurons and the release of chemicals in the brain, what need was there to focus on mental states? Once the physical brain was right, the rest would follow. It was an approach that has spawned a vast pharmaceutical industry to treat any pathological psychological state – anxiety, shyness, depression, psychosis – with a variety of pills. The underlying promise is that scientifically adjusting the levels of various brain chemicals will bring relief and a return to normality. The biggest-selling class of these drugs are the anti-depressant SSRIs – brands include Prozac, Seroxat and Lustral. A recent report revealed that they were the most widely prescribed drugs in America, with an estimated global market value of over $20 billion. However, as is set out calmly and clearly in Irving Kirsch’s The Emperor’s New Drugs, it would seem that the whole golden edifice is based on a lie. © Times Newspapers Ltd 2010

Keyword: Depression
Link ID: 14403 - Posted: 08.30.2010

By Jim Nash Treatment of severe depression with magnetic stimulation is moving beyond large mental health centers and into private practices nationwide, following more than two decades of research on the treatment. Yet even as concern about its efficacy fades, one potential side effect—seizures—continues to shadow the technology. Called repetitive transcranial magnetic stimulation (rTMS), the noninvasive technique uses electromagnets to create localized electrical currents in the brain. The gentle jolts activate certain neurons, reducing symptoms in some patients. Eight psychiatrists contacted for this article, all of whom use rTMS to treat depression, say it is the most significant development in the field since the advent of antidepressant medications. The prevailing theory is that people with depression do not produce enough of certain neurotransmitters, which include serotonin and dopamine. Electricity (administered in combination with antidepressants) stimulates production of those neurotransmitters. A National Institute of Mental Health (NIMH) study released this spring shows that 14 percent of patients with drug-resistant major depressive disorder experience a remission of symptoms after rTMS treatment compared with a control group, which reported a 5 percent rate of remission. Physicians and researchers say those results are similar to the success rate of antidepressants. No notable side effects occurred during the study, according to its authors, who include Mark George, an early rTMS researcher and a professor of psychiatry, radiology and neurosciences at the Medical University of South Carolina in Charleston. They have suggested that higher levels of electrical stimulation might attain better results. © 2010 Scientific American,

Keyword: Depression
Link ID: 14402 - Posted: 08.30.2010

By Vilayanur S. Ramachandran and Diane Rogers-Ramachandran Imagine that you are looking at a dog that is standing behind a picket fence. You do not see several slices of dog; you see a single dog that is partially hidden by a series of opaque vertical slats. The brain’s ability to join these pieces into a perceptual whole demonstrates a fascinating process known as amodal completion. It is clear why such a tendency would have evolved. Animals must be able to spot a mate, predator or prey through dense foliage. The retinal image may contain only fragments, but the brain’s visual system links them, reconstructing the object so the animal can recognize what it sees. The process seems effortless to us, but it has turned out to be one of those things that is horrendously difficult to program computers to do. Nor is it clear how neurons in the brain’s visual pathways manage the trick. In the early 20th century Gestalt psychologists were very interested in this problem. They devised a number of cunningly contrived illusions to investigate how the visual system establishes the continuity of an object and its contours when the object is partially obscured. A striking example of amodal completion is an illusion devised by Italian psychologist Gaetano Kanizsa. In one view, you see a set of “chicken feet” arranged geometrically. But if you merely add a set of opaque diagonal bars, a three-dimensional cube springs into focus seemingly by magic, the chicken feet becoming cube corners. © 2010 Scientific American,

Keyword: Vision
Link ID: 14401 - Posted: 08.30.2010

By Larry O'Hanlon There are theories galore about why some dog breeds appear to be smarter than others, but new research suggests that size alone might make a difference. All larger dogs appear to be better at following pointing cues from humans than smaller dogs, which makes them appear smarter. It's possible that bigger dogs appear smarter not just because they are bred for taking orders, but because their wider set eyes give them better depth perception. As a result, they can more easily discern the direction a person is pointing. This latter hypothesis was tested by researchers in New Zealand, who think there might be something to it. "We do know that dog breeds are different," said William Helton of the University of Canterbury in Christchurch, New Zealand. Human breeding has created dogs with huge physical differences, like shorter snouts for more powerful bites. Even the internal structure of dogs eyes can vary among some breeds, he said. But can something as simple as the distance between the eyes be a factor too? To see if all larger dogs in general were better at discerning human pointing cues, Helton and his colleagues put 104 dogs to the test -- 61 large dogs (greater than 50 lbs) and 43 small dogs (less than 50 lbs). © 2010 Discovery Communications, LLC.

Keyword: Intelligence; Evolution
Link ID: 14400 - Posted: 08.30.2010

by David Robson Can you tell a snake from a pretzel? Some can't – and their experiences are revealing how the brain builds up a coherent picture of the world AFTER her minor stroke, BP started to feel as if her eyes were playing tricks on her. TV shows became confusing: in one film, she was surprised to see a character reel as if punched by an invisible man. Sometimes BP would miss seeing things that were right before her eyes, causing her to bump into furniture or people. BP's stroke had damaged a key part of her visual system, giving rise to a rare disorder called simultanagnosia. This meant that she often saw just one object at a time. When looking at her place setting on the dinner table, for example, BP might see just a spoon, with everything else a blur (Brain, vol 114, p 1523). BP's problems are just one example of a group of disorders known collectively as visual agnosias, usually caused by some kind of brain damage. Another form results in people having trouble recognising and naming objects, as experienced by the agnosic immortalised in the title of Oliver Sacks's 1985 best-seller The Man Who Mistook His Wife for a Hat. Agnosias have become particularly interesting to neuroscientists in the past decade or so, as advances in brain scanning techniques have allowed them to close in on what's going on in the brain. This gives researchers a unique opportunity to work out how the brain normally makes sense of the world. "Humans are naturally so good at this, it's difficult to see our inner workings," says Marlene Behrmann, a psychologist who studies vision at Carnegie Mellon University in Pittsburgh, Pennsylvania. Cases like BP's are even shedding light on how our unconscious informs our conscious mind. "Agnosias allow us to adopt a reverse-engineering approach and infer how [the brain] would normally work," says Behrmann. © Copyright Reed Business Information Ltd

Keyword: Vision; Attention
Link ID: 14399 - Posted: 08.30.2010

By Rebecca Kessler In greylag geese, nearly a fifth of all long-term couples are composed of two males. They're not alone: More than 130 bird species are known to engage in homosexual behavior at least occasionally, a fact that has puzzled scientists. After all, in evolutionary terms same-sex mating seems to reduce the birds' chances of reproductive success. But that's not necessarily so, according to a new study. In a given species, the sex with lighter parental duties tends to mate more, period whether with the same or the opposite sex. Birds engage in all kinds of same-sex hanky panky, from elaborate courtship displays to mounting and genital contact to setting up house together. In some species the same-sex pairs even raise young (conceived with outside partners, obviously) and stay together for several years. In 2007, a team led by Geoff MacFarlane, a biologist at the University of Newcastle in Australia, reported that male homosexual behavior was more common in polygynous bird species, where males mate with numerous females, and that female homosexual behavior was more common in monogamous species. Intrigued, MacFarlane looked for help explaining the pattern in a theory predicting that whichever gender spends less time caring for young tends to have sex with more partners. © 2010 LiveScience.com.

Keyword: Sexual Behavior; Evolution
Link ID: 14398 - Posted: 08.24.2010

By KATHERINE BOUTON “Delusions of Gender” takes on that tricky question, Why exactly are men from Mars and women from Venus?, and eviscerates both the neuroscientists who claim to have found the answers and the popularizers who take their findings and run with them. The author, Cordelia Fine, who has a Ph.D. in cognitive neuroscience from University College London, is an acerbic critic, mincing no words when it comes to those she disagrees with. But her sharp tongue is tempered with humor and linguistic playfulness, as the title itself suggests. Academics like Simon Baron-Cohen and Dr. Louann Brizendine will want to come to this volume well armed. So would Norman Geschwind if he were still alive. Popular authors like John Gray (“Men are from Mars”), Michael Gurian (“What Could He Be Thinking?”) and Dr. Leonard Sax (“Why Gender Matters”) may want to read something else. Sometimes all it takes is their own words, as in this example from Dr. Brizendine’s 2007 book “The Female Brain”: “Maneuvering like an F-15, Sarah’s female brain is a high-performance emotion machine — geared to tracking, moment by moment, the nonverbal signals of the innermost feelings of others.” Is Sarah some kind of psychic? Dr. Fine clarifies: “She is simply a woman who enjoys the extraordinary gift of mind reading that, apparently, is bestowed on all owners of a female brain.” Experts used to attribute gender inequality to the “delicacy of the brain fibers” in women ; then to the smaller dimensions of the female brain (the “missing five ounces,” the Victorians called it); then to the ratio of skull length to skull breadth. In 1915 the neurologist Dr. Charles L. Dana wrote in this newspaper that because a woman’s upper spinal cord is smaller than a man’s it affects women’s “efficiency” in the evaluation of “political initiative or of judicial authority in a community’s organization” — and thus compromises their ability to vote. Copyright 2010 The New York Times Company

Keyword: Sexual Behavior
Link ID: 14397 - Posted: 08.24.2010