Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Maggie Fox WASHINGTON - Use of antidepressant drugs in the United States doubled between 1996 and 2005, probably because of a mix of factors, researchers reported on Monday. About 6 percent of people were prescribed an antidepressant in 1996 — 13 million people. This rose to more than 10 percent or 27 million people by 2005, the researchers found. "Significant increases in antidepressant use were evident across all sociodemographic groups examined, except African Americans," Dr. Mark Olfson of Columbia University in New York and Steven Marcus of the University of Pennsylvania in Philadelphia wrote in the Archives of General Psychiatry. Story continues below ↓advertisement | your ad here "Not only are more U.S. residents being treated with antidepressants, but also those who are being treated are receiving more antidepressant prescriptions," they added. More than 164 million prescriptions were written in 2008 for antidepressants, totaling $9.6 billion in U.S. sales, according to IMS Health. Drugs that affect the brain chemical serotonin like GlaxoSmithKline's Paxil, known generically as paroxetine, and Eli Lilly and Co's Prozac, known generically as fluoxetine, are the most commonly prescribed class of antidepressant. But the study found the effect in all classes of the drugs. Copyright 2009 Reuters.
Keyword: Depression
Link ID: 13131 - Posted: 06.24.2010
By Emily Anthes The surface of the brain is a complex landscape, featuring endless peaks and valleys. This intricately folded outer layer, known as the cerebral cortex, is one of the brain’s most noticeable features. But it’s also one of the least well understood. “There’s this large expanse of cortex, much of which is like South America to a 17th century cartographer,’’ said David Van Essen, a neurobiologist at Washington University in St. Louis. “It’s this big mass of land, and we know what the outlines are, but no one’s been able to chart the intricacies.’’ That’s beginning to change. Technological and computational advances have enabled researchers to image the brain’s wrinkled exterior in stunning detail, mapping the size and shape of each fold. Scientists pursuing this new discipline of “cortical cartography’’ expect it to yield insights into how the brain develops and what happens when things go awry. Researchers have already discovered that the cerebral cortex - which controls higher-level functions, including thought, emotion, and perception - is folded abnormally in disorders ranging from autism to depression. Such insights could lead to better and earlier diagnoses and perhaps even new clues to treatment. When the human brain develops in the womb, the outer surface is initially almost entirely smooth. But during the last few months of fetal development, the cortex begins to fold and wrinkle; by the time a full-term infant is delivered, most of the folding has been completed, though subtle refinements continue through early childhood. The folds create more surface area, increasing the size of the cerebral cortex that can fit in our skulls, and, it’s believed, partly accounting for the greater cognitive powers of humans compared with species with smoother brains. © 2009 NY Times Co.
Keyword: Development of the Brain; Evolution
Link ID: 13130 - Posted: 06.24.2010
By TARA PARKER-POPE Married people tend to be healthier than single people. But what happens when a marriage ends? New research shows that when married people become single again, whether by divorce or a spouse’s death, they experience much more than an emotional loss. Often they suffer a decline in physical health from which they never fully recover, even if they remarry. And in terms of health, it’s not better to have married and lost than never to have married at all. Middle-age people who never married have fewer chronic health problems than those who were divorced or widowed. The findings, from a national study of 8,652 men and women in their 50s and early 60s, suggest that the physical stress of marital loss continues long after the emotional wounds have healed. While this does not mean that people should stay married at all costs, it does show that marital history is an important indicator of health, and that the newly single need to be especially vigilant about stress management and exercise, even if they remarry. “When your spouse is getting sick and about to die or your marriage is getting bad and about to die, your stress levels go up,” said Linda Waite, a sociology professor at the University of Chicago and an author of the study, which appears in the September issue of The Journal of Health and Social Behavior. “You’re not sleeping well, your diet gets worse, you can’t exercise, you can’t see your friends. It’s a whole package of awful events.” Copyright 2009 The New York Times Company
Keyword: Stress
Link ID: 13129 - Posted: 06.24.2010
By C. CLAIBORNE RAY Q. After a five-day cruise, I felt the ground ashore rolling under my legs for days, as if I were still at sea. What’s going on? A. It is a condition that has been called mal de débarquement, like a sort of mal de mer after debarking. Timothy C. Hain, an expert on dizziness disorders in Chicago, and others published a study of the condition in Archives of Otolaryngology in 1999, finding that 26 of 27 sufferers responding to a survey of passengers about severe post-cruise symptoms were women ages 40 to 50. Most people suffer for a month or less, though the subjects of the study had it for months or years. Mal de débarquement is not linked to any injury to the ear or brain and is generally described as a variant of motion sickness, though most motion-sickness medications do not work once it starts. There is no proof of any cause. Because of the preponderance of premenopausal women with symptoms, there is some suspicion that the problem could be linked to hormones. Other suggestions for a cause include a type of migraine or the results of overcompensation for sensory input while trying to maintain balance on a boat. Another possible cause is the adaptation to rolling, that is, rocking side to side, while rotating the head. A study could investigate whether people who do a lot of head rotation on a boat are more likely to develop it. Copyright 2009 The New York Times Company
Keyword: Miscellaneous
Link ID: 13128 - Posted: 06.24.2010
by Colin Barras Epilepsy may be sparked by a metal imbalance in the brain caused by a single gene mutation, a study in mice suggests. The finding could help develop new treatments in humans who suffer from the condition. Steven Clapcote's team at the University of Leeds, UK, pinpointed a gene that seems to play an important part in the genesis of epileptic seizures, which result from abnormal bursts of electrical activity in the brain and can occur even when there is no underlying neurological condition. The Atp1a3 gene is one of three that produce a chemical pump mechanism to keep sodium and potassium levels in brain nerve cells and the surrounding tissue at the levels needed for normal activity. "It's been known for a long time that injecting the sodium/potassium pump inhibitor ouabain into the brain can induce seizures in rats," says Clapcote, and it's also known that mice lacking two of three forms of the pump – either the "alpha1" or "alpha2" forms – are free from seizures. Cured offspring Clapcote's team have now determined that mice with a mutated copy of the Atp1a3 gene and reduced activity of the "alpha3" pump were prone to epileptic seizures. The mouse strain has been dubbed Myshkin after a Dostoevsky character in The Idiot, who suffered from epilepsy. "Mysh" also comes from the Russian for mouse. © Copyright Reed Business Information Ltd.
Keyword: Epilepsy; Genes & Behavior
Link ID: 13127 - Posted: 06.24.2010
By Greg Miller In people suffering from glaucoma, damage to the optic nerve can slowly degrade peripheral vision and, in the worst cases, eventually lead to blindness. But eyedrops containing nerve growth factor (NGF)--a protein that promotes the survival and growth of neurons in the developing brain--appear to prevent nerve damage in rats and restore some vision in three human glaucoma patients, the authors of a new study claim. Not everyone thinks the reported effect is real, however. For the study, ophthalmologist Alessandro Lambiase of the University of Rome Campus Bio-Medica and colleagues first mimicked glaucoma in rats. The researchers recreated the most common form of the disease, in which increased fluid pressure inside the eye damages nerves, by injecting saline solution into a vein in the eye. They kept the intraocular pressure up for 7 weeks, killing about 40% of the neurons in the retina whose tail-like axons give rise to the optic nerve, which conveys visual information to the brain. However, in rats treated four times daily with NGF-laced eyedrops during the 7-week period, the death of these "retinal ganglion cells" was reduced by about 25%, the team reports online today in the Proceedings of the National Academy of Sciences. Encouraged by these findings, the researchers asked three patients with advanced glaucoma to take the drops four times daily for 3 months. Peripheral vision, one of the main visual functions impaired by glaucoma, improved in two of the patients and got no worse in the third, the researchers report in the same paper. They also report improvements in visual acuity, contrast sensitivity, and in electrophysiological measures of nerve conduction in the visual system in some or all of the patients. © 2009 American Association for the Advancement of Science.
Keyword: Vision; Trophic Factors
Link ID: 13126 - Posted: 06.24.2010
By NEIL AMDUR The three new movies would seem to have little in common: a romantic comedy about Upper West Side singles, a biopic about a noted animal science professor, and an animated film about an extended pen-pal relationship. But all three revolve around Asperger’s syndrome, the complex and mysterious neurological disorder linked to autism. Their nearly simultaneous appearance — two open this summer, and the third is planned for next year — underscores how much Asperger’s and high-functioning autism have expanded in the public consciousness since Dustin Hoffman’s portrayal of an autistic savant in “Rain Man” 21 years ago. “The more I learned about Asperger’s,” said Max Mayer, the writer and director of the romance, “Adam,” which opened last week, “the better metaphor it felt like for the condition of all of us in terms of a desire for connection to other people.” People with Asperger’s may have superior intelligence and verbal skills, and they often have an obsessive interest in a particular topic (astronomy, in the case of the title character in “Adam,” played by Hugh Dancy). But they tend to be self-defeatingly awkward in social situations, and romantic relationships can leave them at sea. The syndrome is generally considered a high-functioning form of autism, which in recent years has been diagnosed in more and more children. While the reasons for the explosion in diagnoses are unclear, increased awareness may be part of the explanation, and one reason for the growth in awareness is the rise of online parent communities. Copyright 2009 The New York Times Company
Keyword: Autism
Link ID: 13125 - Posted: 06.24.2010
By Charles Q. Choi Breathing in the hormone oxytocin has been shown in recent years to trigger all kinds of feel-good emotions in people, such as trust, empathy and generosity. Now scientists find it might have a dark side: Snorting oxytocin might also incite envy and gloating. Past studies have shown that oxytocin plays a wide role in social bonding in mammals—between mates, for instance, or mother and child—and recent work suggested the hormone was linked with pro-social behavior in people, such as altruism. Still, neuroscientist Simone Shamay-Tsoory in University of Haifa in Israel and her colleagues noted that oxytocin was found to raise aggression in rodents, suggesting the hormone might play a wider role in social emotions in humans. The researchers decided to investigate envy and gloating—feelings related to the tendency to compare oneself with others—to see if oxytocin ramped up these emotions or dialed them down. The researchers gave 56 volunteers either oxytocin or a placebo and paid them to take part in a game of chance with another participant which, unknown to them, was a computer. They were shown three doors on a video screen, either red, blue or yellow, and told that behind each door was a different sum of money they could keep after the game. The computer was programmed to either win more money than the players to trigger feelings of envy, lose more money to elicit a form of gloating known as schadenfreude (delight over another's misfortune) or to win or lose equal amounts of money. To encourage these negative emotions, the researchers gave the computer player an arrogant "personality". © 1996-2009 Scientific American Inc.
Keyword: Aggression; Hormones & Behavior
Link ID: 13124 - Posted: 06.24.2010
Sue Barry is a neuroscientist at Mount Holyoke College. She's also the author of the newly released book Fixing My Gaze, which tells the story of how Barry, at the age of 48, finally learned to see in 3-D. Mind Matters editor Jonah Lehrer chats with Barry about what a flat world looks like and what her own experience can teach us about brain plasticity and education. LEHRER: You begin your new book, Fixing My Gaze, by describing the moment you realized that you lacked stereoscopic vision, which underlies the ability to see in 3-D. Could you describe that moment? BARRY: I was sitting in my college neurobiology class, somewhat bored and distracted, when the professor began to describe experiments done on wall-eyed and cross-eyed cats. He mentioned that vision in these cats had not developed normally and that these animals probably lacked stereovision or the ability to see in 3D. What's more, these animals could never gain stereovision because this skill developed only during a "critical period" in early life. What was true for cats was also thought to be true for people. The professor's words jerked me right out of my daydream. I realized that I was like the cats in the scientists' experiments, since I had been cross-eyed since early infancy. Three childhood surgeries made my eyes look normal so I assumed that I saw normally as well. Yet, I had just learned in class that I lacked a fundamental way of seeing. After class, I went straight to the college library and read up on stereovision. I searched out and tried every stereovision test I could find and flunked them all. This is how I learned that I was stereoblind. © 1996-2009 Scientific American Inc.
Keyword: Vision
Link ID: 13123 - Posted: 06.24.2010
By Sandra G. Boodman Although he had never seen a case like it in his career, cardiologist David Lomnitz felt certain he knew why his new patient kept blacking out when she ate. At the time of her first appointment in September 2004, Martha Bryce, then a 36-year-old health-care consultant, was feeling desperate. Four years earlier she had been given a diagnosis of epilepsy, and had taken medication to prevent seizures. But doctors had been unable to explain the frequent swooning episodes that occurred when she started to eat, forcing her to put her head down on the table in an intermittently successful attempt to avoid passing out. Doctors seemed unconcerned and told her the episodes might be a symptom of her seizure disorder. Bryce, a registered nurse, wasn't so sure. But after a frightening incident drove home the potential danger of the baffling condition, she made an appointment with Lomnitz, now assistant chief of cardiology at Norwalk Hospital in Norwalk, Conn. "Her story rang a bell for me," he said. His hunch about her condition, triggered by cases he heard about during his training years earlier, would upend her diagnosis and radically alter her treatment. The first sign something was wrong was dramatic. While on a business trip to Las Vegas in January 2000, Bryce, who lives in Ridgefield, Conn., decided to visit the Hoover Dam before catching a red-eye flight home. Standing at an overlook preparing to photograph the concrete behemoth, Bryce recalled, "all of a sudden I felt a way I'd never felt before." She fainted and, after regaining consciousness, learned she had suffered a grand mal seizure during which she had bitten her tongue. © 2009 The Washington Post Company
Keyword: Epilepsy
Link ID: 13122 - Posted: 06.24.2010
By JOHN MARKOFF LOS ANGELES — As a promising Caltech graduate student in applied physics, Stephen Kurtin could have taken a job offer from Intel at the dawn of the microelectronics era 40 years ago. Instead he followed the path of a lone inventor, gaining more than 30 patents in fields including word processing software and sound systems, culminating in the pair of glasses resting on his nose, which he believes can free nearly two billion people around the world from bifocals, trifocals and progressive lenses. The glasses have a tiny adjustable slider on the bridge of the frame that makes it possible to focus alternately on the page of a book, a computer screen or a mountain range in the distance. Dr. Kurtin, 64, has spent almost 20 years of his career on a quest to create a better pair of spectacles for people who suffer from presbyopia — the condition that affects almost everyone over the age of 40 as they progressively lose the ability to focus on close objects. After many false turns and dead ends, he has succeeded in creating glasses with a mechanically adjustable focus. He says they are better than other glasses and some forms of Lasik surgery. And they make an intriguing fashion statement: a bit of Harry Potter with a dash of “Revenge of the Nerds.” Copyright 2009 The New York Times Company
Keyword: Vision
Link ID: 13121 - Posted: 06.24.2010
Matt Kaplan Finches instinctively avoid competitors coloured red, rather than learning to fear the colour during their upbringing, Australian research concludes.1 The results are tempting researchers to suspect that in other animals, including ourselves, red's aggressive and intimidating character might also be hard-wired into brains from birth. Dozens of experiments have shown that red intimidates competitors. In humans, wearing red improves chances of winning at sports.2 Studies have also revealed that red is associated with aggression and dominance in fish, reptiles and birds.3,4 But whether fear of red is innate or learned is an "unresolved mystery", says Robert Barton, an anthropologist at the University of Durham, UK. Sarah Pryke of Macquarie University in Sydney tested this question in Australian Gouldian finches (Erythrura gouldiae). As adults, the finches develop either red or black heads, a genetically determined trait. The red-headed birds are aggressive, dominant and avoided by others. To find out whether these traits were learned or inborn, Pryke examined competition between young Gouldian finches — whose heads, yet to blossom into coloured adulthood, are all dull grey. She first raised finches that were genetically destined to be red-headed with black-headed parents, raised others that were genetically destined to be black-headed with red-headed parents, and left still other finches to be raised by parents of the same colour group. In contests staged between these young birds over food, it was body size rather than genetic destiny or rearing environment that decided the winner. © 2009 Nature Publishing Group,
Keyword: Vision; Evolution
Link ID: 13120 - Posted: 06.24.2010
By Katherine Harmon Problem: you’re a fungus that can only flourish at a certain temperature, humidity, location and distance from the ground but can’t do the legwork to find that perfect spot yourself. Solution: hijack an ant’s body to do the work for you—and then inhabit it. A paper, to be published in The American Naturalist’s September issue, explores the astounding accuracy with which this fungus compels ants to create its ideal home. The Ophiocordyceps unilateralis fungus infects Camponotus leonardi ants that live in tropical rainforest trees. Once infected, the spore-possessed ant will climb down from its normal habitat and bite down, with what the authors call a "death grip" on a leaf and then die. But the story doesn’t end there. "The death grip occurred in very precise locations," the authors write. All of the C. leonardi ants studied in Thailand’s Khao Chong Wildlife Sanctuary had chomped down on the underside of a leaf, and 98 percent had landed on a vein. Most had: a) found their way to the north side of the plant, b) chomped on a leaf about 25 centimeters above the ground, c) selected a leaf in an environment with 94 to 95 percent humidity and d) ended up in a location with temperatures between 20 and 30 degrees Celsius. The researchers called this specificity "remarkable." In other words, the fungus was transported via the zombie ant to its prime location. To see just how important this accuracy is to the fungus, the researchers identified dozens of infected ants in a small area of the forest. Some of the ants were moved to other nearby heights and locations, and others were left to sprout spores just where they had died. © 1996-2009 Scientific American Inc.
Keyword: Evolution
Link ID: 13119 - Posted: 06.24.2010
By Susan Milius Karen Warkentin speaks admiringly of the eggs of red-eyed tree frogs because, for one thing, they know what’s shaking. Masses of these glistening eggs hang on leaves that dangle over tropical ponds, and the eggs stay put even when branches thrash in storms. A hungry snake biting into one end of an egg mass can make the embryos’ home dip and dance too. But at this jouncing, older embryos flee. They can’t run, but they can hatch. A sudden burst of emergency hatching sends a rain of new tadpoles into the water, often saving some 80 percent of a clutch. Pretty sophisticated for a glob of goo. It turns out that frog eggs and other embryonic blobs possess a rather advanced repertoire for coping with the earliest stages of life. Embryos can react to perils both inside their eggs and out. Even while still within their shelter, they’re learning lessons about eating. And being eaten. These and other recent findings are forcing researchers to expand their ideas of how capably and subtly these half-baked beings can act. Maybe deft dodges and finely judged trade-offs should be expected: Life for the small and underpowered is dangerous, and plenty of good embryos die young. “That creates opportunities for natural selection to shape a response,” says Warkentin, a biologist at Boston University. But there’s still the lingering old view of early life stages as clods. Warkentin says: “We think, ‘It’s a frog egg, it’s a snail embryo — what can it do?’ That’s what surprises us.” © Society for Science & the Public 2000 - 2009
Keyword: Development of the Brain
Link ID: 13118 - Posted: 06.24.2010
By Constance Holden If you take your coffee without sugar or your pancakes without syrup, chances are you've got some European ancestry in your blood. New research reveals that people whose early relatives lived in Europe are more sensitive to sweet tastes than those whose ancestors came from other parts of the world. Scientists led by Alexey Fushan of the National Institute on Deafness and Other Communication Disorders in Bethesda, Maryland, asked 144 people from various ethnic backgrounds to rank the sweetness of nine solutions ranging from 0% to 4% sugar. The volunteers' sucrose sensitivity turned out to be strongly associated with two variants of a gene called TAS1R3, which plays a major role in encoding the main carbohydrate sweet taste receptor. Consulting a reference collection of DNA from 1050 people from around the world held by CEPH, the French gene database, the scientists found that most Europeans have both of the sweetness-sensing variants. The variants are less widespread in people from Asia and the Middle East and are least prevalent in Africans, the team reports in the 11 August issue of Current Biology. Co-author and geneticist Dennis Drayna says the disparity may be evolutionarily significant. "People who study diet and evolution have pointed out most of the high sugar–containing plants like sugarcane are tropical plants," he notes. "So in northerly latitudes, you have to be more sensitive to sugar to find calories." Molecular biologist Stephen Wooding of the University of Texas Southwestern Medical Center at Dallas agrees that the difference may be adaptive. But he says the particular adaptation isn't yet clear. © 2009 American Association for the Advancement of Science
Keyword: Chemical Senses (Smell & Taste); Obesity
Link ID: 13117 - Posted: 06.24.2010
By Laura Sanders When the monitor lizard chomped into Bryan Fry, it did more than turn his hand into a bloody mess. Besides ripping skin and severing tendons, the lizard delivered noxious venom into Fry’s body, injecting molecules that quickly thinned his blood and dilated his vessels. As the tiny toxic assassins dispersed throughout his circulatory system, they hit their targets with speed and precision, ultimately causing more blood to gush from Fry’s wound. Over millions of years, evolution has meticulously shaped these toxins into powerful weapons, and Fry was feeling the devastating consequences firsthand. “I’ve never seen arterial bleeding before, and I really don’t want to ever see it again. Especially coming out of my own arm,” says Fry, a venom researcher at the University of Melbourne in Australia. To unlock the molecular secrets of venom, Fry and other researchers have pioneered a burgeoning field called venomics. With cutting-edge methods, the scientists are teasing apart and cataloging venom’s ingredients, some of which can paralyze muscles, make blood pressure plummet or induce seizures by scrambling brain signals. Researchers are also learning more about how these toxins work. Discovering venom’s tricks may allow scientists to rehabilitate these damaging molecules and convert them from destroyers to healers. Venom might be teeming with wonder drugs, for instance. After all, a perfect venom toxin works with lightning speed, remains stable for a long time and strikes its mark with surgical exactitude — attributes that drugmakers dream about. © Society for Science & the Public 2000 - 2009
Keyword: Neurotoxins
Link ID: 13116 - Posted: 06.24.2010
By Melinda Wenner Your memories of high school biology class may be a bit hazy nowadays, but there are probably a few things you haven’t forgotten. Like the fact that you are a composite of your parents—your mother and father each provided you with half your genes, and each parent’s contribution was equal. Gregor Mendel, often called the father of modern genetics, came up with this concept in the late 19th century, and it has been the basis for our understanding of genetics ever since. But in the past couple of decades, scientists have learned that Mendel’s understanding was incomplete. It is true that children inherit 23 chromosomes from their mother and 23 complementary chromosomes from their father. But it turns out that genes from Mom and Dad do not always exert the same level of influence on the developing fetus. Sometimes it matters which parent you inherit a gene from—the genes in these cases, called imprinted genes because they carry an extra molecule like a stamp, add a whole new level of complexity to Mendelian inheritance. These molecular imprints silence genes; certain imprinted genes are silenced by the mother, whereas others are silenced by the father, and the result is the delicate balance of gene activation that usually produces a healthy baby. When that balance is upset, however, big problems can arise. Because most of these stamped genes influence the brain, major imprinting errors can manifest themselves as rare developmental disorders, such as Prader-Willi syndrome, which is characterized by mild mental retardation and hormonal imbalances that lead to obesity. And recently scientists have started to suspect that more subtle imprinting errors could lead to common mental illnesses such as autism, schizophrenia and Alzheimer’s disease. A better understanding of how imprinting goes awry could provide doctors with new ways to treat or perhaps even prevent some of these disorders. © 1996-2009 Scientific American Inc.
Keyword: Sexual Behavior; Development of the Brain
Link ID: 13115 - Posted: 06.24.2010
by Caroline Williams The smell of the sweat you produce when terrified is not only registered by the brains of others, but changes their behaviour too, according to new research. It adds to a growing body of evidence that humans may communicate using scent in a similar way to how other animals use pheromones. Lilianne Mujica-Parodi, a cognitive neuroscientist at Stony Brook University in New York and colleagues collected sweat from the armpits of first-time tandem skydivers as they hurtled towards the earth. The smell of their sweat was wafted under the noses of volunteers as they lay in an fMRI scanner. Even though they had no idea what they were inhaling, two separate sets of volunteers showed activation of the amygdala – the area of the brain responsible for emotion-processing, plus areas involved in vision, motor control and goal-directed behaviour. Sweat produced under non-stressed conditions didn't produce this reaction. What's more, in behavioural tests, the "stress sweat" seemed to heighten people's awareness of threat, making them 43 per cent more accurate in judging whether a face was neutral or threatening. Because the study used sweat rather than its components, this is not definitive evidence that human pheromones exist, says Johan Lundstrom, a pheromone researcher at Monell Chemical Senses Center in Philadelphia, who was not involved in the research. © Copyright Reed Business Information Ltd
Keyword: Chemical Senses (Smell & Taste); Emotions
Link ID: 13114 - Posted: 06.24.2010
By CAMILLE SWEENEY ON a recent evening, an unusual experiment took place at a lounge in downtown Manhattan. Nine blindfolded women were asked to determine, by smell alone, whether any among a group of nine men was worth pursuing. Three men had just showered using a body wash with synthesized pheromones, three had used a body wash without pheromones, and the rest had worked up a sweat and not washed at all. They then rubbed their arms on scent strips, and handed them to the women to sniff. One participant, Michelle Hotaling, 24, chose a man who had used the pheromone body wash. “In appearance and personality he was not someone I would otherwise be convinced to go out with,” she said, once her blindfold came off. “But his scent was a factor that would push my decision to say, ‘Yes.’ ” Which was just what Dial, the event’s sponsor and maker of the new “pheromone-infused” Dial for Men Magnetic Attraction Enhancing Body Wash, wanted to hear. “We don’t claim using our product you’re going to hit a home run,” said Ryan Gaspar, a brand manager. “We say, ‘We’ll get you to first base.’ ” As the science — or, as some believe, pseudo-science — of pheromones advances toward commercial applications, more manufacturers of personal-care products are dropping tinctures of synthesized pheromones into their formulas, with claims that they will boost sex appeal and confidence. The pheromone of choice for men is a family of steroids, related to testosterone, found near the axillary glands in the underarm area. For women, a commonly used compound is estratetraenol, a derivative of the sex hormone estradiol. (The patents of these synthesized hormones are proprietary, and when asked, the makers would not reveal their ingredients.) Copyright 2009 The New York Times Company
Keyword: Chemical Senses (Smell & Taste); Sexual Behavior
Link ID: 13113 - Posted: 06.24.2010
by Bob Holmes IT IS one of the biggest mysteries in human evolution. Why did we humans evolve such big brains, making us the unrivalled rulers of the world? Some 2.5 million years ago, our ancestors' brains expanded from a mere 600 cubic centimetres to about a litre. Two new studies suggest it is no fluke that this brain boom coincided with the onset of an ice age. Cooler heads, it seems, allowed ancient human brains to let off steam and grow. Cooler heads, it seems, allowed ancient human brains to let off steam and grow For all its advantages, the modern human brain is a huge energy glutton, accounting for nearly half of our resting metabolic rate. About a decade ago, biologists David Schwartzman and George Middendorf of Howard University in Washington DC hypothesised that our modern brain could not have evolved until the Quaternary ice age started, about 2.5 million years ago. They reckoned such a large brain would have generated heat faster than it could dissipate it in the warmer climate of earlier times, but they lacked evidence to back their hypothesis. Now hints of that evidence are beginning to emerge. Climate researcher Axel Kleidon of the Max Planck Institute for Biogeochemistry in Jena, Germany, modelled present-day temperature, humidity and wind conditions around the world using an Earth-systems computer model. He used these factors to predict the maximum rate at which a modern human brain can lose heat in different regions. He found that, even today, the ability to dissipate heat should restrict the activity of people in many tropical regions (Climatic Change, vol 95, p 405). © Copyright Reed Business Information Ltd.
Keyword: Evolution; Intelligence
Link ID: 13112 - Posted: 06.24.2010


.gif)

