Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 16461 - 16480 of 29480

By BENEDICT CAREY BUFFALO — Many 4-year-olds cannot count up to their own age when they arrive at preschool, and those at the Stanley M. Makowski Early Childhood Center are hardly prodigies. Most live in this city’s poorer districts and begin their academic life well behind the curve. But there they were on a recent Wednesday morning, three months into the school year, counting up to seven and higher, even doing some elementary addition and subtraction. At recess, one boy, Joshua, used a pointer to illustrate a math concept known as cardinality, by completing place settings on a whiteboard. “You just put one plate there, and one there, and one here,” he explained, stepping aside as two other students ambled by, one wearing a pair of clown pants as a headscarf. “That’s it. See?” For much of the last century, educators and many scientists believed that children could not learn math at all before the age of five, that their brains simply were not ready. But recent research has turned that assumption on its head — that, and a host of other conventional wisdom about geometry, reading, language and self-control in class. The findings, mostly from a branch of research called cognitive neuroscience, are helping to clarify when young brains are best able to grasp fundamental concepts. Copyright 2009 The New York Times Company

Keyword: Development of the Brain
Link ID: 13587 - Posted: 06.24.2010

Owen Flanagan, contributor I use the term "neuro-enthusiasta" for those given to excessive excitement over what brain science teaches. I have been warning, often in these pages, of its mostly amusing excesses and its tendency to produce newspaper headlines exclaiming that the brain "lights up" when people think and feel various things. Still, I did not foresee "neuro-" becoming a universal prefix. We have neuro-economics, neuro-theology, neuro-aesthetics and now, if Iain McGilchrist is to be believed, neuro-history. Plato, long before neuroscience, spoke of the struggle in the soul between Reason, Appetite and Temperament. This, neurologically speaking, has turned out to be the struggle between the brain's upper and lower regions. It's so last century. master_emissary_cover.jpgThe new story is the battle between the brain's hemispheres. "The left hemisphere is competitive, and its concern, its prime motivation is power," McGilchrist writes. The right, in contrast, is personal, empathetic and "primary", experiencing things in the lovey-dovey way we did in the old days when we sat around campfires singing Kumbaya. (Music, predictably, is so right brain). If the left hemisphere has its way, McGilchrist warns, the world will seem "relatively mechanical, an assemblage of more or less disconnected 'parts'... utilitarian in ethic; overconfident of its own take on reality, and lacking insight to its problems". © Copyright Reed Business Information Ltd.

Keyword: Laterality
Link ID: 13586 - Posted: 06.24.2010

By GEOFF DAVIES A Dalhousie University professor and an international team of researchers have discovered what makes us kick. Dr. Rob Brownstone, along with colleagues in New York and Scotland, discovered a group of nerve cells that are critical to regulating how much force muscles use when performing movements. "We knew that they had to be there," Dr. Brownstone said Friday, roughly a week after Neuron, the world’s leading neuroscience journal, published the findings. "But we couldn’t pinpoint them and we couldn’t say exactly what their role was in a behaving animal." The researchers located a group of cells that regulate how much force is used by motor neurons, nerve cells in the spinal cord that make muscles contract. The team used genetic techniques to locate and deactivate these new-found "modulatory" cells in mice. "When we did that, that’s when we found that the animals couldn’t contract their muscles in their legs as much as they needed to swim properly," Dr. Brownstone said. "This is a fundamental discovery about how the spinal cord works to produce movement." Further on down the line, this discovery could lead to breakthroughs in the treatment of Lou Gehrig’s disease, spinal cord injuries and other conditions, Dr. Brownstone said. © 2009 The Halifax Herald Limited

Keyword: ALS-Lou Gehrig's Disease
Link ID: 13585 - Posted: 06.24.2010

By Rob Stein About one out of every 110 U.S. children has been diagnosed with autism, according to a new federal estimate released Friday. An analysis of medical records from more than 307,000 8-year-olds in 2006 found that about 1 percent -- or one out of every 110 -- had been diagnosed with an "autism spectrum disorder," which includes a range of conditions including autism, the federal Centers for Disease Control and Prevention reported. The estimate is an increase in the prevalence of the condition from a previous CDC estimate of about 1 in 150 but is consistent with another estimate the agency released in October based on a telephone survey that concluded the condition was diagnosed in about 1 out of every 100 children. "The findings in this report are in line with other recently reported estimates," said Catherine Rice, a behavioral health scientist at the CDC's National Center on Birth Defects and Developmental Disabilities, whose report were published in the agency's Morbidity and Mortality Weekly Report. The reason for the increase remains unclear, she said. It could be due at least in part to more children being diagnosed with one of the conditions rather than an actual increase in how many children are developing the disability, she said. But "a true increase cannot be ruled out," she said, calling the estimate "concerning." Other factors that could be contributing to the increase include a rise in the average age that women are giving birth, and potentially air pollution, she said. But, Rice stressed "we have much to learn about the causes." © 2009 The Washington Post Company

Keyword: Autism
Link ID: 13584 - Posted: 06.24.2010

BY Prachi Patel No woman has yet won one of the three top mathematics awards–the Fields, the Abel, or the Wolf. It’s part of what’s often called the math gender gap, which in the United States starts early—at least twice as many boys as girls score in the 99th percentile on state-level math assessment tests. Five years ago, then Harvard president Lawrence Summers’s suggestion that women lack an ”intrinsic aptitude” for math and science drew a firestorm of protest, but he was drawing on a century-old hypothesis that males exhibit greater variability in many features, math included. By such reasoning, it is possible for girls to be as good as boys in math on average but to be less well represented in the upper (and lower) echelons. This, Summers said, is one reason there are fewer women in tenured science and engineering positions at top universities and research institutions. ”I would like nothing better than to be proved wrong,” he added. A recent study published in the Proceedings of the National Academy of Sciences might make him happy. In it, psychologists Janet Hyde and Janet Mertz, from the University of Wisconsin–Madison, used data from math aptitude tests to show that among top math performers, the gender gap doesn’t exist in some ethnic groups and in some countries. The researchers conclude that culture is the main reason more men excel at the highest math levels in most countries.

Keyword: Sexual Behavior; Learning & Memory
Link ID: 13583 - Posted: 06.24.2010

By Melissa Lee Phillips Sometimes a difference between the sexes is not based on sex at all. Women have a finer sense of touch than men do, but a new study shows that this is simply because their fingertips tend to be smaller. Neuroscientist Daniel Goldreich of McMaster University in Hamilton, Canada, and his colleagues first became curious about the sex difference while studying differences between blind and sighted people. They found that blind people are better than those with normal vision at distinguishing fine textures but that, within each group, women are better than men. The researchers thought that the discrepancy might be the result of brain differences between men and women, but they first wanted to see if something simpler could explain it. So they tested 50 women and 50 men on a simple task: Each person touched a small, grooved surface and tried to identify the orientation of the grooves. As the grooves got closer together, it became more difficult to determine their direction. As expected, women performed better at this task than men did, but when the scientists looked at the results by finger size, they found that the sex difference disappeared: On average, men and women with the same size fingertips perform at the same level, the team reports in the 16 December issue of The Journal of Neuroscience. (Finger size does not explain all individual variability, however; there are differences between people with the same size fingers, perhaps as a result of differences in the mechanical properties of skin or in how each person's brain processes the information.) © 2009 American Association for the Advancement of Science.

Keyword: Pain & Touch; Sexual Behavior
Link ID: 13582 - Posted: 06.24.2010

A gene associated with a rare form of progressive deafness in males has been identified by an international team of researchers funded by the National Institute on Deafness and Other Communication Disorders. The gene, PRPS1, appears to be crucial in inner ear development and maintenance. The findings are published in the December 17 early online issue of the American Journal of Human Genetics. The gene is associated with DFN2, a progressive form of deafness that primarily affects males. Boys with DFN2 begin to lose their hearing in both ears roughly between the ages of 5 and 15, and over the course of several decades will experience hearing loss that can range from severe to profound. Their mothers, who carry the defective PRPS1 gene, may experience hearing loss as well, but much later in life and in a milder form. Families with DFN2 have been identified in the United States, Great Britain, and China. The NIDCD-funded researchers led by Xue Zhong Liu, M.D., Ph.D., of the University of Miami Miller School of Medicine, discovered that the PRPS1 gene encodes the enzyme phosphoribosylpyrophosphate (PRPP) synthetase 1, which produces and regulates PRPP (phospho-ribosylpyrophosphate), and appears to play a key role in inner ear development and maintenance. The four mutations identified in the PRPS1 gene cause a decrease in the production of the PRPP synthetase 1 protein that results in defects in sensory cells (called hair cells) in the inner ear, and eventually leads to progressive deafness.

Keyword: Hearing; Genes & Behavior
Link ID: 13581 - Posted: 12.19.2009

by Carl Zimmer When Samuel Morse established the first commercial telegraph, in 1844, he dramatically changed our expectations about the pace of life. One of the first telegraph messages came from that year’s Democratic National Convention in Baltimore, where the delegates had picked Senator Silas Wright as their vice presidential nominee. The president of the convention telegraphed Wright in Washington, D.C., to see if he would accept. Wright immediately wired back: No. Incredulous that a message could fly almost instantly down a wire, the delegates adjourned and sent a flesh-and-blood committee by train to confirm Wright’s response—which was, of course, the same. From such beginnings came today’s high-speed, networked society. Less famously but no less significantly, the telegraph also transformed the way we think about the pace of our inner life. Morse’s invention debuted just as researchers were starting to make sense of the nervous system, and telegraph wires were an inspiring model of how nerves might work. After all, nerves and telegraph wires were both long strands, and they both used electricity to transmit signals. Scientists knew that telegraph signals did not travel instantaneously; in one experiment, it took a set of dots and dashes a quarter of a second to travel 900 miles down a telegraph wire. Perhaps, the early brain investigators considered, it took time for nerves to send signals too. And perhaps we could even quantify that time. The notion that the speed of thought could be measured, just like the density of a rock, was shocking. Yet that is exactly what scientists did. In 1850 German physiologist Hermann von Helmholtz attached wires to a frog’s leg muscle so that when the muscle contracted it broke a circuit. He found that it took a tenth of a second for a signal to travel down the nerve to the muscle.

Keyword: Miscellaneous
Link ID: 13580 - Posted: 06.24.2010

By Nathan Seppa It looks like nearsightedness is on the rise in the United States. Researchers tapped into a wide-ranging health survey to rate vision in the population in the early 1970s and roughly 30 years later. They compared eyesight information for more than 4,400 people tested in 1971 and 1972 with data from another set of 8,300 people tested from 1999 to 2004. This broad survey showed that 25 percent of those examined in the early 1970s were deemed to be nearsighted, compared with 42 percent examined three decades later, the researchers report in the December Archives of Ophthalmology. That’s an increase of 66 percent. Myopia severity also increased, with moderate nearsightedness doubling between the two time periods and severe cases, although uncommon, also rising sharply. Mild myopia cases increased slightly, from about 13 percent to 18 percent. This group included some people who did not need corrective lenses, says study coauthor Susan Vitale, an epidemiologist at the National Eye Institute in Bethesda, Md. When analyzing the more recent eye-exam data, the scientists used only diagnoses that were made with the same technology used in the 1970s — mainly standard eye tests and trial lenses. Including diagnoses made with more advanced technology that has become available only recently might have biased the comparison, Vitale says.. The cause of nearsightedness is poorly understood. Past research has linked added risk to both a genetic predisposition to nearsightedness and to excessive amounts of near work, the kind of tasks that require peering at written words or small objects. © Society for Science & the Public 2000 - 2009

Keyword: Vision
Link ID: 13579 - Posted: 06.24.2010

By Megan Talkington Many animals test their legs and totter forth only hours after they are born, but humans need a year before they take their first, hesitant steps. Is something fundamentally different going on in human babies? Maybe not. A new study shows that the time it takes for humans and all other mammals to start walking fits closely with the size of their brains. In past studies to develop a new animal model for the brain events that support motor development, neurophysiologist Martin Garwicz of Lund University in Sweden and his colleagues discovered that the schedules by which ferrets and rats acquire various motor skills, such as crawling and walking, are strikingly similar to each other; the progress simply happens faster for rats. That made them wonder how similar the timing of motor development might be among mammals in general. They compared the time between conception and walking in 24 species and looked at how well this duration correlated with a range of variables, including gestation time, adult body mass, and adult brain mass. As they report in this week's issue of PNAS, brain mass accounts for the vast majority (94%) of the variance in walking time between species. Species with larger brains, such as humans, tend to take longer to learn to walk. Strikingly, a model based on adult brain mass and walking time in the other 23 species almost perfectly predicts when humans begin to walk. "We've always considered humans the exception," Garwicz says, "But in fact, we start walking at exactly the time that would be expected from all other walking mammals." © 2009 American Association for the Advancement of Science

Keyword: Development of the Brain; Evolution
Link ID: 13578 - Posted: 06.24.2010

By Katherine Harmon In patients who have survived severe brain damage, judging the level of actual awareness has proved a difficult process. And the prognosis can sometimes mean the difference between life and death. New research suggests that some vegetative patients are capable of simple learning—a sign of consciousness in many who had failed other traditional cognitive tests. To determine whether patients are in a minimally conscious state (in which there is some evidence of perception or intentional movement) or have sunk into a vegetative state (in which neither exists), doctors have traditionally used a battery of tests and observations. Many of them require some subjective interpretation, such as deciding whether a patient’s movements are purposeful or just random. “We want to have an objective way of knowing whether the other person has consciousness or not,” says Mariano Sigman, who directs the Integrative Neuroscience Laboratory at the University of Buenos Aires. That desire stems in part from surprising neuroimaging work that showed that some vegetative patients, when asked to imagine performing physical tasks such as playing tennis, still had activity in premotor areas of their brains. In others, verbal cues sparked language sectors. A recent study found that about 40 percent of vegetative state diagnoses are incorrect. To explore possible tests of consciousness in patients, Sigman and his colleagues turned to classical conditioning: they sounded a tone and then sent a light puff of air to the patient’s eye. The air puff would cause a patient to blink or flinch the eye, but after repeated trials over half an hour, many patients would begin to anticipate the puff, blinking an eye after only hearing the tone. © 1996-2009 Scientific American Inc.

Keyword: Brain imaging
Link ID: 13577 - Posted: 06.24.2010

By Mark Pothier In the long and tortured debate over drug policy, one of the strangest episodes has been playing out this fall in the United Kingdom, where the country’s top drug adviser was recently fired for publicly criticizing his own government’s drug laws. The adviser, Dr. David Nutt, said in a lecture that alcohol is more hazardous than many outlawed substances, and that the United Kingdom might be making a mistake in throwing marijuana smokers in jail. His comments were published in a press release in October, and the next day he was dismissed. The buzz over his sacking has yet to subside: Nutt has become the talk of pubs and Parliament, as well as the subject of tabloid headlines like: “Drug advisor on wacky baccy?” But behind Nutt’s words lay something perhaps more surprising, and harder to grapple with. His comments weren’t the idle musings of a reality-insulated professor in a policy job. They were based on a list - a scientifically compiled ranking of drugs, assembled by specialists in chemistry, health, and enforcement, published in a prestigious medical journal two years earlier. The list, printed as a chart with the unassuming title “Mean Harm Scores for 20 Substances,” ranked a set of common drugs, both legal and illegal, in order of their harmfulness - how addictive they were, how physically damaging, and how much they threatened society. Many drug specialists now consider it one of the most objective sources available on the actual harmfulness of different substances. © 2009 NY Times Co

Keyword: Drug Abuse
Link ID: 13576 - Posted: 06.24.2010

By Emily Sohn Blue whales' songs are hauntingly deep, filled with extraterrestrial vibratos, and utterly mysterious. Despite many attempts to interpret them, scientists still don't know what the world's largest animals are saying. Now, the mystery only thickens. For decades, blue whales have been singing with increasingly deeper voices, reports a new study. In some cases, the pitch of their songs has dropped by more than 30 percent. Frustrated researchers cannot yet explain why. "It's a worldwide phenomenon," said Mark McDonald, an ocean acoustician and independent researcher in Bellvue, Colo. "All blue whales are shifting their frequencies downward. They are all going in the same direction, and we really don't understand it." "Maybe by putting this data out there," he added, "someone will have a eureka moment and see something that really explains this." McDonald first suspected something was going on about eight years ago, when he started setting up underwater detectors to study blue whales across the Pacific Ocean. To get the devices to work, he and colleagues noticed that they had to shift the detector frequencies downward every year. At the time, they didn't know if something was amiss with the detectors or with the whales. © 2009 Discovery Channel

Keyword: Language; Hearing
Link ID: 13575 - Posted: 06.24.2010

High levels of a hormone that controls appetite appear to be linked to a reduced risk of developing Alzheimer's disease, US research suggests. The 12-year-study of 200 volunteers found those with the lowest levels of leptin were more likely to develop the disease than those with the highest. The JAMA study builds on work that links low leptin levels to the brain plaques found in Alzheimer's patients. The hope is leptin could eventually be used as both a marker and a treatment. The hormone leptin is produced by fat cells and tells the brain that the body is full and so reduces appetite. It has long been touted as a potential weapon in treating obesity. But there is growing evidence that the hormone also benefits brain function. Research on mice - conducted to establish why obese patients with diabetes often have long-term memory problems - found those who received doses of leptin were far more adept at negotiating their way through a maze. The latest research, carried out at Boston University Medical Center, involved regular brain scans on 198 older volunteers over a 12-year period. A quarter of those with the lowest levels of leptin went on to develop Alzheimer's disease, compared with 6% of those with the highest levels. BBC © MMIX

Keyword: Alzheimers; Hormones & Behavior
Link ID: 13574 - Posted: 06.24.2010

by Peter Aldhous Since this article was first posted, the American Psychiatric Association has announced that the publication of DSM-V will be delayed until May 2013. "Extending the timeline will allow more time for public review, field trials and revisions," says APA president Alan Schatzberg. When doctors disagree with each other, they usually couch their criticisms in careful, measured language. In the past few months, however, open conflict has broken out among the upper echelons of US psychiatry. The focus of discord is a volume called the Diagnostic and Statistical Manual of Mental Disorders, or DSM, which psychiatrists turn to when diagnosing the distressed individuals who turn up at their offices seeking help. Regularly referred to as the profession's bible, the DSM is in the midst of a major rewrite, and feelings are running high. Two eminent retired psychiatrists are warning that the revision process is fatally flawed. They say the new manual, to be known as DSM-V, will extend definitions of mental illnesses so broadly that tens of millions of people will be given unnecessary and risky drugs. Leaders of the American Psychiatric Association (APA), which publishes the manual, have shot back, accusing the pair of being motivated by their own financial interests - a charge they deny. The row is set to come to a head next month when the proposed changes will be published online. For a profession that exists to soothe human troubles, it's incendiary stuff. © Copyright Reed Business Information Ltd.

Keyword: Schizophrenia; Depression
Link ID: 13573 - Posted: 06.24.2010

Post menopausal women who take anti-depressants face a small - but statistically significant - increased risk of a stroke, research suggests. The US study was based on 136,293 women aged 50 to 79, who were followed for an average of six years. Anti-depressant users were 45% more likely to have a stroke than women not taking the drugs. The data, published in Archives of Internal Medicine, is taken from the Women's Health Initiative Study. When overall death rates were examined, those on anti-depressants were found to have a 32% higher risk of death from all causes during the study than non-users. The researchers stressed that the overall risk of a stroke was relatively small. Even for women on anti-depressants, it was less than one in 200 chance in any given year. However, they said that because so many women were taking anti-depressants the effect would be significant across the entire population. It is not clear whether taking anti-depressants is solely responsible for the increased risk of a stroke. Depression itself is known to be a risk factor for cardiovascular problems. BBC © MMIX

Keyword: Depression; Stroke
Link ID: 13572 - Posted: 06.24.2010

by Andy Coghlan Octopuses have been observed carrying coconut shells in what researchers claim is the first recorded example of tool use in invertebrates. There is a growing record of tool use in animals and birds, from musical "instruments" made by orang-utansMovie Camera to sponges used by dolphins to dislodge prey from sand. Now veined octopuses, Amphioctopus marginatus, have been filmed picking up coconut halves from the seabed to use as hiding places when they feel threatened. "This octopus behaviour was totally unexpected," says Julian Finn, a marine biologist at Museum Victoria in Melbourne, Australia, who has filmed at least four individual veined octopuses performing the trick off the coast of Indonesia. Discarded coconuts People living in Indonesian coastal villages discard coconut shells into the sea after use. When the octopuses come across these on the seabed, they drape their bodies over and around the shells, hollow-side up, leaving their eight arms dangling over the edges. The octopuses then lift the shells by making their arms rigid, before tiptoeing away in a manoeuvre Finn calls stilt-walking. © Copyright Reed Business Information Ltd

Keyword: Intelligence; Evolution
Link ID: 13571 - Posted: 06.24.2010

By Rachel Saslow Two mice. One weighs 20 grams and has brown fur. The other is a hefty 60 grams with yellow fur and is prone to diabetes and cancer. They're identical twins, with identical DNA. It turns out that their varying traits are controlled by a mediator between nature and nurture known as epigenetics. A group of molecules that sit atop our DNA, the epigenome (which means "above the genome") tells genes when to turn on and off. Duke University's Randy Jirtle made one of the mice brown and one yellow by altering their epigenetics in utero through diet. The mother of the brown, thin mouse was given a dietary supplement of folic acid, vitamin B12 and other nutrients while pregnant, and the mother of the obese mouse was not. (Though the mice had different mothers, they're genetically identical as a result of inbreeding.) The supplement "turned off" the agouti gene, which gives mice yellow coats and insatiable appetites. "If you look at these animals and realize they're genetically identical but at 100 days old some of them are yellow, obese and have diabetes and you don't appreciate the importance of epigenetics in disease, there's frankly no hope for you," Jirtle says. He offers this analogy: The genome is a computer's hardware, and the epigenome is the software that tells it what to do. Epigenomes vary greatly among species, Jirtle explains, so we cannot assume that obesity in humans is preventable with prenatal vitamins. But his experiment is part of a growing body of research that has some scientists rethinking humans' genetic destinies. Is our hereditary fate -- bipolar disorder or cancer at age 70, for example -- sealed upon the formation of our double helices, or are there things we can do to change it? Are we recipients of our DNA, or caretakers of it? © 2009 The Washington Post Company

Keyword: Genes & Behavior
Link ID: 13570 - Posted: 06.24.2010

By Katherine Ellison Gulf War veteran Lynn Gibbons has awful memories of combat with her fourth-grade son, Brent. "He was an out-of-control monster whenever you asked him to do something," the former Air Force computer operations officer recalls. Brent, who had received a diagnosis of attention-deficit hyperactivity disorder, was also flailing in his classes at Saratoga Elementary School in Springfield -- unable, says his mom, to write a coherent paragraph. That was seven years ago. Today Brent is taking advanced-placement high school courses, maintaining a 3.5 grade-point average, playing guitar in a band and -- drum roll -- helping with chores. Says Gibbons: "I am no longer afraid that jail time will be in his future." What made the difference, she's convinced, is a high-tech intervention called neurofeedback, also known as EEG biofeedback. Ordinary biofeedback is a kind of mind-over-body training in which a person uses electronic equipment to monitor an involuntary physiological process such as heart rate and learns to gain some control over it. Neurofeedback operates on the same principle -- except in this case, it's mind over brain. Proponents claim neurofeedback can help alleviate a broad range of problems, including not only ADHD but anxiety, depression, autism and brain injuries. Yet the costly, time-consuming therapy has long been dogged by skeptics who call it a placebo at best, a rip-off at worst. © 2009 The Washington Post Company

Keyword: ADHD; Attention
Link ID: 13569 - Posted: 06.24.2010

Ewen Callaway, reporter People with autism, conventional wisdom goes, have trouble reading the emotions of others. However, brain scans suggest they also have difficulties getting in touch with their inner selves. In a study published yesterday in the journal Brain, Michael Lombardo at the University of Cambridge reports scanning the brains of 66 males - half with autism spectrum disorder (ASD), half developmentally normal - while they thought and made judgements about themselves and, separately, Queen Elizabeth. For the non-autistic subjects, two brain areas linked to self-reflection proved more active when they thought about themselves, compared with thinking about the queen. Not so for those with ASD. One region, the ventralmedial prefrontal cortex, tended to respond similarly to regal and personal judgements, while the second region, the middle cingulate cortex, proved more active when ASD patients thought about the queen. These neurological differences correlated with social ability. According to Lombardo's team: "Individuals whose ventromedial prefrontal cortex made the largest distinction between mentalising about self and other were least socially impaired in early childhood, while those whose ventromedial prefrontal cortex made little to no distinction between mentalising about self and other were the most socially impaired in early childhood." © Copyright Reed Business Information Ltd.

Keyword: Autism; Emotions
Link ID: 13568 - Posted: 06.24.2010