Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
Adults who had a common bariatric surgery to lose weight had a significantly higher risk of alcohol use disorders (AUD) two years after surgery, according to a study by a National Institutes of Health research consortium. Researchers investigated alcohol consumption and alcohol use disorders symptoms in 1,945 participants from the NIH-funded Longitudinal Assessment of Bariatric Surgery (LABS), a prospective study of patients undergoing weight-loss surgery at one of 10 hospitals across the United States. Within 30 days before surgery, and again one and two years after surgery, study participants completed the Alcohol Use Disorders Identification (AUDIT) test. The test, developed by the World Health Organization, identifies symptoms of alcohol use disorders, a condition that includes alcohol abuse and dependence, commonly known as alcoholism. Study participants were categorized as having AUD if they had at least one symptom of alcohol dependence, which included not being able to stop drinking once started, or alcohol-related harm, which included not being able to remember, or if their total AUDIT score was at least 8 (out of 40). About 70 percent of the study participants had Roux-en-Y (RYGB) gastric bypass surgery, which reduces the size of the stomach and shortens the intestine, limiting food intake and the body's ability to absorb calories. Another 25 percent had laparoscopic adjustable gastric banding surgery, which makes the stomach smaller with an adjustable band. About 5 percent of the patients had other, less common weight-loss surgeries.
Keyword: Obesity; Drug Abuse
Link ID: 16931 - Posted: 06.19.2012
By Justin Moyer, On June 9, Commerce Secretary John Bryson was hospitalized after his reported involvement in three auto accidents. Although details were not disclosed, the White House confirmed that he had a seizure. On July 30, 2007, Chief Justice John Roberts collapsed on a boat dock at his Maine summer home. Although that seizure was Roberts’s second, he offered little explanation. When Time magazine asked “Does Justice Roberts Have Epilepsy?,” Roberts didn’t answer, and he hasn’t in five years. Reading these stories, I wish public figures such as Roberts and Bryson would talk publicly about their conditions. They should do this not because they are legally compelled to or because their health may affect their work. They should do it because hiding their problems makes it seem like their problems are worth hiding. I received a diagnosis of epilepsy in 2001, at age 24. My seizures are generalized, meaning they strike my whole brain and body. Without warning, I lose consciousness for several minutes and remain disoriented for a few hours. Later, I have no memory of the episode save muscle aches and a sore mouth from biting my tongue. My seizures are idiopathic: They have no known cause. They can be controlled with levetiracetam, a medication that regulates brain neurotransmitters. © 1996-2012 The Washington Post
Keyword: Epilepsy
Link ID: 16930 - Posted: 06.19.2012
By ANDREW POLLACK Two of the front-runners in the race to develop drugs to treat mental retardation and autism are joining forces, hoping to save money and get to the market sooner. A deal, expected to be announced on Tuesday, will pool the resources of Roche, the Swiss pharmaceutical giant, and Seaside Therapeutics, a private 30-employee company based in Cambridge, Mass. “This deal will establish the biggest effort to date” in autism drugs, Luca Santarelli, head of neuroscience for Roche, said before the announcement. Financial terms are not being disclosed. There is rising excitement that drugs might be able to relieve some of the behavioral problems associated with autism and in particular a cause of autism and mental retardation known as fragile X syndrome. About 100,000 Americans have fragile X syndrome. Some parents of children being treated with new drugs in clinical trials have said they see positive changes in behavior. Becky Zorovic of Sharon, Mass., said that when she used to take her son Anders, who has fragile X, to the dentist, she would have to lie in the chair and hold him on top of her as he screamed. But after Anders starting taking Seaside’s drug, arbaclofen, in a clinical trial, she said, “He sat in the chair by himself and he opened his mouth and let the dentist polish his teeth and even scrape his teeth.” Anders has also has gone to birthday parties, which he once refused to do, she said. © 2012 The New York Times Company
Keyword: Autism
Link ID: 16929 - Posted: 06.19.2012
by Veronique Greenwood An average human, utterly unremarkable in every way, can perceive a million different colors. Vermilion, puce, cerulean, periwinkle, chartreuse—we have thousands of words for them, but mere language can never capture our extraordinary range of hues. Our powers of color vision derive from cells in our eyes called cones, three types in all, each triggered by different wavelengths of light. Every moment our eyes are open, those three flavors of cone fire off messages to the brain. The brain then combines the signals to produce the sensation we call color. Vision is complex, but the calculus of color is strangely simple: Each cone confers the ability to distinguish around a hundred shades, so the total number of combinations is at least 1003, or a million. Take one cone away—go from being what scientists call a trichromat to a dichromat—and the number of possible combinations drops a factor of 100, to 10,000. Almost all other mammals, including dogs and New World monkeys, are dichromats. The richness of the world we see is rivaled only by that of birds and some insects, which also perceive the ultraviolet part of the spectrum. Researchers suspect, though, that some people see even more. Living among us are people with four cones, who might experience a range of colors invisible to the rest. It’s possible these so-called tetrachromats see a hundred million colors, with each familiar hue fracturing into a hundred more subtle shades for which there are no names, no paint swatches. And because perceiving color is a personal experience, they would have no way of knowing they see far beyond what we consider the limits of human vision. © 2012, Kalmbach Publishing Co.
Keyword: Vision
Link ID: 16928 - Posted: 06.19.2012
Neuroscientist and friend of the blog Bradley Voytek has a terrific piece about the relationship between popular science writing and neuroscience that’s well worth your time. In brief, he says, the primary problem are neuroscientists themselves. Or at least, some of the assumptions that neuroscientists make. The problem, he notes, is a problem that long term readers here are familiar with – the confusion of cause and effect, as well as an over-reductionist view of the brain. Namely, that imaging studies showing a portion of the brain “lighting up” when something happens means that that area of the brain is directly involved in the activity. Something that he analogizes to “like how when your arms swing faster when you run that means that your arms are ‘where running happens’.” Voytek also provides what I think is one of the best analogies I’ve read about the problem inherent in trying to isolate “which part of the brain does X”: Imagine asking “where is video located in my computer?” That doesn’t make any sense. Your monitor is required to see the video. Your graphics card is required to render the video. The software is required to generate the code for the video. But the “video” isn’t located anywhere in the computer. This is something that I think is exactly right. The best neuroscientists out there are, I think, very aware of this problem, but I think part of what’s going on here is the inherent limitations of our ability to experiment. Take fMRI’s, for example, which have provided an interesting look into what kinds of activity is going on in the brain while the person being imaged is doing or thinking something. It’s one of the best pieces of equipment available, but it’s very nature can be deceiving. Because it’s one of the few ways available to figure out what’s going on in the brain, it can be tempting to see what is measured by an fMRI image as definitive. 2012 Forbes.com LLC™
Keyword: Brain imaging
Link ID: 16927 - Posted: 06.19.2012
By Susan Milius ALBUQUERQUE — Baby bluebirds don’t survive as well near rumbling traffic and other human din as they do amid natural lullabies. In a Virginia study, 35 percent more chicks died in the noisiest nests than in the most remote ones. Researchers found that chicks didn’t adjust for the noise by begging louder or at different frequencies. So parents may not have gotten the right cues for nestling care, behavioral ecologist John Swaddle suggested June 12 at the annual meeting of the Animal Behavior Society. Until recently, most research on how human-made noise discombobulates birds has focused on how adults adjust their songs (or don’t) or on what species will nest at all among the din. Research is now turning to how noise might directly affect the success of a species. One earlier study on reproductive success, in common European birds called great tits, found smaller clutches near roaring highways. Clutch size didn’t shrink among eastern bluebirds (Sialia sialis), said Swaddle, a professor at the College of William and Mary in Williamsburg, Va. Birds settling in to the 43 nest boxes he and his colleagues monitored for two years all started with about the same number of eggs. Just what made noisier nests less successful after hatching isn’t clear, but Swaddle suspects that noise kept parents from caring for their nestlings properly. Noise might have made food harder to find, or it might have masked normal parent-chick chat. Even though baby birds have become an icon of endlessly demanding maws, parents do tune their feeding effort to begging calls, and research has confirmed the importance of communication. © Society for Science & the Public 2000 - 2012
Keyword: Animal Communication; Stress
Link ID: 16926 - Posted: 06.19.2012
By DAWN LERMAN I grew up with a fat dad — 450 pounds at his heaviest. Every week he would rotate to a new fad diet, and my family ended up eating whatever freeze-dried, saccharin-loaded concoction he was trying at that moment. By the time I was 9, I was an expert on Atkins, Pritikin and Weight Watchers, just to name a few. Did I mention spending four weeks at Duke University’s “Fat Farm” consuming only minuscule bowls of white rice, while my 10-year-old peers were home eating ice cream cones? In spite of being shorter and scrawnier than my classmates, I was eating calorie-free astronaut mystery powders and drinking diet sodas, which were the only staples in our kitchen. My dad was obsessed with his career in advertising and his fluctuating weight, which was fluctuating mostly in the wrong direction. Every new diet, no matter how stringent or odd, was the potential solution for his expanding waistline. My mother, on the other hand, never understood what the big deal with food was and ate only one small meal a day while standing up and chatting on the phone. She had no interest in preparing food. Most of our meals consisted of my dad’s diet foods, a meal replacement shake, a frozen dinner, or a bagel or pizza in the car. We never had meals together as a family; in fact, we never ate sitting down. At home, we never used silverware or dishes, only plastic forks and paper plates. My mom loved the fact that in India they never used silverware at all. Of course, she missed the part that Indian families actually ate together and sat down while eating. What I remember most about those years is that I was always hungry — hungry for food, hungry for nice clean clothes, hungry for someone to notice when I ran away from home or hid in the closet for hours. I was just hungry — hungry for someone to care for me because I was a child and I yearned to be cared for. Copyright 2012 The New York Times Company
Keyword: Obesity
Link ID: 16925 - Posted: 06.16.2012
Scientists say they have identified a possible genetic link between diabetes and Alzheimer's disease. It has been known for some time that people with diabetes have a much higher risk of developing Alzheimer's, but not why this is so. Now US researchers writing in Genetics say a study of worms has indicated a known Alzheimer's gene also plays a role in the way insulin is processed. Dementia experts said more work in humans was now needed. Alzheimer's is the most common cause of dementia, which affects 820,000 people in the UK. There are medications which can slow the progress of the disease, but none that can halt its progress. A key indication of Alzheimer's, which can only be seen after death, is the presence of sticky plaques of amyloid protein in decimated portions of patients' brains. Scientists have already found mutations in a gene involved in the processing of amyloid protein in Alzheimer's which run in families. In this study, a team from the City College of New York looked at a similar gene in the nematode worms (C. elegans). These worms are often studied because they, perhaps surprisingly, a useful model for human research. BBC © 2012
Keyword: Alzheimers; Obesity
Link ID: 16924 - Posted: 06.16.2012
by Michael Marshall Fancy getting into a fight? Here's a tip: don't. Even if you win you'll probably get hurt, and that will mean you have to spend weeks recovering when you could be doing something worthwhile, like curing cancer or having sex. Most animals know this instinctively and are reluctant to get into all-out fights. In particular, animals don't fight with members of other species. There's just no point: they aren't sexual rivals, and they have a different diet so they're not likely to steal food either. With some exceptions, including predator-prey struggles, animals only fight their direct competitors: members of their own species. Someone needs to tell the Dalmatian wall lizard about this unwritten rule – preferably through a megaphone from a safe distance. In field tests it picks fights with a neighbouring lizard species that poses no threat to it at all. Is it just a thug, or is there a good reason for its aggressive behaviour? Dalmatian wall lizards are named after the Dalmatia region of southern Croatia – as is the notoriously fecund breed of dog. As lizards go they look quite ordinary, measuring about 6 centimetres long, not counting their tails. They spend most of their time on the ground under vegetation or on low rocks. That keeps them separate from the neighbouring sharp-snouted rock lizards (Dalmatolacerta oxycephala), which tend to hang out on higher rocks where it's cooler. © Copyright Reed Business Information Ltd.
Keyword: Aggression
Link ID: 16923 - Posted: 06.16.2012
by Carl Zimmer I dig a knife into a cardboard box, slit it open, and lift a plastic bottle of bright red fluid from inside. I set it down on my kitchen table, next to my coffee and eggs. The drink, called NeuroSonic, is labeled with a cartoon silhouette of a head, with a red circle where its brain should be. A jagged line—presumably the trace of an EKG—crosses the circle. And down at the very bottom of the bottle, it reads, “Mental performance in every bottle.” My office is full of similar boxes: Dream Water (“Dream Responsibly”), Brain Toniq (“The clean and intelligent think drink”), iChill (“helps you relax, reduce stress, sleep better”), and Nawgan (“What to Drink When You Want to Think”). These products contain mixtures of neurotransmitters, hormones, and neuroactive amino acids, but you don’t need a prescription to buy them. I ordered mine on Amazon, and you can even find them in many convenience stores. I unscrew the cap from one of them and take a gulp. NeuroSonic tastes like cherry and aluminum. I wait for my neurons to light up. While I wait I call nutrition scientist Chris Noonan, who serves as adviser to Neuro, the company that makes NeuroSonic and a line of other elixirs for the brain. The inspiration for NeuroSonic came from the huge success of energy drinks, the caffeine-loaded potions now earning over $6 billion a year in the United States. The company’s founder, Diana Jenkins, posed a question: “Instead of just having a regular caffeinated energy drink, could we also include nutrients for cognitive enhancement and cognitive health?” Her team searched the scientific literature for compounds, eventually zeroing in on L-theanine, an amino acid found in green tea. © 2012, Kalmbach Publishing Co.
Keyword: Attention; Drug Abuse
Link ID: 16922 - Posted: 06.16.2012
David Cyranoski A stem-cell biologist has had an eye-opening success in his latest effort to mimic mammalian organ development in vitro. Yoshiki Sasai of the RIKEN Center for Developmental Biology (CBD) in Kobe, Japan, has grown the precursor of a human eye in the lab. The structure, called an optic cup, is 550 micrometres in diameter and contains multiple layers of retinal cells including photoreceptors. The achievement has raised hopes that doctors may one day be able to repair damaged eyes in the clinic. But for researchers at the annual meeting of the International Society for Stem Cell Research in Yokohama, Japan, where Sasai presented the findings this week, the most exciting thing is that the optic cup developed its structure without guidance from Sasai and his team. “The morphology is the truly extraordinary thing,” says Austin Smith, director of the Centre for Stem Cell Research at the University of Cambridge, UK. Until recently, stem-cell biologists had been able to grow embryonic stem-cells only into two-dimensional sheets. But over the past four years, Sasai has used mouse embryonic stem cells to grow well-organized, three-dimensional cerebral-cortex1, pituitary-gland2 and optic-cup3 tissue. His latest result marks the first time that anyone has managed a similar feat using human cells. The various parts of the human optic cup grew in mostly the same order as those in the mouse optic cup. This reconfirms a biological lesson: the cues for this complex formation come from inside the cell, rather than relying on external triggers. © 2012 Nature Publishing Group,
Keyword: Stem Cells; Vision
Link ID: 16921 - Posted: 06.16.2012
Sandrine Ceurstemont, editor, New Scientist TV Be careful if you're walking in the jungle: what may seem like moving spots could actually be a cheetah. Now new animations by Stuart Anstis and his team from the University of California in San Diego illustrate the effect by showing how our brain can interpret a moving scene in different ways. The video starts with couples gazing into each others' eyes. As they rotate, you are likely to keep seeing four distinct couples as we are primed to recognise male-female pairs. However, in the next clip, where faces are replaced with dots in a similar arrangement, what initially appear to be groups of rotating dots will probably soon turn into two floating squares. A third animation demonstrates that by adding more pairs of dots, the motion of the whole takes over as most people will see two pulsing octagons. A final clip shows that linking the pairs of dots makes the smaller groupings stick out from the overall formation. These animations demonstrate that our brain can favour either the overall shape or its components depending on the arrangement. In many cases, we can perceive a scene in different ways and alternate between the configurations. However, Anstis and his team showed that over time, our brain usually favours one arrangement over the other by remembering how it has previously processed information in the scene. Typically, we see the motion of the smaller groupings first before perceiving an overarching shape. "In our view, the grouping of moving spots into local motion clusters is an early, fast, pre-attentive event, while grouping them into global motion clusters is a slower, high-level process," write Anstis and colleagues. "Reverting to the cheetah example, it is a modest visual achievement to group some of the moving spots locally into legs or a tail, but a prey's actions and survival will ultimately depend on organizing them globally into a whole cheetah." © Copyright Reed Business Information Ltd.
Keyword: Vision
Link ID: 16920 - Posted: 06.16.2012
by Michael Balter The basic questions about early European cave art—who made it and whether they developed artistic talent swiftly or slowly—were thought by many researchers to have been settled long ago: Modern humans made the paintings, crafting brilliant artworks almost as soon as they entered Europe from Africa. Now dating experts working in Spain, using a technique relatively new to archaeology, have pushed dates for the earliest cave art back some 4000 years to at least 41,000 years ago*, raising the possibility that the artists were Neandertals rather than modern humans. And a few researchers say that the study argues for the slow development of artistic skill over tens of thousands of years. Figuring out the age of cave art is fraught with difficulties. Radiocarbon dating has long been the method of choice, but it is restricted to organic materials such as bone and charcoal. When such materials are lying on a cave floor near art on the cave wall, archaeologists have to make many assumptions before concluding that they are contemporary. Questions have even arisen in cases like the superb renditions of horses, rhinos, and other animals in France's Grotte Chauvet, the cave where researchers have directly radiocarbon dated artworks executed in charcoal to 37,000 years ago. Other archaeologists have argued that artists could have entered Chauvet much later and picked up charcoal that had been lying around for thousands of years. Now in a paper published online today in Science, applied a technique called uranium-series (U-series) dating to artworks from 11 Spanish caves. U-series dating has been around since the 1950s and is often used to date caves, corals, and other proxies for climate and sea level changes. But it has been used only a few times before on cave art, including by Pike and Pettit, who used it to date the United Kingdom's oldest known cave art at Cresswell Crags in England. © 2010 American Association for the Advancement of Science.
Keyword: Evolution
Link ID: 16919 - Posted: 06.16.2012
By Laura Sanders In what seems like a blow for humanity, a very smart chimpanzee in Japan crushes any human challenger at a number memory game. After the numbers 1 through 9 make a split-second appearance on a computer screen, the chimp, Ayumu, gets to work. His bulky index finger flies gracefully across the screen, tapping white squares where the numbers had appeared, in order. So far, no human has topped him. Ayumu’s talent caused a stir when researchers first reported it in 2007 (SN: 12/8/2007, p. 355). Since then, the chimp’s feat has grown legendary, even earning him a starring role in a recent BBC documentary. But psychologist Nicholas Humphrey says the hype may be overblown. In an upcoming Trends in Cognitive Sciences essay, Humphrey floats a different explanation for Ayumu’s superlative performance, one that leaves humans’ memory skills unimpugned: Ayumu might have a curious brain condition that allows him to see numbers in colors. If Humphrey’s wild idea is right, the chimpanzee’s feat has nothing to do with memory. “When you get extraordinary results, you need to look for extraordinary ideas to explain them,” says Humphrey, of Darwin College at Cambridge University in England. The idea came to him after listening to two presentations at a consciousness conference in 2011. Tetsuro Matsuzawa of the Primate Research Institute at Kyoto University in Japan described his research on the memory skills of Ayumu, his mother Ai, and two other mom-offspring pairs. And neuroscientist David Eagleman of Baylor College of Medicine in Houston talked about the brain condition known as synesthesia, which causes people to attach sensory experiences to letters or numbers. A synesthete might always see the number four as blue, for instance. © Society for Science & the Public 2000 - 2012
Keyword: Learning & Memory; Evolution
Link ID: 16918 - Posted: 06.16.2012
by Helen Thomson EVER wanted to know what an invisible hand looks like? Well, it is slightly wider than a real hand, and it has shorter fingers too. For the first time, the perceived shape of a phantom limb has been measured. This should make it possible to learn more about how the brain represents what we look like. The illusion of a phantom limb can kick in after an amputation or in people missing limbs from congenital disease. The result is the sensation that the limb is, in fact, present. One theory suggests people with phantom limbs take cues from those around them to work out what their missing body part looks like. Another theory is that the sensation of an invisible limb reflects brain activity in regions that map our body in space. To clarify the sensory origins of phantom limbs, Matthew Longo at Birkbeck, University of London, and colleagues enlisted the help of CL - a 38-year-old woman born without a left arm, who periodically feels she has a phantom hand. They asked her to place her right hand beneath a board and indicate where she believed her fingertips and knuckles were. She then repeated the exercise imagining that her phantom left hand was beneath the board instead. Previous studies have shown that we tend to underestimate our finger length increasingly from thumb to little finger. This mirrors differences in the sensitivity and size of areas in the brain's somatosensory cortex that are thought to represent each digit, probably by making use of visual, mechanical and tactile feedback. The thumb is represented by a larger area of the cortex than the little finger. © Copyright Reed Business Information Ltd.
Keyword: Pain & Touch
Link ID: 16917 - Posted: 06.16.2012
By Rachel Ehrenberg Among a small number of related families from northern Pakistan, some individuals never feel pain in any part of their bodies. Scientists studying six such children found that by the age of 4, they all had injuries to the lips or tongue from repeatedly biting themselves. Bruises, cuts and broken bones were common, though fractures were diagnosed only long after the fact, when weird, painless limping or the inability to use a limb called attention to the injury. Tests showed that the pain-free children perceived sensations of warm and cold, tickling and pressure. They could feel the prick of a needle, but it didn’t hurt. Two had been scalded — painlessly — by hot liquids. And one boy who performed street theater by putting knives through his arms and walking on hot coals died after jumping off a roof on his 14th birthday. Besides their inability to feel pain, the Pakistani individuals studied by the scientists had something else in common: mutations in a gene called SCN9A. That gene encodes the instructions for a protein that forms a passageway for letting sodium ions into nerve cells. Known as Nav1.7, this particular ion channel sits on pain-sensing nerves; when a nerve is stimulated enough to warrant sending a signal to the brain, a flood of sodium ions rush into the cell. Among the pain-free Pakistanis, various mutations in SCN9A altered the blueprints for Nav1.7 in different ways, but with the same result: The channel didn’t work. Muted nerve cells could no longer alert the brain when the body encountered something painful. © Society for Science & the Public 2000 - 2012
Keyword: Pain & Touch
Link ID: 16916 - Posted: 06.16.2012
By Bill Briggs I had done all my crying weeks before. But pacing a hospital hallway -- as nurses changed the diapers of my silent, blank-faced, 20-year-old daughter in the room behind me -- I asked my wife for a hug. I don’t request many. I try to give more hugs than I get. But that August night, I yearned for the blonde girl lying in the bed 20 feet away, a respiration machine blowing oxygen through a hole cut into her trachea. Advertise | AdChoices “I miss her voice. I miss her laugh,” I told Nancy -- my wife and Andrea’s stepmom -- as she wrapped her arms around me. “I really just miss Andrea.” One month earlier, on July 26, my cell phone rang as I gobbled a final forkful of dinner in my living room. I didn’t recognize the number. A somber woman asked if I was the father of Andrea Briggs and told me, flatly, that Andrea was in a nearby hospital. Now standing, my knees flinched. I held a corner of my desk for support as I peppered the woman with urgent questions that she wouldn’t answer. “Is she alive? Can you just please tell me if my daughter is alive?” I demanded, my voice rising. “She is in very critical condition,” the woman said. “Come to Denver Health Medical Center as soon as possible.” The nauseous pang in my stomach blended with a strange, detached numbness and I felt like I was walking in someone else’s body. I grabbed my car keys, fully believing I was on my way to say goodbye to my only child. © 2012 msnbc.com
Keyword: Brain Injury/Concussion
Link ID: 16915 - Posted: 06.16.2012
By Tom Siegfried Arguably, and it would be a tough argument to win if you took the other side, computers have had a greater impact on civilization than any other machine since the wheel. Sure, there was the steam engine, the automobile and the airplane, the printing press and the mechanical clock. Radios and televisions also made their share of societal waves. But look around. Computers do everything TVs and radios ever did. And computers tell time, control cars and planes, and have rendered printing presses pretty darn near obsolete. Computers have invaded every realm of life, from work to entertainment to medicine to education: Reading, writing and arithmetic are now all computer-centric activities. Every nook and cranny of human culture is controlled, colored or monitored by the digital computer. Even though, merely 100 years ago, no such machine existed. In 1912, the word computer referred to people (typically women) using pencils and paper or adding machines. Coincidentally, that was the year that Alan Turing was born. If you don’t like the way computers have taken over the world, you could blame him. No one did more to build the foundation of computer science than Turing. In a paper published in 1936, he described the principle behind all of today’s computing devices, sketching out the theoretical blueprint for a machine able to implement instructions for making any calculation. Turing didn’t invent the idea of a computer, of course. Charles Babbage had grand plans for a computing machine a century earlier (and even he had precursors). George Boole, not long after Babbage, developed the underlying binary mathematics (originally conceived much earlier by Gottfried Leibniz) that modern digital computers adopted. © Society for Science & the Public 2000 - 2012
Keyword: Consciousness; Robotics
Link ID: 16914 - Posted: 06.16.2012
by Ann Gibbons Chimpanzees now have to share the distinction of being our closest living relative in the animal kingdom. An international team of researchers has sequenced the genome of the bonobo for the first time, confirming that it shares the same percentage of its DNA with us as chimps do. The team also found some small but tantalizing differences in the genomes of the three species—differences that may explain how bonobos and chimpanzees don't look or act like us even though we share about 99% of our DNA. "We're so closely related genetically, yet our behavior is so different," says team member and computational biologist Janet Kelso of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. "This will allow us to look for the genetic basis of what makes modern humans different from both bonobos and chimpanzees." Ever since researchers sequenced the chimp genome in 2005, they have known that humans share about 99% of our DNA with chimpanzees, making them our closest living relatives. But there are actually two species of chimpanzees that are this closely related to humans: bonobos (Pan paniscus) and the common chimpanzee (Pan troglodytes). This has prompted researchers to speculate whether the ancestor of humans, chimpanzees, and bonobos looked and acted more like a bonobo, a chimpanzee, or something else—and how all three species have evolved differently since the ancestor of humans split with the common ancestor of bonobos and chimps between 5 million and 7 million years ago in Africa. © 2010 American Association for the Advancement of Science.
Keyword: Evolution
Link ID: 16913 - Posted: 06.14.2012
By Brian Alexander A CPAP device, the Darth Vader-like mask used to ease breathing in sleep apnea sufferers, might be the least attractive thing a man can wear at night, but it could wind up improving his sex life, according to a new study released today at an annual meeting of sleep experts. In yet another example of how the human penis can serve as an important health indicator, a team of doctors from the Sleep Disorders Center of the Walter Reed National Military Medical Center has found that erectile dysfunction is common in younger men with sleep apnea, but that E.D. -- and libido -- improves in men who use the CPAP, or continuous positive airway pressure machine. They presented their results today at the meeting of the Associated Professional Sleep Societies in Boston. Over the past few years, medical science has repeatedly shown that how a man’s penis is working can reflect how the rest of his body is working. E.D. can be an early sign of diabetes, cardiovascular disease, high blood pressure and poor fitness, among other ailments. So when army captain Dr. Joseph Dombrowsky looked at a small handful of studies that had linked apnea to E.D., he realized that he had access to a pool of possible test subjects -- military beneficiaries newly diagnosed with the sleep disorder -- that he could use to explore the link. Dombrowsky and his colleagues recruited 92 men with an average age of nearly 46 who had both a new diagnosis of obstructive sleep apnea, or OSA, and who were starting therapy with CPAP machines. © 2012 msnbc.com
Keyword: Sleep; Sexual Behavior
Link ID: 16912 - Posted: 06.14.2012