Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By ALLISON HERSH LONDON I’M in line at the supermarket holding three items close to my chest. But I might as well be juggling my Kleenex box, toothpaste tube and an orange. Because — as you’d surely notice if you were behind me in line — I‘m bent forward at a sharp angle, which makes holding things difficult. I know you don’t want to stare, but you do. Maybe you think you’re being considerate when you say, apropos of nothing, “You look like you’re in pain.” Well, thanks, I am — but I’ll resist replying the way I want (“You look like you’re having a bad hair day”). I’m sorry. I know you mean well. Anyway it’s my turn at the register which means I’m closer to being at home where I can lie down and wait for the spasms to subside. Besides, if I told you what my issue was, you would probably shrug and reply that you’d never heard of it. There aren’t any public service announcements about it or telethons. No Angelina Jolies to bravely inform the world. Just people like me, in supermarket checkout lines. And this, I realize, is at the core of a problem that extends beyond me and my condition and that affects the way all of us respond to illnesses, some of which are the subject of public attention — and resources — and some of which are not. I have dystonia, a neurological disorder. Some years ago, for reasons no one knows, the muscles in my back and neck began to spasm involuntarily; the spasms multiply quickly, fatigue the muscles and force the body into repetitive movements and awkward postures like mine. There is no cure, only treatment options like deep brain stimulation, which requires a surgery I underwent last year as a last resort. © 2013 The New York Times Company
Keyword: Movement Disorders; Pain & Touch
Link ID: 18171 - Posted: 05.20.2013
By Bruce Bower In its idealized form, science resembles a championship boxing match. Theories square off, each vying for the gold belt engraved with “Truth.” Under the stern eyes of a host of referees, one theory triumphs by best explaining available evidence — at least until the next bout. But in the real world, science sometimes works more like a fashion show. Researchers clothe plausible explanations of experimental findings in glittery statistical suits and gowns. These gussied-up hypotheses charm journal editors and attract media coverage with carefully orchestrated runway struts, never having to battle competitors. Then there’s psychology. Even more than other social scientists — and certainly more than physical scientists — psychologists tend to overlook or dismiss hypotheses that might topple their own, says Klaus Fiedler of the University of Heidelberg in Germany. They explain experimental findings with ambiguous terms that make no testable predictions at all; they build careers on theories that have never bested a competitor in a fair scientific fight. In many cases, no one knows or bothers to check how much common ground one theory shares with others that address the same topic. Problems like these, Fiedler and his colleagues contended last November in Perspectives in Psychological Science, afflict sets of related theories about such psychological phenomena as memory and decision making. In the end, that affects how well these phenomena are understood. © Society for Science & the Public 2000 - 2013
Keyword: Attention
Link ID: 18170 - Posted: 05.20.2013
By SUSANA MARTINEZ-CONDE YOUR eyes are the sharks of the human body: they never stop moving. In the past minute alone, your eyes made as many as 240 quick movements called “saccades” (French for “jolts”). In your waking hours today, you will very likely make some 200,000 of them, give or take a few thousand. When you sleep, your eyes keep moving — though in different ways and at varying speeds, depending on the stage of sleep. A portion of our eye movements we do consciously and are at least aware of on some level: when we follow a moving bird or plane across the sky with our gaze, for instance. But most of these tiny back-and-forths and ups-and-downs — split-second moves that would make the Flying Karamazov Brothers weep with jealousy — are unconscious and nearly imperceptible to us. Our brain suppresses the feeling of our eye jumps, to avoid the sensation that the world is constantly quaking. Even when we think our gazes are petrified, in fact, we are still making eye motions, including tiny saccades called “microsaccades” — between 60 and 120 of them per minute. Just as we don’t notice most of our breathing, we are almost wholly unaware of this frenetic, nonstop ocular activity. Without it, though, we couldn’t see a thing. Humans are hardly unique in this way. Every known visual system depends on movement: we see things either because they move or because our eyes do. Some of the earliest clues to this came more than two centuries ago. Erasmus Darwin, a grandfather of Charles Darwin, observed in 1794 that staring at a small piece of scarlet silk on white paper for a long time — thereby minimizing (though not stopping) his eye movements — made it grow fainter in color, until it seemed to vanish. © 2013 The New York Times Company
Keyword: Vision
Link ID: 18169 - Posted: 05.20.2013
by Emily Underwood If you are one of the 20% of healthy adults who struggle with basic arithmetic, simple tasks like splitting the dinner bill can be excruciating. Now, a new study suggests that a gentle, painless electrical current applied to the brain can boost math performance for up to 6 months. Researchers don't fully understand how it works, however, and there could be side effects. The idea of using electrical current to alter brain activity is nothing new—electroshock therapy, which induces seizures for therapeutic effect, is probably the best known and most dramatic example. In recent years, however, a slew of studies has shown that much milder electrical stimulation applied to targeted regions of the brain can dramatically accelerate learning in a wide range of tasks, from marksmanship to speech rehabilitation after stroke. In 2010, cognitive neuroscientist Roi Cohen Kadosh of the University of Oxford in the United Kingdom showed that, when combined with training, electrical brain stimulation can make people better at very basic numerical tasks, such as judging which of two quantities is larger. However, it wasn't clear how those basic numerical skills would translate to real-world math ability. © 2010 American Association for the Advancement of Science
Keyword: Learning & Memory
Link ID: 18168 - Posted: 05.18.2013
by Douglas Heaven Got a memory like a fish? The first study to visualise live memory retrieval in the whole brain has not only debunked the "three-second memory" myth, but also sheds light on the brain processes involved in forming long-term memories. Even the haziest recollections have a physical basis in the brain, but the mechanisms behind the formation and retrieval of memories are not well understood. By working with zebrafish, which are small and partially transparent, Hitoshi Okamoto at the RIKEN Brain Science Institute in Wako, Japan, and colleagues were able to study the whole brain at once. This allowed them to observe the roles played by different brain regions as a memory was retrieved. The team used fish with a genetically engineered fluorescent protein in the brain that glows less brightly when calcium levels increase – which occurs when neurons fire. They were able to study the activity of these proteins under a microscope. First, the team trained a group of fish to respond to a visual cue to avoid a small electric shock. Each fish was placed in a tank containing two compartments. When a red light shone in one compartment the fish had to swim to the other to avoid the shock. The researchers then selected the fish that had learned to perform the avoidance task successfully at least 80 per cent of the time and looked at the activity in their brains while a red light was switched on and off. © Copyright Reed Business Information Ltd.
Keyword: Learning & Memory
Link ID: 18167 - Posted: 05.18.2013
By CARL ZIMMER Imagine a wolf catching a Frisbee a dozen times in a row, or leading police officers to a stash of cocaine, or just sleeping peacefully next to you on your couch. It’s a stretch, to say the least. Dogs may have evolved from wolves, but the minds of the two canines are profoundly different. Dog brains, as I wrote last month in The New York Times, have become exquisitely tuned to our own. Scientists are now zeroing in on some of the genes that were crucial to the rewiring of dog brains. Their results are fascinating, and not only because they can help us understand how dogs turned into man’s best friend. They may also teach us something about the evolution of our own brains: Some of the genes that evolved in dogs are the same ones that evolved in us. To trace the change in dog brains, scientists have first had to work out how dog breeds are related to one another, and how they’re all related to wolves. Ya-Ping Zhang, a geneticist at the Chinese Academy of Sciences, has led an international network of scientists who have compared pieces of DNA from different canines. They’ve come to the conclusion that wolves started their transformation into dogs in East Asia. Those early dogs then spread to other parts of the world. Many of the breeds we’re most familiar with, like German shepherds and golden retrievers, emerged only in the past few centuries. © 2013 The New York Times Company
Keyword: Aggression; Genes & Behavior
Link ID: 18166 - Posted: 05.18.2013
by Sara Reardon As suicide rates climb steeply in the US a growing number of psychiatrists are arguing that suicidal behaviour should be considered as a disease in its own right, rather than as a behaviour resulting from a mood disorder. They base their argument on mounting evidence showing that the brains of people who have committed suicide have striking similarities, quite distinct from what is seen in the brains of people who have similar mood disorders but who died of natural causes. Suicide also tends to be more common in some families, suggesting there may be genetic and other biological factors in play. What's more, most people with mood disorders never attempt to kill themselves, and about 10 per cent of suicides have no history of mental disease. The idea of classifying suicidal tendencies as a disease is being taken seriously. The team behind the fifth edition of the Diagnostic Standards Manual (DSM-5) – the newest version of psychiatry's "bible", released at the American Psychiatric Association's meeting in San Francisco this week – considered a proposal to have "suicide behaviour disorder" listed as a distinct diagnosis. It was ultimately put on probation: put into a list of topics deemed to require further research for possible inclusion in future DSM revisions. Another argument for linking suicidal people together under a single diagnosis is that it could spur research into the neurological and genetic factors they have in common. This could allow psychiatrists to better predict someone's suicide risk, and even lead to treatments that stop suicidal feelings. © Copyright Reed Business Information Ltd.
Keyword: Depression; Aggression
Link ID: 18165 - Posted: 05.18.2013
By Tina Hesman Saey COLD SPRING HARBOR, N.Y. – Taming foxes changes not only the animals’ behavior but also their brain chemistry, a new study shows. The finding could shed light on how the foxes’ genetic cousins, wolves, morphed into man’s best friend. Lenore Pipes of Cornell University presented the results May 10 at the Biology of Genomes conference. The foxes she worked with come from a long line started in 1959 when a Russian scientist named Dmitry Belyaev attempted to recreate dog domestication, but using foxes instead of wolves. He bred silver foxes (Vulpes vulpes), which are actually a type of red fox with white-tipped black fur. Belyaev and his colleagues selected the least aggressive animals they could find at local fox farms and bred them. Each generation, the scientists picked the tamest animals to mate, creating ever friendlier foxes. Now, more than 50 years later, the foxes act like dogs, wagging their tails, jumping with excitement and leaping into the arms of caregivers for caresses. At the same time, the scientists also bred the most aggressive foxes on the farms. The descendents of those foxes crouch, flatten their ears, growl, bare their teeth and lunge at people who approach their cages. The foxes’ tame and aggressive behaviors are rooted in genetics, but scientists have not found DNA changes that account for the differences. Rather than search for changes in genes themselves, Pipes and her colleagues took an indirect approach, looking for differences in the activity of genes in the foxes’ brains. © Society for Science & the Public 2000 - 2013
Keyword: Genes & Behavior
Link ID: 18164 - Posted: 05.16.2013
by Michael Balter From the human perspective, few events in evolution were more momentous than the split among primates that led to apes (large, tailless primates such as today's gorillas, chimpanzees, and humans) and Old World monkeys (which today include baboons and macaques). DNA studies of living primates have estimated that the rift took place between 25 million and 30 million years ago, but the earliest known fossils of both groups date no earlier than 20 million years ago. Now, a team working in Tanzania has found teeth and partial jaws from what it thinks are 25-million-year-old ancestors of both groups. If the interpretations hold up, the finds would reconcile the molecular and fossil evidence and possibly provide insights into what led to the split in the first place. Researchers have long been frustrated by a paucity of fossils from this key period in evolution, which sits at the borderline between two major geological epochs: the Miocene (about 23 million to 5 million years ago) and the Oligocene (about 34 million to 23 million years ago). The earliest known fossils of early apes and Old World monkeys date from the early Miocene and have been found in just a handful of sites in Kenya, Uganda, and North Africa. Meanwhile, molecular studies of existing primates consistently suggest that these two groups arose during the Oligocene, leading scientists to wonder whether the molecular dates are wrong or if paleontologists have been looking in the wrong places. For more than a decade, researchers from the United States and Tanzania have been combing Tanzania's Rukwa Rift Basin, searching for fossils of all kinds. During the 2011 and 2012 seasons, a team led by Nancy Stevens, a vertebrate paleontologist at Ohio University in Athens, discovered fossils that it identified as belonging to two previously unknown species of primates: one, an apparent ape ancestor the team has named Rukwapithecus fleaglei; the other, a claimed Old World monkey ancestor dubbed Nsungwepithecus gunnelli. © 2010 American Association for the Advancement of Science.
Keyword: Evolution
Link ID: 18163 - Posted: 05.16.2013
By David Brown, A team of researchers said Wednesday that it had produced embryonic stem cells — a possible source of disease-fighting spare parts — from a cloned human embryo. Scientists at the Oregon Health and Science University accomplished in humans what has been done over the past 15 years in sheep, mice, cattle and several other species. The achievement is likely to, at least temporarily, reawaken worries about “reproductive cloning” — the production of one-parent duplicate humans. But few experts think that production of stem cells through cloning is likely to be medically useful soon, or possibly ever. “An outstanding issue of whether it would work in humans has been resolved,” said Rudolf Jaenisch, a biologist at MIT’s Whitehead Institute in Cambridge, Mass., who added that he thinks the feat “has no clinical relevance.” “I think part of the significance is technical and part of the significance is historical,” said John Gearhart, head of the Institute for Regenerative Medicine at the University of Pennsylvania. “Many labs attempted it, and no one had ever been able to achieve it.” A far less controversial way to get stem cells is now available. It involves reprogramming mature cells (often ones taken from the skin) so that they return to what amounts to a second childhood from which they can grow into a new and different adulthood. Learning how to make and manipulate those “induced pluripotent stem” (IPS) cells is one of biology’s hottest fields. © 1996-2013 The Washington Post
Keyword: Stem Cells; Parkinsons
Link ID: 18162 - Posted: 05.16.2013
By Jason G. Goldman There is a rich tradition in psychology and neuroscience of using animals as models for understanding humans. Humans, after all, are enormously complicated creatures to begin even from a strictly biological perspective. Tacking on the messiness that comes with culture makes the study of the human mind tricky, at best. So, just as biomedical scientists have relied upon the humble mouse, psychological and cognitive scientists have too turned to our evolutionary cousins in the animal kingdom as a means of better understanding ourselves. In her new book Animal Wise, journalist Virginia Morrell recounts a conversation with one researcher who pointed out that decades of research were built upon “rats, pigeons, and college sophomores, preferably male.” The college undergrads stood in for all of humanity, the rats served as representatives of all other mammals, and pigeons served as a model for the rest of the animal kingdom. The silly part isn’t that non-human animals can be used effectively as a means of understanding more about our own species. The idea is simple: understand how a simple system works, and you can make careful inferences about the way that complex systems work. That is (or should be) obvious. In his interview with CNN today, memory research pioneer and Nobel Prize winner Eric Kandel said as much: “Rather than studying the most complex form of memory in a very complicated animal, we had to take the most simple form — an implicit form of memory — in a very simple animal.” © 2013 Scientific American
Keyword: Learning & Memory; Evolution
Link ID: 18161 - Posted: 05.16.2013
by Meera Senthilingam Malaria parasites give mosquitoes a keener sense of smell, it seems. A small-scale study in the lab finds that mosquitoes infected by the parasite are three times as likely as uninfected mosquitoes to respond to human odours. If the same results are seen in malaria-carrying mosquitoes in the wild, it could lead to new ways to combat the disease. Female anopheles mosquitoes are attracted to the chemicals in human odours, which help them find the source of blood they need to grow their eggs. When these mosquitoes carry Plasmodium falciparum – the most lethal form of malaria parasite – the likelihood that they will target humans rises. "We knew already that mosquitoes bite more often when they're infected. They probe the skin more frequently," says James Logan from the London School of Hygiene and Tropical Medicine. To quantify the effect – and try to work out its cause – Logan and his colleagues infected some lab-grown Anopheles gambiae mosquitoes with Plasmodium parasites, while leaving others uninfected. They then tested how both groups were attracted to human smells. Mosquitoes are particularly attracted to foot odours, so Logan's team used nylon stockings containing the volatile chemicals produced by our feet. Over a period of three minutes, Plasmodium-infected mosquitoes landed and attempted to bite the stockings around 15 times on average. By contrast, the uninfected mosquitoes attempted to bite only around five times on average during that time. © Copyright Reed Business Information Ltd.
Keyword: Chemical Senses (Smell & Taste)
Link ID: 18160 - Posted: 05.16.2013
Ed Yong The US adolescents who signed up for the Study of Mathematically Precocious Youth (SMPY) in the 1970s were the smartest of the smart, with mathematical and verbal-reasoning skills within the top 1% of the population. Now, researchers at BGI (formerly the Beijing Genomics Institute) in Shenzhen, China, the largest gene-sequencing facility in the world, are searching for the quirks of DNA that may contribute to such gifts. Plunging into an area that is littered with failures and riven with controversy, the researchers are scouring the genomes of 1,600 of these high-fliers in an ambitious project to find the first common genetic variants associated with human intelligence. The project, which was launched in August 2012 and is slated to begin data analysis in the next few months, has spawned wild accusations of eugenics plots, as well as more measured objections by social scientists who view such research as a distraction from pressing societal issues. Some geneticists, however, take issue with the study for a different reason. They say that it is highly unlikely to find anything of interest — because the sample size is too small and intelligence is too complex. Earlier large studies with the same goal have failed. But scientists from BGI’s Cognitive Genomics group hope that their super-smart sample will give them an edge, because it should be enriched with bits of DNA that confer effects on intelligence. “An exceptional person gets you an order of magnitude more statistical power than if you took random people from the population — I’d say we have a fighting chance,” says Stephen Hsu, a theoretical physicist from Michigan State University in East Lansing, who acts as a scientific adviser to BGI and is one of the project’s leaders. © 2013 Nature Publishing Group,
Keyword: Genes & Behavior; Intelligence
Link ID: 18159 - Posted: 05.15.2013
20:00 13 May 2013 by Douglas Heaven Genes in cells throughout the body switch on and off throughout the day in a coordinated way. Or at least they should. In people with clinical depression, genes in their brain tissues appear to be significantly out of sync – a finding that could lead to new treatments for the condition. We know from previous studies that genes in cells elsewhere, such as the skin, follow a 24-hour cycle of activity. But identifying patterns of genetic activity in a living brain isn't easy to do. "We always assumed we would have a clock [in our brain]," says Huda Akil at the University of Michigan in Ann Arbor. "But it had never been shown before." Akil and her colleagues examined the brains of 55 people with a known time of death, looking at around 12,000 genes in tissues from six brain regions. By matching the time of death with molecular signs of genetic activity – whether each gene was actively expressing itself or not – the team identified hundreds of genes that follow a daily cycle. Sudden death Akil says it was important to look at the brains of individuals who had died suddenly – through a heart attack or car accident, for example. Slower deaths can cause dramatic changes in the brain that would have obscured what they were looking for, but sudden death freezes the genetic activity. "We can capture an instant," she says. © Copyright Reed Business Information Ltd.
Keyword: Depression; Sleep
Link ID: 18158 - Posted: 05.14.2013
By Samyukta Mullangi A recent article in NYTimes [1] declared that the rising rate of suicides among our baby boomer generation now made suicides, by raw numbers alone, a bigger killer than motor vehicle accidents! Researchers quoted within the article pointed to complex reasons like the economic downturn over the past decade, the widespread availability of opioid drugs like oxycodone, and changes in marriage, social isolation and family roles. Then I scrolled down, as I always do, to peruse some of the readers’ comments, and that’s when I paused. I suppose in hindsight that I had expected readers to exclaim at the shocking statistics (suicide rates now stand at 27.3 per 100,000 for middle aged men, 8.1 per 100,000 for women), or lament over personal stories of relatives or friends who took their own lives. While I certainly saw a few such comments, I was amazed to discover the number of readers who were sympathetic to the idea of suicide. “Molly” wrote “Why is suicide usually looked upon as a desperate and forbidden act? Can’t we accept that in addition to poverty, loneliness, alienation, ill health, life in world [sic] that is sometimes personally pointless means that death is a relief? I believe the right to die, in a time and place (and wishfully peacefully without violence) is a basic human right.” This post was ‘recommended’ by 351 other readers at the time of this essay being written. © 2013 Scientific American
Keyword: Depression; Sleep
Link ID: 18157 - Posted: 05.14.2013
Brian Owens The gut is home to innumerable different bacteria — a complex ecosystem that has an active role in a variety of bodily functions. In a study published this week in Proceedings of the National Academy of Sciences1, a team of researchers finds that in mice, just one of those bacterial species plays a major part in controlling obesity and metabolic disorders such as type 2 diabetes. The bacterium, Akkermansia muciniphila, digests mucus and makes up 3–5% of the microbes in a healthy mammalian gut. But the intestines of obese humans and mice, and those with type 2 diabetes, have much lower levels. A team led by Patrice Cani, who studies the interaction between gut bacteria and metabolism at the Catholic University of Louvain in Belgium, decided to investigate the link. Mice that were fed a high-fat diet, the researchers found, had 100 times less A. muciniphila in their guts than mice fed normal diets. The researchers were able to restore normal levels of the bacterium by feeding the mice live A. muciniphila, as well as 'prebiotic' foods that encourage the growth of gut microbes. The effects of this treatment were dramatic. Compared with untreated animals, the mice lost weight and had a better ratio of fat to body mass, as well as reduced insulin resistance and a thicker layer of intestinal mucus. They also showed improvements in a host of other indicators related to obesity and metabolic disorders. “We found one specific common factor between all the different parameters that we have been investigating over the past ten years,” says Cani. © 2013 Nature Publishing Group
Keyword: Obesity
Link ID: 18156 - Posted: 05.14.2013
Linda Carroll TODAY contributor We all get lost or disoriented once in a while, but for Sharon Roseman, being lost is a way of life. A little quirk in her brain makes it impossible to recognize landmarks and find her way around neighborhoods that should have become familiar long ago. “I can literally see my house out the car window, but I have no clue that it’s my house,” Roseman told NBC’s Kristen Dahlgren. Roseman, 64, suffers from developmental topographical disorientation, or DTD, a disorder that had flown under brain researchers’ radar until very recently. DTD was first described as a single case study in a paper published online in 2008 in the journal Neuropsychologia. At the time, it was thought to be extremely rare, says the study’s lead author, Giuseppe Iaria, professor of cognitive neuroscience at the University of Calgary. But since then, Iaria has discovered nearly 1,000 other people with DTD and he thinks there may be a lot more. He currently estimates that about 2 percent of the population may be constantly coping with orientation and navigation problems caused by the disorder. DTD is a profound and disabling deficit. Nothing, not even the layout of a house you’ve lived in for decades, ever becomes familiar. And for Roseman that has made life very trying. When her kids would cry in the night, she would struggle to find her way to them.
Keyword: Attention
Link ID: 18155 - Posted: 05.14.2013
Pregnant mothers’ exposure to the flu was associated with a nearly fourfold increased risk that their child would develop bipolar disorder in adulthood, in a study funded by the National Institutes of Health. The findings add to mounting evidence of possible shared underlying causes and illness processes with schizophrenia, which some studies have also linked to prenatal exposure to influenza. “Prospective mothers should take common sense preventive measures, such as getting flu shots prior to and in the early stages of pregnancy and avoiding contact with people who are symptomatic,” said Alan Brown, M.D., M.P.H, of Columbia University and New York State Psychiatric Institute, a grantee of the NIH’s National Institute of Mental Health (NIMH). “In spite of public health recommendations, only a relatively small fraction of such women get immunized. The weight of evidence now suggests that benefits of the vaccine likely outweigh any possible risk to the mother or newborn.” Brown and colleagues reported their findings online May 8, 2013 in JAMA Psychiatry. Although there have been hints of a maternal influenza/bipolar disorder connection, the new study is the first to prospectively follow families in the same HMO, using physician-based diagnoses and structured standardized psychiatric measures. Access to unique Kaiser-Permanente, county and Child Health and Development Study External Web Site Policy databases made it possible to include more cases with detailed maternal flu exposure information than in previous studies.
Keyword: Schizophrenia; Development of the Brain
Link ID: 18154 - Posted: 05.14.2013
By ABIGAIL ZUGER, M.D. I hadn’t seen Larry in a dozen years when he reappeared in my office a few months ago, grinning. We were both grinning. I always liked Larry, even though he was a bit of a hustler, a little erratic in his appointments, a persistent dabbler in a variety of illegal substances. But he was always careful to avoid the hard stuff; he said he had a bad problem as a teenager and was going to stay out of trouble. It was to stay out of trouble that he left town all those years ago, and now he was back, grayer and thinner but still smiling. Then he pulled out a list of the medications he needed, and we both stopped smiling. According to Larry’s list, he was now taking giant quantities of one of the most addictive painkillers around, an immensely popular black-market drug most doctors automatically avoid prescribing except under the most exceptional circumstances. “I got a bad back now, Doc,” Larry said. Doctors hate pain. Let me count the ways. We hate it because we are (mostly) kindhearted and hate to see people suffer. We hate it because it is invisible, cannot be measured or monitored, and varies wildly and unpredictably from person to person. We hate it because it can drag us closer to the perilous zones of illegal practice than any other complaint. And we hate it most of all because unless we specifically seek out training in how to manage pain, we get virtually none at all, and wind up flying over all kinds of scary territory absolutely solo, without a map or a net. Copyright 2013 The New York Times Company
Keyword: Pain & Touch; Drug Abuse
Link ID: 18153 - Posted: 05.14.2013
By Scicurious Aging happens. As you get older, your body slows down, eventually your brain slows down, too. Some things go gradually, and some go suddenly. To many people, this might seem like a pretty random process. We used to think of aging this way, as just…well cells get old, which means we get old, too. DNA replication after a while starts making errors in repair, the errors build up, and on the whole body scale the whole thing just kind of goes downhill. It seems random. But in fact, it’s not. There are specific proteins which can help control this process. And one of these, NF-kB, in one particular brain region, may have a very important role indeed. NF-kB (which stands for nuclear factor kappa-light-chain-enhancer of activated B cells, which is why we use NF-kB) is a protein complex that has a lot of roles to play. It’s an important starting player in the immune system, where it helps to stimulate antibodies. It’s important in memory and stress responses. NF-kB is something called a transcription factor, which helps to control what DNA is transcribed to RNA, and therefore what proteins will eventually be produced. Transcription factors, as you can see, can have a very large number of functions. But in the hypothalamus, NF-kB may have the added function of helping to control aging. The hypothalamus is an area of many small nuclei (further sub areas of neurons) located at the base of the brain. It’s been coming more and more into vogue lately among neuroscientists. In the past, we were interested in the hypothalamus mostly for its role in controlling hormone release from the dangling pituitary gland before it, but now we are learning that the hypothalamus can play roles in fear, mood, food intake, reproduction, and now…aging. © 2013 Scientific American
Keyword: Development of the Brain; Hormones & Behavior
Link ID: 18152 - Posted: 05.14.2013