Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 6241 - 6260 of 29484

Carina Storrs In the late 1960s, a team of researchers began doling out a nutritional supplement to families with young children in rural Guatemala. They were testing the assumption that providing enough protein in the first few years of life would reduce the incidence of stunted growth. It did. Children who got supplements grew 1 to 2 centimetres taller than those in a control group. But the benefits didn't stop there. The children who received added nutrition went on to score higher on reading and knowledge tests as adolescents, and when researchers returned in the early 2000s, women who had received the supplements in the first three years of life completed more years of schooling and men had higher incomes1. “Had there not been these follow-ups, this study probably would have been largely forgotten,” says Reynaldo Martorell, a specialist in maternal and child nutrition at Emory University in Atlanta, Georgia, who led the follow-up studies. Instead, he says, the findings made financial institutions such as the World Bank think of early nutritional interventions as long-term investments in human health. Since the Guatemalan research, studies around the world — in Brazil, Peru, Jamaica, the Philippines, Kenya and Zimbabwe — have all associated poor or stunted growth in young children with lower cognitive test scores and worse school achievement2. A picture slowly emerged that being too short early in life is a sign of adverse conditions — such as poor diet and regular bouts of diarrhoeal disease — and a predictor for intellectual deficits and mortality. But not all stunted growth, which affects an estimated 160 million children worldwide, is connected with these bad outcomes. Now, researchers are trying to untangle the links between growth and neurological development. Is bad nutrition alone the culprit? What about emotional neglect, infectious disease or other challenges? © 2017 Macmillan Publishers Limited

Keyword: Development of the Brain
Link ID: 23831 - Posted: 07.13.2017

By Jane C. Hu In English the sky is blue, and the grass is green. But in Vietnamese there is just one color category for both sky and grass: xanh. For decades cognitive scientists have pointed to such examples as evidence that language largely determines how we see color. But new research with four- to six-month-old infants indicates that long before we learn language, we see up to five basic categories of hue—a finding that suggests a stronger biological component to color perception than previously thought. The study, published recently in the Proceedings of the National Academy of Sciences USA, tested the color-discrimination abilities of more than 170 British infants. Researchers at the University of Sussex in England measured how long babies spent gazing at color swatches, a metric known as looking time. First the team showed infants one swatch repeatedly until their looking time decreased—a sign they had grown bored with it. Then the researchers showed them a different swatch and noted their reaction. Longer looking times were interpreted to mean the babies considered the second swatch to be a new hue. Their cumulative responses showed that they distinguished among five colors: red, green, blue, purple and yellow. The finding “suggests we’re all working from the same template,” explains lead author Alice Skelton, a doctoral student at Sussex. “You come prepackaged to make [color] distinctions, but given your culture and language, certain distinctions may or may not be used.” For instance, infants learning Vietnamese most likely see green and blue, even if their native language does not use distinct words for the two colors. © 2017 Scientific American

Keyword: Vision; Development of the Brain
Link ID: 23830 - Posted: 07.13.2017

By Alice Klein FOR decades, new parents have been warned against sharing a bed with their babies. While snuggling up with your newborn may seem like the most natural thing in the world, prevailing medical advice says this increases the risk of sudden infant death syndrome (SIDS), sometimes called cot death. Instead, doctors say your little ones should sleep in a separate crib in your bedroom. On the other side of the argument are anthropologists and proponents of “attachment parenting”, who believe that infant-parent separation is unnatural and at odds with our evolutionary history. They favour not just room-sharing but bed-sharing – putting them in direct conflict with paediatric advice. This debate was recently reignited by a study suggesting that room-sharing for up to nine months reduces a baby’s sleep, which in theory could have future health consequences. So what’s a sleep-deprived parent to do? Our ancestors slept in direct contact with their young in order to protect them, just as other primates do today, says Helen Ball at Durham University, UK. “Babies respond to close contact – their breathing, blood oxygen and heart rate are on a more even keel.” In Asia and Africa, most babies still share their parents’ beds (see map). But in the West, bed-sharing fell during the industrial revolution as increased wealth let people afford separate rooms and value was placed on teaching early independence. © Copyright New Scientist Ltd.

Keyword: Sleep
Link ID: 23829 - Posted: 07.13.2017

By Giorgia Guglielmi Semen has something in common with the brains of Alzheimer’s sufferers: Both contain bundles of protein filaments called amyloid fibrils. But although amyloid accumulation appears to damage brain cells, these fibrils may be critical for reproduction. A new study suggests that semen fibrils immobilize subpar sperm, ensuring that only the fittest ones make it to the egg. “I’m sure that from the very first time scientists described semen fibrils, they must have been speculating what their natural function was,” says Daniel Otzen, an expert in protein aggregates at Aarhus University in Denmark, who did not participate in the research. “This seems to be the smoking gun.” Researchers discovered semen fibrils in 2007. At first, they seemed like mostly bad news. Scientists showed that the fibrils, found in the seminal fluid together with sperm cells and other components, can bind to HIV, helping it get inside cells. But the fibrils are found in most primates, notes Nadia Roan, a mucosal biologist at the University of California, San Francisco. “If fibrils didn’t serve some beneficial purpose, they would have been eliminated over evolutionary time.” Because the way HIV fuses to cells is reminiscent of the way a sperm fuses to the egg, she wondered whether the fibrils facilitated fertilization. © 2017 American Association for the Advancement of Science.

Keyword: Alzheimers; Sexual Behavior
Link ID: 23828 - Posted: 07.12.2017

By Linda Geddes Many dangers stalk the bushlands of Tanzania while members of the Hadza people sleep, yet no one keeps watch. There is no need because it seems that natural variation in sleep means there’s rarely a moment when someone isn’t alert enough to raise the alarm. That’s the conclusion of a study that sheds new light on why teenagers sleep late while grandparents are often up at the crack of dawn. Fifty years ago, psychologist Frederick Snyder proposed that animals who live in groups stay vigilant during sleep, by having some stay awake while others rest. However, no one had tested this sentinel hypothesis in humans until now. One way of maintaining this constant vigilance might be by the evolution of different chronotypes – individual differences in when we tend to sleep. This changes as we age, with teenagers shifting towards later bedtimes, and older people towards earlier bedtimes. Would such variability be enough to keep a community safe at night? To investigate, David Samson, then at the University of Toronto in Canada, and his colleagues turned to the Hadza, a group of hunter-gatherers in northern Tanzania. The Hadza sleep in grass huts, each containing one or two adults and often several children. They live in camps of around 30 adults, although several other camps may be close by. Samson recruited 33 adults from two nearby groups of 22 huts and asked them to wear motion-sensors on their wrists to monitor sleep, for 20 days. “It turned out that it was extremely rare for there to be synchronous sleep,” says Samson, now at Duke University in Durham, North Carolina. © Copyright New Scientist Ltd.

Keyword: Sleep; Development of the Brain
Link ID: 23827 - Posted: 07.12.2017

By Jessica Wright, Spectrum on July 11, 2017 Treatment with the hormone oxytocin improves social skills in some children with autism, suggest results from a small clinical trial. The results appeared today in the Proceedings of the National Academy of Sciences1. Oxytocin, dubbed the ‘love hormone,’ enhances social behavior in animals. This effect makes it attractive as a potential autism treatment. But studies in people have been inconsistent: Some small trials have shown that the hormone improves social skills in people with autism, and others have shown no benefit. This may be because only a subset of people with autism respond to the treatment. In the new study, researchers tried to identify this subset. The same team showed in 2014 that children with relatively high blood levels of oxytocin have better social skills than do those with low levels2. In their new work, the researchers examined whether oxytocin levels in children with autism alter the children’s response to treatment with the hormone. They found that low levels of the hormone prior to treatment are associated with the most improvement in social skills. “We need to be thinking about a precision-medicine approach for autism,” says Karen Parker, associate professor of psychiatry at Stanford University in California, who co-led the study. “There’s been a reasonable number of failed [oxytocin] trials, and the question is: Could they have failed because all of the kids, by blind, dumb luck, had really high baseline oxytocin levels?” The study marks the first successful attempt to find a biological marker that predicts response to the therapy. © 2017 Scientific American,

Keyword: Autism; Hormones & Behavior
Link ID: 23826 - Posted: 07.12.2017

By Ryan Cross Whether caused by a car accident that slams your head into the dashboard or repeated blows to your cranium from high-contact sports, traumatic brain injury can be permanent. There are no drugs to reverse the cognitive decline and memory loss, and any surgical interventions must be carried out within hours to be effective, according to the current medical wisdom. But a compound previously used to enhance memory in mice may offer hope: Rodents who took it up to a month after a concussion had memory capabilities similar to those that had never been injured. The study “offers a glimmer of hope for our traumatic brain injury patients,” says Cesario Borlongan, a neuroscientist who studies brain aging and repair at the University of South Florida in Tampa. Borlongan, who reviewed the new paper, notes that its findings are especially important in the clinic, where most rehabilitation focuses on improving motor—not cognitive—function. Traumatic brain injuries, which cause cell death and inflammation in the brain, affect 2 million Americans each year. But the condition is difficult to study, in part because every fall, concussion, or blow to the head is different. Some result in bleeding and swelling, which must be treated immediately by drilling into the skull to relieve pressure. But under the microscope, even less severe cases appear to trigger an “integrated stress response,” which throws protein synthesis in neurons out of whack and may make long-term memory formation difficult. © 2017 American Association for the Advancement of Science.

Keyword: Learning & Memory; Brain Injury/Concussion
Link ID: 23825 - Posted: 07.11.2017

Tina Hesman Saey How well, not how much, people sleep may affect Alzheimer’s disease risk. Healthy adults built up Alzheimer’s-associated proteins in their cerebral spinal fluid when prevented from getting slow-wave sleep, the deepest stage of sleep, researchers report July 10 in Brain. Just one night of deep-sleep disruption was enough to increase the amount of amyloid-beta, a protein that clumps into brain cell‒killing plaques in people with Alzheimer’s. People in the study who slept poorly for a week also had more of a protein called tau in their spinal fluid than they did when well rested. Tau snarls itself into tangles inside brain cells of people with the disease. These findings support a growing body of evidence that lack of Zs is linked to Alzheimer’s and other neurodegenerative diseases. Specifically, “this suggests that there’s something special about deep, slow-wave sleep,” says Kristine Yaffe, a neurologist and psychiatrist at the University of California, San Francisco who was not involved in the study. People with Alzheimer’s are notoriously poor sleepers, but scientists aren’t sure if that is a cause or a consequence of the disease. Evidence from recent animal and human studies suggests the problem goes both ways, Yaffe says. Lack of sleep may make people more prone to brain disorders. And once a person has the disease, disruptions in the brain may make it hard to sleep. Still, it wasn’t clear why not getting enough shut-eye promotes Alzheimer’s disease.

Keyword: Sleep; Alzheimers
Link ID: 23824 - Posted: 07.11.2017

Nicola Davis People who drink coffee have a lower risk of dying from a host of causes, including heart disease, stroke and liver disease, research suggests – but experts say it’s unclear whether the health boost is down to the brew itself. The connection, revealed in two large studies, was found to hold regardless of whether the coffee was caffeinated or not, with the effect higher among those who drank more cups of coffee a day. But scientists say that the link might just be down to coffee-drinkers having healthier behaviours. “It is plausible that there is something else behind this that is causing this relationship,” said Marc Gunter, a co-author of one of the studies, from the International Agency for Research on Cancer. But, he added, based on the consistency of the results he would be surprised if coffee itself didn’t play a role in reducing the risk of death. About 2.25bn cups of coffee are consumed worldwide every day. While previous studies have suggested coffee might have health benefits, the latest research involves large and diverse cohorts of participants. The first study looked at coffee consumption among more than 185,000 white and non-white participants, recruited in the early 1990s and followed up for an average of over 16 years. The results revealed that drinking one cup of coffee a day was linked to a 12% lower risk of death at any age, from any cause while those drinking two or three cups a day had an 18% lower risk, with the association not linked to ethnicity. © 2017 Guardian News and Media Limited

Keyword: Drug Abuse; Stroke
Link ID: 23823 - Posted: 07.11.2017

Dean Burnett Antidepressants; the go-to treatment for depression, or generalised anxiety. It’s incredible when you think about it, the fact that you can have a debilitating mood disorder, take a few pills, and feel better. It’s unbelievable that medical science has progressed so far that we now fully understand how the human brain produces moods and other emotions, so can manipulate them with designer drugs. That’s right, it is unbelievable. Because it isn’t the case. The fact that antidepressants are now so common is something of a mixed blessing. On one hand, anything that helps reduce stigma and lets those afflicted know they aren’t alone can only be helpful. Depression is incredibly common, so this awareness can literally save many lives. On the other hand, familiarity does not automatically mean understanding. Nearly everyone has a smartphone these days, but how many people, if pushed, could construct a touchscreen? Not many, I’d wager. And so it is with depression and antidepressants. For all the coverage and opinion pieces produced about them, the details around how they work remain somewhat murky and elusive. Actually, in the case of antidepressants, it’s more a question of why they work, rather than how. Most antidepressants, from the earliest Trycyclics and Monamine Oxidase inhibitors, to the ubiquitous modern day selective serotonin reuptake inhibitors (SSRIs), work by increasing the levels of specific neurotransmitters in the brain, usually by preventing them from being broken down and reabsorbed into the neurons, meaning they linger in the synapses longer, causing more activity, so “compensating” for the reduced overall levels. Antidepressants make the remaining neurotransmitters work twice as hard, so overall activity is more “normal”, so to speak. © 2017 Guardian News and Media Limited

Keyword: Depression
Link ID: 23822 - Posted: 07.11.2017

By Jennifer Oullette Are brain-training games any better at improving your ability to think, remember and focus than regular computer games? Possibly not, if the latest study is anything to go by. Joseph Kable at the University of Pennsylvania and his colleagues have tested the popular Luminosity brain-training program from Lumos Labs in San Francisco, California, against other computer games and found no evidence that it is any better at improving your thinking skills. Brain-training is a booming market. It’s based on the premise that our brains change in response to learning challenges. Unlike computer games designed purely for entertainment, brain-training games are meant to be adaptive, adjusting challenge levels in response to a player’s changing performance. The thinking is that this should improve a player’s memory, attention, focus and multitasking skills. But there are questions over whether brain-training platforms can enhance cognitive function in a way that is meaningful for wider life. Last year, Lumos Labs paid $2 million to settle a charge from the US Federal Trade Commission for false advertising. Advertising campaigns had claimed that the company’s memory and attention games could reduce the effects of age-related dementia, and stave off Alzheimer’s disease. Most studies on the effects of brain-training games have been small and had mixed results. For this study, Kable and his colleagues recruited 128 young healthy adults for a randomised controlled trial. © Copyright New Scientist Ltd.

Keyword: Learning & Memory; Alzheimers
Link ID: 23821 - Posted: 07.11.2017

Aimee Cunningham An expectant mom might want to think twice about quenching her thirst with soda. The more sugary beverages a mom drank during mid-pregnancy, the heavier her kids were in elementary school compared with kids whose mothers consumed less of the drinks, a new study finds. At age 8, boys and girls weighed approximately 0.25 kilograms more — about half a pound — with each serving mom added per day while pregnant, researchers report online July 10 in Pediatrics. “What happens in early development really has a long-term impact,” says Meghan Azad, an epidemiologist at the University of Manitoba in Canada, who was not involved in the study. A fetus’s metabolism develops in response to the surrounding environment, including the maternal diet, she says. The new findings come out of a larger project that studies the impact of pregnant moms’ diets on their kids’ health. “We know that what mothers eat during pregnancy may affect their children’s health and later obesity,” says biostatistician Sheryl Rifas-Shiman of Harvard Medical School and Harvard Pilgrim Health Care Institute in Boston. “We decided to look at sugar-sweetened beverages as one of these factors.” Sugary drinks are associated with excessive weight gain and obesity in studies of adults and children. Rifas-Shiman and colleagues included 1,078 mother-child pairs in the study. Moms filled out a questionnaire in the first and second trimesters of their pregnancy about what they were drinking — soda, fruit drinks, 100 percent fruit juice, diet soda or water — and how often. Soda and fruit drinks were considered sugar-sweetened beverages. A serving was defined as a can, glass or bottle of a beverage. |© Society for Science & the Public 2000 - 2017

Keyword: Obesity; Development of the Brain
Link ID: 23820 - Posted: 07.11.2017

Ian Sample Science editor The secret to a good night’s sleep later in life is having a good reason to get up in the morning, according to US researchers who surveyed people on their sleeping habits and sense of purpose. People who felt they had a strong purpose in life suffered from less insomnia and sleep disturbances than others and claimed to rest better at night as a result, the study found. Jason Ong, a neurologist who led the research at Northwestern University in Chicago, said that encouraging people to develop a sense of purpose could help them to keep insomnia at bay without the need for sleeping pills. More than 800 people aged 60 to 100 took part in the study and answered questions on their sleep quality and motivations in life. To assess their sense of purpose, the participants were asked to rate statements such as: “I feel good when I think of what I’ve done in the past and what I hope to do in the future.” According to Ong, people who felt their lives had most meaning were less likely to have sleep apnea, a disorder that makes the breathing shallow or occasionally stop, or restless leg syndrome, a condition that compels people to move their legs and which is often worse at night. Those who reported the most purposeful lives had slightly better sleep quality overall, according to the study in the journal Sleep Science and Practice. © 2017 Guardian News and Media Limited

Keyword: Sleep; Attention
Link ID: 23819 - Posted: 07.11.2017

Deborah Orr Most people know about SSRIs, the antidepressant drugs that stop the brain from re-absorbing too much of the serotonin we produce, to regulate mood, anxiety and happiness. And a lot of people know about these drugs first hand, for the simple reason that they have used them. Last year, according to NHS Digital, no fewer than 64.7m antidepressant prescriptions were given in England alone. In a decade, the number of prescriptions has doubled. On Tuesday I joined the throng, and popped my first Citalopram. It was quite a thing – not least because, like an idiot, I dropped my pill about 90 minutes before curtain up for the Royal Shakespeare Company’s production of The Tempest at the Barbican. That’s right. This isn’t just mental illness: this is metropolitan-elite mental illness. It was a pretty overwhelming theatrical experience. The first indication that something was up came as I approached my local tube station. I noticed that I was in a state of extreme dissociation, walking along looking as though I was entirely present in the world yet feeling completely detached from it. I had drifted into total mental autopilot. Luckily, I was able to recognise my fugue. It’s a symptom of my condition, which, as I’ve written before, is complex post-traumatic stress disorder. The drug-induced dissociation was more intense than I’m used to when it’s happening naturally. I use the word advisedly. Much of what is thought of as illness is actually an extreme and sensible protective reaction to unbearable interventions from outside the self. © 2017 Guardian News and Media Limited

Keyword: Depression; Attention
Link ID: 23818 - Posted: 07.09.2017

By Clare Wilson A patient-led movement is helping people taking psychiatric medicines to hack their dosing regimens so they can wean themselves off the drugs without any side effects. Now a Dutch website that sells kits to help people do this is about to launch an English-language site, triggering safety concerns among UK regulators and doctors. Some people find it impossible to stop taking certain antidepressants and anti-anxiety medicines such as valium because, unless the dose is reduced very gradually, they get severe mental and physical side-effects. The problem is these medicines aren’t sold in small enough tablets to allow for tapering. This has prompted some people to flout mainstream medical advice and use DIY methods for reducing their doses, such as grinding up tablets and dissolving them in water, or breaking open capsules of tiny beads and counting them out. The UK mental health charity Mind advises people who want to stop taking antidepressants of some techniques to try, but recommends they get advice from their doctor or pharmacist first. To help people taper their dose more easily, a Dutch medical charity, called Cinderella Therapeutics, creates personalised “tapering kits”, with precisely weighed out tablets in labelled packets that gradually reduce over several months. The website recommends people do this under medical supervision and must first receive a doctor’s prescription. © Copyright New Scientist Ltd.

Keyword: Depression
Link ID: 23817 - Posted: 07.09.2017

Robin McKie Observer science editor Scientists at Cambridge University have co-opted an unusual ally in their battle to find treatments for an incurable degenerative ailment that affects thousands of people in the UK. They have taken charge of a flock of merino sheep that have been genetically modified to carry the gene for Huntington’s disease. The research, led by neuroscientist Professor Jenny Morton, aims to understand how to pinpoint early symptoms of the brain condition, which affects more than 6,700 people in the UK. The gene responsible for Huntington’s was isolated more then 30 years ago but scientists have yet to develop drugs that might halt or even slow its development in patients. The brain’s complexity has defied attempts to understand how the condition develops. “Until now, much of our effort has been based on research on mice or rats,” said Morton. “But sheep should make better research subjects. Not only do they live much longer than rodents, their brains are larger and closer in size and structure to humans.” Huntington’s disease, which affects men and women equally, is an inherited neurological condition whose symptoms manifest themselves in adulthood, usually between 35 and 55. Initially mood, personality, coordination and memory are affected but, as the disease progresses, speech, swallowing and motor function deteriorate until death occurs 10 to 25 years after symptoms first appear. There is no known cure for Huntington’s disease although there are treatments to manage symptoms. © 2017 Guardian News and Media Limited

Keyword: Huntingtons
Link ID: 23816 - Posted: 07.09.2017

By Abby Olena For more than 50 years, scientists have taken for granted that all snakes share a ZW sex determination system, in which males have two Z chromosomes and females have one Z and one W. But a study, published today (July 6) in Current Biology, reveals that the Central American boa (Boa imperator) and the Burmese python (Python bivittatus) use an XY sex determination system, which evolved independently in the two species. “This work is a culmination of a lot of questions that we’ve had about pythons and boas for a long time,” says Jenny Marshall Graves, a geneticist at La Trobe Univeristy in Melbourne, Australia, who did not participate in the study. Some of these questions came up for Warren Booth, a geneticist and ecologist at the University of Tulsa, as he studied parthenogenesis—the growth and development of offspring in the absence of fertilization. He noticed a pattern for organisms undergoing parthenogenesis: animal species that use a ZW system have only male (ZZ) offspring, and the organisms that use an XY system have only female (XX) offspring. Except this pattern doesn’t hold true for boas and pythons, who consistently produce female offspring by parthenogenesis. Booth contacted Tony Gamble, a geneticist at Marquette University in Milwaukee, Wisconsin, who studies sex chromosomes, to begin a collaboration to investigate whether boas and pythons might actually have X and Y chromosomes. Spurred by Booth’s questions, “I went back and reread some of the early papers” on snake sex chromosomes, says Gamble. “What became clear is that they didn’t show that boas and pythons had a ZW sex chromosome system. They just said it without any evidence.” © 1986-2017 The Scientist

Keyword: Sexual Behavior; Evolution
Link ID: 23815 - Posted: 07.09.2017

Hannah Devlin A Catholic priest, a Rabbi and a Buddhist walk into a bar and order some magic mushrooms. It may sound like the first line of a bad joke, but this scenario is playing out in one of the first scientific investigations into the effects of psychedelic drugs on religious experience – albeit in a laboratory rather than a bar. Scientists at Johns Hopkins University in Baltimore have enlisted two dozen religious leaders from a wide range of denominations, to participate in a study in which they will be given two powerful doses of psilocybin, the active ingredient in magic mushrooms. Dr William Richards, a psychologist at Johns Hopkins University in Baltimore, Maryland who is involved in the work, said: “With psilocybin these profound mystical experiences are quite common. It seemed like a no-brainer that they might be of interest, if not valuable, to clergy.” The experiment, which is currently under way, aims to assess whether a transcendental experience makes the leaders more effective and confident in their work and how it alters their religious thinking. Despite most organised religions frowning on the use of illicit substances, Catholic, Orthodox and Presbyterian priests, a Zen Buddhist and several rabbis were recruited. The team has yet to persuade a Muslim imam or Hindu priest to take part, but “just about all the other bases are covered,” according to Richards. After preliminary screening, including medical and psychological tests, the participants have been given two powerful doses of psilocybin in two sessions, one month apart. © 2017 Guardian News and Media Limited

Keyword: Drug Abuse; Attention
Link ID: 23814 - Posted: 07.09.2017

By Michael Price Male baboons that harass and assault females are more likely to mate with them, according to a new study, adding evidence that sexual intimidation may be a common mating strategy among promiscuous mammals. The study’s authors even argue that the findings could shed light on the evolutionary origins of our own species’ behavior, although others aren’t convinced the results imply anything about people. “I think the data and analyses in this study are first-rate,” says Susan Alberts, a biologist who studies primate behavior at Duke University in Durham, North Carolina. “[But] I also think it’s a big stretch to infer something about the origins of human male aggression towards women.” To conduct the research, Elise Huchard, a zoologist at the National Center for Scientific Research in Montpellier, France, and colleagues examined a group of chacma baboons (Papio ursinus) living in Tsaobis Nature Park in Namibia over a 9-year period. These brownish, dog-sized primates live in troops of dozens of males and females. Females will mate with multiple males throughout the year. The male chacma are about twice the size of females and aggressively fight one another and engage in howling competitions to establish dominance. The more dominant a male is, the more likely he is both to succeed in finding a mate and to sire offspring. Males rarely force females to mate, but after years spent observing the animals in the wild, Huchard noticed that a subtler form of sexual coercion appeared to be going on. “Males often chase and attack some females of their own group when meeting another group, and they generally target sexually receptive females on such occasions,” she says. “I spent a great deal of time studying female mate choice, and my main impression … was that females don't have much room to express any preference.” © 2017 American Association for the Advancement of Science

Keyword: Sexual Behavior; Aggression
Link ID: 23813 - Posted: 07.07.2017

Ewen Callaway For 18 months in the early 1980s, John Sulston spent his days watching worms grow. Working in twin 4-hour shifts each day, Sulston would train a light microscope on a single Caenorhabditis elegans embryo and sketch what he saw at 5-minute intervals, as a fertilized egg morphed into two cells, then four, eight and so on. He worked alone and in silence in a tiny room at the Medical Research Council Laboratory of Molecular Biology in Cambridge, UK, solving a Rubik's cube between turns at the microscope. “I did find myself little distractions,” the retired Nobel prize-winning biologist once recalled. His hundreds of drawings revealed the rigid choreography of early worm development, encompassing the births of precisely 671 cells, and the deaths of 111 (or 113, depending on the worm’s sex). Every cell could be traced to its immediate forebear and then to the one before that in a series of invariant steps. From these maps and others, Sulston and his collaborators were able to draw up the first, and so far the only, complete ‘cell-lineage tree’ of a multicellular organism1. Although the desire to record an organism’s development in such exquisite detail preceded Sulston by at least a century, the ability to do so in more-complex animals has been limited. No one could ever track the fates of billions of cells in a mouse or a human with just a microscope and a Rubik’s cube to pass the time. But there are other ways. Revolutions in biologists’ ability to edit genomes and sequence them at the level of a single cell have sparked a renaissance in cell-lineage tracing. © 2017 Macmillan Publishers Limited

Keyword: Development of the Brain
Link ID: 23812 - Posted: 07.07.2017