Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
Sandeep Ravindran In 2012, computer scientist Dharmendra Modha used a powerful supercomputer to simulate the activity of more than 500 billion neurons—more, even, than the 85 billion or so neurons in the human brain. It was the culmination of almost a decade of work, as Modha progressed from simulating the brains of rodents and cats to something on the scale of humans. The simulation consumed enormous computational resources—1.5 million processors and 1.5 petabytes (1.5 million gigabytes) of memory—and was still agonizingly slow, 1,500 times slower than the brain computes. Modha estimates that to run it in biological real time would have required 12 gigawatts of energy, about six times the maximum output capacity of the Hoover Dam. “And yet, it was just a cartoon of what the brain does,” says Modha, chief scientist for brain-inspired computing at IBM Almaden Research Center in northern California. The simulation came nowhere close to replicating the functionality of the human brain, which uses about the same amount of power as a 20-watt lightbulb. Since the early 2000s, improved hardware and advances in experimental and theoretical neuroscience have enabled researchers to create ever larger and more-detailed models of the brain. But the more complex these simulations get, the more they run into the limitations of conventional computer hardware, as illustrated by Modha’s power-hungry model. © 1986–2019 The Scientist
Keyword: Robotics
Link ID: 26269 - Posted: 05.28.2019
By Susan Rudnick I was a month shy of turning 16 when a red-faced man in a white coat told me I had been born without a uterus. With a huge dark desk between us, he told me I would never menstruate, and would need plastic surgery to correct the anomaly of my vaginal opening that was a mere dimple, so that one day I would to be able to have sexual intercourse. I have M.R.K.H. These four letters stand for Mayer, Rokitansky, Küster and Hauser, the names of the four doctors who discovered the syndrome over a hundred years ago. This anatomical condition occurs during the first trimester of pregnancy, when the duct that normally forms the uterus, cervix and vaginal canal fails to develop. Ovaries do develop, but there is no menstruation. Although the condition is rare, impacting just one in every 4,500 women, for over 40 years I thought I was like nobody else. At the time of the diagnosis, he described my symptoms but neglected to tell that the condition had a name. Without a name for my syndrome, I couldn’t connect with others like me. I was left to navigate my life feeling defective, marginalized and alone. I carried my difference as a secret shame, acting as if I were just like other people. In high school I learned to pretend I had my period. I even talked about having period cramps, which I never had. Even though swimming and water ballet club were my favorite activities, I didn’t go swimming a few times so that it would seem as if I had my period, like all the other girls. There was nothing I wouldn’t do to be like everybody else. One day in college I was lying in my dorm room bed with a gaggle of women, when the conversation turned to diaphragms, something I would never need, but which seemed a rite of passage. Birth control was way beyond my knowledge base. This was in the early 60s, when abortions were still illegal, but you could go to Planned Parenthood and get fitted for a diaphragm. While my girlfriends laughed about looking at condoms in the drugstore, I spaced out. I felt like such an outsider carrying the pain of my secret. © 2019 The New York Times Company
Keyword: Sexual Behavior
Link ID: 26268 - Posted: 05.28.2019
Bryan Clark Last spring, a study set the internet ablaze with sensational headlines promising an early death for those with nontraditional sleep schedules. It wasn’t the conclusion of the study, or its researchers. But in the bombastic world of science reporting, it didn’t really matter. Originally published in the journal Chronobiology International, the study looked at the chronotypes — a means of classifying one’s predisposition for sleeping at certain hours — of more than 430,000 people over a six-and-a-half-year period. Scouring data from the National Health Service in England and the NHS Central Register in Scotland, researchers sought to find out what, if any, negative health impacts awaited those with a night-owl schedule. After sorting nearly half a million people into four groups — definite larks (larks are early birds, those most likely to rise with the sun), definite owls (those more likely to retire to bed with the sun than to wake with it), moderate larks and moderate owls — researchers reported some troubling findings. More than 10,000 participants died during the study period. Of those deaths, the bulk seemed to be the result of natural causes. The study didn’t necessarily seek to link death with sleep deprivation, but rather to “comorbidity” — the occurrence in one person of two or more conditions, such as psychological or neurological disorders, diabetes and the like. With each incremental shift toward a night-owl schedule, comorbidities became more common, increasing the risk of an early death. But while saying that night owls are going to die early makes for an eye-catching headline, the real story isn’t quite that simple. The story behind the study It’s evident that owls’ nontraditional schedules put them at risk of significant health problems. Nearly every study on this chronotype has returned troubling findings. © 2019 The New York Times Company
Keyword: Biological Rhythms
Link ID: 26267 - Posted: 05.24.2019
By Megan Schmidt “The women’s winter is here. The freeze is upon us,” warns a Game of Thrones parody about men and women’s office temperature preferences. If you have a Y chromosome, you probably haven’t experienced “women’s winter.” As the video explains, women’s winter is “when spring turns to summer and there’s blossom on the trees, the office air doth turns to ice and all the women freeze.” Although the skit is now a few years old, it perfectly captures women’s daily struggle with overly air-conditioned workplaces. To some people, thermostat complaints might seem trivial. But a new study has found that cold offices do more than make women shiver. Thermostat settings geared for men’s comfort — typically cooler temperatures — may actually disadvantage women by lowering their ability to perform some tasks. The study, published in PLOS One, found that women are better at math and word tests when room temperatures are warmer. The women in the study answered more questions correctly and submitted more answers overall during the timed tests. Men, on the other hand, performed marginally better on the same tests at cooler room temperatures, the researchers found. Temperature didn’t influence performance on the logic test for either gender. Study author Agne Kajackaite, a behavioral economics researcher at the WZB Berlin Social Science Center, said the research illustrates that “the battle for the thermostat is not just a complaint about comfort levels.” When it comes to women succeeding in the classroom or in the workplace, room temperatures may make a big difference.
Keyword: Sexual Behavior; Learning & Memory
Link ID: 26266 - Posted: 05.24.2019
Laura Sanders Advantages of speaking a second language are obvious: easier logistics when traveling, wider access to great literature and, of course, more people to talk with. Some studies have also pointed to the idea that polyglots have stronger executive functioning skills, brain abilities such as switching between tasks and ignoring distractions. But a large study of bilingual children in the U.S. finds scant evidence of those extra bilingual brain benefits. Bilingual children performed no better in tests measuring such thinking skills than children who knew just one language, researchers report May 20 in Nature Human Behaviour. To look for a relationship between bilingualism and executive function, researchers relied on a survey of U.S. adolescents called the ABCD study. From data collected at 21 research sites across the country, researchers identified 4,524 kids ages 9 and 10. Of these children, 1,740 spoke English and a second language (mostly Spanish, though 40 second languages were represented). On three tests that measured executive function, such as the ability to ignore distractions or quickly switch between tasks with different rules, the bilingual children performed similarly to children who spoke only English, the researchers found. “We really looked,” says study coauthor Anthony Dick, a developmental cognitive neuroscientist at Florida International University in Miami said. “We didn’t find anything.” |© Society for Science & the Public 2000 - 2019.
Keyword: Language
Link ID: 26265 - Posted: 05.24.2019
By Anna Groves | Bipolar patients are seven times more likely to develop Parkinson’s disease, according to a new study. Though the news may be disheartening to those suffering from the already-trying condition, the link might also lead to clues about the causes behind the two conditions. Parkinson’s is a complex disease associated with a gradual decline in dopamine levels produced by neurons, or brain cells. It eventually leads to impaired movements and other bodily functions. The causes are unknown, and there is no cure. Bipolar disorder, also known as manic-depressive illness, is characterized by episodic fluctuations in mood, concentration or energy levels. Its causes are also unknown, though some bipolar-associated genes have been identified. Researchers are still figuring out how brain structure and function changes under the disease. Previous research has linked Parkinson’s with depression. So when the authors of the new study, most of whom are practicing physicians, noticed some of their bipolar patients developing Parkinson’s, they wondered if there was a connection. The study, out today in Neurology, was led by Huang Mao-Hsuan, who practices in the department of psychiatry at Taipei Veterans General Hospital. The researchers compared data from two groups of adults in the Taiwan National Health Insurance Research Database. Members of one group — over 56,000 individuals — were diagnosed with bipolar disorder between 2001 and 2009. The other — 225,000 individuals — had never been diagnosed with the disorder. No one in either cohort had received a Parkinson’s diagnosis and all the patients were over 20. And researchers ensured the two groups had similar ages, socioeconomic status, and other traits that might influence health.
Keyword: Parkinsons; Schizophrenia
Link ID: 26264 - Posted: 05.23.2019
By Michelle Roberts Health editor, BBC News online Patients who have had a stroke caused by bleeding in the brain can safely take aspirin to cut their risk of future strokes and heart problems, according to a new study. Aspirin thins the blood and so doctors have been cautious about giving it, fearing it could make bleeds worse. But The Lancet research suggests it does not increase the risk of new brain bleeds, and may even lower it. Experts say the "strong indication" needs confirming with more research. Only take daily aspirin if your doctor recommends it, they advise. Aspirin benefits and risks Aspirin is best known as a painkiller and is sometimes also taken to help bring down a fever. But daily low-dose (75mg) aspirin is used to make the blood less sticky and can help to prevent heart attacks and stroke. Most strokes are caused by clots in the blood vessels of the brain but some are caused by bleeds. Because aspirin thins the blood, it can sometimes make the patient bleed more easily. And aspirin isn't safe for everyone. It can also cause indigestion and, more rarely, lead to stomach ulcers. Never give aspirin to children under the age of 16 (unless their doctor prescribes it). It can make children more likely to develop a very rare but serious illness called Reye's syndrome (which can cause liver and brain damage). The study The research involved 537 people from across the UK who had had a brain bleed while taking anti-platelet medicines, to stop blood clotting, including aspirin, dipyridamole or another drug called clopidogrel. Half of the patients were chosen at random to continue on their medicine (following a short pause immediately after their brain bleed), while the other half were told to stop taking it Over the five years of the study, 12 of those who kept taking the tablets suffered a brain bleed, compared with 23 of those who stopped © 2019 BBC
Keyword: Stroke
Link ID: 26263 - Posted: 05.23.2019
Laura Sanders A teenager’s brain does not magically mature into its reasoned, adult form the night before his or her 18th birthday. Instead, aspects of brain development stretch into a person’s 20s — a protracted fine-tuning with serious implications for young people caught in the U.S. justice system, argues cognitive neuroscientist B.J. Casey of Yale University. In the May 22 Neuron, Casey describes the heartbreaking case of Kalief Browder, sent at age 16 to Rikers Island correctional facility in New York City after being accused of stealing a backpack. Unable to come up with the $3,000 bail, Browder spent three years in the violent jail before his case was ultimately dropped. About two-thirds of his time in custody was spent in solitary confinement — “a terrible place for a child to have to grow up,” Casey says. Two years after his 2013 release, Browder died from suicide. Casey uses the case to highlight how the criminal justice system — and the accompanying violence, stress and isolation (SN: 12/8/18, p. 11) that come with being incarcerated — can interfere with brain development in adolescents and children. Other recent stories of immigrant children being separated from their families and held in detention centers have raised similar concerns (SN Online: 6/20/18). Studies with lab animals and brain imaging experiments in people show that chronic stress and other assaults “impact the very brain circuitry that is changing so radically during adolescence,” Casey says. An abundance of science says that “the way we’re treating our young people is not the way to a healthy development.” |© Society for Science & the Public 2000 - 2019
Keyword: Development of the Brain; Stress
Link ID: 26262 - Posted: 05.23.2019
Ed Yong In 1996, a group of European researchers found that a certain gene, called SLC6A4, might influence a person’s risk of depression. It was a blockbuster discovery at the time. The team found that a less active version of the gene was more common among 454 people who had mood disorders than in 570 who did not. In theory, anyone who had this particular gene variant could be at higher risk for depression, and that finding, they said, might help in diagnosing such disorders, assessing suicidal behavior, or even predicting a person’s response to antidepressants. Back then, tools for sequencing DNA weren’t as cheap or powerful as they are today. When researchers wanted to work out which genes might affect a disease or trait, they made educated guesses, and picked likely “candidate genes.” For depression, SLC6A4 seemed like a great candidate: It’s responsible for getting a chemical called serotonin into brain cells, and serotonin had already been linked to mood and depression. Over two decades, this one gene inspired at least 450 research papers. But a new study—the biggest and most comprehensive of its kind yet—shows that this seemingly sturdy mountain of research is actually a house of cards, built on nonexistent foundations. Richard Border of the University of Colorado at Boulder and his colleagues picked the 18 candidate genes that have been most commonly linked to depression—SLC6A4 chief among them. Using data from large groups of volunteers, ranging from 62,000 to 443,000 people, the team checked whether any versions of these genes were more common among people with depression. “We didn’t find a smidge of evidence,” says Matthew Keller, who led the project. (c) 2019 by The Atlantic Monthly Group.
Keyword: Depression; Genes & Behavior
Link ID: 26261 - Posted: 05.22.2019
By Emily Willingham As anyone who’s dealt with substance addiction can tell you, breaking the physical intimacy with the drug isn’t always the most challenging part of treatment. People trying to avoid resurrecting their addiction also must grapple with reminders of it: the sights, sounds and people who were part of their addictive behaviors. These cues can trigger a craving for the drug, creating anxiety that steers them straight back into addiction for relief. The opioid epidemic in the United States has taken more than 300,000 lives, and support for people working to keep these drugs out of their orbit has become crucial. Methadone and buprenorphine, the current medical treatment options, help break the physical craving for opioids by targeting the same pathways that opioids use. Although these drugs can ease physical need, they don’t quiet the anxiety that environmental cues can trigger, leaving open a door to addiction reentry. The cannabis compound cannabidiol (CBD), a nonpsychoactive component of cannabis, might be the key to keeping that door locked. Researchers report that among people with opioid addiction, CBD dampens cue-triggered cravings and anxiety, along with reducing stress hormone levels and heart rate. The results were published May 21 in the American Journal of Psychiatry. “These findings provide support for an effect of cannabidiol on this process,” says Kathryn McHugh, assistant professor in the department of psychiatry at Harvard Medical School’s Division of Alcohol and Drug Abuse, who was not involved in the study. However, she cautions, the results are preliminary, and behavioral therapies are also quite effective at dimming the signal from cues. © 2019 Scientific American
Keyword: Drug Abuse; Depression
Link ID: 26260 - Posted: 05.22.2019
Carolyn Wilke Here’s a downer: Pessimism seems contagious among ravens. But positivity? Not so much. When ravens saw fellow birds’ responses to a disliked food, but not the food itself, their interest in their own food options waned, researchers report May 20 in the Proceedings of the National Academy of Sciences. The study suggests that the birds pick up on and even share negative emotions, the researchers say. Ravens are “very good problem solvers … but this paper’s really highlighting their social intelligence as well,” says Andrew Gallup, a psychologist at SUNY Polytechnic Institute in Utica, N.Y., who was not involved in the study. The work paints a richer picture of how the birds’ brains work, he says. Known for their smarts, ravens act in ways that suggest a capacity for empathy, such as by appearing to console a distressed comrade. Thomas Bugnyar, a cognitive ethologist at the University of Vienna, and his colleagues wanted to look into one building block of empathy — whether animals share emotions. To be able to feel for others, an animal needs to be able to feel like others, he says. But sizing up an animal’s mood is tricky. Scientists generally rely on behavioral or physiological cues to clue into a creature’s emotional state. More challenging is assessing how one animal’s mood might influence another’s: Similar actions appearing to stem from kindred emotions may just be mimicry. |© Society for Science & the Public 2000 - 2019
Keyword: Emotions; Evolution
Link ID: 26259 - Posted: 05.22.2019
By Kenneth Miller A model of Ben Barres’ brain sits on the windowsill behind his desk at Stanford University School of Medicine. To a casual observer, there’s nothing remarkable about the plastic lump, 3-D-printed from an MRI scan. Almost lost in the jumble of papers, coffee mugs, plaques and trophies that fill the neurobiologist’s office, it offers no hint about what Barres’ actual gray matter has helped to accomplish: a transformation of our understanding of brains in general, and how they can go wrong. Barres is a pioneer in the study of glia. This class of cells makes up 90 percent of the human brain, but gets far less attention than neurons, the nerve cells that transmit our thoughts and sensations at lightning speed. Glia were long regarded mainly as a maintenance crew, performing such unglamorous tasks as ferrying nutrients and mopping up waste, and occasionally mounting a defense when the brain faced injury or infection. Over the past two decades, however, Barres’ research has revealed that they actually play central roles in sculpting the developing brain, and in guiding neurons’ behavior at every stage of life. “He has made one shocking, revolutionary discovery after another,” says biologist Martin Raff, emeritus professor at University College London, whose own work helped pave the way for those advances. Recently, Barres and his collaborators have made some discoveries that may revolutionize the treatment of neurodegenerative ailments, from glaucoma and multiple sclerosis to Alzheimer’s disease and stroke. What drives such disorders, their findings suggest, is a process in which glia turn from nurturing neurons to destroying them. Human trials of a drug designed to block that change are just beginning.
Keyword: Glia; Learning & Memory
Link ID: 26258 - Posted: 05.22.2019
By Gretchen Reynolds Skipping breakfast before exercise might reduce how much we eat during the remainder of the day, according to a small but intriguing new study of fit young men. The study finds that the choice to eat or omit a meal before an early workout could affect our relationship to food for the rest of the day, in complicated and sometimes unexpected ways. Weight management is, of course, one of the great public — and private — health concerns of our time. But the role of exercise in helping people to maintain, lose or, in some instances, add pounds is problematic. Exercise burns calories, but in many past studies, people who begin a new exercise program do not lose as much weight as would be expected, because they often compensate for the energy used during exercise by eating more later or moving less. These compensations, usually subtle and unintended, indicate that our brains are receiving internal communiqués detailing how much energy we used during that last workout and, in response, sending biological signals that increase hunger or reduce our urge to move. Our helpful brains do not wish us to sustain an energy deficit and starve. Previous studies show that many aspects of eating and exercise can affect how much people compensate for the calories burned during exercise, including the type and length of the exercise and the fitness and weight of the exercisers. Skipping or consuming breakfast also can matter. When we eat a meal, our bodies rely on the carbohydrates in those foods as a primary source of energy. Some of those carbohydrates are stored in our bodies, but those internal stores of carbohydrates are small compared to the stores of fat. Some researchers believe that our brains may pay particular attention to any reductions in our carbohydrate levels and rush to replace them. © 2019 The New York Times Company
Keyword: Obesity
Link ID: 26257 - Posted: 05.22.2019
Ian Sample Science editor Male bonobos living with their mothers are three times more likely to father offspring, research suggests. Their mothers are so keen for them to father children that they usher them in front of promising partners, shield them from violent competitors and dash the chances of other males by charging them while they are at it. For a bonobo mother, it is all part of the parenting day, and analysis finds the hard work pays off. Males of the species that live with their mothers are three times more likely to father offspring than those whose mothers are absent. Martin Surbeck, a primatologist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, said: “We wanted to see if the mothers’ behaviour changes the odds of their sons’ success, and it does. The mothers have a strong influence on the number of grandchildren they get.” Bonobo mothers seize every opportunity to give their sons a leg-up. In bonobo society, the lower ranks tend to be gender balanced, but females dominate the top ranks. Many mothers have social clout and chaperone their sons to huddles with fertile females, ensuring them better chances to mate. “The mothers tend to be a social passport for their sons,” said Surbeck. © 2019 Guardian News & Media Limited
Keyword: Sexual Behavior; Evolution
Link ID: 26256 - Posted: 05.21.2019
Alison Abbott Pharmacologists gave mescaline a fair trial. In the early and mid-twentieth century, it seemed more than plausible that the fashionable hallucinogen could be tamed into a therapeutic agent. After all, it had profound effects on the human body, and had been used for centuries in parts of the Americas as a gateway to ceremonial spiritual experience. But this psychoactive alkaloid never found its clinical indication, as science writer Mike Jay explains in Mescaline, his anthropological and medical history. In the 1950s, the attention of biomedical researchers abruptly switched to a newly synthesized molecule with similar hallucinogenic properties but fewer physical side effects: lysergic acid diethylamide, or LSD. First synthesized by Swiss scientist Albert Hofmann in 1938, LSD went on to become a recreational drug of choice in the 1960s hippy era. And, like mescaline, it teased psychiatrists without delivering a cure. Jay traces the chronology of mescaline use. The alkaloid is found in the fast-growing San Pedro cactus (Echinopsis pachanoi) that towers above the mountainous desert scrub of the Andes, and the slow-growing, ground-hugging peyote cactus (Lophophora williamsii) native to Mexico and the southwestern United States. Archaeological evidence suggests that the use of these cacti in rites of long-vanished cultures goes back at least 5,000 years. © 2019 Springer Nature Publishing AG
Keyword: Drug Abuse; Depression
Link ID: 26255 - Posted: 05.21.2019
by C.L. Lynch Everyone knows that autism is a spectrum. People bring it up all the time. “My son is on the severe end of the autism spectrum.” “We’re all a little autistic– it’s a spectrum.” “I’m not autistic but I’m definitely ‘on the spectrum.'” If only people knew what a spectrum is… because they are talking about autism all wrong. Let’s use the visible spectrum as an example. As you can see, the various parts of the spectrum are noticeably different from each other. Blue looks very different from red, but they are both on the visible light spectrum. Red is not “more blue” than blue is. Red is not “more spectrum” than blue is. When people discuss colours, they don’t talk about how “far along” the spectrum a colour is. They don’t say “my walls are on the high end of the spectrum” or “I look best in colours that are on the low end of the spectrum.” But when people talk about autism they talk as if it were a gradient, not a spectrum at all. People think you can be “a little autistic” or “extremely autistic,” the way a paint colour could be a little red or extremely red. An image of a colour gradient moving from white to red. The lightest zone is labelled How people think the spectrum looks But autism isn’t that simple. Autism isn’t a set of defined symptoms that collectively worsen as you move “up” the spectrum.
Keyword: Autism
Link ID: 26254 - Posted: 05.21.2019
By Jane E. Brody One of the most widely prescribed prescription drugs, gabapentin, is being taken by millions of patients despite little or no evidence that it can relieve their pain. In 2006, I wrote about gabapentin after discovering accidentally that it could counter hot flashes. The drug was initially approved 25 years ago to treat seizure disorders, but it is now commonly prescribed off-label to treat all kinds of pain, acute and chronic, in addition to hot flashes, chronic cough and a host of other medical problems. The F.D.A. approves a drug for specific uses and doses if the company demonstrates it is safe and effective for its intended uses, and its benefits outweigh any potential risks. Off-label means that a medical provider can legally prescribe any drug that has been approved by the Food and Drug Administration for any condition, not just the ones for which it was approved. This can leave patients at the mercy of what their doctors think is helpful. Thus, it can become a patient’s job to try to determine whether a medication prescribed off-label is both safe and effective for their particular condition. This is no easy task even for well-educated doctors, let alone for desperate patients in pain. Two doctors recently reviewed published evidence for the benefits and risks of off-label use of gabapentin (originally sold under the trade name Neurontin) and its brand-name cousin Lyrica (pregabalin) for treating all kinds of pain. (There is now also a third drug, gabapentin encarbil, sold as Horizant, approved only for restless leg syndrome and postherpetic neuralgia, which can follow a shingles outbreak.) © 2019 The New York Times Company
Keyword: Pain & Touch; Drug Abuse
Link ID: 26253 - Posted: 05.21.2019
By Nathaniel Scharping | Don’t get a big head, your mother may have told you. That’s good advice, but it comes too late for most of us. Humans have had big heads, relatively speaking, for hundreds of thousands of years, much to our mothers’ dismay. Our oversize noggins are a literal pain during childbirth. Babies have to twist and turn as they exit the birth canal, sometimes leading to complications that necessitate surgery. And while big heads can be painful for the mother, they can downright transformative for babies: A fetus’ pliable skull deforms during birth like putty squeezed through a tube to allow it to pass into the world. This cranial deformation has been known about for a long time, but in a new study, scientists from France and the U.S. actually watched it happen using an MRI machine during labor. The images, published in a study in PLOS One, show how the skulls (and brains) of seven infants squished and warped during birth to pass through the birth canal. They also shine new light on how much our skulls change shape as we’re born. The researchers recruited pregnant women in France to undergo an MRI a few weeks before pregnancy and another in the minutes before they began to actually give birth. In total, seven women were scanned in the second stage of labor, when the baby begins to make its way out of the uterus. They were then rushed to the maternity ward to actually complete giving birth.
Keyword: Development of the Brain; Brain imaging
Link ID: 26252 - Posted: 05.20.2019
Before he was born, his parents knew their boy was in trouble. That was clear from what their doctors' saw in their baby's ultrasound. And tragically, the boy died when he was only ten months old. But in his short life, he left behind a valuable legacy by helping scientists understand a crucial type of brain cell. That's because — as it turned out — the child had none. "One of the things about being a pediatric geneticist is on any given day you can see a patient you could spend the rest of your life or your career thinking about," Dr. James Bennett told Quirks & Quarks host Bob McDonald. Dr. Bennett is a physician and researcher from Seattle Children's Hospital and assistant professor of pediatric genetics at the University of Washington. Devastating problems with brain development On the first day he met the child — the boy's very first day of life — Dr. Bennett said he could tell this baby needed a lot of support. The baby was having difficulty breathing, had an enlarged head as well as some very significant abnormalities of his brain. "Every single part of his brain was affected. There was no connection between the left side and the right side of his brain. And there was too much fluid on the brain — that the spaces that hold fluid around the brain were enlarged. And the white matter, which is the part of the brain that sort of connects the neurons — you can think of it as sort of the wires connecting things in the brain — was decreased and abnormal," said Dr. Bennett. Scientists had never seen a medical mystery like this before, so Dr. Bennett was determined to figure out what was wrong with the infant. He he undertook a "diagnostic odyssey" to identify the cause of this extremely rare condition. ©2019 CBC/Radio-Canada
Keyword: Development of the Brain; Glia
Link ID: 26251 - Posted: 05.20.2019
By John Horgan In a previous post I summarized my remarks at “Souls or Selfish Genes,” a conversation at Stevens Institute of Technology about religious versus scientific views of humanity. I represented the agnostic position and David Lahti, a biologist and philosopher at the City University of New York, a position more friendly to theism. Below is Lahti’s summary of his opening comments. –John Horgan I’ve been asked to deal with the question of “Souls vs. Selfish Genes”. And whereas I am sure this is a false dichotomy, I’m not quite sure how exactly to fit the two parts of the truth together. But I’ll give you a few thoughts I’ve had about it, which can at least start us off. First, selfish genes. This of course is a reference to Richard Dawkins’ 1976 book of the same name, which is a popular and sensational description of a revolution in our understanding of the way evolution by natural selection operates. Briefly, we discovered in the 1960s-70s that the organismic individual was generally the most important level at which natural selection operates, meaning that evolution by natural selection proceeds primarily via certain individuals in a population reproducing more successfully than others. In fact, this is too simplistic. Hamilton’s theory of kin selection showed that it’s actually below the level of the individual where we really have to concentrate in order to explain certain traits, such as the self-sacrificial stinging of bees and the fact that some young male birds help their mother raise her next brood instead of looking for a mate. Those individuals are not being as selfish as we might predict. © 2019 Scientific American
Keyword: Consciousness; Genes & Behavior
Link ID: 26250 - Posted: 05.20.2019


.gif)

