Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
The plant compound apigenin improved the cognitive and memory deficits usually seen in a mouse model of Down syndrome, according to a study by researchers at the National Institutes of Health and other institutions. Apigenin is found in chamomile flowers, parsley, celery, peppermint and citrus fruits. The researchers fed the compound to pregnant mice carrying fetuses with Down syndrome characteristics and then to the animals after they were born and as they matured. The findings raise the possibility that a treatment to lessen the cognitive deficits seen in Down syndrome could one day be offered to pregnant women whose fetuses have been diagnosed with Down syndrome through prenatal testing. The study appears in the American Journal of Human Genetics. Down syndrome is a set of symptoms resulting from an extra copy or piece of chromosome 21. The intellectual and developmental disabilities accompanying the condition are believed to result from decreased brain growth caused by increased inflammation in the fetal brain. Apigenin is not known to have any toxic effects, and previous studies have indicated that it is an antioxidant that reduces inflammation. Unlike many compounds, it is absorbed through the placenta and the blood brain barrier, the cellular layer that prevents potentially harmful substances from entering the brain. Compared to mice with Down symptoms whose mothers were not fed apigenin, those exposed to the compound showed improvements in tests of developmental milestones and had improvements in spatial and olfactory memory. Tests of gene activity and protein levels showed the apigenin-treated mice had less inflammation and increased blood vessel and nervous system growth. Guedj, F. et al. Apigenin as a candidate prenatal treatment for Trisomy 21: effects in human amniocytes and the Ts1Cje mouse model. American Journal of Human Genetics. 2020.
Keyword: Development of the Brain; Genes & Behavior
Link ID: 27546 - Posted: 10.24.2020
Jon Hamilton Medical research was an early casualty of the COVID-19 pandemic. After cases began emerging worldwide, thousands of clinical trials unrelated to COVID-19 were paused or canceled amid fears that participants would be infected. But now some researchers are finding ways to carry on in spite of the coronavirus. "It's been a struggle of course," says Joshua Grill, who directs the Institute for Memory Impairments and Neurological Disorders at the University of California, Irvine. "But I think there's an imperative for us to find ways to move forward." Grill got a close-up view of the challenge in July when COVID-19 cases were spiking nationwide as he was trying to launch a study. UC Irvine and dozens of other research centers had just begun enrolling participants in the AHEAD study, a global effort that will test whether an investigational drug can slow down the earliest brain changes associated with Alzheimer's disease. Finding individuals willing and able to sign up for this sort of research is difficult even without a pandemic, says Grill, who also co-directs recruitment for the Alzheimer's Clinicals Trial Consortium, funded by the National Institute on Aging. "We're asking people do a lot, including enroll in long studies that require numerous visits," he says, "and in the AHEAD study, taking an investigational drug or placebo that's injected into a vein." Participants will receive either a placebo or a drug called BAN2401, made by Eisai, which is meant to reduce levels of amyloid, a toxic protein associated with Alzheimer's. People in the study will also have positron emission tomography, or PET, scans of their brains to measure changes in amyloid and another toxic protein called tau. © 2020 npr
Keyword: Alzheimers
Link ID: 27545 - Posted: 10.24.2020
By Jeremy Hsu Artificial intelligence could soon help screen for Alzheimer’s disease by analyzing writing. A team from IBM and Pfizer says it has trained AI models to spot early signs of the notoriously stealthy illness by looking at linguistic patterns in word usage. Other researchers have already trained various models to look for signs of cognitive impairments, including Alzheimer’s, by using different types of data, such as brain scans and clinical test results. But the latest work stands out because it used historical information from the multigenerational Framingham Heart Study, which has been tracking the health of more than 14,000 people from three generations since 1948. If the new models’ ability to pick up trends in such data holds up in forward-looking studies of bigger and more diverse populations, researchers say they could predict the development of Alzheimer’s a number of years before symptoms become severe enough for typical diagnostic methods to pick up. And such a screening tool would not require invasive tests or scans. The results of the Pfizer-funded and IBM-run study were published on Thursday in EClinicalMedicine. The new AI models provide “an augmentation to expert practitioners in how you would see some subtle changes earlier in time, before the clinical diagnosis has been achieved,” says Ajay Royyuru, vice president of health care and life sciences research at IBM. “It might actually alert you to some changes that [indicate] you ought to then go do a more complete exam.” To train these models, the researchers used digital transcriptions of handwritten responses from Framingham Heart Study participants who were asked to describe a picture of a woman who is apparently preoccupied with washing dishes while two kids raid a cookie jar behind her back. These descriptions did not preserve the handwriting from the original responses, says Rhoda Au, director of neuropsychology at the Framingham study and a professor at Boston University. © 2020 Scientific American,
Keyword: Alzheimers; Language
Link ID: 27544 - Posted: 10.24.2020
By Bruce Bower A type of bone tool generally thought to have been invented by Stone Age humans got its start among hominids that lived hundreds of thousands of years before Homo sapiens evolved, a new study concludes. A set of 52 previously excavated but little-studied animal bones from East Africa’s Olduvai Gorge includes the world’s oldest known barbed bone point, an implement probably crafted by now-extinct Homo erectus at least 800,000 years ago, researchers say. Made from a piece of a large animal’s rib, the artifact features three curved barbs and a carved tip, the team reports in the November Journal of Human Evolution. Among the Olduvai bones, biological anthropologist Michael Pante of Colorado State University in Fort Collins and colleagues identified five other tools from more than 800,000 years ago as probable choppers, hammering tools or hammering platforms. The previous oldest barbed bone points were from a central African site and dated to around 90,000 years ago (SN: 4/29/95), and were assumed to reflect a toolmaking ingenuity exclusive to Homo sapiens. Those implements include carved rings around the base of the tools where wooden shafts were presumably attached. Barbed bone points found at H. sapiens sites were likely used to catch fish and perhaps to hunt large land prey. The Olduvai Gorge barbed bone point, which had not been completed, shows no signs of having been attached to a handle or shaft. Ways in which H. erectus used the implement are unclear, Pante and his colleagues say. © Society for Science & the Public 2000–2020.
Keyword: Evolution; Learning & Memory
Link ID: 27543 - Posted: 10.24.2020
By James Gorman It’s good to have friends, for humans and chimpanzees. But the nature and number of those friends change over time. In young adulthood, humans tend to have a lot of friendships. But as they age, social circles narrow, and people tend to keep a few good friends around and enjoy them more. This trend holds across many cultures, and one explanation has to do with awareness of one’s own mortality. Zarin P. Machanda, an anthropologist at Tufts University, and her own good friend, Alexandra G. Rosati, a psychologist and anthropologist at the University of Michigan, wondered whether chimpanzees, which they both study, would show a similar pattern even though they don’t seem to have anything like a human sense of their own inevitable death. The idea, in humans, Dr. Machanda said, is that as we get older we think, “I don’t have time for these negative people in my life, or I don’t want to waste my time with all of this negativity.” So we concentrate on a few good friends and invest in them. This explanation is called socioemotional selectivity theory. Dr. Rosati and Dr. Machanda, who is the director of long-term research at the Kibale Chimpanzee Project in Uganda, drew on many years of observations of chimps at Kibale. Along with several colleagues, they reported Thursday in the journal Science that male chimps, at least, display the very same inclinations as humans. The team looked only at interactions of male chimpanzees because males are quite gregarious and form a lot of friendships, whereas females are more tied to family groups. So male relationships were easier to analyze. The finding doesn’t prove or disprove anything about whether knowledge of death is what drives the human behavior. But it does show that our closest primate relative displays the same bonding habits for some other reason, perhaps something about aging that the two species have in common. At the very least, the finding raises questions about humans. © 2020 The New York Times Company
Keyword: Aggression; Stress
Link ID: 27542 - Posted: 10.24.2020
By Meagan Cantwell Although bird brains are tiny, they’re packed with neurons, especially in areas responsible for higher level thinking. Two studies published last month in Science explore the structure and function of avian brains—revealing they are organized similarly to mammals’ and are capable of conscious thought. © 2020 American Association for the Advancement of Science.
Keyword: Evolution; Learning & Memory
Link ID: 27541 - Posted: 10.24.2020
Ashley Yeager Tiroyaone Brombacher sat in her lab at the University of Cape Town watching a video of an albino mouse swimming around a meter-wide tub filled with water. The animal, which lacked an immune protein called interleukin 13 (IL-13), was searching for a place to rest but couldn’t find the clear plexiglass stand that sat at one end of the pool, just beneath the water’s surface. Instead, it swam and swam, crisscrossing the tub several times before finally finding the platform on which to stand. Over and over, in repeated trials, the mouse failed to learn where the platform was located. Meanwhile, wildtype mice learned fairly quickly and repeatedly swam right to the platform. “When you took out IL-13, [the mice] just could not learn,” says Brombacher, who studies the intersection of psychology, neuroscience, and immunology. Curious as to what was going on, Brombacher decided to dissect the mice’s brains and the spongy membranes, called the meninges, that separate neural tissue from the skull. She wanted to know if the nervous system and the immune system were communicating using proteins such as IL-13. While the knockout mice had no IL-13, she reported in 2017 that the meninges of wildtype mice were chock full of the cytokine. Sitting just outside the brain, the immune protein did, in fact, seem to be playing a critical role in learning and memory, Brombacher and her colleagues concluded. As far back as 2004, studies in rodents suggested that neurons and their support cells release signals that allow the immune system to passively monitor the brain for pathogens, toxins, and debris that might form during learning and memory-making, and that, in response, molecules of the immune system could communicate with neurons to influence learning, memory, and social behavior. Together with research on the brain’s resident immune cells, called microglia, the work overturned a dogma, held since the 1940s, that the brain was “immune privileged,” cut off from the immune system entirely. © 1986–2020 The Scientist.
Keyword: Neuroimmunology
Link ID: 27540 - Posted: 10.21.2020
By Rachel Nuwer With their bright saucer eyes, button noses and plump, fuzzy bodies, slow lorises — a group of small, nocturnal Asian primates — resemble adorable, living stuffed animals. But their innocuous looks belie a startling aggression: They pack vicious bites loaded with flesh-rotting venom. Even more surprising, new research reveals that the most frequent recipients of their toxic bites are other slow lorises. “This very rare, weird behavior is happening in one of our closest primate relatives,” said Anna Nekaris, a primate conservationist at Oxford Brookes University and lead author of the findings, published Monday in Current Biology. “If the killer bunnies on Monty Python were a real animal, they would be slow lorises — but they would be attacking each other.” Even before this new discovery, slow lorises already stood out as an evolutionary oddity. Scientists know of just five other types of venomous mammals: vampire bats, two species of shrew, platypuses and solenodons (an insectivorous mammal found in Cuba, the Dominican Republic and Haiti). Researchers are just beginning to untangle the many mysteries of slow loris venom. One key component resembles the protein found in cat dander that triggers allergies in humans. But other unidentified compounds seem to lend additional toxicity and cause extreme pain. Strangely, to produce the venom, the melon-sized primates raise their arms above their head and quickly lick venomous oil-secreting glands located on their upper arms. The venom then pools in their grooved canines, which are sharp enough to slice into bone. “The result of their bite is really, really horrendous,” Dr. Nekaris says. “It causes necrosis, so animals may lose an eye, a scalp or half their face.” © 2020 The New York Times Company
Keyword: Aggression; Neurotoxins
Link ID: 27539 - Posted: 10.21.2020
By Jake Buehler Naked mole-rats — with their subterranean societies made up of a single breeding pair and an army of workers — seem like mammals trying their hardest to live like insects. Nearly 300 of the bald, bucktoothed, nearly blind rodents can scoot along a colony’s labyrinth of tunnels. New research suggests there’s brute power in those numbers: Like ants or termites, the mole-rats go to battle with rival colonies to conquer their lands. Wild naked mole-rats (Heterocephalus glaber) will invade nearby colonies to expand their territory, sometimes abducting pups to incorporate them into their own ranks, researchers report September 28 in the Journal of Zoology. This behavior may put smaller, less cohesive colonies at a disadvantage, potentially supporting the evolution of bigger colonies. Researchers stumbled across this phenomenon by accident while monitoring naked mole-rat colonies in Kenya’s Meru National Park. The team was studying the social structure of this extreme form of group living among mammals (SN: 6/20/06). Over more than a decade, the team trapped and marked thousands of mole-rats from dozens of colonies by either implanting small radio-frequency transponder chips under their skin, or clipping their toes. One day in 1994, while marking mole-rats in a new colony, researchers were surprised to find in its tunnels mole-rats from a neighboring colony that had already been marked. The queen in the new colony had wounds on her face from the ravages of battle. It looked like a war was playing out down in the soil. © Society for Science & the Public 2000–2020.
Keyword: Evolution; Sexual Behavior
Link ID: 27538 - Posted: 10.21.2020
Catherine Offord Overactivation of the brain’s immune cells, called microglia, may play a role in cognitive impairments associated with Down syndrome, according to research published today (October 6) in Neuron. Researchers in Italy identified elevated numbers of the cells in an inflammation-promoting state in the brains of mice with a murine version of the syndrome as well as in postmortem brain tissue from people with the condition. The team additionally showed that drugs that reduce the number of activated microglia in juvenile mice could boost the animals’ performance on cognitive tests. “This is a fabulous study that gives a lot of proof of principle to pursuing some clinical trials in people,” says Elizabeth Head, a neuroscientist at the University of California, Irvine, who was not involved in the work. “The focus on microglial activation, I thought, was very novel and exciting,” she adds, noting that more research will be needed to see how the effects of drugs used in the study might translate from mice to humans. Down syndrome is caused by an extra copy of part or all of human chromosome 21, and is the most commonly occurring chromosomal condition in the US. Children with Down syndrome often experience cognitive delays compared to typically developing children, although there’s substantial variation and the effects are usually mild or moderate. People with the syndrome also have a higher risk of certain medical conditions, including Alzheimer’s disease. © 1986–2020 The Scientist.
Keyword: Development of the Brain; Glia
Link ID: 27537 - Posted: 10.21.2020
By Sundas Hashmi It was the afternoon of Jan. 31. I was preparing for a dinner party and adding final touches to my cheese platter when everything suddenly went dark. I woke up feeling baffled in a hospital bed. My husband filled me in: Apparently, I had suffered a massive seizure a few hours before our guests were to arrive at our Manhattan apartment. Our children’s nanny found me and I was rushed to the hospital. That had been three days earlier. My husband and I were both mystified: I was 37 years old and had always been in excellent health. In due course, a surgeon dropped by and told me I had a glioma, a type of brain tumor. It was relatively huge but operable. I felt sick to my stomach. Two weeks later, I was getting wheeled to the operating theater. I wouldn’t know the pathology until much later. I said my goodbyes to everyone — most importantly to my children, Sofia, 6, and Nyle, 2 — and prepared to die. But right before the surgery, in a very drugged state, I asked the surgeon to please get photos of me and my brother from my husband. I wanted the surgeon to see them. My brother had died two decades earlier from a different kind of brain tumor — a glioblastoma. I was 15 at the time, and he was 18. He died within two years of being diagnosed. Those two years were the worst period of my life. Doctors in my home country of Pakistan refused to take him, saying his case was fatal. So, my parents gathered their savings and flew him to Britain, where he was able to get a biopsy (his tumor was in an inoperable location) and radiation. Afterward, we had to ask people for donations so he could get the gamma knife treatment in Singapore that my parents felt confident would save him. In the end, nothing worked, and he died, taking 18 years of memories with him. © 2020 The New York Times Company
Keyword: Glia
Link ID: 27536 - Posted: 10.21.2020
Shawna Williams In Greek mythology, Orpheus descends to the underworld and persuades Hades to allow him to take his dead wife, Eurydice, back to the realm of the living. Hades agrees, but tells Orpheus that he must not look back until he has exited the underworld. Despite the warning, Orpheus glances behind him on his way out to check whether Eurydice is indeed following him—and loses her forever. The story hints at a dark side to curiosity, a drive to seek certain kinds of knowledge even when doing so is risky—and even if the information serves no practical purpose at the time. In fact, the way people pursue information they’re curious about can resemble the drive to attain more tangible rewards such as food—a parallel that hasn’t been lost on scientists. To investigate the apparent similarity between curiosity and hunger, researchers led by Kou Murayama of the University of Reading in the UK recently devised an experiment to compare how the brain processes desires for food and knowledge, and the risks people are willing to take to satisfy those desires. Beginning in 2016, the team recruited 32 volunteers and instructed them not to eat for at least two hours before coming into the lab. After they arrived, the volunteers’ fingers were hooked up to electrodes that could deliver a weak current, and researchers calibrated the level of electricity to what each participant reported was uncomfortable, but not painful. Then, still hooked up to the electrodes, the volunteers were asked to gamble: they viewed either a photo of a food item or a video of a magician performing a trick, followed by a visual depiction of their odds of “winning” that round (which ranged from 1:6 to 5:6). © 1986–2020 The Scientist.
Keyword: Attention; Obesity
Link ID: 27535 - Posted: 10.21.2020
By Nicholas Bakalar A mother’s psychological distress during pregnancy may increase the risk for asthma in her child, a new study suggests. Researchers had the parents of 4,231 children fill out well-validated questionnaires on psychological stress in the second trimester of pregnancy, and again three years later. The mothers also completed questionnaires at two and six months after giving birth. The study, in the journal Thorax, found that 362 of the mothers and 167 of the fathers had clinically significant psychological distress during the mothers’ pregnancies. When the children were 10 years old, parents reported whether their child had ever been diagnosed with asthma. As an extra measure, the researchers tested the children using forced expiratory volume, or FEV, a standard clinical test of lung function. After controlling for age, smoking during pregnancy, body mass index, a history of asthma and other factors, they found that maternal depression and anxiety during pregnancy was significantly associated with both diagnoses of asthma and poorer lung function in their children. There was no association between childhood asthma and parents’ psychological distress in the years after pregnancy, and no association with paternal psychological stress at any time. “Of course, this could be only one of many causes of asthma,” said the lead author, Dr. Evelien R. van Meel of Erasmus University in Rotterdam, “but we corrected for many confounders, and we saw the effect only in mothers. This seems to suggest that there’s something going on in the uterus. But this is an observational study, and we can’t say that it’s a causal effect.” © 2020 The New York Times Company
Keyword: Depression; Development of the Brain
Link ID: 27534 - Posted: 10.21.2020
By Pam Belluck A potential therapy for amyotrophic lateral sclerosis, a fatal neurological disorder, may allow patients to live several months longer than they otherwise would have, according to a study published Friday. The two-drug combination, dreamed up by two college students, is one of several potential treatments raising the hopes of patients with A.L.S., also known as Lou Gehrig’s disease. The paralytic condition steals people’s ability to walk, speak, eat and ultimately breathe, typically causing death within two to five years. There are only two approved A.L.S. medications, neither tremendously effective. But advocacy efforts by patients and organizations, along with the Ice Bucket Challenge, a highly successful fundraising campaign, have galvanized research into more than 20 therapies that are currently in clinical trials. The two-drug combination, called AMX0035, was conceived seven years ago by Joshua Cohen and Justin Klee, then a junior and senior at Brown University, with the goal of preventing the destruction of neurons that occurs in many brain disorders. It is a combination of an existing supplement and a medication for a pediatric urea disorder. Last month, a study of 137 patients reported that AMX0035 slowed progression of A.L.S. paralysis by about 25 percent more than a placebo. Measuring patients using a scale of physical function, researchers found that those receiving a placebo declined in 18 weeks to a level that patients receiving the treatment didn’t reach until 24 weeks, according to the study’s principal investigator, Dr. Sabrina Paganoni. But because that trial was conducted for only 24 weeks, it left unanswered a crucial question of whether the treatment extended survival for the patients receiving the therapy. After that study ended, 98 of the participants, who had not been told whether they had received placebo or therapy, were given the option of taking the therapy for up to 30 months, a format called an open-label extension study. © 2020 The New York Times Company
Keyword: ALS-Lou Gehrig's Disease
Link ID: 27533 - Posted: 10.19.2020
By Laurie Archbald-Pannone The number of cases of dementia in the United States is rising as baby boomers age, raising questions for boomers themselves and also for their families, caregivers and society. Dementia, which is not technically a disease but a term for impaired ability to think, remember or make decisions, is one of the most feared impairments of old age. Incidence increases dramatically as people move into their 90s. About 5 percent of those 71 to 79 have dementia, and about 37 percent of those about 90 live with it. Older people may worry about their own loss of function as well as the cost and toll of caregiving for someone with dementia. A 2018 study estimated that the lifetime cost of care for a person with Alzheimer’s, the most common form of dementia, to be $329,360. That figure, too, will no doubt rise, putting even more burdens on family, Medicare and Medicaid. There’s also been a good deal of talk and reporting about dementia in recent months because of the presidential election. Some voters have asked whether one or both candidates might have dementia. But is this even a fair question to ask? When these types of questions are posed — adding further stigma to people with dementia — it can unfairly further isolate them and those caring for them. We need to understand dementia and the impact it has on more than 5 million people in the United States who now live with dementia and their caregivers. That number is expected to triple by 2060. First, it is important to know that dementia cannot be diagnosed from afar or by someone who is not a doctor. A person needs a detailed doctor’s exam for a diagnosis. Sometimes, brain imaging is required. And, forgetting an occasional word — or even where you put your keys — does not mean a person has dementia. There are different types of memory loss and they can have different causes, such as other medical conditions, falls or even medication, including herbals, supplements and anything over-the-counter. © 1996-2020 The Washington Post
Keyword: Alzheimers
Link ID: 27532 - Posted: 10.19.2020
By John Horgan One of the most impressive, disturbing works of science journalism I’ve encountered is Anatomy of an Epidemic: Magic Bullets, Psychiatric Drugs, and the Astonishing Rise of Mental Illness in America, published in 2010. In the book, which I review here, award-winning journalist Robert Whitaker presents evidence that medications for mental illness, over time and in the aggregate, cause net harm. In 2012, I brought Whitaker to my school to give a talk, in part to check him out. He struck me as a smart, sensible, meticulous reporter whose in-depth research had led him to startling conclusions. Since then, far from encountering persuasive rebuttals of Whitaker’s thesis, I keep finding corroborations of it. If Whitaker is right, modern psychiatry, together with the pharmaceutical industry, has inflicted iatrogenic harm on millions of people. Reports of surging mental distress during the pandemic have me thinking once again about Whitaker’s views and wondering how they have evolved. Below he answers some questions. —John Horgan Horgan: When and why did you start reporting on mental health? Whitaker: It came about in a very roundabout way. In 1994, I had co-founded a publishing company called CenterWatch that covered the business aspects of the “clinical trials industry,” and I soon became interested in writing about how financial interests were corrupting drug trials. Risperdal and Zyprexa had just come to market, and after I used a Freedom of Information request to obtain the FDA’s review of those two drugs, I could see that psychiatric drug trials were a prime example of that corruption. In addition, I had learned of NIMH-funded research that seemed abusive of schizophrenia patients, and in 1998, I co-wrote a series for the Boston Globe on abuses of patients in psychiatric research. My interest was in that broader question of corruption and abuse in research settings, and not specific to psychiatry. © 2020 Scientific American
Keyword: Depression; Schizophrenia
Link ID: 27531 - Posted: 10.19.2020
Moles have a pretty tough life. They live underground, in the dark, burrowing through heavy dirt. And when faced with an enemy, there's nowhere to turn — they have to fight. In most mammals, females tend to be at a disadvantage when it comes to face-to-face combat, because they tend to be smaller and less aggressive than males. But female moles have evolved a secret weapon: a hybrid organ made up of both ovarian and testicular tissue. This effectively makes them intersex, giving them an extra dose of testosterone to make them just as muscular and aggressive as male moles. "As a consequence, basically the whole body of the female, they get masculinized," geneticist Darío Lupiáñez told Quirks & Quarks host Bob McDonald. "They become the body builders of nature." Lupiáñez co-led a study to understand how the moles' genes facilitated this advantage, which was recently published in the journal Science. The research was part of a collaboration between the Max Planck Institute for Molecular Genetics and the Max Delbrück Center for Molecular Medicine in the Helmholtz Association in Germany. Same genes, different instructions The team worked with Iberian moles, commonly found in Spain and Portugal, however this intersex adaptation has been documented in at least six mole species. "We know that intersexuality happens in species like humans, dogs or cats. But the difference actually in moles, it happens all the time, so all the females are intersexual. And this is really something unique among mammals," said Lupiáñez. To understand how moles evolved these intersexual traits, researchers fully mapped the genome of the Iberian mole, commonly found in Spain and Portugal. (David Carmona, Department of Genetics, University of Granada, Spain ) ©2020 CBC/Radio-Canada.
Keyword: Sexual Behavior; Hormones & Behavior
Link ID: 27530 - Posted: 10.19.2020
By Benedict Carey Scott Lilienfeld, an expert in personality disorders who repeatedly disturbed the order in his own field, questioning the science behind many of psychology’s conceits, popular therapies and prized tools, died on Sept. 30 at his home in Atlanta. He was 59. The cause was pancreatic cancer, his wife, Candice Basterfield, said. Dr. Lilienfeld’s career, most of it spent at Emory University in Atlanta, proceeded on two tracks: one that sought to deepen the understanding of so-called psychopathic behavior, the other to expose the many faces of pseudoscience in psychology. Psychopathy is characterized by superficial charm, grandiosity, pathological lying and a lack of empathy. Descriptions of the syndrome were rooted in research in the criminal justice system, where psychopaths often end up. In the early 1990s, Dr. Lilienfeld worked to deepen and clarify the definition. In a series of papers, he anchored a team of psychologists who identified three underlying personality features that psychopaths share, whether they commit illegal acts or not: fearless dominance, meanness and impulsivity. The psychopath does what he or she wants, without anxiety, regret or regard for the suffering of others. “When you have these three systems interacting, it’s a bad brew, and it creates the substrate for what can become psychopathy,” said Mark F. Lenzenweger, a professor of psychology at the State University of New York at Binghamton. “This was Scott’s great contribution: He helped change the thinking about psychopathy, in a profound way, by focusing on aspects of personality, rather than on a list of bad behaviors.” Dr. Lilienfeld’s parallel career encompassed clinical psychology and aimed to shake it free of empty theorizing, softheadedness and bad practice. In the late 1990s and early 2000s, he led a loose group of researchers who began to question the validity of some of the field’s favored constructs, like repressed memories of abuse and multiple personality disorder. The Rorschach inkblot test took a direct hit as largely unreliable. The group also attacked treatments including psychological debriefing and eye movement desensitization and reprocessing, or E.M.D.R., both of which are used for trauma victims. © 2020 The New York Times Company
Keyword: Aggression; Learning & Memory
Link ID: 27529 - Posted: 10.19.2020
By Lydia Denworth, Spectrum, Brendan Borrell, Allyson Berent is a specialty veterinarian in New York City. She treats animals that other doctors cannot help. When no good therapies are available, she invents one. Cats and dogs consumed almost all of her time—until 6 years ago, when her second daughter was born. As a baby, Quincy appeared healthy and happy, smiling at an early age and giggling frequently. But during her first few months of life, she missed many developmental milestones: At 10 weeks, she was not making eye contact. When her parents waved toys in front of her, she stared blankly. She had trouble feeding. And when she was lying on her stomach, she could not lift her head. Doctors kept telling Berent and her husband to give it time, but the couple insisted on genetic testing: At 7 months old, their daughter was diagnosed with Angelman syndrome, a neurodevelopmental condition that affects as many as one in 12,000 people. Most people with Angelman syndrome have severe intellectual disability. They never talk or live an independent life. They experience seizures, gut issues, and sleeping and feeding difficulties. Due to balance and motor problems, they are usually unable or barely able to walk. Many also meet the diagnostic criteria for autism. Within days of learning her daughter’s diagnosis, Berent set herself a new goal: curing Quincy. With her medical background, she had no trouble parsing the scientific research on Angelman syndrome. She learned that it stems from a missing or mutated copy of a gene called UBE3A, which generates a protein essential for healthy brain activity. People inherit two copies of UBE3A, one from each parent, but the paternal copy is typically silent. In about 70% of people with Angelman, the maternal copy is absent, and they produce none of the protein. Many others with the syndrome have a small mutation in the mother’s copy, rendering it ineffective. © 2020 American Association for the Advancement of Science.
Keyword: Autism; Development of the Brain
Link ID: 27528 - Posted: 10.16.2020
Jon Hamilton Researchers appear to have shown how the brain creates two different kinds of thirst. The process involves two types of brain cells, one that responds to a decline in fluid in our bodies, while the other monitors levels of salt and other minerals, a team reports in the journal Nature. Together, these specialized thirst cells seem to determine whether animals and people crave pure water or something like a sports drink, which contains salt and other minerals. "Our brain can detect these two distinct stimuli with different cell types," says Yuki Oka, a professor of biology at Caltech and the study's lead author. The finding appears to help answer "this question that we've been trying to ask for decades and decades and decades," says Sean Stocker, a professor at the University of Pittsburgh who studies water and salt balance in the body. Stocker was not involved in the study. Oka's research is part of an effort to understand the brain biology underlying behavior that's seen in people and many animals. Article continues after sponsor message For example, people who've just finished a long, sweaty workout often experience a special kind of thirst. "Pure water doesn't do it, right? It's not enough," Oka says. "You need water and salt to recover. And we can easily imagine that under such condition, we crave [a] sport drink." Sports drinks like Gatorade generally include a mix of salt and sugar, as well as water. To understand what triggers this type of thirst, Oka's team studied cells in two regions of mouse brains. Both regions are known to contain neurons involved in the sensation of thirst. © 2020 npr
Keyword: Miscellaneous
Link ID: 27527 - Posted: 10.16.2020


.gif)

