Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 21897

By PAM BELLUCK “Has the person become agitated, aggressive, irritable, or temperamental?” the questionnaire asks. “Does she/he have unrealistic beliefs about her/his power, wealth or skills?” Or maybe another kind of personality change has happened: “Does she/he no longer care about anything?” If the answer is yes to one of these questions — or others on a new checklist — and the personality or behavior change has lasted for months, it could indicate a very early stage of dementia, according to a group of neuropsychiatrists and Alzheimer’s experts. They are proposing the creation of a new diagnosis: mild behavioral impairment. The idea is to recognize and measure something that some experts say is often overlooked: Sharp changes in mood and behavior may precede the memory and thinking problems of dementia. The group made the proposal on Sunday at the Alzheimer’s Association International Conference in Toronto, and presented a 38-question checklist that may one day be used to identify people at greater risk for Alzheimer’s. “I think we do need something like this,” said Nina Silverberg, the director of the Alzheimer’s Disease Centers program at the National Institute on Aging, who was not involved in creating the checklist or the proposed new diagnosis. “Most people think of Alzheimer’s as primarily a memory disorder, but we do know from years of research that it also can start as a behavioral issue.” Under the proposal, mild behavioral impairment (M.B.I.) would be a clinical designation preceding mild cognitive impairment (M.C.I.), a diagnosis created more than a decade ago to describe people experiencing some cognitive problems but who can still perform most daily functions. © 2016 The New York Times Company

Keyword: Alzheimers
Link ID: 22480 - Posted: 07.26.2016

By Sharon Begley, STAT For the first time ever, researchers have managed to reduce people’s risk for dementia — not through a medicine, special diet, or exercise, but by having healthy older adults play a computer-based brain-training game. The training nearly halved the incidence of Alzheimer’s disease and other devastating forms of cognitive and memory loss in older adults a decade after they completed it, scientists reported on Sunday. If the surprising finding holds up, the intervention would be the first of any kind — including drugs, diet, and exercise — to do that. “I think these results are highly, highly promising,” said George Rebok of the Johns Hopkins Bloomberg School of Public Health, an expert on cognitive aging who was not involved in the study. “It’s exciting that this intervention pays dividends so far down the line.” The results, presented at the Alzheimer’s Association International Conference in Toronto, come from the government-funded ACTIVE (Advanced Cognitive Training for Independent and Vital Elderly) study. Starting in 1998, ACTIVE’s 2,832 healthy older adults (average age at the start: 74) received one of three forms of cognitive training, or none, and were evaluated periodically in the years after. In actual numbers, 14 percent of ACTIVE participants who received no training had dementia 10 years later, said psychologist Jerri Edwards of the University of South Florida, who led the study. Among those who completed up to 10 60-to-75-minute sessions of computer-based training in speed-of-processing — basically, how quickly and accurately they can pay attention to, process, and remember brief images on a computer screen — 12.1 percent developed dementia. Of those who completed all 10 initial training sessions plus four booster sessions a few years later, 8.2 percent developed dementia. © 2016 Scientific American

Keyword: Alzheimers; Learning & Memory
Link ID: 22479 - Posted: 07.26.2016

By Tim Page When I returned to California, I brought my diaries into the back yard every afternoon and read them through sequentially, with the hope of learning more about the years before my brain injury. I remembered much of what I’d done professionally, and whatever additional information I needed could usually be found on my constantly vandalized Wikipedia page. Here was the story of an awkward, imperious child prodigy who made his own films and became famous much too early; a music explainer who won a Pulitzer Prize; a driven and obsessive loner whose fascinations led to collaborations with Glenn Gould, Philip Glass and Thomas Pynchon. In 2000, at age 45, I was diagnosed with Asperger’s syndrome. In retrospect, the only surprise is that it took so long. But the diaries offered a more intimate view. Reading them was slow going, and I felt as though my nose was pressed up against the windowpane of my own life. The shaggy-dog accretion of material — phone numbers, long-ago concert dates, coded references to secret loves — all seemed to belong to somebody else. My last clear memory was of a muggy, quiet Sunday morning in July, three months earlier, as I waited for a train in New London, Conn. It was 11:13 a.m., and the train was due to arrive two minutes later. I was contented, proud of my punctuality and expecting an easy ride to New York in the designated “quiet car,” with just enough time to finish whatever book I was carrying. There would be dinner in Midtown with a magical friend, followed by overnight family visits in Baltimore and Washington, and then a flight back to Los Angeles and the University of Southern California, at which point a sabbatical semester would be at an end.

Keyword: Stroke; Learning & Memory
Link ID: 22478 - Posted: 07.26.2016

Dean Burnett On July 31st 2016, this blog will have been in existence for four years exactly. A huge thanks to everyone who’s made the effort to read it in that time (an alarming number of you). Normally there’d be a post on the day to mark the occasion, but this year the 31st is a) a Sunday, and b) my birthday, so even if I could be bothered to work that day, it’s unlikely anyone would want to read it. However, today also marks the ridiculously-unlikely-but-here-we-are American release of my book. How did it get to this point? I’ve been a “professional” science writer now for four years, and I’ve been involved in neuroscience, in one guise or another, since 2000, the year I started my undergraduate degree. In that time, I’ve heard/encountered some seriously bizarre claims about how the brain works. Oftentimes it was me not understanding what was being said, or misinterpreting a paper, or just my own lack of competence. Sometimes, it was just a media exaggeration. However, there have been occasions when a claim made about the brain thwarts all my efforts to find published evidence or even a rational basis for it, leaving me scratching my head and wondering “where the hell did THAT come from?” Here are some of my favourites. In the past, one terabyte of storage capacity would have seemed incredibly impressive. But Moore’s law has put paid to that. My home desktop PC presently has 1.5 TB of storage space, and that’s over seven years old. Could my own clunky desktop be, in terms of information capacity, smarter than me? Apparently. Some estimates put the capacity of the human brain as low as 1TB. A lifetimes worth of memories wouldn’t fill a modern-day hard drive? That seems far-fetched, at least at an intuitive level.

Keyword: Development of the Brain
Link ID: 22477 - Posted: 07.26.2016

By Dave Dormer, Transporting babies deprived of oxygen at birth to a neonatal intensive care unit in Calgary will soon be safer thanks to a new portable cooling device. The Foothills hospital is one of the first facilities in Canada to acquire one and doctors hope it will help prevent brain injuries, as reducing a baby's temperature can prevent damage to brain tissue and promote healing. The reduction in temperature is called therapeutic hypothermia, and it can help prevent damage to brain tissue and promote healing. (Evelyne Asselin/CBC) "The period immediately following birth is critical. We have about a six-hour window to lower these babies' temperatures to prevent neurological damage," said Dr. Khorshid Mohammad, the neonatal neurocritical care project lead who spearheaded the initiative. "The sooner we can do so, and the more consistent we can make the temperature, the more protective it is and the better their chances of surviving without injury." Since about 2008, doctors used cooling blankets and gel packs to lower a baby's temperature to 33.5 C from the normal 37 C for 72 hours in order to prevent brain damage. "With those methods, it can be difficult to maintain a stable temperature," said Mohammad. ©2016 CBC/Radio-Canada.

Keyword: Development of the Brain
Link ID: 22476 - Posted: 07.26.2016

By Ann Grisold, Oscar, 6, sits at the family dinner table and endures the loneliest hour of his day. The room bustles with activity: Oscar’s sister passes plates and doles out broccoli florets. His father and uncle exchange playful banter. Oscar’s mother emerges from the kitchen carrying a platter of carved meat; a cousin pulls up an empty chair. “Chi fan le!” shouts Oscar’s older sister, in Mandarin Chinese. Time for dinner! “Hao,” her grandfather responds from the other room. Okay. Family members tell stories and rehash the day, all in animated Chinese. But when they turn to Oscar, who has autism, they speak in English. “Eat rice,” Oscar’s father says. “Sit nice.” Except there is no rice on the table. In Chinese, ‘eat rice’ can refer to any meal, but its meaning is lost in translation. Pediatricians, educators and speech therapists have long advised multilingual families to speak one language — the predominant one where they live — to children with autism or other developmental delays. The reasoning is simple: These children often struggle to learn language, so they’re better off focusing on a single one. However, there are no data to support this notion. In fact, a handful of studies show that children with autism can learn two languages as well as they learn one, and might even thrive in multilingual environments. Lost in translation: It’s not just children with autism who miss out when parents speak only English at home — their families, too, may experience frustrating miscommunications. Important instructions, offhand remarks and words of affection are often lost in translation when families swap their heritage language for English, says Betty Yu, associate professor of special education and communicative disorders at San Francisco State University. © 2016 Scientific American,

Keyword: Autism; Language
Link ID: 22475 - Posted: 07.26.2016

By Andy Coghlan The final brain edit before adulthood has been observed for the first time. MRI scans of 300 adolescents and young adults have shown how the teenage brain upgrades itself to become quicker – but that errors in this process may lead to schizophrenia in later life. The editing process that takes place in teen years seems to select the brain’s best connections and networks, says Kirstie Whitaker at the University of Cambridge. “The result is a brain that’s sleeker and more efficient.” When Whitaker and her team scanned brains from people between the ages of 14 and 24, they found that two major changes take place in the outer layer of the brain – the cortex – at this time. As adolescence progresses, this layer of grey matter gets thinner – probably because unwanted or unused connections between neurons – called synapses – are pruned back. At the same time, important neurons are upgraded. The parts of these cells that carry signals down towards synapses are given a sheath that helps them transmit signals more quickly – a process called myelination. “It may be that pruning and myelination are part of the maturation of the brain,” says Steven McCarroll at Harvard Medical School. “Pruning involves removing the connections that are not used, and myelination takes the ones that are left and makes them faster,” he says. McCarroll describes this as a trade-off – by pruning connections, we lose some flexibility in the brain, but the proficiency of signal transmission improves. © Copyright Reed Business Information Ltd.

Keyword: Development of the Brain
Link ID: 22474 - Posted: 07.26.2016

By Lizzie Wade Neandertals and modern humans had a lot in common—at least enough to have babies together fairly often. But what about their brains? To answer that question, scientists have looked at how Neandertal and modern human brains developed during the crucial time of early childhood. In the first year of life, modern human infants go through a growth spurt in several parts of the brain: the cerebellum, the parietal lobes, and the temporal lobes—key regions for language and social interaction. Past studies suggested baby Neandertal brains developed more like the brains of chimpanzees, without concentrated growth in any particular area. But a new study casts doubt on that idea. Scientists examined 15 Neandertal skulls, including one newborn and a pair of children under the age of 2. By carefully imaging the skulls, the team determined that Neandertal temporal lobes, frontal lobes, and cerebellums did, in fact, grow faster than the rest of the brain in early life, a pattern very similar to modern humans, they report today in Current Biology. Scientists had overlooked that possibility, the researchers say, because Neandertals and Homo sapiens have such differently shaped skulls. Modern humans’ rounded skull is a telltale marker of the growth spurt, for example, whereas Neandertals’ skulls were relatively flat on the top. If Neandertals did, in fact, have fast developing cerebellums and temporal and frontal lobes, they might have been more skilled at language and socializing than assumed, scientists say. This could in turn explain how the children of Neandertal–modern human pairings fared well enough to pass down their genes to so many us living today. © 2016 American Association for the Advancement of Science

Keyword: Evolution; Development of the Brain
Link ID: 22473 - Posted: 07.26.2016

By Jessica Boddy Ever wonder what it looks like when brain cells chat up a storm? Researchers have found a way to watch the conversation in action without ever cracking open a skull. This glimpse into the brain’s communication system could open new doors to diagnosing and treating disorders from epilepsy to Alzheimer’s disease. Being able to see where—and how—living brain cells are working is “the holy grail in neuroscience,” says Howard Federoff, a neurologist at Georgetown University in Washington, D.C., who was not involved with the work. “This is a possible new tool that could bring us closer to that.” Neurons, which are only slightly longer than the width of a human hair, are laid out in the brain like a series of tangled highways. Signals must travel down these highways, but there’s a catch: The cells don’t actually touch. They’re separated by tiny gaps called synapses, where messages, with the assistance of electricity, jump from neuron to neuron to reach their destinations. The number of functional synapses that fire in one area—a measure known as synaptic density—tends to be a good way to figure out how healthy the brain is. Higher synaptic density means more signals are being sent successfully. If there are significant interruptions in large sections of the neuron highway, many signals may never reach their destinations, leading to disorders like Huntington disease. The only way to look at synaptic density in the brain, however, is to biopsy nonliving brain tissue. That means there’s no way for researchers to investigate how diseases like Alzheimer’s progress—something that could hold secrets to diagnosis and treatment. © 2016 American Association for the Advancement of Science

Keyword: Brain imaging
Link ID: 22472 - Posted: 07.23.2016

By ANNA WEXLER EARLIER this month, in the journal Annals of Neurology, four neuroscientists published an open letter to practitioners of do-it-yourself brain stimulation. These are people who stimulate their own brains with low levels of electricity, largely for purposes like improved memory or learning ability. The letter, which was signed by 39 other researchers, outlined what is known and unknown about the safety of such noninvasive brain stimulation, and asked users to give careful consideration to the risks. For the last three years, I have been studying D.I.Y. brain stimulators. Their conflict with neuroscientists offers a fascinating case study of what happens when experimental tools normally kept behind the closed doors of academia — in this case, transcranial direct current stimulation — are appropriated for use outside them. Neuroscientists began experimenting in earnest with transcranial direct current stimulation about 15 years ago. In such stimulation, electric current is administered at levels that are hundreds of times less than those used in electroconvulsive therapy. To date, more than 1,000 peer-reviewed studies of the technique have been published. Studies have suggested, among other things, that the stimulation may be beneficial for treating problems like depression and chronic pain as well as enhancing cognition and learning in healthy individuals. The device scientists use for stimulation is essentially a nine-volt battery attached to two wires that are connected to electrodes placed at various spots on the head. A crude version can be constructed with just a bit of electrical know-how. Consequently, as reports of the effects of the technique began to appear in scientific journals and in newspapers, people began to build their own devices at home. By late 2011 and early 2012, diagrams, schematics and videos began to appear online. © 2016 The New York Times Company

Keyword: ADHD
Link ID: 22471 - Posted: 07.23.2016

By Knvul Sheikh Although millions of women use hormone therapy, those who try it in hopes of maintaining sharp memory and preventing the fuzzy thinking sometimes associated with menopause may be disappointed. A new study indicates that taking estrogen does not significantly affect verbal memory and other mental skills. “There is no change in cognitive abilities associated with estrogen therapy for postmenopausal women, regardless of their age,” says Victor Henderson, a neurologist at Stanford University and the study’s lead author. Evidence of positive and negative effects of such hormone therapy has ping-ponged over the years, with some observational studies in postmenopausal women and research in animal models, suggesting it improves cognitive function and memory. But other previous research, including a long-term National Institutes of Health Women’s Health Initiative memory study published in 2004, has suggested that taking estrogen increases the risk of cognitive impairment and dementia in women over 65 years old. Henderson says one explanation for these contradictory findings may be that after menopause begins there is a “critical period” in which hormone therapy could still benefit relatively young women—if they start early enough. So in their study, which appears in the July 20 online Neurology, Henderson and his team recruited 567 healthy women, between ages 41 and 84, to examine how estrogen affected one group whose members were within six years of their last menstrual period and another whose members had started menopause at least 10 years earlier. © 2016 Scientific American

Keyword: Hormones & Behavior; Attention
Link ID: 22470 - Posted: 07.23.2016

By Emma Bryce In 1999, neuroscientist Gero Miesenböck dreamed of using light to expose the brain's inner workings. Two years later, he invented optogenetics, a technique that fulfils this goal: by genetically engineering cells to contain proteins that make them light-responsive, Miesenböck found he could shine light at the brain and trigger electrical activity in those cells. This technique gave scientists the tools to activate and control specific cell populations in the brain, for the first time. For example, Miesenböck, who directs the Centre for Neural Circuits and Behaviour at the University of Oxford, first used optogenetics to activate courtship responses in fruit flies, and even make headless flies take flight - groundbreaking experiments that allowed him to examine, in unprecedented detail, how neurons drive behaviour. Gero Miesenböck: There was almost a "eureka" moment. As is often the case, you tend to have your best ideas when you're not trying to have them: suddenly I had this idea - which I must have been incubating for a long time, because I was thinking about manipulating neurons in the brain genetically to emit light so I could visualise their activity. Suddenly I thought, "What if we just turn the thing upside down, and instead of reading activity, write activity using light and genetics?" That was the real breakthrough idea, and then of course came the big challenge of having to make it work. Brains are composed of many different kinds of nerve cells, and they are genetically distinct from one another. To deconstruct how the brain works we need to pinpoint the roles these individual classes of cells play in processing information. Optogenetics uses the genetic signatures that define individual cell types to address them selectively in the intact brain - that's the "genetics" component. The "opto" component is to use these genetic signatures to place light-sensitive molecules that are encoded in DNA within these cells.

Keyword: Sleep
Link ID: 22469 - Posted: 07.23.2016

By NATALIE ANGIER Their word is their bond, and they do what they say — even if the “word” on one side is a loud trill and grunt, and, on the other, the excited twitterings of a bird. Researchers have long known that among certain traditional cultures of Africa, people forage for wild honey with the help of honeyguides — woodpecker-like birds that show tribesmen where the best beehives are hidden, high up in trees. In return for revealing the location of natural honey pots, the birds are rewarded with the leftover beeswax, which they eagerly devour. Now scientists have determined that humans and their honeyguides communicate with each other through an extraordinary exchange of sounds and gestures, which are used only for honey hunting and serve to convey enthusiasm, trustworthiness and a commitment to the dangerous business of separating bees from their hives. The findings cast fresh light on one of only a few known examples of cooperation between humans and free-living wild animals, a partnership that may well predate the love affair between people and their domesticated dogs by hundreds of thousands of years. Claire N. Spottiswoode, a behavioral ecologist at Cambridge University, and her colleagues reported in the journal Science that honeyguides advertise their scout readiness to the Yao people of northern Mozambique by flying up close while emitting a loud chattering cry. For their part, the Yao seek to recruit and retain honeyguides with a distinctive vocalization, a firmly trilled “brrr” followed by a grunted “hmm.” In a series of careful experiments, the researchers then showed that honeyguides take the meaning of the familiar ahoy seriously. The birds were twice as likely to offer sustained help to Yao foragers who walked along while playing recordings of the proper brrr-hmm signal than they were to participants with recordings of normal Yao words or the sounds of other animals. © 2016 The New York Times Company

Keyword: Animal Communication; Evolution
Link ID: 22468 - Posted: 07.23.2016

By Tanya Lewis Scientists have made significant progress toward understanding how individual memories are formed, but less is known about how multiple memories interact. Researchers from the Hospital for Sick Children in Toronto and colleagues studied how memories are encoded in the amygdalas of mice. Memories formed within six hours of each other activate the same population of neurons, whereas distinct sets of brain cells encode memories formed farther apart, in a process whereby neurons compete with their neighbors, according to the team’s study, published today (July 21) in Science. “Some memories naturally go together,” study coauthor Sheena Josselyn of the Hospital for Sick Children told The Scientist. For example, you may remember walking down the aisle at your wedding ceremony and, later, your friend having a bit too much to drink at the reception. “We’re wondering about how these memories become linked in your mind,” Josselyn said. When the brain forms a memory, a group of neurons called an “engram” stores that information. Neurons in the lateral amygdala—a brain region involved in memory of fearful events—are thought to compete with one another to form an engram. Cells that are more excitable or have higher expression of the transcription factor CREB—which is critical for the formation of long-term memories—at the time the memory is being formed will “win” this competition and become part of a memory. © 1986-2016 The Scientist

Keyword: Learning & Memory
Link ID: 22467 - Posted: 07.23.2016

Carl Zimmer The brain looks like a featureless expanse of folds and bulges, but it’s actually carved up into invisible territories. Each is specialized: Some groups of neurons become active when we recognize faces, others when we read, others when we raise our hands. On Wednesday, in what many experts are calling a milestone in neuroscience, researchers published a spectacular new map of the brain, detailing nearly 100 previously unknown regions — an unprecedented glimpse into the machinery of the human mind. Scientists will rely on this guide as they attempt to understand virtually every aspect of the brain, from how it develops in children and ages over decades, to how it can be corrupted by diseases like Alzheimer’s and schizophrenia. “It’s a step towards understanding why we’re we,” said David Kleinfeld, a neuroscientist at the University of California, San Diego, who was not involved in the research. Scientists created the map with advanced scanners and computers running artificial intelligence programs that “learned” to identify the brain’s hidden regions from vast amounts of data collected from hundreds of test subjects, a far more sophisticated and broader effort than had been previously attempted. While an important advance, the new atlas is hardly the final word on the brain’s workings. It may take decades for scientists to figure out what each region is doing, and more will be discovered in coming decades. “This map you should think of as version 1.0,” said Matthew F. Glasser, a neuroscientist at Washington University School of Medicine and lead author of the new research. “There may be a version 2.0 as the data get better and more eyes look at the data. We hope the map can evolve as the science progresses.” © 2016 The New York Times Company

Keyword: Brain imaging
Link ID: 22466 - Posted: 07.21.2016

Ian Sample Science editor When the German neurologist Korbinian Brodmann first sliced and mapped the human brain more than a century ago he identified 50 distinct regions in the crinkly surface called the cerebral cortex that governs much of what makes us human. Now researchers have updated the 100-year-old map in a scientific tour de force which reveals that the human brain has at least 180 different regions that are important for language, perception, consciousness, thought, attention and sensation. The landmark achievement hands neuroscientists their most comprehensive map of the cortex so far, one that is expected to supersede Brodmann’s as the standard researchers use to talk about the various areas of the brain. Scientists at Washington University in St Louis created the map by combining highly-detailed MRI scans from 210 healthy young adults who had agreed to take part in the Human Connectome Project, a massive effort that aims to understand how neurons in the brain are connected. Most previous maps of the human brain have been created by looking at only one aspect of the tissues, such as how the cells look under a microscope, or how active areas become when a person performs a certain task. But maps made in different ways do not always look the same, which casts doubt on where one part of the brain stops and another starts. Writing in the journal Nature, Matthew Glasser and others describe how they combined scans of brain structure, function and connectivity to produce the new map, which confirmed the existence of 83 known brain regions and added 97 new ones. Some scans were taken while patients simply rested in the machine, while others were recorded as they performed maths tasks, listened to stories, or categorised objects, for example by stating whether an image was of a tool or an animal. © 2016 Guardian News and Media Limited

Keyword: Brain imaging
Link ID: 22465 - Posted: 07.21.2016

By Minaz Kerawala, For years, gamers, athletes and even regular people trying to improving their memory have resorted, with electrified enthusiasm, to "brain zapping" to gain an edge. The procedure, called transcranial direct current stimulation (tDCS), uses a battery and electrodes to deliver electrical pulses to the brain, usually through a cap or headset fitted close to the scalp. Proponents say these currents are beneficial for a range of neurological conditions like Alzheimer's and Parkinson's diseases, stroke and schizophrenia, but experts are warning that too little is known about the safety of tDCS. "You might end up with a placement of electrodes that doesn't do what you think it does and could potentially have long-lasting effects," said Matthew Krause, a neuroscientist at the Montreal Neurological Institute. All functions of the brain—thought, emotion and coordination—are carried out by neurons using pulses of electricity. "The objective of all neuroscience is to influence these electrical processes," Krause said. The brain's activity can be influenced by drugs that alter its electrochemistry or by external external electric fields. While mind-altering headsets may seem futuristic, tDCS is not a new procedure. Much of the pioneering work in the field was done in Montreal by Dr. Wilder Penfield in the 1920s and 30s. ©2016 CBC/Radio-Canada.

Keyword: Alzheimers
Link ID: 22464 - Posted: 07.21.2016

You drift off to dreamland just fine but then something, a noise, a partner's tossing and turning, jars you awake. Now your mind races with an ever expanding to-do list of worries that you can't shut off. When the alarm buzzes, you start the day feeling grouchy and slightly dazed. Nearly six in 10 Canadians say they wake up feeling tired. About 40 per cent of Canadians will exhaust themselves with a sleep disorder at some point in their lifetime, studies suggest. It's common for people to wake up in the middle of the night. What's important is not to let it snowball, sleep specialists say. Our sleep cycles include brief periods of wakefulness but deep sleep makes us forget about these awakenings. "It's normal to have one or two a night," said Dr. Brian Murray, a sleep neurologist at Sunnybrook Health Sciences Centre and a professor at the University of Toronto. "It's when it's multiple that I worry." Sleep experts say if someone wakes up multiple times a night, it's a red flag. Chronic sleep problems are linked to heart disease, high blood pressure and some cancers. It can also affect hormone levels, which increases the risk of obesity and Type 2 diabetes, sleep specialists say. Julie Snyder of Toronto said she has stretches of days or weeks when she'll consistently wake up at 1:15 a.m., and again at 4 a.m. The broken sleep leaves her feeling short on patience. ©2016 CBC/Radio-Canada.

Keyword: Sleep
Link ID: 22463 - Posted: 07.21.2016

By David Levine Almost seven percent of U.S. adults—about 15.7 million people—are diagnosed with major depression disorder, according to the National Institute of Mental Health (NIMH). The Centers for Disease Control and Prevention report that depression causes 200 million lost workdays each year at a cost to employers of between $17 billion and $44 billion. The statistics for anxiety disorders are not great either. The most common mental illnesses in the U.S., they affect 40 million adults age 18 and older, costing the economy more than $42 billion a year. In my twenties, I developed panic disorder. I failed to get better on most medications and therapy. As I reported in an article earlier this year, it took me years to find a medication that worked. Because it took me so long to be diagnosed and treated properly, I have always been interested in alternative treatments for depression and anxiety. Two years ago I attended two sessions at the World Science Festival on the use of electrical therapy to treat depression and anxiety. The first event was Spark of Genius? Awakening a Better Brain, a panel discussion moderated by ABC News Chief Health & Medical Editor Richard Besser. The panel discussed what is known about treating the brain and the ethical and legal complications of brain enhancement. (You can watch it online at the World Science Festival website.) The second panel, "Electric Medicine and the Brain" was moderated by John Rennie, former editor in chief of Scientific American His panel focused on the use of "electroceuticals," a term coined by researchers at GlaxoSmithKline to refer to all implantable devices being used to treat mental illnesses and being explored in the treatment of metabolic, cardiovascular and inflammatory disorders. © 2016 Scientific American

Keyword: Depression
Link ID: 22462 - Posted: 07.20.2016

Davide Castelvecchi People can detect flashes of light as feeble as a single photon, an experiment has demonstrated — a finding that seems to conclude a 70-year quest to test the limits of human vision. The study, published in Nature Communications on 19 July1, “finally answers a long-standing question about whether humans can see single photons — they can!” says Paul Kwiat, a quantum optics researcher at the University of Illinois at Urbana–Champaign. The techniques used in the study also open up ways of testing how quantum properties — such as the ability of photons to be in two places at the same time — affect biology, he adds. “The most amazing thing is that it’s not like seeing light. It’s almost a feeling, at the threshold of imagination,” says Alipasha Vaziri, a physicist at the Rockefeller University in New York City, who led the work and tried out the experience himself. Experiments on cells from frogs have shown that sensitive light-detecting cells in vertebrate eyes, called rod cells, do fire in response to single photons2. But, in part because the retina processes its information to reduce ‘noise’ from false alarms, researchers hadn’t been able to confirm whether the firing of one rod cell would trigger a signal that would be transmitted all the way to the brain. Nor was it clear whether people would be able to consciously sense such a signal if it did reach the brain. Experiments to test the limits of human vision have also had to wait for the arrival of quantum-optics technologies that can reliably produce one photon of light at a time. © 2016 Macmillan Publishers Limited

Keyword: Vision
Link ID: 22461 - Posted: 07.20.2016