Links for Keyword: Intelligence

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 315

By Ruth Williams | The sun’s ultraviolet (UV) radiation is a major cause of skin cancer, but it offers some health benefits too, such as boosting production of essential vitamin D and improving mood. Today (May 17), a report in Cell adds enhanced learning and memory to UV’s unexpected benefits. Researchers have discovered that, in mice, exposure to UV light activates a molecular pathway that increases production of the brain chemical glutamate, heightening the animals’ ability to learn and remember. “The subject is of strong interest, because it provides additional support for the recently proposed theory of ultraviolet light’s regulation of the brain and central neuroendocrine system,” dermatologist Andrzej Slominski of the University of Alabama who was not involved in the research writes in an email to The Scientist. “It’s an interesting and timely paper investigating the skin-brain connection,” notes skin scientist Martin Steinhoff of University College Dublin’s Center for Biomedical Engineering who also did not participate in the research. “The authors make an interesting observation linking moderate UV exposure to . . . [production of] the molecule urocanic acid. They hypothesize that this molecule enters the brain, activates glutaminergic neurons through glutamate release, and that memory and learning are increased.” While the work is “fascinating, very meticulous, and extremely detailed,” says dermatologist David Fisher of Massachusetts General Hospital and Harvard Medical School, “it does not imply that UV is actually good for you. . . . Across the board, for humanity, UV really is dangerous.” © 1986-2018 The Scientist

Related chapters from BN: Chapter 17: Learning and Memory; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 13: Memory and Learning; Chapter 7: Vision: From Eye to Brain
Link ID: 24993 - Posted: 05.18.2018

Mariarosaria Taddeo and Luciano Floridi. Cyberattacks are becoming more frequent, sophisticated and destructive. Each day in 2017, the United States suffered, on average, more than 4,000 ransomware attacks, which encrypt computer files until the owner pays to release them1. In 2015, the daily average was just 1,000. In May last year, when the WannaCry virus crippled hundreds of IT systems across the UK National Health Service, more than 19,000 appointments were cancelled. A month later, the NotPetya ransomware cost pharmaceutical giant Merck, shipping firm Maersk and logistics company FedEx around US$300 million each. Global damages from cyberattacks totalled $5 billion in 2017 and may reach $6 trillion a year by 2021 (see go.nature.com/2gncsyg). Countries are partly behind this rise. They use cyberattacks both offensively and defensively. For example, North Korea has been linked to WannaCry, and Russia to NotPetya. As the threats escalate, so do defence tactics. Since 2012, the United States has used ‘active’ cyberdefence strategies, in which computer experts neutralize or distract viruses with decoy targets, or break into a hacker’s computer to delete data or destroy the system. In 2016, the United Kingdom announced a 5-year, £1.9-billion (US$2.7-billion) plan to combat cyber threats. NATO also began drafting principles for active cyberdefence, to be agreed by 2019. The United States and the United Kingdom are leading this initiative. Denmark, Germany, the Netherlands, Norway and Spain are also involved (see go.nature.com/2hebxnt). © 2018 Macmillan Publishers Limited,

Related chapters from BN: Chapter 1: Introduction: Scope and Outlook; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 20: ; Chapter 13: Memory and Learning
Link ID: 24873 - Posted: 04.17.2018

By TARA PARKER-POPE Today’s teenagers have been raised on cellphones and social media. Should we worry about them or just get out of their way? A recent wave of student protests around the country has provided a close-up view of Generation Z in action, and many adults have been surprised. While there has been much hand-wringing about this cohort, also called iGen or the Post-Millennials, the stereotype of a disengaged, entitled and social-media-addicted generation doesn’t match the poised, media-savvy and inclusive young people leading the protests and gracing magazine covers. There’s 18-year-old Emma González, whose shaved head, impassioned speeches and torn jeans have made her the iconic face of the #NeverAgain movement, which developed after the 17 shooting deaths in February at Marjory Stoneman Douglas High School in Parkland, Fla. Naomi Wadler, just 11, became an overnight sensation after confidently telling a national television audience she represented “African-American girls whose stories don’t make the front page of every national newspaper.” David Hogg, a high school senior at Stoneman Douglas, has weathered numerous personal attacks with the disciplined calm of a seasoned politician. Sure, these kids could be outliers. But plenty of adolescent researchers believe they are not. “I think we must contemplate that technology is having the exact opposite effect than we perceived,” said Julie Lythcott-Haims, the former dean of freshmen at Stanford University and author of “How to Raise an Adult.” “We see the negatives of not going outside, can’t look people in the eye, don’t have to go through the effort of making a phone call. There are ways we see the deficiencies that social media has offered, but there are obviously tremendous upsides and positives as well.” “I am fascinated by the phenomenon we are seeing in front of us, and I don’t think it’s unique to these six or seven kids who have been the face of the Parkland adolescent cohort,” says Lisa Damour, an adolescent psychologist and author of “Untangled: Guiding Teenage Girls Through the Seven Transitions Into Adulthood.” © 2018 The New York Times Company

Related chapters from BN: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 4: Development of the Brain; Chapter 4: Development of the Brain
Link ID: 24805 - Posted: 03.31.2018

By David Z. Hambrick, Madeline Marquardt There are advantages to being smart. People who do well on standardized tests of intelligence—IQ tests—tend to be more successful in the classroom and the workplace. Although the reasons are not fully understood, they also tend to live longer, healthier lives, and are less likely to experience negative life events such as bankruptcy. Now there’s some bad news for people in the right tail of the IQ bell curve. In a study just published in the journal Intelligence, Pitzer College researcher Ruth Karpinski and her colleagues emailed a survey with questions about psychological and physiological disorders to members of Mensa. A “high IQ society”, Mensa requires that its members have an IQ in the top two percent. For most intelligence tests, this corresponds to an IQ of about 132 or higher. (The average IQ of the general population is 100.) The survey of Mensa’s highly intelligent members found that they were more likely to suffer from a range of serious disorders. The survey covered mood disorders (depression, dysthymia, and bipolar), anxiety disorders (generalized, social, and obsessive-compulsive), attention-deficit hyperactivity disorder, and autism. It also covered environmental allergies, asthma, and autoimmune disorders. Respondents were asked to report whether they had ever been formally diagnosed with each disorder, or suspected they suffered from it. With a return rate of nearly 75%, Karpinski and colleagues compared the percentage of the 3,715 respondents who reported each disorder to the national average. © 2017 Scientific American

Related chapters from BN: Chapter 17: Learning and Memory; Chapter 1: Introduction: Scope and Outlook
Related chapters from MM:Chapter 13: Memory and Learning; Chapter 20:
Link ID: 24397 - Posted: 12.06.2017

By Alexander P. Burgoyne, David Z. Hambrick More than 60 years ago, Francis Crick and James Watson discovered the double-helical structure of deoxyribonucleic acid—better known as DNA. Today, for the cost of a Netflix subscription, you can have your DNA sequenced to learn about your ancestry and proclivities. Yet, while it is an irrefutable fact that the transmission of DNA from parents to offspring is the biological basis for heredity, we still know relatively little about the specific genes that make us who we are. That is changing rapidly through genome-wide association studies—GWAS, for short. These studies search for differences in people’s genetic makeup—their “genotypes”—that correlate with differences in their observable traits—their “phenotypes.” In a GWAS recently published in Nature Genetics, a team of scientists from around the world analyzed the DNA sequences of 78,308 people for correlations with general intelligence, as measured by IQ tests. The major goal of the study was to identify single nucleotide polymorphisms—or SNPs—that correlate significantly with intelligence test scores. Found in most cells throughout the body, DNA is made up of four molecules called nucleotides, referred to by their organic bases: cytosine (C), thymine (T), adenine (A), and guanine (G). Within a cell, DNA is organized into structures called chromosomes­. Humans normally have 23 pairs of chromosomes, with one in each pair inherited from each parent. © 2017 Scientific American

Related chapters from BN: Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 4: Development of the Brain
Link ID: 23986 - Posted: 08.23.2017

Susan Milius Ravens have passed what may be their toughest tests yet of powers that, at least on a good day, let people and other apes plan ahead. Lab-dwelling common ravens (Corvus corax) in Sweden at least matched the performance of nonhuman apes and young children in peculiar tests of advanced planning ability. The birds faced such challenges as selecting a rock useless at the moment but likely to be useful for working a puzzle box and getting food later. Ravens also reached apelike levels of self-control, picking a tool instead of a ho-hum treat when the tool would eventually allow them to get a fabulous bit of kibble 17 hours later, Mathias Osvath and Can Kabadayi of Lund University in Sweden report in the July 14 Science. “The insight we get from the experiment is that [ravens] can plan for the future outside behaviors observed in the wild,” Markus Böckle, of the University of Cambridge, said in an e-mail. Böckle, who has studied ravens, coauthored a commentary in the same issue of Science. In the wild, ravens cache some of their food, but that apparent foresight could be more of a specific adaptation that evolved with diet instead of as some broader power of planning. The Lund tests, based on experiments with apes, tried to challenge ravens in less natural ways. The researchers say the birds aren’t considered much of a tool-using species in nature, nor do they trade for food. “The study for the first time in any animal shows that future planning can be used in behaviors it was not originally selected for” in evolution, Böckle says. © Society for Science & the Public 2000 - 2017.

Related chapters from BN: Chapter 6: Evolution of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory and Learning
Link ID: 23835 - Posted: 07.14.2017

People with higher IQs are less likely to die before the age of 79. That’s according to a study of over 65,000 people born in Scotland in 1936. Each of the people in the study took an intelligence test at the age of 11, and their health was then followed for 68 years, until the end of 2015. When Ian Deary, of the University of Edinburgh, UK, and his team analysed data from the study, they found that a higher test score in childhood was linked to a 28 per cent lower risk of death from respiratory disease, a 25 per cent reduced risk of coronary heart disease, and a 24 per cent lower risk of death from stroke. These people were also less likely to die from injuries, digestive diseases, and dementia – even when factors like socio-economic status were taken into account. Deary’s team say there are several theories for why more intelligent people live longer, such as people with higher IQs being more likely to look after their health and less likely to smoke. They also tend to do more exercise and seek medical attention when ill. “I’m hoping it means that if we can find out what smart people do and copy them, then we have a chance of a slightly longer and healthier life,” says Dreary. But there’s evidence genetics is involved too. A recent study suggests that very rare genetic variants can play an important role in lowering intelligence, and that these may also be likely to impair a person’s health. Journal reference: British Medical Journal, DOI: 10.1136/bmj.j2708 © Copyright New Scientist Ltd.

Related chapters from BN: Chapter 1: Introduction: Scope and Outlook; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 20: ; Chapter 13: Memory and Learning
Link ID: 23786 - Posted: 06.29.2017

By Katie Langin No one likes a con artist. People avoid dealing with characters who have swindled them in the past, and—according to new research—birds avoid those people, too. Ravens, known more for their intelligence, but only slightly less for their love of cheese, were trained by researchers to trade a crust of bread for a morsel of cheese with human partners. When the birds then tried to broker a trade with “fair” and “unfair” partners—some completed the trade as expected, but others took the raven’s bread and kept (and ate) the cheese—the ravens avoided the tricksters in separate trials a month later. This suggests that ravens can not only differentiate between “fair” and “unfair” individuals, but they retain that ability for at least a month, the researchers write this month in Animal Behavior. Ravens have a complex social life involving friendships and rivalries. Their ability to recognize and punish dishonest individuals, even after a single encounter, may help explain how cooperation evolved in this group of birds. For people, though, the moral of the story is simple: Be nice to ravens. © 2017 American Association for the Advancement of Science.

Related chapters from BN: Chapter 6: Evolution of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory and Learning
Link ID: 23709 - Posted: 06.06.2017

By David Z. Hambrick Physical similarities aside, we share a lot in common with our primate relatives. For example, as Jane Goodall famously documented, chimpanzees form lifelong bonds and show affection in much the same way as humans. Chimps can also solve novel problems, use objects as tools, and may possess “theory of mind”—an understanding that others may have different perspectives than oneself. They can even outperform humans in certain types of cognitive tasks. These commonalities may not seem all that surprising given what we now know from the field of comparative genomics: We share nearly all of our DNA with chimpanzees and other primates. However, social and cognitive complexity is not unique to our closest evolutionary cousins. In fact, it is abundant in species with which we would seem to have very little in common—like the spotted hyena. For more than three decades, the Michigan State University zoologist Kay Holekamp has studied the habits of the spotted hyena in Kenya’s Masai Mara National Reserve, once spending five years straight living in a tent among her oft-maligned subjects. One of the world’s longest-running studies of a wild mammal, this landmark project has revealed that spotted hyenas not only have social groups as complex as those of many primates, but are also capable of some of the same types of problem solving. This research sheds light on one of science’s greatest mysteries—how intelligence has evolved across the animal kingdom. According to the social brain hypothesis, intelligence has evolved to meet the demands of social life. The subject of many popular articles and books, this hypothesis posits that the complex information processing that goes along with coexisting with members of one’s own species—forming coalitions, settling disputes, trying to outwit each other, and so on—selects for larger brains and greater intelligence. By contrast, the cognitive buffer hypothesis holds that intelligence emerges as an adaption to dealing with novelty in the environment, in whatever form it presents itself. © 2017 Scientific American,

Related chapters from BN: Chapter 1: Introduction: Scope and Outlook; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 20: ; Chapter 13: Memory and Learning
Link ID: 23685 - Posted: 05.31.2017

Carl Zimmer In a significant advance in the study of mental ability, a team of European and American scientists announced on Monday that they had identified 52 genes linked to intelligence in nearly 80,000 people. These genes do not determine intelligence, however. Their combined influence is minuscule, the researchers said, suggesting that thousands more are likely to be involved and still await discovery. Just as important, intelligence is profoundly shaped by the environment. Still, the findings could make it possible to begin new experiments into the biological basis of reasoning and problem-solving, experts said. They could even help researchers determine which interventions would be most effective for children struggling to learn. “This represents an enormous success,” said Paige Harden, a psychologist at the University of Texas, who was not involved in the study. For over a century, psychologists have studied intelligence by asking people questions. Their exams have evolved into batteries of tests, each probing a different mental ability, such as verbal reasoning or memorization. In a typical test, the tasks might include imagining an object rotating, picking out a shape to complete a figure, and then pressing a button as fast as possible whenever a particular type of word appears. Each test-taker may get varying scores for different abilities. But over all, these scores tend to hang together — people who score low on one measure tend to score low on the others, and vice versa. Psychologists sometimes refer to this similarity as general intelligence. It’s still not clear what in the brain accounts for intelligence. Neuroscientists have compared the brains of people with high and low test scores for clues, and they’ve found a few. Brain size explains a small part of the variation, for example, although there are plenty of people with small brains who score higher than others with bigger brains. © 2017 The New York Times Company

Related chapters from BN: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 4: Development of the Brain; Chapter 4: Development of the Brain
Link ID: 23650 - Posted: 05.23.2017

By Ian Randall René Descartes began with doubt. “We cannot doubt of our existence while we doubt. … I think, therefore I am,” the 17th century philosopher and scientist famously wrote. Now, modern scientists are trying to figure out what made the genius’s mind tick by reconstructing his brain. Scientists have long wondered whether the brains of geniuses (especially the shapes on their surfaces) could hold clues about their owners’ outsized intelligences. But most brains studied to date—including Albert Einstein’s—were actual brains. Descartes’s had unfortunately decomposed by the time scientists wanted to study it. So with techniques normally used for studying prehistoric humans, researchers created a 3D image of Descartes’s brain (above) by scanning the impression it left on the inside of his skull, which has been kept for almost 200 years now in the National Museum of Natural History in Paris. For the most part, his brain was surprisingly normal—its overall dimensions fell within regular ranges, compared with 102 other modern humans. But one part stood out: an unusual bulge in the frontal cortex, in an area which previous studies have suggested may process the meaning of words. That’s not to say this oddity is necessarily indicative of genius, the scientists report online in the Journal of the Neurological Sciences. Even Descartes might agree: “It is not enough to have a good mind,” he wrote. “The main thing is to use it well.” © 2017 American Association for the Advancement of Science

Related chapters from BN: Chapter 19: Language and Lateralization; Chapter 1: Introduction: Scope and Outlook
Related chapters from MM:Chapter 15: Language and Lateralization; Chapter 20:
Link ID: 23579 - Posted: 05.06.2017

Ian Sample Science editor Tempting as it may be, it would be wrong to claim that with each generation humans are becoming more stupid. As scientists are often so keen to point out, it is a bit more complicated than that. A study from Iceland is the latest to raise the prospect of a downwards spiral into imbecility. The research from deCODE, a genetics firm in Reykjavik, finds that groups of genes that predispose people to spend more years in education became a little rarer in the country from 1910 to 1975. The scientists used a database of more than 100,000 Icelanders to see how dozens of gene variants that affect educational attainment appeared in the population over time. They found a shallow decline over the 65 year period, implying a downturn in the natural inclination to rack up qualifications. But the genes involved in education affected fertility too. Those who carried more “education genes” tended to have fewer children than others. This led the scientists to propose that the genes had become rarer in the population because, for all their qualifications, better educated people had contributed less than others to the Icelandic gene pool. Spending longer in education and the career opportunities that provides is not the sole reason that better educated people tend to start families later and have fewer children, the study suggests. Many people who carried lots of genes for prolonged education left the system early and yet still had fewer children that the others. “It isn’t the case that education, or the career opportunities it provides, prevents you from having more children,” said Kari Stefansson, who led the study. “If you are genetically predisposed to have a lot of education, you are also predisposed to have fewer children.” © 2017 Guardian News and Media Limited

Related chapters from BN: Chapter 17: Learning and Memory; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory and Learning
Link ID: 23113 - Posted: 01.17.2017

By PETER GODFREY-SMITH Around 2008, while snorkeling and scuba diving in my free time, I began watching the unusual group of animals known as cephalopods, the group that includes octopuses, cuttlefish and squid. The first ones I encountered were giant cuttlefish, large animals whose skin changes color so quickly and completely that swimming after them can be like following an aquatic, multi-armed television. Then I began watching octopuses. Despite being mollusks, like clams and oysters, these animals have very large brains and exhibit a curious, enigmatic intelligence. I followed them through the sea, and also began reading about them, and one of the first things I learned came as a shock: They have extremely short lives — just one or two years. I was already puzzled by the evolution of large brains in cephalopods, and this discovery made the questions more acute. What is the point of building a complex brain like that if your life is over in a year or two? Why invest in a process of learning about the world if there is no time to put that information to use? An octopus’s or cuttlefish’s life is rich in experience, but it is incredibly compressed. The particular puzzle of octopus life span opens up a more general one. Why do animals age? And why do they age so differently? A scruffy-looking fish that inhabits the same patch of sea as my cephalopods has relatives who live to 200 years of age. This seems extraordinarily unfair: A dull-looking fish lives for centuries while the cuttlefish, in their chromatic splendor, and the octopuses, in their inquisitive intelligence, are dead before they are 2? There are monkeys the size of a mouse that can live for 15 years, and hummingbirds that can live for over 10. Nautiluses (who are also cephalopods) can live for 20 years. A recent Nature paper reported that despite continuing medical advances, humans appear to have reached a rough plateau at around 115 years, though a few people will edge beyond it. The life spans of animals seem to lack all rhyme or reason. © 2016 The New York Times Company

Related chapters from BN: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 4: Development of the Brain; Chapter 4: Development of the Brain
Link ID: 22951 - Posted: 12.05.2016

James Gorman The Goffin’s cockatoo is a smart bird, so smart it has been compared to a 3-year-old human. But even for this species, a bird named Figaro stands out for his creativity with tools. Hand-raised at the Veterinary University of Vienna, the male bird was trying to play with a pebble that fell outside his aviary onto a wooden beam about four years ago. First he used a piece of bamboo to try to rake the stone back in. Impressed, scientists in the university Goffin’s lab, which specializes in testing the thinking abilities of the birds, put a cashew nut where the pebble had been. Figaro extended his beak through the wire mesh to bite a splinter off the wooden beam. He used the splinter to fish the cashew in, a fairly difficult process because he had to work the splinter through the mesh and position it at the right angle. In later trials, Figaro made his tools much more quickly, and also picked a bamboo twig from the bottom of the aviary and trimmed it to make a similar tool. Cockatoos don’t do anything like this in nature, as far as anyone knows. They don’t use tools. They don’t even build nests, so they are not used to manipulating sticks. And they have curved bills, unlike the straight beaks of crows and jays that make manipulating tools a bit easier. Blue jays have been observed creating tools from newspaper to pull food pellets to them. Alice M.I. Auersperg, a researcher at the Veterinary University of Vienna who studies cognition in animals, and her colleagues reported those first accomplishments by Figaro in 2012. Since then, they have continued to test Figaro and other birds in the lab that were able to learn tool use or tool making, sometimes both, by watching Figaro. © 2016 The New York Times Company

Related chapters from BN: Chapter 1: Introduction: Scope and Outlook; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 20:
Link ID: 22892 - Posted: 11.21.2016

Bruce Bower Apes understand what others believe to be true. What’s more, they realize that those beliefs can be wrong, researchers say. To make this discovery, researchers devised experiments involving a concealed, gorilla-suited person or a squirreled-away rock that had been moved from their original hiding places — something the apes knew, but a person looking for King Kong or the stone didn’t. “Apes anticipated that an individual would search for an object where he last saw it, even though the apes knew that the object was no longer there,” says evolutionary anthropologist Christopher Krupenye. If this first-of-its-kind finding holds up, it means that chimpanzees, bonobos and orangutans can understand that others’ actions sometimes reflect mistaken assumptions about reality. Apes’ grasp of others’ false beliefs roughly equals that of human 2-year-olds tested in much the same way, say Krupenye of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and his colleagues. Considering their targeted gazes during brief experiments, apes must rapidly assess others’ beliefs about the world in wild and captive communities, the researchers propose in the October 7 Science. Understanding the concept of false beliefs helps wild and captive chimps deceive their comrades, such as hiding food from those who don’t share, Krupenye suggests. |© Society for Science & the Public 2000 - 2016.

Related chapters from BN: Chapter 6: Evolution of the Brain and Behavior; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 22733 - Posted: 10.08.2016

By David Z. Hambrick, Fredrik Ullén, Miriam Mosing Elite-level performance can leave us awestruck. This summer, in Rio, Simone Biles appeared to defy gravity in her gymnastics routines, and Michelle Carter seemed to harness super-human strength to win gold in the shot put. Michael Phelps, meanwhile, collected 5 gold medals, bringing his career total to 23. In everyday conversation, we say that elite performers like Biles, Carter, and Phelps must be “naturals” who possess a “gift” that “can’t be taught.” What does science say? Is innate talent a myth? This question is the focus of the new book Peak: Secrets from the New Science of Expertise by Florida State University psychologist Anders Ericsson and science writer Robert Pool. Ericsson and Pool argue that, with the exception of height and body size, the idea that we are limited by genetic factors—innate talent—is a pernicious myth. “The belief that one’s abilities are limited by one’s genetically prescribed characteristics....manifests itself in all sorts of ‘I can’t’ or ‘I’m not’ statements,” Ericsson and Pool write. The key to extraordinary performance, they argue, is “thousands and thousands of hours of hard, focused work.” To make their case, Ericsson and Pool review evidence from a wide range of studies demonstrating the effects of training on performance. In one study, Ericsson and his late colleague William Chase found that, through over 230 hours of practice, a college student was able to increase his digit span—the number of random digits he could recall—from a normal 7 to nearly 80. In another study, the Japanese psychologist Ayako Sakakibara enrolled 24 children from a private Tokyo music school in a training program designed to train “perfect pitch”—the ability to name the pitch of a tone without hearing another tone for reference. With a trainer playing a piano, the children learned to identify chords using colored flags—for example, a red flag for CEG and a green flag for DGH. Then, the children were tested on their ability to identify the pitches of individual notes until they reached a criterion level of proficiency. By the end of the study, the children had seemed to acquire perfect pitch. Based on these findings, Ericsson and Pool conclude that the “clear implication is that perfect pitch, far from being a gift bestowed upon only a lucky few, is an ability that pretty much anyone can develop with the right exposure and training.” © 2016 Scientific American

Related chapters from BN: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 4: Development of the Brain; Chapter 4: Development of the Brain
Link ID: 22674 - Posted: 09.21.2016

By DAVID Z. HAMBRICK and ALEXANDER P. BURGOYNE ARE you intelligent — or rational? The question may sound redundant, but in recent years researchers have demonstrated just how distinct those two cognitive attributes actually are. It all started in the early 1970s, when the psychologists Daniel Kahneman and Amos Tversky conducted an influential series of experiments showing that all of us, even highly intelligent people, are prone to irrationality. Across a wide range of scenarios, the experiments revealed, people tend to make decisions based on intuition rather than reason. In one study, Professors Kahneman and Tversky had people read the following personality sketch for a woman named Linda: “Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.” Then they asked the subjects which was more probable: (A) Linda is a bank teller or (B) Linda is a bank teller and is active in the feminist movement. Eighty-five percent of the subjects chose B, even though logically speaking, A is more probable. (All feminist bank tellers are bank tellers, though some bank tellers may not be feminists.) In the Linda problem, we fall prey to the conjunction fallacy — the belief that the co-occurrence of two events is more likely than the occurrence of one of the events. In other cases, we ignore information about the prevalence of events when judging their likelihood. We fail to consider alternative explanations. We evaluate evidence in a manner consistent with our prior beliefs. And so on. Humans, it seems, are fundamentally irrational. But starting in the late 1990s, researchers began to add a significant wrinkle to that view. As the psychologist Keith Stanovich and others observed, even the Kahneman and Tversky data show that some people are highly rational. In other words, there are individual differences in rationality, even if we all face cognitive challenges in being rational. So who are these more rational people? Presumably, the more intelligent people, right? © 2016 The New York Times Company

Related chapters from BN: Chapter 17: Learning and Memory; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 13: Memory and Learning; Chapter 14: Attention and Higher Cognition
Link ID: 22666 - Posted: 09.19.2016

By Brian Owens It’s certainly something to crow about. New Caledonian crows are known for their ingenious use of tools to get at hard-to-reach food. Now it turns out that their Hawaiian cousins are adept tool-users as well. Christian Rutz at the University of St Andrews in the UK has spent 10 years studying the New Caledonian crow and wondered whether any other crow species are disposed to use tools. So he looked for crows that have similar features to the New Caledonian crow – a straight bill and large, mobile eyes that allow it to manipulate tools, much as archaeologists use opposable thumbs as an evolutionary signature for tool use in early humans. “The Hawaiian crow really stood out,” he says. “They look quite similar.” Hawaiian crows are extinct in the wild, but 109 birds still live in two captive breeding facilities in Hawaii. That meant Rutz was able to test pretty much every member of the species. He stuffed tasty morsels into a variety of holes and crevices in a log, and gave the birds a variety of sticks to see if they would use them to dig out the food. Almost all of them did, and most extracted the food in less than a minute, faster than the researchers themselves could. “It’s mind-blowing,” says Rutz. “They’re very good at getting the tool in the right position, and if they’re not happy with it they’ll modify it or make their own.” © Copyright Reed Business Information Ltd.

Related chapters from BN: Chapter 17: Learning and Memory; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory and Learning
Link ID: 22659 - Posted: 09.15.2016

By Virginia Morell Fourteen years ago, a bird named Betty stunned scientists with her humanlike ability to invent and use tools. Captured from the wild and shown a tiny basket of meat trapped in a plastic tube, the New Caledonian crow bent a straight piece of wire into a hook and retrieved the food. Researchers hailed the observation as evidence that these crows could invent new tools on the fly—a sign of complex, abstract thought that became regarded as one of the best demonstrations of this ability in an animal other than a human. But a new study casts doubt on at least some of Betty’s supposed intuition. Scientists have long agreed that New Caledonian crows (Corvus moneduloides), which are found only on the South Pacific island of the same name, are accomplished toolmakers. At the time of Betty’s feat, researchers knew that in the wild these crows could shape either stiff or flexible twigs into tools with a tiny, barblike hook at one end, which they used to lever grubs from rotting logs. They also make rakelike tools from the leaves of the screw pine (Pandanus) tree. But Betty appeared to take things to the next level. Not only did she fashion a hook from a material she’d never previously encountered—a behavior not observed in the wild—she seemed to know she needed this specific shape to solve her particular puzzle. © 2016 American Association for the Advancement of Science. A

Related chapters from BN: Chapter 6: Evolution of the Brain and Behavior; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory and Learning
Link ID: 22538 - Posted: 08.10.2016

by Helen Thompson Pinky and The Brain's smarts might not be so far-fetched. Some mice are quicker on the uptake than others. While it might not lead to world domination, wits have their upside: a better shot at staying alive. Biologists Audrey Maille and Carsten Schradin of the University of Strasbourg in France tested reaction time and spatial memory in 90 African striped mice (Rhabdomys pumilio) over the course of a summer. For this particular wild rodent, surviving harsh summer droughts means making it to mating season in the early fall. The team saw some overall trends: Females were more likely to survive if they had quick reflexes, and males were more likely to survive if they had good spatial memory. Cognitive traits like reacting quickly and remembering the best places to hide are key to eluding predators during these tough times but may come with trade-offs for males and females. The results show that an individual mouse’s cognitive strengths are linked to its survival odds, suggesting that the pressure to survive can shape basic cognition, Maille and Schradin write August 3 in Biology Letters. |© Society for Science & the Public 2000 - 2016

Related chapters from BN: Chapter 17: Learning and Memory; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory and Learning
Link ID: 22511 - Posted: 08.04.2016