"Unleash your creativity and unlock your potential with MsgBrains.Com - the innovative platform for nurturing your intellect." » » Make It Stick: The Science of Successful Learning

Add to favorite Make It Stick: The Science of Successful Learning

Select the language in which you want the text you are reading to be translated, then select the words you don't know with the cursor to get the translation above the selected word!




Go to page:
Text Size:

by another group who read the same paragraph about a girl named Carol Harris.6

Imagination infl ation refers to the tendency of people who, when asked to imagine an event vividly, will sometimes begin to believe, when asked about it later, that the event actually occurred. Adults who were asked “Did you ever break a window with your hand?” were more likely on a later life inventory to report that they believed this event occurred during their lifetimes. It seems that asking the question led them to imagine the event, and the act of having imagined it had the effect, later, of making them more likely to think it had occurred (relative to another group who answered the question without having previously imagined it occurring).

Hypothetical events that are imagined vividly can seat themselves in the mind as fi rmly as memories of actual events. For instance, when it is suspected that a child is being sexually abused and he is interviewed and questioned about it, he may imagine experiences that the interviewer describes and then later come to “remember” them as having occurred.7 (Sadly, of course, many memories of childhood sexual abuse are absolutely true, usually ones reported soon after the occurrence.) Another type of memory illusion is one caused by suggestion, which may arise simply in the way a question is asked. In one example, people watched a video of a car running a stop sign at an intersection and colliding with another car passing through. Those who were later asked to judge the speed of the vehicles when they “contacted” each other gave an average estimate of thirty- two miles per hour. Those who were asked to judge the speed when the two vehicles “smashed” into each

Make It Stick ê 114

other estimated on average forty- one miles per hour. If the speed limit was thirty miles per hour, asking the question the second way rather than the fi rst could lead to the driver’s being charged with speeding. Of course, the legal system knows the danger of witnesses being asked “leading questions” (ones that encourage a par tic u lar answer), but such questions are diffi cult to avoid completely, because suggestibility can be very subtle. After all, in the case just discussed, the two cars did “smash together.”8

Some witnesses to crimes who are struggling to recall them are instructed to let their minds roam freely, to generate what-ever comes to mind, even if it is a guess. However, the act of guessing about possible events causes people to provide their own misinformation, which, if left uncorrected, they may later come to retrieve as memories. That is one reason why people who have been interviewed after being hypnotized are barred from testifying in court in almost all states and Canadian provinces. The hypnotic interview typically encourages people to let their thoughts roam freely and produce everything that comes to mind, in hopes that they will retrieve information that would not otherwise be produced. However, this pro cess causes them to produce much erroneous information, and studies have shown that when they are tested later, under instructions only to tell exactly what they remember of the actual events, their guesses made while under hypnosis cloud their memories about what truly happened. In par tic ular, they remember events they produced under hypnosis as actual experiences, even under conditions (in the laboratory) when it is known that the events in question did not occur.9

Interference from other events can distort memory. Suppose the police interview a witness shortly after a crime, showing

Avoid Illusions of Knowing ê 115

pictures of possible suspects. Time passes, but eventually the police nab a suspect, one whose picture had been viewed by the witness. If the witness is now asked to view a lineup, he may mistakenly remember one of the suspects whose photo he saw as having been present at the crime. A particularly vivid example of a related pro cess happened to the Australian psychologist Donald M. Thomson. A woman in Sydney was watching tele vi sion in midday when she heard a knock at the door. When she answered it, she was attacked, raped, and left unconscious. When she awoke and dialed the police, they came to her aid, got a description of her assailant, and launched a search. They spotted Donald Thomson walking down a Sydney street, and he matched the description. They arrested him on the spot. It turns out that Thomson had an airtight alibi— at the exact time of the rape, he was being interviewed on a live tele vi sion show. The police did not believe him and sneered when he was being interrogated. However, the story was true. The woman had been watching the show when she heard the knock on the door. The description she gave the police was apparently of the man she saw on tele vi sion, Donald Thomson, rather than the rapist. Her System 1 reaction—

quick but sometimes mistaken— provided the wrong description, probably due to her extreme emotional state.10

What psychologists call the curse of knowledge is our tendency to underestimate how long it will take another person to learn something new or perform a task that we have already mastered. Teachers often suffer this illusion— the calculus instructor who fi nds calculus so easy that she can no longer place herself in the shoes of the student who is just starting out and struggling with the subject. The curse-of-knowledge effect is close kin to hindsight bias, or what is often called the

Make It Stick ê 116

knew- it- all- along effect, in which we view events after the fact as having been more predictable than they were before they occurred. Stock market pundits will confi dently announce on the eve ning news why the stock market behaved as it did that day, even though they could not have predicted the movements that morning.11

Accounts that sound familiar can create the feeling of knowing and be mistaken for true. This is one reason that po liti cal or advertising claims that are not factual but are repeated can gain traction with the public, particularly if they have emotional resonance. Something you once heard that you hear again later carries a warmth of familiarity that can be mistaken for memory, a shred of something you once knew and cannot quite place but are inclined to believe. In the world of propaganda, this is called “the big lie” technique— even a big lie told repeatedly can come to be accepted as truth.

Fluency illusions result from our tendency to mistake fl uency with a text for mastery of its content. For example, if you read a particularly lucid pre sen ta tion of a diffi cult concept, you can get the idea that it is actually pretty simple and perhaps even that you knew it all along. As discussed earlier, students who study by rereading their texts can mistake their fl uency with a text, gained from rereading, for possession of accessible knowledge of the subject and consequently overestimate how well they will do on a test.

Our memories are also subject to social infl uence and tend to align with the memories of the people around us. If you are in

Avoid Illusions of Knowing ê 117

a group reminiscing about past experiences and someone adds a wrong detail about the story, you will tend to incorporate this detail into your own memory and later remember the experience with the erroneous detail. This pro cess is called

“memory conformity” or the “social contagion of memory”: one person’s error can “infect” another person’s memory. Of course, social infl uences are not always bad. If someone recalls details of joint memory on which you are somewhat hazy, your subsequent memory will be updated and will hold a more accurate record of the past event.12

In the obverse of the social infl uence effect, humans are predisposed to assume that others share their beliefs, a pro cess called the false consensus effect. We generally fail to recognize the idiosyncratic nature of our personal understanding of the world and interpretation of events and that ours differ from others’. Recall how surprised you were recently, on commiser-ating with a friend about the general state of affairs, to discover that she sees in an entirely different light matters on which you thought the correct view was fundamental and obvious: cli-mate change, gun control, fracking of gas wells— or perhaps something very local, such as whether to pass a bond issue for a school building or to oppose construction of a big box store in the neighborhood.13

Confi dence in a memory is not a reliable indication of its accuracy. We can have utmost faith in a vivid, nearly literal memory of an event and yet fi nd that we actually have it all wrong. National tragedies, like the assassination of President John Kennedy or the events surrounding 9/11, create what psychologists call “fl ashbulb” memories, named for the vivid

Make It Stick ê 118

images that we retain: where we were when we got the news, how we learned it, how we felt, what we did. These memories are thought to be indelible, burned into our minds, and it is true that the broad outlines of such catastrophes, thoroughly reported in the media, are well remembered, but your memory of your personal circumstances surrounding the events may not necessarily be accurate. There have been numerous studies of this phenomenon, including surveys of fi fteen hundred Americans’ memories of the September 11 attacks. In this study, the respondents’ memories were surveyed a week after the attacks, again a year later, and then again three years and ten years later. Respondents’ most emotional memories of their personal details at the time they learned of the attacks are also those of which they are most confi dent and, paradoxi-cally, the ones that have most changed over the years relative to other memories about 9/11.14

Mental Models

As we develop mastery in the various areas of our lives, we tend to bundle together the incremental steps that are required to solve different kinds of problems. To use an analogy from a previous chapter, you could think of them as something like smart- phone apps in the brain. We call them mental models. Two examples in police work are the choreography of the routine traffi c stop and the moves to take a weapon from an assailant at close quarters. Each of these maneuvers involves a set of perceptions and actions that cops can adapt with little conscious thought in response to context and situation. For a barista, a mental model would be the steps and ingredients to produce a perfect sixteen- ounce decaf frappuccino. For the receptionist at urgent care, it’s triage and registration.

Avoid Illusions of Knowing ê 119

The better you know something, the more diffi cult it becomes to teach it. So says physicist and educator Eric Mazur of Harvard. Why? As you get more expert in complex areas, your models in those areas grow more complex, and the component steps that compose them fade into the background of memory (the curse of knowledge). A physicist, for example, will create a mental library of the principles of physics she can use to solve the various kinds of problems she encounters in her work: Newton’s laws of motion, for example, or the laws of conservation of momentum. She will tend to sort problems based on their underlying principles, whereas a novice will group them by similarity of surface features, like the appara-tus being manipulated in the problem (pulley, inclined plane, etc.). One day, when she goes to teach an intro physics class, she explains how a par tic u lar problem calls for something from Newtonian mechanics, forgetting that her students have yet to master the underlying steps she has long ago bundled into one unifi ed mental model. This presumption by the professor that her students will readily follow something complex that appears fundamental in her own mind is a metacognitive error, a misjudgment of the matchup between what she knows and what her students know. Mazur says that the person who knows best what a student is struggling with in assimilating new concepts is not the professor, it’s another student.15 This problem is illustrated through a very simple experiment in which one person plays a common tune inside her head and taps the rhythm with her knuckles and another person hearing the rhythmic taps must guess the tune. Each tune comes from a fi xed set of twenty- fi ve, so the statistical chance of guessing it is 4 percent. Tellingly, the participants who have the tune in mind estimate that the other person will guess correctly 50 percent of the time, but in fact the listeners

Make It Stick ê 120

guess correctly only 2.5 percent of the time, no better than chance.16

Like Coach Dooley’s football players memorizing their playbooks, we all build mental libraries of myriad useful solutions that we can call on at will to help us work our way from one Saturday game to the next. But we can be tripped by these models, too, when we fail to recognize a new problem that appears to be a familiar one is actually something quite different and we pull out a solution to address it that doesn’t work or makes things worse. The failure to recognize when your solution doesn’t fi t the problem is another form of faulty self-observation that can lead you into trouble.

Mike Ebersold, the neurosurgeon, was called into the operating room one day to help a surgical resident who, in the midst of removing a brain tumor, was losing the patient. The usual model for cutting out a tumor calls for taking your time, working carefully around the growth, getting a clean margin, saving the surrounding nerves. But when the growth is in the brain, and if you get bleeding behind it, pressure on the brain can turn fatal. Instead of slow- and- careful, you need just the opposite, cutting the growth out very quickly so the blood can drain, and then working to repair the bleeding. “Initially you might be a little timid to take the big step,” Mike says. “It’s not pretty, but the patient’s survival depends on your knowing to switch gears and do it fast.” Mike assisted, and the surgery was successful.

Like the infant who calls the stranger Dada, we must cultivate the ability to discern when our mental models aren’t working: when a situation that seems familiar is actually different and requires that we reach for a different solution and do something new.

Avoid Illusions of Knowing ê 121

Unskilled and Unaware of It

Incompetent people lack the skills to improve because they are unable to distinguish between incompetence and competence.

This phenomenon, of par tic u lar interest for metacognition, has been named the Dunning- Kruger effect after the psychologists David Dunning and Justin Kruger. Their research showed that incompetent people overestimate their own competence and, failing to sense a mismatch between their per for mance and what is desirable, see no need to try to improve. (The title of their initial paper on the topic was “Unskilled and Unaware of It.”) Dunning and Kruger have also shown that incompetent people can be taught to raise their competence by learning the skills to judge their own per for mance more accurately, in short, to make their metacognition more accurate. In one series of studies that demonstrate this fi nding, they gave students a test of logic and asked them to rate their own per for mance. In the fi rst experiment the results confi rmed expectations that the least competent students were the most out of touch with their per for mance: students who scored at the twelfth percentile on average believed that their general logical reasoning ability fell at the sixty- eighth percentile.

In a second experiment, after taking an initial test and rating their own per for mance, the students were shown the other students’ answers and then their own answers and asked to reestimate the number of test questions they had answered correctly. The students whose per for mance was in the bottom quartile failed to judge their own per for mance more accurately after seeing the more competent choices of their peers and in fact tended to raise their already infl ated estimates of their own ability.

A third experiment explored whether poor performers could learn to improve their judgment. The students were given ten

Make It Stick ê 122

problems in logical reasoning and after the test were asked to rate their logical reasoning skills and test per for mance. Once again, the students in the bottom quartile grossly overestimated their per for mance. Next, half the students received ten minutes of training in logic (how to test the accuracy of a syllogism); the other half of the students were given an unrelated task. All the students were then asked to estimate again how well they had performed on the test. Now the students in the bottom quartile who had received the training were much more accurate estimators of the number of questions they got right and of how they performed compared to the other students. Those in the bottom quartile who didn’t receive the training held to their mistaken conviction that they had performed well.

How is it that incompetent people fail to learn through experience that they are unskilled? Dunning and Kruger offer several theories. One is that people seldom receive negative feedback about their skills and abilities from others in everyday life, because people don’t like to deliver the bad news.

Even if people get negative feedback, they must come to an accurate understanding of why the failure occurred. For success everything must go right, but by contrast, failure can be attributed to any number of external causes: it’s easy to blame the tool for what the hand cannot do. Finally, Dunning and Kruger suggest that some people are just not astute at reading how other people are performing and are therefore less able to spot competence when they see it, making them less able to make comparative judgments of their own per for mance.

These effects are more likely to occur in some contexts and with some skills than with others. In some domains, the revelation of one’s incompetence can be brutally frank. The authors can all remember from their childhoods when a teacher would appoint two boys to pick other kids for softball teams.

Avoid Illusions of Knowing ê 123

The good players are picked fi rst, the worst last. You learn your peers’ judgments of your softball abilities in a very public manner, so it would be hard for the last- picked player to think “I must be really good at softball.” However, most realms of life do not render such stark judgments of ability.17

To sum up, the means by which we navigate the world—

Daniel Kahneman’s Systems 1 and 2— rely on our perceptual systems, intuition, memory, and cognition, with all their tics, warts, biases, and fl aws. Each of us is an astounding bundle of perceptual and cognitive abilities, coexisting with the seeds of our own undoing. When it comes to learning, what we choose to do is guided by our judgments of what works and what doesn’t, and we are easily misled.

Our susceptibility to illusion and misjudgment should give us all pause, and especially so to the advocates of “student-directed learning,” a theory now current among some parents and educators. This theory holds that students know best what they need to study to master a subject, and what pace and methods work best for them. For example, at Manhattan Free School in East Harlem, opened in 2008, students “do not receive grades, take tests or have to do anything they do not feel like doing .” The Brooklyn Free School, which opened in 2004, along with a new crop of homeschooling families who call themselves “unschoolers,” follows the precept that what ever intrigues the learner is what will result in the best learning. 18

The intent is laudatory. We know that students need to take more control of their own learning by employing strategies like those we have discussed. For example, they need to test themselves, both to attain the direct benefi ts of increased retention and to determine what they know and don’t know to more accurately judge their progress and focus on material

Make It Stick ê 124

Are sens