"Unleash your creativity and unlock your potential with MsgBrains.Com - the innovative platform for nurturing your intellect." » » Make It Stick: The Science of Successful Learning

Add to favorite Make It Stick: The Science of Successful Learning

Select the language in which you want the text you are reading to be translated, then select the words you don't know with the cursor to get the translation above the selected word!




Go to page:
Text Size:

During the summer of 2008, three stickup artists in Minneapolis had a system going of phoning in large fast- food orders and then relieving the delivery man of all the goods and cash he carried. As a livelihood it was a model of simplicity. They kept at it, failing to consider the wisdom of always placing their orders from the same two cell phones and taking delivery at the same two addresses.

David Garman, a Minneapolis cop, was working undercover that summer. “It was getting more aggressive. At the beginning, it was ‘maybe they had a gun,’ then all of a sudden there were a couple of guns, and then they were hurting the people when they were robbing them.”

It was a night in August when Garman got a call about a large order phoned in to a Chinese restaurant. He or ga nized a small team on short notice and prepared to pose as the delivery guy. He pulled on a bulletproof vest, covered it with a casual shirt, and shoved his .45 automatic into his pants. While his colleagues staked out positions near the delivery address, Garman picked up the food, drove there, and parked with his brights shining on the front door. He’d cut a slit in the bottom of the food bag and tucked a .38 inside to rest in his hand as he carried the package. “The .38 has a covered hammer on it, so I can shoot it in a bag. If I were to put the automatic in there, it’d jam and I’d be screwed.”

So I walk up with the package and I say, “Hey, sir, did you order some food?” He says, “Yup,” and I’m thinking this guy’s really just going to pay me and I’m going to be out of here,

Make It Stick ê 104

and this is going to be the dumbest thing we’ve ever done. I’m thinking if he hands me $40, I don’t even know how much this food is. But he turns his head to look halfway back and two other guys start to come up, and as they’re walking towards me they fl ip hoods over their heads. That’s when I know it’s game time. The fi rst guy whips a gun out of his pocket and racks it and puts it to my head all in one motion, saying, “Give me everything you’ve got motherfucker or I’ll kill you.” I ended up shooting him through the bag. It was four rounds.2

Not such a great livelihood after all. The guy was hit low and survived, although he is a lesser man as a result. Garman would have aimed higher if the food package hadn’t been so heavy, and he took a lesson from the experience: he’s better prepared for the next time, though he’d rather we didn’t describe just how.

We like to think we’re smarter than the average doodle, and even if we’re not, we feel affi rmed in this delusion each year when the newest crop of Darwin Awards circulates by email, that short list of self- infl icted fatalities caused by spec-tacularly poor judgment, as in the case of the attorney in Toronto who was demonstrating the strength of the windows in his twenty- two- story offi ce tower by throwing his shoulder against the glass when he broke it and fell through. The truth is that we’re all hardwired to make errors in judgment. Good judgment is a skill one must acquire, becoming an astute observer of one’s own thinking and per for mance. We start at a disadvantage for several reasons. One is that when we’re incompetent, we tend to overestimate our competence and see little reason to change. Another is that, as humans, we are readily misled by illusions, cognitive biases, and the stories we construct to explain the world around us and our place within

Avoid Illusions of Knowing ê 105

it. To become more competent, or even expert, we must learn to recognize competence when we see it in others, become more accurate judges of what we ourselves know and don’t know, adopt learning strategies that get results, and fi nd objective ways to track our progress.

Two Systems of Knowing

In his book Thinking, Fast and Slow, Daniel Kahneman describes our two analytic systems. What he calls System 1 (or the automatic system) is unconscious, intuitive, and immediate. It draws on our senses and memories to size up a situation in the blink of an eye. It’s the running back dodging tackles in his dash for the end zone. It’s the Minneapolis cop, walking up to a driver he’s pulled over on a chilly day, taking evasive action even before he’s fully aware that his eye has seen a bead of sweat run down the driver’s temple.

System 2 (the controlled system) is our slower pro cess of conscious analysis and reasoning. It’s the part of thinking that considers choices, makes decisions, and exerts self- control.

We also use it to train System 1 to recognize and respond to par tic u lar situations that demand refl exive action. The running back is using System 2 when he walks through the moves in his playbook. The cop is using it when he practices taking a gun from a shooter. The neurosurgeon is using it when he rehearses his repair of the torn sinus.

System 1 is automatic and deeply infl uential, but it is susceptible to illusion, and you depend on System 2 to help you manage yourself: by checking your impulses, planning ahead, identifying choices, thinking through their implications, and staying in charge of your actions. When a guy in a restaurant walks past a mother with an infant and the infant cries out

“Dada!” that’s System 1. When the blushing mother says,

Make It Stick ê 106

“No, dear, that’s not Dada, that’s a man, ” she is acting as a surrogate System 2, helping the infant refi ne her System 1.

System 1 is powerful because it draws on our accumulated years of experience and our deep emotions. System 1 gives us the survival refl ex in moments of danger, and the astonishing deftness earned through thousands of hours of deliberate practice in a chosen fi eld of expertise. In the interplay between Systems 1 and 2— the topic of Malcolm Gladwell’s book Blink— your instantaneous ability to size up a situation plays against your capacity for skepticism and thoughtful analysis. Of course, when System 1’s conclusions arise out of misperception or illusion, they can steer you into trouble.

Learning when to trust your intuition and when to question it is a big part of how you improve your competence in the world at large and in any fi eld where you want to be expert.

It’s not just the dullards who fall victim. We all do, to varying degrees. Pi lots, for example, are susceptible to a host of perceptual illusions. They are trained to beware of them and to use their instruments to know that they’re getting things right.

A frightening example with a happy ending is China Airlines Flight 006 on a winter day in 1985. The Boeing 747 was 41,000 feet above the Pacifi c, almost ten hours into its eleven-hour fl ight from Taipei to LA, when engine number 4 lost power. The plane began to lose airspeed. Rather than taking manual control and descending below 30,000 feet to restart the engine, as prescribed in the fl ight book, the crew held at 41,000 with the autopi lot engaged and attempted a restart.

Meanwhile, loss of the outboard engine gave the plane asymmetrical thrust. The autopi lot tried to correct for this and keep the plane level, but as the plane continued to slow it also began to roll to the right. The captain was aware of the

Avoid Illusions of Knowing ê 107

deceleration, but not the extent to which the plane had entered a right bank; his System 1 clue would have been his vestibular refl ex— how the inner ear senses balance and spatial orientation—but because of the plane’s trajectory, he had the sensation of fl ying level. His System 2 clues would have been a glimpse at the horizon and his instruments. Correct procedure called for applying left rudder to help raise the right wing, but his System 2 focus was on the airspeed indicator and on the efforts of the fi rst offi cer and engineer to restart the engine.

As its bank increased, the plane descended through 37,000

feet into high clouds, which obscured the horizon. The captain switched off the autopi lot and pushed the nose down to get more speed, but the plane had already rolled beyond 45

degrees and now turned upside down and fell into an uncontrolled descent. The crew were confused by the situation. They understood the plane was behaving erratically but were unaware they had overturned and were in a dive. They could no longer discern thrust from engines 1– 3 and concluded those engines had quit as well. The plane’s dive was evident from their fl ight gauges, but the angle was so unlikely the crew decided the gauges had failed. At 11,000 feet they broke through the clouds, astonished to see that they were roaring toward earth. The captain and fi rst offi cer both pulled back hard on the stick, exerting enormous forces on the plane but managing to level off. Landing gear hung from the plane’s belly, and they’d lost one of their hydraulic systems, but all four engines came to life, and the captain was able to fl y on, diverting successfully to San Francisco. An inspection revealed just how severe their maneuver had been. Strains fi ve times the force of gravity had bent the plane’s wings permanently upward, broken two landing gear struts, and torn away two landing gear doors and large parts of the rear horizontal stabilizers.

Make It Stick ê 108

“Spatial disorientation” is the aeronautical term for a deadly combination of two elements: losing sight of the horizon and relying on human sensory perception that doesn’t jibe with reality but is so convincing that pi lots conclude their cockpit instruments have failed. As Kahneman says, System 1, the instinctual, refl exive system that detects danger and keeps us safe, can be very hard to overrule. Flight 006’s initial incident, the loss of an engine cruising at altitude, is not considered an emergency, but it quickly became one as a result of the captain’s actions. Rather than following prescribed procedure, and rather than fully engaging his System 2 analytic resources by monitoring all his instruments, he let himself become preoccupied with the engine restart and with a single fl ight indicator, airspeed. Then, when things spiraled out of control, he trusted his senses over his gauges, in effect trying to construct his own narrative of what was happening to the plane.

There’s a long list of illusions to which pi lots can fall prey (some with mordant names like “the leans,” “graveyard spin,”

and “the black hole approach”) and sites on the Internet where you can listen to the chilling last words of pi lots struggling and failing to understand and correct what’s gone wrong in the sky. Spatial disorientation was deemed the probable cause of the crash that killed Mel Carnahan, the governor of Missouri, while being fl own through a thunderstorm one night in October 2000, and the probable cause of the crash that killed John F. Kennedy Jr. and his wife and her sister off the shore of Martha’s Vineyard on a hazy night in July 1999. Fortunately, the China Airlines incident came to a good end, but the National Transportation Safety Board report of that incident reveals just how quickly training and professionalism can be hijacked by System 1 illusion, and therefore why we need to

Avoid Illusions of Knowing ê 109

cultivate a disciplined System 2, conscious analysis and reasoning, that always keeps one eye on the fl ight instruments.3

Illusions and Memory Distortions

The fi lmmaker Errol Morris, in a series of articles on illusion in the New York Times, quotes the social psychologist David Dunning on humans’ penchant for “motivated reasoning,” or, as Dunning put it, the “sheer genius people have at convincing themselves of congenial conclusions while denying the truth of incon ve nient ones.”4 (The British prime minister Benjamin Disraeli once said of a po liti cal opponent that his conscience was not his guide but his accomplice.) There are many ways that our System 1 and System 2 judgments can be led astray: perceptual illusions like those experienced by pi lots, faulty narrative, distortions of memory, failure to recognize when a new kind of problem requires a new kind of solution, and a variety of cognitive biases to which we’re prone. We describe a number of these hazards here, and then we offer mea sures you can take, akin to scanning the cockpit instruments, to help keep your thinking aligned with reality.

Our understanding of the world is shaped by a hunger for narrative that rises out of our discomfort with ambiguity and arbitrary events. When surprising things happen, we search for an explanation. The urge to resolve ambiguity can be surprisingly potent, even when the subject is inconsequential. In a study where participants thought they were being mea sured for reading comprehension and their ability to solve anagrams, they were exposed to the distraction of a background phone conversation. Some heard only one side of a conversation,

Make It Stick ê 110

and others heard both sides. The participants, not knowing that the distraction itself was the subject of the study, tried to ignore what they were hearing so as to stay focused on the reading and anagram solutions. The results showed that overhearing one side of a conversation proved more distracting than overhearing both sides, and the content of those partial conversations was better recalled later by the unintentional eavesdroppers. Why was this? Presumably, those overhearing half a conversation were strongly compelled to try to infer the missing half in a way that made for a complete narrative. As the authors point out, the study may help explain why we fi nd one- sided cell phone conversations in public spaces so intrusive, but it also reveals the ineluctable way we are drawn to imbue the events around us with rational explanations.

The discomfort with ambiguity and arbitrariness is equally powerful, or more so, in our need for a rational understanding of our own lives. We strive to fi t the events of our lives into a cohesive story that accounts for our circumstances, the things that befall us, and the choices we make. Each of us has a different narrative that has many threads woven into it from our shared culture and experience of being human, as well as many distinct threads that explain the singular events of one’s personal past. All these experiences infl uence what comes to mind in a current situation and the narrative through which you make sense of it: Why nobody in my family attended college until me. Why my father never made a fortune in business. Why I’d never want to work in a corporation, or, maybe, Why I would never want to work for myself. We gravitate to the narratives that best explain our emotions. In this way, narrative and memory become one. The memories we or ga nize meaningfully become those that are better remembered. Narrative provides not only meaning but also a

Avoid Illusions of Knowing ê 111

mental framework for imbuing future experiences and information with meaning, in effect shaping new memories to fi t our established constructs of the world and ourselves. No reader, when asked to account for the choices made under pressure by a novel’s protagonist, can keep her own life experience from shading her explanation of what must have been going on in the character’s interior world. The success of a magician or politician, like that of a novelist, relies on the seductive powers of narrative and on the audience’s willing suspension of disbelief. Nowhere is this more evident than in the national po liti cal debate, where like- minded people gather online, at community meetings, and in the media to fi nd common purpose and expand the story they feel best explains their sense of how the world works and how humans and politicians should behave.

You can see how quickly personal narrative is invoked to explain emotions when you read an article online whose author has argued a position on almost any subject— for example, an op- ed piece supporting the use of testing as a powerful tool for learning. Scan the comments posted by readers: some sing hallelujah while others can scarcely contain their umbrage, each invoking a personal story that supports or refutes the column’s main argument. The psychologists Larry Jacoby, Bob Bjork, and Colleen Kelley, summing up studies on illusions of comprehension, competence, and remembering, write that it is nearly impossible to avoid basing one’s judgments on subjective experience. Humans do not give greater credence to an objective record of a past event than to their subjective remembering of it, and we are surprisingly insensitive to the ways our par tic u lar construals of a situation are unique to ourselves. Thus the narrative of memory becomes central to our intuitions regarding the judgments we make and the actions we take.5

Make It Stick ê 112

It is a confounding paradox, then, that the changeable nature of our memory not only can skew our perceptions but also is essential to our ability to learn. As will be familiar to you by now, every time we call up a memory, we make the mind’s routes to that memory stronger, and this capacity to strengthen, expand, and modify memory is central to how we deepen our learning and broaden the connections to what we know and what we can do. Memory has some similarities to a Google search algorithm, in the sense that the more you connect what you learn to what you already know, and the more associations you make to a memory (for example, linking it with a visual image, a place, or a larger story), then the more mental cues you have through which to fi nd and retrieve the memory again later. This capacity expands our agency: our ability to take action and be effective in the world. At the same time, because memory is a shape- shifter, reconciling the competing demands of emotion, suggestions, and narrative, it serves you well to stay open to the fallibility of your certain-ties: even your most cherished memories may not represent events in the exact way they occurred.

Memory can be distorted in many ways. People interpret a story in light of their world knowledge, imposing order where none had been present so as to make a more logical story.

Memory is a reconstruction. We cannot remember every aspect of an event, so we remember those elements that have greatest emotional signifi cance for us, and we fi ll in the gaps with details of our own that are consistent with our narrative but may be wrong.

People remember things that were implied but not specifi -

cally stated. The literature is full of examples. In one, many people who read a paragraph about a troubled girl named Helen Keller later mistakenly recalled the phrase “deaf, dumb, and blind” as being in the text. This mistake was rarely made

Avoid Illusions of Knowing ê 113

Are sens