In Praise of Small Miracles – by David Brooks

Too many people die in auto accidents. When governments try to reduce highway deaths, they generally increase safety regulations. But, also in Kenya, stickers were placed inside buses and vans urging passengers to scream at automobile drivers they saw driving dangerously.

The heckling discouraged dangerous driving by an awesome amount. Insurance claims involving injury or death fell to half of their previous levels.

These are examples of a new kind of policy-making that is sweeping the world. The old style was based on the notion that human beings are rational actors who respond in straightforward ways to incentives. The new style, which supplements but does not replace the old style, is based on the obvious point that human beings are not always rational actors. Sometimes we’re mentally lazy, or stressed, or we’re influenced by social pressure and unconscious biases. It’s possible to take advantage of these features to enact change.

For example, people hate losing things more than they like getting things, a phenomenon known as loss aversion. In some schools, teachers were offered a bonus at the end of their year if they could improve student performance. This kind of merit pay didn’t improve test scores. But, in other schools, teachers were given a bonus at the beginning of the year, which would effectively be taken away if their students didn’t improve. This loss-framed bonus had a big effect.

People are also guided by decision-making formats. The people who administer the ACT college admissions test used to allow students to send free score reports to three colleges. Many people thus applied to three colleges. But then the ACT folks changed the form so there were four lines where you could write down prospective colleges. That tiny change meant that many people applied to four colleges instead of three. Some got into more prestigious schools they wouldn’t have otherwise. This improved the expected earnings of low-income students by about $10,000.

The World Bank has just issued an amazingly good report called “Mind, Society and Behavior” on how the insights of behavioral economics can be applied to global development and global health. The report, written by a team led by Karla Hoff and Varun Gauri, lists many policies that have already been tried and points the way to many more.

Sugar cane farmers in India receive most of their income once a year, at harvest time. In the weeks before harvest, when they are poor and stressed, they score 10 points lower on I.Q. tests than in the weeks after. If you schedule fertilizer purchase decisions and their children’s school enrollment decisions during the weeks after harvest, they will make more farsighted choices than at other times of the year. This simple policy change is based on an understanding of how poverty depletes mental resources.

In Zambia, hairdressers were asked to sell female condoms to their clients. Some were offered financial incentives to do so, but these produced no results. In other salons, top condom sellers had a gold star placed next to their names on a poster that all could see. More than twice as many condoms were sold. This simple change was based on an understanding of the human desire for status and admiration.

The policies informed by behavioral economics are delicious because they show how cheap changes can produce big effects. Policy makers in this mode focus on discrete opportunities to exploit, not vast problems to solve.

This corrects for a bias in the way governments often work. They tend to gravitate toward the grand and the abstract. For example the United Nations is now replacing the Millennium Development Goals, which expire in 2015, with the Sustainable Development Goals.

“The Millennium Development Goals are concrete, measurable and have an end-date, so they could serve as a rallying point,” says Suprotik Basu, the chief executive of the MDG Health Alliance. “One good thing about the Sustainable Development Goals is that they’re being written through a bottom-up consensus process. But sometimes the search for consensus leads you higher and higher into the clouds. The jury is out on whether we will wind up with goals concrete enough to help ministers make decisions and decide priorities.”

Behavioral economics policies are beautiful because they are small and concrete but powerful. They remind us that when policies are rooted in actual human behavior and specific day-to-day circumstances, even governments can produce small miracles.

Advertisements

How To Say “This Is Crap” In Different Cultures – by Erin Meyer

It was Willem’s turn, one of the Dutch participants, who recounted an uncomfortable snafu when working with Asian clients.  “How can I fix this relationship?” Willem asked his group of international peers.

Maarten, the other Dutch participant who knew Willem well, jumped in with his perspective. “You are inflexible and can be socially ill-at-ease. That makes it difficult for you to communicate with your team,” he asserted. As Willem listened, I could see his ears turning red (with embarrassment or anger? I wasn’t sure) but that didn’t seem to bother Maarten, who calmly continued to assess Willem’s weaknesses in front of the entire group. Meanwhile, the other participants — all Americans, British and Asians — awkwardly stared at their feet.

That evening, we had a group dinner at a cozy restaurant.  Entering a little after the others, I was startled to see Willem and Maarten sitting together, eating peanuts, drinking champagne, and laughing like old friends. They waved me over, and it seemed appropriate to comment, “I’m glad to see you together. I was afraid you might not be speaking to each other after the feedback session this afternoon.”

Willem, with a look of surprise, reflected, “Of course, I didn’t enjoy hearing those things about myself. It doesn’t feel good to hear what I have done poorly. But I so much appreciated that Maarten would be transparent enough to give me that feedback honestly. Feedback like that is a gift. Thanks for that, Maarten” he added with an appreciative smile.

I thought to myself, “This Dutch culture is . . . well . . . different from my own.”

Managers in different parts of the world are conditioned to give feedback in drastically different ways. The Chinese manager learns never to criticize a colleague openly or in front of others, while the Dutch manager learns always to be honest and to give the message straight. Americans are trained to wrap positive messages around negative ones, while the French are trained to criticize passionately and provide positive feedback sparingly.

One way to begin gauging how a culture handles negative feedback is by listening to the types of words people use. More direct cultures tend to use what linguists callupgraders, words preceding or following negative feedback that make it feel stronger, such as absolutely, totally, or strongly: “This is absolutely inappropriate,” or “This istotally unprofessional.”

By contrast, more indirect cultures use more downgraders, words that soften the criticism, such as kind of, sort of, a little, a bit, maybe, and slightly. Another type of downgrader is a deliberate understatement, such as “We are not quite there yet” when you really mean “This is nowhere close to complete.” The British are masters at it.  The “Anglo-Dutch Translation Guide”, which has been circulating in various versions on the Internet, illustrates the miscommunication that can result.

Anglo-Dutch Translation Guide Table

Germans are rather like the Dutch in respect of directness and interpret British understatement very similarly. Marcus Klopfer, a German client, described to me how a misunderstanding with his British boss almost cost him his job:

In Germany, we typically use strong words when complaining or criticizing in order to make sure the message registers clearly and honestly. Of course, we assume others will do the same. My British boss during a one-on-one “suggested that I think about” doing something differently. So I took his suggestion: I thought about it, and decided not to do it. Little did I know that his phrase was supposed to be interpreted as “change your behavior right away or else.” And I can tell you I was pretty surprised when my boss called me into his office to chew me out for insubordination!

I learned to ignore all of the soft words surrounding the message when listening to my British teammates. Of course, the other lesson was to consider how my British staff might interpret my messages, which I had been delivering as “purely” as possible with no softeners whatsoever. I realize now that when I give feedback in my German way, I may actually use words that make the message sound as strong as possible without thinking much about it. I’ve been surrounded by this “pure” negative feedback since I was a child.

All this can be interesting, surprising, and sometimes downright painful, when you are leading a global team: as you Skype with your employees in different cultures, your words will be magnified or minimized significantly based on your listener’s cultural context   So you have to work to understand how your own way of giving feedback is viewed in other cultures.   As Klopfer reported:

Now that I better understand these cultural tendencies, I … soften the message when working with cultures less direct than my own.  I start by sprinkling the ground with a few light positive comments and words of appreciation. Then I ease into the feedback with “a few small suggestions.” As I’m giving the feed- back, I add words like “minor” or “possibly.” Then I wrap up by stating that “This is just my opinion, for whatever it is worth,” and “You can take it or leave it.”  The elaborate dance is quite humorous from a German’s point of view … but it certainly gets [the] desired results!

What about you? Where do you think your own culture falls in this regard?   If I need to tell you your work is total crap, how would you like me to deliver the message?


Erin Meyer is a professor specializing in cross-cultural management at INSEAD, where she is the program director for two executive education programs: Managing Global Virtual Teams and Management Skills for International Business.  She is the author of The Culture Map: Breaking Through the Invisible Boundaries of Global Business (PublicAffairs, June 2014).  Follow her on Twitter: @ErinMeyerINSEAD

Why Some Teams Are Smarter Than Others – By Anita Wolley,Thomas W. Malone and Christopher F. Chabris

Psychologists have known for a century that individuals vary in their cognitive ability. But are some groups, like some people, reliably smarter than others?

Working with several colleagues and students, we set out to answer that question. In our first two studies, which we published with Alex Pentland and Nada Hashmi of M.I.T. in 2010 in the journal Science, we grouped 697 volunteer participants into teams of two to five members. Each team worked together to complete a series of short tasks, which were selected to represent the varied kinds of problems that groups are called upon to solve in the real world. One task involved logical analysis, another brainstorming; others emphasized coordination, planning and moral reasoning.

Individual intelligence, as psychologists measure it, is defined by its generality: People with good vocabularies, for instance, also tend to have good math skills, even though we often think of those abilities as distinct. The results of our studies showed that this same kind of general intelligence also exists for teams. On average, the groups that did well on one task did well on the others, too. In other words, some teams were simply smarter than others.

We next tried to define what characteristics distinguished the smarter teams from the rest, and we were a bit surprised by the answers we got. We gave each volunteer an individual I.Q. test, but teams with higher average I.Q.s didn’t score much higher on our collective intelligence tasks than did teams with lower average I.Q.s. Nor did teams with more extroverted people, or teams whose members reported feeling more motivated to contribute to their group’s success.

Instead, the smartest teams were distinguished by three characteristics.

First, their members contributed more equally to the team’s discussions, rather than letting one or two people dominate the group.

Second, their members scored higher on a test called Reading the Mind in the Eyes, which measures how well people can read complex emotional states from images of faces with only the eyes visible.

In a new study that we published with David Engel and Lisa X. Jing of M.I.T. last month in PLoS One, we replicated these earlier findings, but with a twist. We randomly assigned each of 68 teams to complete our collective intelligence test in one of two conditions. Half of the teams worked face to face, like the teams in our earlier studies. The other half worked online, with no ability to see any of their teammates. Online collaboration is on the rise, with tools like Skype, Google Drive and old-fashioned email enabling groups that never meet to execute complex projects. We wanted to see whether groups that worked online would still demonstrate collective intelligence, and whether social ability would matter as much when people communicated purely by typing messages into a browser.

And they did. Online and off, some teams consistently worked smarter than others. More surprisingly, the most important ingredients for a smart team remained constant regardless of its mode of interaction: members who communicated a lot, participated equally and possessed good emotion-reading skills.

This last finding was another surprise. Emotion-reading mattered just as much for the online teams whose members could not see one another as for the teams that worked face to face. What makes teams smart must be not just the ability to read facial expressions, but a more general ability, known as “Theory of Mind,” to consider and keep track of what other people feel, know and believe.

A new science of effective teamwork is vital not only because teams do so many important things in society, but also because so many teams operate over long periods of time, confronting an ever-widening array of tasks and problems that may be much different from the ones they were initially convened to solve. General intelligence, whether in individuals or teams, is especially crucial for explaining who will do best in novel situations or ones that require learning and adaptation to changing circumstances. We hope that understanding what makes groups smart will help organizations and leaders in all fields create and manage teams more effectively.

Lecture Me. Really. – by Molly Worthen

BEFORE the semester began earlier this fall, I went to check out the classroom where I would be teaching an introductory American history course. Like most classrooms at my university, this one featured lots of helpful gadgets: a computer console linked to an audiovisual system, a projector screen that deploys at the touch of a button and USB ports galore. But one thing was missing. The piece of technology that I really needed is centuries old: a simple wooden lectern to hold my lecture notes. I managed to obtain one, but it took a week of emails and phone calls.

Perhaps my request was unusual. Isn’t the old-fashioned lecture on the way out? A 2014 study showed that test scores in science and math courses improved after professors replaced lecture time with “active learning” methods like group work — prompting Eric Mazur, a Harvard physicist who has long campaigned against the lecture format, to declare that “it’s almost unethical to be lecturing.” Maryellen Weimer, a higher-education blogger, wrote: “If deep understanding is the objective, then the learner had best get out there and play the game.”

In many quarters, the active learning craze is only the latest development in a long tradition of complaining about boring professors, flavored with a dash of that other great American pastime, populist resentment of experts. But there is an ominous note in the most recent chorus of calls to replace the “sage on the stage” with student-led discussion. These criticisms intersect with a broader crisis of confidence in the humanities. They are an attempt to further assimilate history, philosophy, literature and their sister disciplines to the goals and methods of the hard sciences — fields whose stars are rising in the eyes of administrators, politicians and higher-education entrepreneurs.

In the humanities, there are sound reasons for sticking with the traditional model of the large lecture course combined with small weekly discussion sections. Lectures are essential for teaching the humanities’ most basic skills: comprehension and reasoning, skills whose value extends beyond the classroom to the essential demands of working life and citizenship.

Today’s vogue for active learning is nothing new. In 1852, John Henry Newman wrote in “The Idea of a University” that true learning “consists, not merely in the passive reception into the mind of a number of ideas hitherto unknown to it, but in the mind’s energetic and simultaneous action upon and towards and among those new ideas.” The lecture course, too, has always had skeptics. In his 1869 inaugural address as president of Harvard University, Charles Eliot warned that “the lecturer pumps laboriously into sieves. The water may be wholesome, but it runs through. A mind must work to grow.”

Eliot was a chemist, so perhaps we should take his criticisms with a grain of salt. In the humanities, a good lecture class does just what Newman said: It keeps students’ minds in energetic and simultaneous action. And it teaches a rare skill in our smartphone-app-addled culture: the art of attention, the crucial first step in the “critical thinking” that educational theorists prize.

Those who want to abolish the lecture course do not understand what a lecture is. A lecture is not the declamation of an encyclopedia article. In the humanities, a lecture “places a premium on the connections between individual facts,” Monessa Cummins, the chairwoman of the classics department and a popular lecturer at Grinnell College, told me. “It is not a recitation of facts, but the building of an argument.”

Absorbing a long, complex argument is hard work, requiring students to synthesize, organize and react as they listen. In our time, when any reading assignment longer than a Facebook post seems ponderous, students have little experience doing this. Some research suggests that minority and low-income students struggle even more. But if we abandon the lecture format because students may find it difficult, we do them a disservice. Moreover, we capitulate to the worst features of the customer-service mentality that has seeped into the university from the business world. The solution, instead, is to teach those students how to gain all a great lecture course has to give them.

When Kjirsten Severson first began teaching philosophy at Clackamas Community College in Oregon, she realized that she needed to teach her students how to listen. “Where I needed to start was by teaching them how to create space in their inner world, so they could take on this argument on a clean canvas,” she told me. She assigns an excerpt from Rebecca Shafir’s “The Zen of Listening” to help students learn to clear their minds and focus. This ability to concentrate is not just a study skill. As Dr. Cummins put it, “Can they listen to a political candidate with an analytical ear? Can they go and listen to their minister with an analytical ear? Can they listen to one another? One of the things a lecture does is build that habit.”

Listening continuously and taking notes for an hour is an unusual cognitive experience for most young people. Professors should embrace — and even advertise — lecture courses as an exercise in mindfulness and attention building, a mental workout that counteracts the junk food of nonstop social media. More and more of my colleagues are banning the use of laptops in their classrooms. They say that despite initial grumbling, students usually praise the policy by the end of the semester. “I think the students value a break from their multitasking lives,” Andrew Delbanco, a professor of American Studies at Columbia University and an award-winning teacher, told me. “The classroom is an unusual space for them to be in: Here’s a person talking about complicated ideas and challenging books and trying not to dumb them down, not playing for laughs, requiring 60 minutes of focused attention.”

Holding their attention is not easy. I lecture from detailed notes, which I rehearse before each class until I know the script well enough to riff when inspiration strikes. I pace around, wave my arms, and call out questions to which I expect an answer. When the hour is done, I’m hot and sweaty. A good lecturer is “someone who conveys that there’s something at stake in what you’re talking about,” Dr. Delbanco said. Or as Ms. Severson told me, “I’m a pretty shy person, but when I lecture, there’s a certain charisma. This stuff matters to me — it saved my life.”

Good lecturers communicate the emotional vitality of the intellectual endeavor (“the way she lectured always made you make connections to your own life,” wrote one of Ms. Severson’s students in an online review). But we also must persuade students to value that aspect of a lecture course often regarded as drudgery: note-taking. Note-taking is important partly for the record it creates, but let’s be honest. Students forget most of the facts we teach them not long after the final exam, if not sooner. The real power of good notes lies in how they shape the mind.

“Note-taking should be just as eloquent as speaking,” said Medora Ahern, a recent graduate of New Saint Andrews College in Idaho. I tracked her down after a visit there persuaded me that this tiny Christian college has preserved some of the best features of a traditional liberal arts education. She told me how learning to take attentive, analytical notes helped her succeed in debates with her classmates. “Debate is really all about note-taking, dissecting your opponent’s idea, reducing it into a single sentence. There’s something about the brevity of notes, putting an idea into a smaller space, that allows you psychologically to overcome that idea.”

Technology can be a saboteur. Studies suggest that taking notes by hand helps students master material better than typing notes on a laptop, probably because most find it impossible to take verbatim notes with pen and paper. Verbatim transcription is never the goal: Students should synthesize as they listen.

This is not a “passive” learning experience, and it cannot be replicated by asking students to watch videotaped lectures online: the temptations of the Internet, the safeguard of the rewind button and the comforts of the dorm-room sofa are deadly to the attention span. But note-taking is not a skill professors can take for granted. We must teach it. Dr. Cummins assigns one student in each day’s class the task of not only taking notes, but also presenting a critique of her argument at the next class meeting.

This kind of work prepares students to succeed in the class format that so many educators, parents and students fetishize: the small seminar discussion. A lecture course teaches students that listening is not the same thing as thinking about what you plan to say next — and that critical thinking depends on mastery of facts, not knee-jerk opinions. “We don’t want to pretend that all we have to do is prod the student and the truth will come out,” Dr. Delbanco told me.

Such words of caution are deeply unfashionable. But humanists have been beating back calls to update our methods, to follow the lead of the sciences, for a very long time. One hundred and sixty years ago, when education reformers proposed training students only in the sciences or “temporal callings,” John Henry Newman defended the humanities as a repository of moral and cultural knowledge, but also as crucial disciplines for teaching a student how to think, “to disentangle a skein of thought, to detect what is sophistical, and to discard what is irrelevant.” Such a student learns “when to speak and when to be silent,” Newman wrote. “He is able to converse, he is able to listen.”

Molly Worthen is the author, most recently, of “Apostles of Reason: The Crisis of Authority in American Evangelicalism,” an assistant professor of history at the University of North Carolina, Chapel Hill, and a contributing opinion writer.

How to separate learning myths from reality – by Artin Atabaki, Stacey Dietsch, and Julia M. Sperling

Over the years, you have probably gained some insight into how your brain works. You may have taken a course or read a book that promised to reveal the secret of maximizing your mental capacity—a common sales pitch of leadership coaches these days. In the process, you may have read that after a critical period in childhood there is no hope for significant learning, that half of your brain is inactive at any given time, or that you’re capable of learning properly only in your preferred style.

Each of these claims is what we call a “neuromyth,” a misconception based on incorrect interpretations of neuroscientific research. Our experience advising companies on their lifelong-learning initiatives suggests that such misunderstandings remain embedded in many corporate training programs. As companies increasingly pour money into developing their employees, they can no longer afford to invest in training programs based on inaccurate and out-of-date assumptions. In recent years, for example, US businesses alone spent more than $164 billion annually on employee learning.1 The stakes are high and getting higher.

Bridging the gap between popular neuromyths and the scientific insights gathered in the past few decades is a growing challenge. As modern brain-imaging techniques, such as functional magnetic resonance imaging (fMRI), have advanced scientific knowledge, these misleading lay interpretations by business practitioners have advanced as well. Unless such misconceptions are eliminated, they will continue to undermine both personal- and organizational-learning efforts. In this article, we’ll address the three most prominent neuromyths in light of the latest research and explore some of the implications for corporate learning.

Myth #1: The critical window of childhood

Most of us have heard about critical learning periods—the first years of life, when the vast majority of the brain’s development is thought to occur. After this period, or so the assumption too often goes, the trajectory of human development is deemed to be more or less fixed. That, however, is an exaggeration. Recent neuroscientific research indicates that experience can change both the brain’s physical structure and its functional organization—a phenomenon described as neuroplasticity.

Researchers studying the plasticity of the brain are increasingly interested in mindfulness. Practicing simple meditation techniques, such as concentrated breathing, helps build denser gray matter in parts of the brain associated with learning and memory, controlling emotions, and compassion. A team led by Harvard scientists has shown that just eight weeks of mindful meditation can produce structural brain changes significant enough to be picked up by MRI scanners.2

Organizations from General Mills in consumer foods to digital bellwethers such as Facebook and Google increasingly give their employees opportunities to benefit from mindfulness and meditation. Most such programs have garnered enthusiastic support from employees, who often see a marked improvement in their mind-sets and job performance. For example, employees at the health insurer Aetna who have participated in the company’s free yoga and meditation classes report, on average, a 28 percent decrease in their levels of stress and a productivity increase of 62 minutes a week—an added value of approximately $3,000 per employee a year. CEO Mark Bertolini, who started the program a few years ago, marvels at the level of interest generated across the company; to date, more than a quarter of Aetna’s 50,000 employees have taken at least one class.3 Leaders like Bertolini understand that providing them with the tools to become more focused and mindful can foster a better working environment conducive to development and high performance.

Myth #2: The idle-brain theory

A recent European survey discovered that nearly 50 percent of teachers surveyed in the United Kingdom and the Netherlands believed that the idle-brain theory has been proved scientifically.4 This misunderstanding originally stemmed from inaccurate interpretations of activation hot spots in brain-imaging studies. By now, more carefully interpreted functional brain scans have shown that, irrespective of what a person is doing, the entire brain is generally active and that, depending on the task, some areas are more active than others. People can always learn new ideas and new skills, not by tapping into some unused part of the brain, but by forming new or stronger connections between nerve cells.

This insight into the brain’s capacity becomes particularly relevant for the environment and context in which learning typically occurs. Everybody knows, all too well, about the habit of quickly checking e-mails or planning for the next meeting in the middle of a training session. The problem is that such multitasking engages large parts of the brain’s working memory. Without freeing that up, we cannot successfully memorize and learn new information. In short, multitasking and learning cannot occur effectively at the same time.

Some organizations, recognizing this problem, are working to build immersive learning environments where distractions are eliminated. At McKinsey, we’ve created a model factory that participants can walk through to see operating conditions in action. But first, everyone is asked to place their phones and other distractive belongings in a locker, so they can fully concentrate on the learning exercise at hand. At many companies, removing the temptation of using mobile devices during learning sessions is becoming commonplace.

Myth #3: Learning styles and the left/right brain hypothesis

Almost everyone has encountered the theory that most people are either dominantly analytical (and left brained) or more creative (and right brained). However, this either/or dichotomy is false. The two hemispheres of the brain are linked and communicate extensively together; they do not work in isolation. The simplistic notion of a false binary has led, in many businesses, to the misconception that each one of us has a strictly preferred learning style and channel. Recent studies have flatly disproved this idea, suggesting instead that engaging all the senses in a variety of ways (for instance, audiovisual and tactile) can help employees retain new content.

One organization that puts this idea into practice is KFC, which uses multiple forms of learning in customer-service training. Sessions begin with an after-hours board game placing the entire team of a store in the role of the customer. This is followed up by “gamified” learning that fits into roughly 15-minute windows during shifts. These video game–like modules put the employees behind the cash register to handle a number of typical customer experiences, including responding to audio and visual cues of satisfaction. At the end of the online modules, employees physically reconvene at the front of the store to hear feedback, report on what they’ve learned, and receive live coaching as reinforcement.

Although significant progress has been made, much remains to be done to eradicate neuromyths from the philosophy of corporate-training programs. Neuroscience research has confirmed some of the approaches that learning professionals already use, such as on-the-job reinforcement and engagement without distractions. But that research has also contradicted other approaches. Companies should draw on the newly substantiated insights and may need to rethink their training programs accordingly. At the very least, they need to improve their dialogue with, and understanding of, the scientific community.

About the authors

Artin Atabaki is a consultant in McKinsey’s Stuttgart office; Stacey Dietsch is an associate principal in the Washington, DC, office; and Julia M. Sperling is a principal in the Dubai office.

The authors wish to thank McKinsey’s Jennifer May, Michael Rennie, and Kristina Wollschlaeger for their support of and contributions to this article.