Caught In The Act!… of doing something right: A neurobiological approach to high performance management – by Michael McIntosh

There is no shortage or theories on how to manage and lead – every day we witness new insights from the famous, talented or deceased. But how might we enact good management on a daily, hands-on basis – and, just as importantly, why? It turns out there is genuine science to this question, and if we understand and apply some basic neurobiological principles, the gap between mediocre and high performance becomes bridgeable quite quickly. Here’s why and how:

Adrenaline

neurobiology of high performance management - adrenaline

A good place to start is with three naturally occurring chemicals that our bodies produce instantly and one that takes a few moments more to generate but lasts a LOT longer. The first two are pretty good – epinephrine (adrenaline) and norepinephrine (noradrenaline) work together to prepare and mobilise the body for action. Increasing blood flow, attention and concentration, they help to sharpen the mind, improving task focus while blocking out, at least to some extent, distractions such as sounds, pain, fatigue and so on. In practise, they allow me to achieve a state of highly productive flow in my work while not hearing a single track on the CD that’s playing in the background. They also give me the energy to finish a cycling or kayaking trip without much discomfort, whereas a short while later, when their effects subside, my body complains in the most vociferous terms about the harsh punishment it has just endured.

Dopamine

neurobiology of high performance management - dopamine

In a professional context these two hormones are important for motivating us to tackle challenging tasks, providing the energy and focus to perform them to a high standard. Ideally, they are accompanied by dopamine, the chemical that makes us feel good by triggering our internal rewards systems, giving us a natural high, increasing positive feelings, optimism, camaraderie and sociability while reducing fear sensitivity and (some) inhibitions. Dopamine is a key ingredient in fostering our social drives and behaviours.

If you have ever enjoyed the feeling of having achieved something difficult, whether related to a sporting or work achievement, that’s dopamine doing its thing – and it can be even better if you were a part of a close-knit team at the time. Our intrinsic desire for those addictive dopamine pleasure dumps means that, unless fear of failure or other potential physical or psychological risks is stronger, most people continually seek new challenges and greater achievements, finding, despite the difficulties involved, them to be more stimulating than the uninspiring predictability of repeating the things we’ve done many times before. Together, the focus and energy provided by epinephrine and norepinephrine allow us to perform at a higher level, and if we take a positive view to those challenges dopamine is likely to be present throughout as well – it’s a naturally occurring behaviour-shaping system that lives within all of us.

Cortisol

neurobiology of high performance management - cortisol

To mess up this massively enjoyable party the fourth guest is cortisol – commonly known as the stress hormone. Cortisol responds to danger, just as epinephrine and norepinephrine do, but one of its primary roles is to protect and repair us. Taking a little longer to take effect, it helps us to be more alert to potential sources of danger, reducing our optimism, appetite for risk and sociability. Amongst other things, cortisol also prepares parts of the circulatory system for repair after the physical exertion that used to be an appropriate response to most causes of fear and danger. In a modern context, the chronic exposure to cortisol that results from sustained high levels of mental stress has been shown to be very damaging to physical and mental health, increasing illness in both frequency and severity, reducing quality and length of life. And whereas dopamine often lasts around 2 to 4 hours, cortisol can last up to 24 hours – or even longer when prolonged dwelling on negativity extends its influence.

Brain chemistry and fairness

Research shows that, potentially due to the power and longevity of the danger-sensitive cortisol in comparison to the pleasure-rewarding but shorter-lived effects of dopamine, employees need to feel they receive more positive than negative feedback. To put a number on the ideal proportion, it seems that around 5 positive things (ie compliments, good news, encouragement, positive social interactions) for every piece of negative feedback is likely to create a balance where people feel appreciated and fairly treated.

Management Practice

How does this inform management practice? The first thing is to understand that meaningful challenges provide motivation, energy and connection. This means that if managers want their teams to do “more” or “better”, they had better feel challenged – preferably by something that connects with their own values and interests, furthers their own development and success and they see as worthwhile socially. As noted earlier, raising the bar on challenges also raises the bar on motivation and intrinsic reward – on the basis that psychological and physical safety exists and employees feel empowered, competent, supported and sincerely appreciated for taking on that challenge and achieving those new goals. This is contrary to the view of some managers who hesitate to ask for “more” for fear that team members will be displeased, or whose demands for “more” are met with active or passive resistance. But in most cases this lack of employee task engagement is an artificial construct – most people want to do great things and feel great about what they achieve in a field that is of interest to them while feeling they are making a contribution to something worthwhile – it’s why most people choose their jobs, careers and hobbies. It’s also why every year a little over 1/3 of all Australians perform volunteer work for no financial reward. (If those intrinsic connections appear to be absent from an employee’s normal habits, it may be useful to view the employee within the workplace systemic context, rather than simplistically on his or her own.)

It also suggests that managers need to be aware of how their own natural danger-aversion instincts are holding them,and their teams back, as evolution taught them to: A million years ago those who recognised danger and reacted fastest survived, and so in an environment where we were just another item on the menu our ancestors evolved to be sensitive to danger as their highest priority – a natural behavioural trait that remains today, often manifesting itself as fear or anxiety (fear of the future) despite the absence of such predatory threats. As a part of this sensitivity, we are very good at ignoring the normal, instead spotting the exceptional, with dangerous exceptions prioritised over pleasurable ones.

In a work context, this means that conscientious managers are alert to problems, mistakes, conflict and anything else “bad” (or their potential), and react according to their own perceptions, biases and habits around dealing with that kind of threat. This is nothing for those in supervisory roles to be ashamed of, it’s simply a lifetime of learning and a few million years of evolutionary instinct in practice – nothing could be more natural. It’s not necessarily “wrong” either – a manager who is not alert to, or able to react to, threats is likely to be an incapable one.

Most managers, when not overly burdened with stressful concerns, also notice exceptionally good things, reacting with praise and sincere appreciation. But with the stronger influence of cortisol arising from the negatives, the overall impact is to create an environment where positives feel outnumbered and outweighed by negatives, commonly leaving employees feeling stressed about their work, unappreciated and, as a part of a prolonged pattern, disengaged.

The fix, however, is amazingly simple. As a for-instance, let’s assume that an employee has performed six tasks on a particular day. One of them was executed very well, one unacceptably poor and the other four of them unremarkably, ignore-ably, invisibly average. On the assumption that the four average performances were to a standard of proficiency that was perfectly acceptable, then surely the employee should not only be recognised for the single exceptional performance but, albeit to a lesser extent, all four of the “average” outcomes as well. If we now add in the single poor performance, as long as the manager ensures that all six pieces of feedback are honest, sincere and consistent, and that the five good achievements aren’t dismissed in time or appreciation on the way to an intense focus on only the poor outcome, an ideal feedback ratio is automatically achieved.

Example

This fairly common type of conversation:

  • “That was a really good job you did with the apples but the bananas ended up bruised and damaged – the customer’s not happy. What happened?” (Likely to be met with a defensive, blame-shedding “not-my-fault” or situational victim response.)

Might easily become this more collaborative, but rarer, conversation:

  • “How was your day? I heard about how you solved the apple problem – how did you manage that?” (Allow employee to share the story of success – the manager might even learn something about the problem-solving capability of the employee and/or there may be lessons for continual process improvement)
  • “I also see you managed the oranges, tomatoes, potatoes and pineapples to plan – were there any challenges?” (Allow employee to be and feel heard and appreciated again.)
  • “And I heard from the customer that there was a difficulty with the bananas – what was your take on that? (Allow employee to lead discussion on the problem and suggest own improvements, with manager acting as a collaborative supporter for the employee’s efforts to correct his or her own performance without avoiding the problem or lowering expectations.)

Bonuses

With this “fairer” feedback practice as a normal, everyday management habit, employees are more likely perform most tasks well, raising the bar on “average” due to the “addiction” to the dopamine motivations and reward in positive feedback that validate feelings of accomplishment. Employees are also more likely to volunteer problems rather than wait to have them brought up, feeling that it is safe to do so in the prevalent “fair” environment. In fostering this behaviour, it is apparent that the good feelings (dopamine) from the trusting and positive conversation and relationship (learned reward from this management practice) are preferable to the stress (cortisol) of attempting to hide, minimise, blame or avoid (learned coping mechanism from other life experiences). Through repeated application as a result of management habit, these universal, powerful, chemically-fueled neurobiological rewards and penalties teach either problem-avoidance or challenge-seeking as default behaviour for employees, with a good chance that, if widely and consistently practiced, they will also shape the dominant organisational culture.

As an extra bonus, there is another dimension to this study of chemical cocktails – the effect on the manager. It turns out the very same chemicals act upon the feedback giver as the receiver – meaning that managers who look for good news and sincerely compliment and support others more often are also more likely to be more motivated, more engaged, more responsible and more satisfied with their work and professional relationships. And with employees who are more proactive about fearlessly identifying and solving problems, there is every chance the actual number of problems managers have to deal with will reduce over time – turning perceptions into aspirational behaviours into a new normality.

So it seems that the manager who catches people in the act of doing something right is also doing the same for themselves, with the same powerfully positive benefits. And that’s not just my opinion – it’s our neurobiology.

©Michael McIntosh April 2016

Notes:

  • The accuracy of some of the research behind an “ideal” ratio of praise to criticism of 5.6:1 by Heaphy and Losada has been questioned, but nonetheless there is common agreement that, notwithstanding differences between individuals, sincere, positive good news and praise serves as an effective incentive for, and reinforcement of, desirable behaviours. In order to foster positive intentions, behaviours and outcomes over a sustained period of time, a ratio that strongly favours positive feedback over negative is most likely to be effective.
  • In the short-term, the threat of loss is generally more effective than the promise of reward, but people, over time, act in more permanent ways (eg resignation, sabotage, enlisting trade or labour union support) to minimise threats, including those on self-esteem through a lack of appreciation. In any case, what does it say about a supervisory manager if the main (disciplinary or otherwise) tool used is to threaten employees with a loss of some kind? Similarly, what does it say about a supervisory manager if there is an aversion to showing regular, honest and sincere appreciation?
  • Underlying the effectiveness of this type of approach are themes like integrity, intention and connection. Through non-verbal clues and behavioural consistency, employees are likely to, either immediately or over time, “see through” disingenuous comments and insincere behaviours, resulting in a severe breach in trust and substantial amounts of disengagement, certainly with the supervisor involved and potentially with the organisation on a larger scale. However, where a supervisory manager’s intention is to help an employee to be successful in their work, for their own benefit as much as anyone else’s, and subject to external influences or pre-existing baggage, it is likely that this process will be highly effective in preventing or removing misunderstandings and in enhancing relationships, performance and job satisfaction for all concerned. In this vein, I believe it is the primary role of supervisory managers to assist their subordinates to succeed, both as a moral purpose and a practical one – for how can a supervisory manager be successful in his or her role if his or her subordinates are not successful in theirs?

Nerdy stuff –

  • Dopamine is a neurohormone that acts as a neurotransmitter. It is produced in a few different areas of the brain and released by the hypothalmus, with effects that include influencing emotions (eg motivation, reward, sociability) as well as energising through increased pulse and blood pressure.
  • Cortisol is a steroid hormone produced by the adrenal gland on instruction from the pituitary gland, which is located near the brain stem. Like dopamine, its effects are numerous and include those listed in this article.
  • Adrenaline is produced in the medulla in the adrenal glands as well as in some neurons within the central nervous system, and also has numerous effects on the body as a part of its major fight-or-flight survival purpose. These occur commonly where the energy released is not used, which can result in irritability, nervousness, insomnia and even heart damage, and for these reasons I recommend physical exercise not only for its inherent benefits, but as a preferred natural use of the energy released by adrenaline as a part of a balanced, healthy and sustainable mental and physical lifestyle.

How To Say “This Is Crap” In Different Cultures – by Erin Meyer

It was Willem’s turn, one of the Dutch participants, who recounted an uncomfortable snafu when working with Asian clients.  “How can I fix this relationship?” Willem asked his group of international peers.

Maarten, the other Dutch participant who knew Willem well, jumped in with his perspective. “You are inflexible and can be socially ill-at-ease. That makes it difficult for you to communicate with your team,” he asserted. As Willem listened, I could see his ears turning red (with embarrassment or anger? I wasn’t sure) but that didn’t seem to bother Maarten, who calmly continued to assess Willem’s weaknesses in front of the entire group. Meanwhile, the other participants — all Americans, British and Asians — awkwardly stared at their feet.

That evening, we had a group dinner at a cozy restaurant.  Entering a little after the others, I was startled to see Willem and Maarten sitting together, eating peanuts, drinking champagne, and laughing like old friends. They waved me over, and it seemed appropriate to comment, “I’m glad to see you together. I was afraid you might not be speaking to each other after the feedback session this afternoon.”

Willem, with a look of surprise, reflected, “Of course, I didn’t enjoy hearing those things about myself. It doesn’t feel good to hear what I have done poorly. But I so much appreciated that Maarten would be transparent enough to give me that feedback honestly. Feedback like that is a gift. Thanks for that, Maarten” he added with an appreciative smile.

I thought to myself, “This Dutch culture is . . . well . . . different from my own.”

Managers in different parts of the world are conditioned to give feedback in drastically different ways. The Chinese manager learns never to criticize a colleague openly or in front of others, while the Dutch manager learns always to be honest and to give the message straight. Americans are trained to wrap positive messages around negative ones, while the French are trained to criticize passionately and provide positive feedback sparingly.

One way to begin gauging how a culture handles negative feedback is by listening to the types of words people use. More direct cultures tend to use what linguists callupgraders, words preceding or following negative feedback that make it feel stronger, such as absolutely, totally, or strongly: “This is absolutely inappropriate,” or “This istotally unprofessional.”

By contrast, more indirect cultures use more downgraders, words that soften the criticism, such as kind of, sort of, a little, a bit, maybe, and slightly. Another type of downgrader is a deliberate understatement, such as “We are not quite there yet” when you really mean “This is nowhere close to complete.” The British are masters at it.  The “Anglo-Dutch Translation Guide”, which has been circulating in various versions on the Internet, illustrates the miscommunication that can result.

Anglo-Dutch Translation Guide Table

Germans are rather like the Dutch in respect of directness and interpret British understatement very similarly. Marcus Klopfer, a German client, described to me how a misunderstanding with his British boss almost cost him his job:

In Germany, we typically use strong words when complaining or criticizing in order to make sure the message registers clearly and honestly. Of course, we assume others will do the same. My British boss during a one-on-one “suggested that I think about” doing something differently. So I took his suggestion: I thought about it, and decided not to do it. Little did I know that his phrase was supposed to be interpreted as “change your behavior right away or else.” And I can tell you I was pretty surprised when my boss called me into his office to chew me out for insubordination!

I learned to ignore all of the soft words surrounding the message when listening to my British teammates. Of course, the other lesson was to consider how my British staff might interpret my messages, which I had been delivering as “purely” as possible with no softeners whatsoever. I realize now that when I give feedback in my German way, I may actually use words that make the message sound as strong as possible without thinking much about it. I’ve been surrounded by this “pure” negative feedback since I was a child.

All this can be interesting, surprising, and sometimes downright painful, when you are leading a global team: as you Skype with your employees in different cultures, your words will be magnified or minimized significantly based on your listener’s cultural context   So you have to work to understand how your own way of giving feedback is viewed in other cultures.   As Klopfer reported:

Now that I better understand these cultural tendencies, I … soften the message when working with cultures less direct than my own.  I start by sprinkling the ground with a few light positive comments and words of appreciation. Then I ease into the feedback with “a few small suggestions.” As I’m giving the feed- back, I add words like “minor” or “possibly.” Then I wrap up by stating that “This is just my opinion, for whatever it is worth,” and “You can take it or leave it.”  The elaborate dance is quite humorous from a German’s point of view … but it certainly gets [the] desired results!

What about you? Where do you think your own culture falls in this regard?   If I need to tell you your work is total crap, how would you like me to deliver the message?


Erin Meyer is a professor specializing in cross-cultural management at INSEAD, where she is the program director for two executive education programs: Managing Global Virtual Teams and Management Skills for International Business.  She is the author of The Culture Map: Breaking Through the Invisible Boundaries of Global Business (PublicAffairs, June 2014).  Follow her on Twitter: @ErinMeyerINSEAD

Lecture Me. Really. – by Molly Worthen

BEFORE the semester began earlier this fall, I went to check out the classroom where I would be teaching an introductory American history course. Like most classrooms at my university, this one featured lots of helpful gadgets: a computer console linked to an audiovisual system, a projector screen that deploys at the touch of a button and USB ports galore. But one thing was missing. The piece of technology that I really needed is centuries old: a simple wooden lectern to hold my lecture notes. I managed to obtain one, but it took a week of emails and phone calls.

Perhaps my request was unusual. Isn’t the old-fashioned lecture on the way out? A 2014 study showed that test scores in science and math courses improved after professors replaced lecture time with “active learning” methods like group work — prompting Eric Mazur, a Harvard physicist who has long campaigned against the lecture format, to declare that “it’s almost unethical to be lecturing.” Maryellen Weimer, a higher-education blogger, wrote: “If deep understanding is the objective, then the learner had best get out there and play the game.”

In many quarters, the active learning craze is only the latest development in a long tradition of complaining about boring professors, flavored with a dash of that other great American pastime, populist resentment of experts. But there is an ominous note in the most recent chorus of calls to replace the “sage on the stage” with student-led discussion. These criticisms intersect with a broader crisis of confidence in the humanities. They are an attempt to further assimilate history, philosophy, literature and their sister disciplines to the goals and methods of the hard sciences — fields whose stars are rising in the eyes of administrators, politicians and higher-education entrepreneurs.

In the humanities, there are sound reasons for sticking with the traditional model of the large lecture course combined with small weekly discussion sections. Lectures are essential for teaching the humanities’ most basic skills: comprehension and reasoning, skills whose value extends beyond the classroom to the essential demands of working life and citizenship.

Today’s vogue for active learning is nothing new. In 1852, John Henry Newman wrote in “The Idea of a University” that true learning “consists, not merely in the passive reception into the mind of a number of ideas hitherto unknown to it, but in the mind’s energetic and simultaneous action upon and towards and among those new ideas.” The lecture course, too, has always had skeptics. In his 1869 inaugural address as president of Harvard University, Charles Eliot warned that “the lecturer pumps laboriously into sieves. The water may be wholesome, but it runs through. A mind must work to grow.”

Eliot was a chemist, so perhaps we should take his criticisms with a grain of salt. In the humanities, a good lecture class does just what Newman said: It keeps students’ minds in energetic and simultaneous action. And it teaches a rare skill in our smartphone-app-addled culture: the art of attention, the crucial first step in the “critical thinking” that educational theorists prize.

Those who want to abolish the lecture course do not understand what a lecture is. A lecture is not the declamation of an encyclopedia article. In the humanities, a lecture “places a premium on the connections between individual facts,” Monessa Cummins, the chairwoman of the classics department and a popular lecturer at Grinnell College, told me. “It is not a recitation of facts, but the building of an argument.”

Absorbing a long, complex argument is hard work, requiring students to synthesize, organize and react as they listen. In our time, when any reading assignment longer than a Facebook post seems ponderous, students have little experience doing this. Some research suggests that minority and low-income students struggle even more. But if we abandon the lecture format because students may find it difficult, we do them a disservice. Moreover, we capitulate to the worst features of the customer-service mentality that has seeped into the university from the business world. The solution, instead, is to teach those students how to gain all a great lecture course has to give them.

When Kjirsten Severson first began teaching philosophy at Clackamas Community College in Oregon, she realized that she needed to teach her students how to listen. “Where I needed to start was by teaching them how to create space in their inner world, so they could take on this argument on a clean canvas,” she told me. She assigns an excerpt from Rebecca Shafir’s “The Zen of Listening” to help students learn to clear their minds and focus. This ability to concentrate is not just a study skill. As Dr. Cummins put it, “Can they listen to a political candidate with an analytical ear? Can they go and listen to their minister with an analytical ear? Can they listen to one another? One of the things a lecture does is build that habit.”

Listening continuously and taking notes for an hour is an unusual cognitive experience for most young people. Professors should embrace — and even advertise — lecture courses as an exercise in mindfulness and attention building, a mental workout that counteracts the junk food of nonstop social media. More and more of my colleagues are banning the use of laptops in their classrooms. They say that despite initial grumbling, students usually praise the policy by the end of the semester. “I think the students value a break from their multitasking lives,” Andrew Delbanco, a professor of American Studies at Columbia University and an award-winning teacher, told me. “The classroom is an unusual space for them to be in: Here’s a person talking about complicated ideas and challenging books and trying not to dumb them down, not playing for laughs, requiring 60 minutes of focused attention.”

Holding their attention is not easy. I lecture from detailed notes, which I rehearse before each class until I know the script well enough to riff when inspiration strikes. I pace around, wave my arms, and call out questions to which I expect an answer. When the hour is done, I’m hot and sweaty. A good lecturer is “someone who conveys that there’s something at stake in what you’re talking about,” Dr. Delbanco said. Or as Ms. Severson told me, “I’m a pretty shy person, but when I lecture, there’s a certain charisma. This stuff matters to me — it saved my life.”

Good lecturers communicate the emotional vitality of the intellectual endeavor (“the way she lectured always made you make connections to your own life,” wrote one of Ms. Severson’s students in an online review). But we also must persuade students to value that aspect of a lecture course often regarded as drudgery: note-taking. Note-taking is important partly for the record it creates, but let’s be honest. Students forget most of the facts we teach them not long after the final exam, if not sooner. The real power of good notes lies in how they shape the mind.

“Note-taking should be just as eloquent as speaking,” said Medora Ahern, a recent graduate of New Saint Andrews College in Idaho. I tracked her down after a visit there persuaded me that this tiny Christian college has preserved some of the best features of a traditional liberal arts education. She told me how learning to take attentive, analytical notes helped her succeed in debates with her classmates. “Debate is really all about note-taking, dissecting your opponent’s idea, reducing it into a single sentence. There’s something about the brevity of notes, putting an idea into a smaller space, that allows you psychologically to overcome that idea.”

Technology can be a saboteur. Studies suggest that taking notes by hand helps students master material better than typing notes on a laptop, probably because most find it impossible to take verbatim notes with pen and paper. Verbatim transcription is never the goal: Students should synthesize as they listen.

This is not a “passive” learning experience, and it cannot be replicated by asking students to watch videotaped lectures online: the temptations of the Internet, the safeguard of the rewind button and the comforts of the dorm-room sofa are deadly to the attention span. But note-taking is not a skill professors can take for granted. We must teach it. Dr. Cummins assigns one student in each day’s class the task of not only taking notes, but also presenting a critique of her argument at the next class meeting.

This kind of work prepares students to succeed in the class format that so many educators, parents and students fetishize: the small seminar discussion. A lecture course teaches students that listening is not the same thing as thinking about what you plan to say next — and that critical thinking depends on mastery of facts, not knee-jerk opinions. “We don’t want to pretend that all we have to do is prod the student and the truth will come out,” Dr. Delbanco told me.

Such words of caution are deeply unfashionable. But humanists have been beating back calls to update our methods, to follow the lead of the sciences, for a very long time. One hundred and sixty years ago, when education reformers proposed training students only in the sciences or “temporal callings,” John Henry Newman defended the humanities as a repository of moral and cultural knowledge, but also as crucial disciplines for teaching a student how to think, “to disentangle a skein of thought, to detect what is sophistical, and to discard what is irrelevant.” Such a student learns “when to speak and when to be silent,” Newman wrote. “He is able to converse, he is able to listen.”

Molly Worthen is the author, most recently, of “Apostles of Reason: The Crisis of Authority in American Evangelicalism,” an assistant professor of history at the University of North Carolina, Chapel Hill, and a contributing opinion writer.

Decoding the Rules of Conversation – by Pamela Druckerman

My kids have recently picked up a worrying French slang word: bim (pronounced “beam”). It’s what children say in the schoolyard here after they’ve proved someone wrong, or skewered him with a biting remark. English equivalents like “gotcha” or “booyah” don’t carry the same sense of gleeful vanquish, and I doubt British or American kids use them quite as often.

As an American married to an Englishman and living in France, I’ve spent much of my adult life trying to decode the rules of conversation in three countries. Paradoxically, these rules are almost always unspoken. So much bubbles beneath what’s said, it’s often hard to know what anyone means.

I had a breakthrough on French conversation recently, when a French sociologist suggested I watch “Ridicule,” a 1996 French movie (it won the César award for best film) about aristocrats at the court of Versailles, on the eve of the French Revolution.

Life at Versailles was apparently a protracted battle of wits. You gained status if you showed “esprit” — clever, erudite and often caustic wit, aimed at making rivals look ridiculous. The king himself kept abreast of the sharpest remarks, and granted audiences to those who made them. “Wit opens every door,” one courtier explained.

If you lacked “esprit” — or suffered from “l’esprit de l’escalier” (thinking of a comeback only once you had reached the bottom of the staircase) — you’d look ridiculous yourself.

Granted, France has changed a bit since Versailles. But many modern-day conversations — including the schoolyard cries of “Bim!” — make more sense once you realize that everyone around you is in a competition not to look ridiculous. When my daughter complained that a boy had insulted her during recess, I counseled her to forget about it. She said that just wouldn’t do: To save face, she had to humiliate him.

Many children train for this at home. Where Americans might coo over a child’s most inane remark, to boost his confidence, middle-class French parents teach their kids to be concise and amusing, to keep everyone listening. “I force him or her to discover the best ways of retaining my attention,” the anthropologist Raymonde Carroll wrote in her 1987 book “Cultural Misunderstandings: The French-American Experience.”

This is probably worse in Paris, and among the professional classes. But a lot of French TV involves round-table discussions in which well-dressed people attempt to land zingers on one another. Practically every time I speak up at a school conference, a political event or my apartment building association’s annual meeting, I’m met with a display of someone else’s superior intelligence. (Adults don’t actually say “bim,” they just flash you a satisfied smile.) Jean-Benoît Nadeau, a Canadian who co-wrote a forthcoming book on French conversation, told me that the penchant for saying “no” or “it’s not possible” is often a cover for the potential humiliation of seeming not to know something. Only once you trust someone can you turn down the wit and reveal your weaknesses, he said. (I think the French obsession with protecting private life comes from the belief that everyone’s entitled to a humiliation-free zone.)

It’s dizzying to switch to the British conversational mode, in which everyone’s trying to show they don’t take themselves seriously. The result is lots of self-deprecation and ironic banter. I’ve sat through two-hour lunches in London waiting for everyone to stop exchanging quips so the real conversation could begin. But “real things aren’t supposed to come up,” my husband said. “Banter can be the only mode of conversation you ever have with someone.”

Even British courtships can be conducted ironically. “ ‘You’re just not my type,’ uttered in the right tone and in the context of banter, can be tantamount to a proposal of marriage,” Ms. Fox writes.

Being ridiculous is sometimes required. The classic British hen night — a bachelor party for brides — involves groups of women wearing feather boas to a bar, then daring one another to “kiss a bald man” or “remove your bra without leaving the room.” Stumbling around drunk with friends — then recounting your misadventures for months afterward — is a standard bonding ritual.

After being besieged by British irony and French wit, I sometimes yearn for the familiar comfort of American conversations, where there are no stupid questions. Among friends, I merely have to provide reassurance and mirroring: No, you don’t look fat, and anyway, I look worse.

It might not matter what I say, since some American conversations resemble a succession of monologues. A 2014 study led by a psychologist at Yeshiva University found that when researchers crossed two unrelated instant-message conversations, as many as 42 percent of participants didn’t notice. A lot of us — myself included — could benefit from a basic rule of improvisational comedy: Instead of planning your next remark, just listen very hard to what the other person is saying. Call it “mindful conversation,” if you like. That’s what the French tend to do — even if it ends with “bim.”