In his latest book, “The Power of Noticing: What the Best Leaders See,” Harvard's Max Bazerman delves into “motivated blindness,” a term that refers to a systemic failure to notice unethical behavior in others when it’s not in our interest to do so.

Stephanie Mitchell/Harvard Staff Photographer

Nation & World

Seeing what leaders miss

5 min read

Behavioral psychologist Bazerman goes beyond appearances in new book

Although they may seem disparate at first glance, crises like the Catholic Church clergy sex-abuse scandal, investor Bernie Madoff’s multibillion dollar Ponzi scheme, and the 2008 global financial meltdown all have at least one commonality: For years, some very smart people failed to notice or act on critical information that could have limited the damage.

More recently, executives at the Veterans Administration and General Motors have been criticized for failing to see and cure corrupt organizational cultures that led to accusations of criminal harm done to patients and consumers by negligent employees.

So why didn’t the leaders of these organizations, or others in similar straits, identify key problems and act before things turned catastrophic?

“They don’t want to see, they can’t see, the organization isn’t designed to see, and there’s other people who are doing their best to keep us from seeing,” said Max Bazerman, the Jesse Isidor Straus Professor of Business Administration at Harvard Business School and co-director of Harvard Kennedy School’s Center for Public Leadership,

Social scientists have long identified our tendency to overlook bad news when it suits us as “motivated blindness,” a term that refers to a systemic failure to notice unethical behavior in others when it’s not in our interest to do so. The condition affects virtually everyone. Even leaders who have gained tremendous success through focus and application in one arena sometimes lack the self-awareness to routinely question whether information on which they’re basing decisions is reliable.

“Research in the field of behavioral ethics has found that when we have a vested self-interest in a situation, we have difficulty approaching that situation without bias, no matter how well-calibrated we believe our moral compass to be. We want to think the best of our kids and spouses and we’re disinclined to speak against those with influence in our offices and occupations,” Bazerman writes in his latest book, “The Power of Noticing: What the Best Leaders See.”

Even with his expertise in behavioral psychology, Bazerman only recently realized that his own noticing skills were “truly terrible.” Hired a few years ago as an expert witness for the Department of Justice in what was to be the largest-ever lawsuit against the tobacco industry, Bazerman says that just before he was due to testify, he felt pressured by the government to water down written testimony he had submitted to the court in which he recommended structural changes to the tobacco industry.

While the request seemed odd and vaguely unsettled him, Bazerman, distracted by other stresses and uncertain whether the request was corrupt, didn’t act on those feelings at the time. It wasn’t until six weeks later, after reading that another expert in the case said that he too had been pressured to alter his testimony, that he realized he had failed to notice that the gravity of the situation ― possible witness tampering ― had called for decisive action.

“We see something that we don’t quite know what to make out of it, we don’t know how to interpret it, we’re already very busy, we don’t think that we would actually be happier if we learned some bad news and we just don’t learn more,” he said about why people tend to brush off difficult information. “So the question is, did I not notice or did I notice and not act? I think that the answer’s often somewhere in the middle.”

The failure to anticipate and then head off what Bazerman calls “predictable surprises” until after trouble has reared its head, as demonstrated by the U.S. airline security breakdown of 9/11 or the New Orleans levee failures during Hurricane Katrina, often stems from a mix of cognitive, organizational, and political causes. A leader may be overconfident in his or her ability to understand and fix a problem, or deliberately ignore warning signs because of financial or political expediency.

One way leaders can overcome a tendency to miss critical clues, Bazerman said, is developing a “noticing mindset,” frequently asking themselves and others inside and outside their organizations, “What are the critical threats and challenges that we’re ignoring?” Another is designing internal systems, such as auditing or human resources or sales, to deliver the most useful and accurate data.

“I think lots of organizations make the mistake of hiring a McKinsey or a Bain or a [Boston Consulting Group] and they keep on hiring the same old people over time,” he said. “If you’re going to use outside consultants, don’t allow them to become insiders. One of the things you want from them is to be outsiders, so after a fairly limited amount of time … there may be some wisdom of getting rid of them and bringing in a different consultant for the next project so that you maintain that fresh outlook or perspective.”