Why Experienced Managers Make Bad Talent Decisions

It seems no one in HR has escaped the story of Moneyball: The Art of Winning an Unfair Game.Visier Top Talent Graphic

For those few who haven’t, however, it is the story of how Billy Beane, as the general manager of the Oakland Athletics baseball team, challenged conventional methods of player evaluations. Supported by carefully interpreted statistical data, Beane focused on landing talent the “old school” experts had overlooked.

In the end, the team ended up winning with a payroll that was less than half that of their division rivals, a true story that is chronicled by Michael Lewis in the book-turned-Hollywood film

Yes, the story is about effective baseball team management, and most have focused on the team building, performance management, or Big Data aspects of the story, but it also demonstrates how the human brain is wired to misjudge talent.

As academics Richard Thaler and Cass Sunstein write in this book review, Lewis is actually illustrating a central finding in cognitive psychology. When Lewis states that the “professionals were unduly affected by how a player had performed most recently, even though recent performance is not always a good guide,” he is really talking about heuristics (more on that later).

Even the best, most experienced managers can make poor judgment calls about individuals. It’s only human nature, after all: Our natural propensity for bias can explain why an applicant is less likely to be admitted to medical school when interviewed on a rainy day or why taller presidential candidates win.

This post will cover the science behind faulty decision-making, and tips for adopting a more analytical approach as a way of overcoming cognitive biases.

The Human Brain: A Tale of Two Systems

As Nobel Prize-winning psychologist Daniel Kahneman writes in Thinking, Fast and Slow, the brain is dominated by two major “characters”: System 1 (fast thinking) is the domain of intuitive responses, and System 2 (slow thinking) is the domain of conscious, effortful thought.

To understand how System 1 works, take a look at the following image:

Angry Face

With minimal effort, you probably came to the conclusion that this person is angry and is about to say — or do — something quite negative. That’s System 1 at work: it’s automatic, and can help you efficiently read other people in social situations. It includes innate skills and activities that have become automatic through repeated practice (like driving a car or riding a bike).

Now, to understand how the other system works, do this simple math problem without a calculator:

4 + 4 – 2 x 5 + 9 ÷ 2 x 3 + 1 = X

To solve this math problem, you likely engaged System 2, which required you to construct thoughts in an orderly series of steps.

When everything is working well, the intuitive response system (defined as a cluster of mental shortcuts) will call on the domain of conscious thought for help when needed. However, System 1 often has biases, which can lead to systematic deviations from logic, probability or rational choice. It cannot be turned off, and causes problems when it is the only mechanism used for high stakes, complex decision-making.

How “Quick and Dirty Rules of Thumb” Impact Talent Decisions

Intuitive judgments can be wrong because they are often driven by one of several heuristics (quick and dirty rules of thumb), as identified by Kahneman and his colleague Amos Tversky.

Heuristics can have an impact on the decisions managers make about people in several ways:

Representativeness Heuristic
The representativeness heuristic is used when making judgments about the probability of an event under uncertainty. We estimate the likelihood of an event by comparing it to an existing prototype that already exists in our minds.

This heuristic can help explain gender bias — why people think a man or a woman would be better in a certain role. In fact, major symphony orchestras have found that more women are hired when they conduct auditions with the candidates hidden behind a screen.

Availability Heuristic
Are you more concerned about dying in an airplane crash than getting in a car accident? That’s the availability heuristic at work, which makes people overestimate the likelihood of an infrequent, yet highly-publicized event.

This heuristic is one of the reasons why people are more easily influenced by a sensational story than by a large body of statistical evidence. It can explain why a manager wants to promote a salesperson based on the most recent successful deal, but ignore a person in the same role with a stronger, steady history of success.

Anchoring Heuristic
The anchoring effect happens when people consider a particular value for an unknown quantity before estimating what the quantity should be. For example, people are often influenced by the asking price for a home before they assess the true value of the property.

When crafting a counteroffer for an employee who has been offered a job with another firm, managers often focus solely on whether they can match the gap between the current salary and the offered salary. The supplied salary increase number “anchors” the response to focus just on that salary number, rather than considering alternatives such as increased variable pay, promotions, special projects or other highly valued — but less costly — alternatives.

At the end of the day, why do these heuristics fail us? Along with our tendency to draw on past experiences, they take whatever information is readily available. They make people jump to conclusions, constructing coherent stories out of very little evidence. They also work behind the scenes, without people consciously knowing it.

Overcoming Cognitive Bias With Big Data

Everyday, managers make costly, time-sensitive people decisions: An employee has been offered a job at a different company, what pay raise is justified to retain her? Which employee should I consider for the new manager position? I’m worried that an employee is a flight risk – is he?

When these decisions are made based on intuition alone, the result is wasted spending, as well as performance, productivity, and engagement issues. Total human capital costs average nearly 70% of the operating expenses of most organizations. The stakes are simply too high to leave these decisions up to cognitive bias.

This is where the domain of conscious, effortful thought as described by Kahneman comes in.

For example, let’s say a manager is concerned that the organization is leaking talent because one person just resigned. As Ian Cook describes in this blog post, instead of jumping to conclusions that turnover is running rampant due to compensation or other reasons, this is an opportunity to engage in more analytical thinking.

Essentially, it involves the following steps:

  • identify the problem
  • ask key questions
  • uncover trends
  • analyze the situation
  • develop options
  • evaluate alternatives
  • select preferred alternative

The best way to support this kind of thinking is to look for and include data in the process. Workforce data is inherently Big Data — it has a high volume, variety, and velocity. Gathering and analyzing this data can take time. But in-memory analytics advances have resulted in vastly shortened query response times (about 250,000 times faster). People don’t have to let reports run overnight anymore.

One challenge, as identified by Kahneman, is that people often use System 2 approaches to reinforce what they already think they know, instead of using the evidence to reach new conclusions. It is important to combine the wisdom of experts with a culture of hypothesis testing to counter this natural tendency; I cover this topic in more detail in this previous blog post on strategic thinking skills.

The Case For Rational Thinking in a Volatile World

There is something to be said for experience: when a manager has encountered a situation before, he is in a much better position to efficiently deal with a similar event. Problem is, the brain often tricks people into thinking events are similar when they aren’t, and thinking they know the whole story when they don’t.

Furthermore, there is evidence that, as the pace of change quickens, more and more managers will find themselves in new situations, guiding people who are using skills they themselves have never used before. As described in this McKinsey report, the combination of emerging economies, technology advances, and other forces “will produce change so significant that much of the management intuition that has served us in the past will become irrelevant.”

So, instead of making snap judgments based on little evidence, it will become increasingly important for managers become aware of bias, making sound decisions based on evidence with a little help from their colleagues….and System 2.

avatar
Dave Weisbeck |

Dave enjoys problems that require both logical and creative solutions, and thus exercise both his left- and right-brain. He started out his career in the 90s writing code as a computer programmer, and then moved on to product management, marketing and general management roles. Dave has a strong background in analytics, having played a key role in the analytics businesses at SAP, Business Objects, and Crystal Decisions. At Visier, he looks after product and market strategy. A proficient do-it-yourselfer (he made his own PVR for fun), Dave’s hobbies include the logical and creative challenges of cooking, home brewing, and photography.

Human Resources Today
Human Resources Today