Categories
Culture

What lack of diversity has to say about estimates

What would be your guess for the weight of the following ox? Likely you’re not a livestock specialist, your guess may be way off, and you think that relying on a specialist would be the most accurate way to estimate its weight. I’d go a bit further and say that averaging out the specialists’ guesses with regular people’s ones would result in an estimate worse than the one from a specialist, right?!? Well, that may not be the case.

Photo by Macau Photo Agency on Unsplash

In his book “The Wisdom of Crowds”, James Surowiecki describes an experiment done in 1906 when averaging out all the 787 guesses resulted in a better weight estimate than the guesses of so-called specialists. The crowd average was only 1 pound off! Surowiecki states that “under the right circumstances, groups are remarkably intelligent, and are often smarter than the smartest people in them.”

The average of a large number of forecasts reliably outperforms the average individual forecast. However, that depends on how independent the participant’s opinions are, a statistics concept known as “signal independence”. Independence in this context means uncorrelated.

The more correlated people are, the less independent they are and then their opinion will be biased towards a specific value that may be different from the actual result. Diversity can play an important role to drive that independence.

Correlation can be originated from different sources. If people discuss a subject in advance, their opinions may not be independent during a meeting with other team members. If they always talk to the same people, even about different topics, their opinions may be correlated. If team members have the same background, that’s another source of correlation.

Diversity in a team brings a variety of ways to think, multiple cultural experiences, conflicting ideas to stimulate discussions, and a learning environment that is not possible if people are all alike. That reduces correlation. Diversity is also a key factor when running surveys that can support strategic business decisions. Including people from different races, gender, experience level, religion, sexual orientation, nationalities, and many other dimensions, when combined with a proper selection of a large sample, reduces biases and improves accuracy.

While diversity by itself does not guarantee independent opinions, neglecting it will most likely affect your organization’s capacity to come up with estimates that are closer to reality.

References

  • The Wisdom of Crowds, James Surowiecki, 2005
  • People Analytics, Coursera, Wharton University of Pennsylvania

You can also find other articles in my Medium or LinkedIn.

Categories
Culture

2 must-haves to build a culture of learning

Success can come in many different forms, but if your organization wants to have a chance to continue succeeding, learning must be part of its DNA. Similarly, learning can be achieved by taking different paths (training, coaching, sharing, teaching, …), but most of the opportunities come from day-to-day work since that’s where people spend most of their time. However, practice by itself doesn’t guarantee learning.

Allowing team members to voice their thoughts and building an environment that stimulates taking risks are required to convert experience and practice into learning. However, these can only be achieved if psychological safety and blamelessness are present.

Psychological Safety

Can you think of an occasion at work when you held your thoughts? If so, why did you do that? Didn’t you feel comfortable sharing them? Were you afraid of saying anything ‘wrong’, of any sort of retaliation, or of sounding incompetent? While several factors (introversion, people in the room, …) can contribute to people not to voice their opinions, lack of psychological safety is definitely a common one.

Psychological safety is defined as a shared belief held by members of a team that the team is safe for interpersonal risk-taking. That means that people should feel comfortable to share their ideas/suggestions/mistakes/concerns with their teams with no fear of being punished. Sometimes reactions apparently as harmless as a giggle or cutting someone off can erode the psychological safety of a team.

Workplaces with low psychological safety can jeopardize the ability of teams to innovate and to learn from each other. New ideas usually come from tiny increments of work through a sequence of several, maybe unintentional events. A comment may trigger a discussion, that raises a problem, that drives ideas, that may lead to a brand-new product. However, that flow will not happen unless people feel safe to speak up and that may lead to missing many learning opportunities. The way an organization handles incidents tells a lot about how much psychological safe is valued in its culture.

Blamelessness

All the services are down and you start receiving tons of notifications. A war room is created and then the ‘fun’ starts. If that gives you goosebumps, you may have had a traumatic experience. If that brings you some memories of collaboration, teamwork, and synergy, you may have experienced an incident where the focus was solely on addressing the problem and on avoiding it to happen again.

Blamelessness is the notion of switching responsibility from people to systems and processes and it fosters psychological safety. That means switches from ‘who’ to ‘what’. Teams should assume that people have done their best with the knowledge and tools that they had at hand. That’s similar to the Prime Directive for Retrospectives and bringing that up in the first incidents and post-mortems can set the scene for a learning instead of a blaming experience.

The Bad Apple Theory (or Old View) maintains that:

  • A complex system would be fine assuming you don’t have unreliable people (Bad Apples)
  • Humans are the dominant contributor to accidents
  • Failures come as unpleasant surprises and are introduced by unreliable people

Instead of thinking that we have bad people in safe systems, we should think that we have well-intentioned people in imperfect systems. That mindset will drive your team to learn from failures and build the foundation for them to focus on problem-solving instead of covering their tracks not to be blamed later.

Another aspect to watch out for to ensure blameless incidents is hindsight bias. If somebody says “I knew that was happening. It was so obvious!”, that should ring a bell. Hindsight bias is the common tendency for people to perceive past events as having been more predictable than they actually were. For example, if a friend says after a game that he knew since the beginning that his team would win it. There was no way to know that for sure, but he/she actually believed that he/she knew. This sort of attitude needs to be purged from any team. A way to handle that without calling out someone in front of many people is to have a 1:1 session and describe why those comments would damage the team’s capacity to handle and learn from an incident.

Fundamental attribution error can also be another call for action if noticed. That’s the tendency to assume that somebody’s action depends more on the type of person he/she is than the environment that influenced that action. For instance, somebody pushes an update that breaks a system and a colleague concludes that that person is not reliable since he didn’t graduate from a renowned university. Again, that represents a type of behavior that deviates from the main focus: understanding the problem, fixing it, and learning from it.

Being able to identify and fix these common counterproductive behaviors is required to build an environment where people know they will not be blamed and that they will be safe to thrive, take some risks, and learn.