get in touch

Dig Deeper: Using Data to Drive Organizational Design Decisions

Data can be intimidating. But with the right story, it can make all the difference to gaining stakeholder buy-in for your proposed organizational transformation.

Two Clearing consultants, Sonya Patel and Nathan Toronto, help organizations optimize their structure around their goals. Today, they tell us how smart use of data is the key to unlocking the right organizational development path.

What Is Organizational Design?

Sonya: Organizational Design (OD) is how we think about the design of an organization as it relates to its people and processes. It’s how you structure an organization in a way to reflect its vision and its ability to achieve its mission. At The Clearing, we believe there are six factors that contribute to good Organizational Design.

Taken as a whole, these factors provide a framework for both ground-up Organizational Design or diagnosing and optimizing the design of an established entity – and each one is awash in data.

Nathan: That last point is important. When we work with clients on OD, one of the challenges we often see is that we’re not presented with a blank canvas. There are certain path dependencies that we must confront. For example, there may be data-related mandates in place that can’t be changed. When such factors come into play, having the right data on hand helps us create novel solutions to optimize Organizational Design despite these limitations.

How Does The Clearing Integrate Data Into OD?

Nathan: It’s so easy for us as humans to reason from the gut – there are a lot of evolutionary processes ingrained in us to push us in that direction. In fact, thinking in terms of data is not a natural human activity. That is why it’s both so hard and so important for organizations and leaders to do so. The first challenge to getting over the data hump is simply understanding how data can make the case for where the organization needs to go. The second challenge is recognizing how data can serve as a shared language between stakeholders of differing backgrounds and experiences.

Of course, data doesn’t tell a story on its own. That’s why effective use of data requires a mix of head and heart (or gut). This requires an observant human to make sense of and distill data down to a relatable story – or, in today’s case, an Organizational Design that resonates with stakeholders.

Sonya: Echoing Nathan, I want to emphasize the importance of prescribing meaning to your data because the context of how you apply it to Organizational Design is critical. Let’s consider three scenarios around moving a set of employees to a new team. These employees work in your widget factory in the tooling department. However, your machining department is currently a bottleneck. The intention of this theoretical Organizational Design move is to allow for increased machining capacity and increased overall productivity.

Here’s how data and context help make the case:

Without data, your case for moving these people simply looks like shifting them from one organizational box to another. Without the data, your “why” – or gut feeling – can be picked apart.
With data but without context, your case may appear muddy. Stakeholders may fixate on a certain data point at the expense of the bigger picture.
With data and context, you can tie your reasoning for the move to specific statistics. With data, you can show stakeholders that following a recent product launch, new tooling demands have decreased by 50% while machining needs have risen 50%. Shifting these team members out of tooling won’t negatively impact the required tooling capacity and will meet the machining demand.

In short, scenario three enables you to justify your organization design proposal in a way the heads of tooling, machining, and sales can all understand.

Nathan: We also set clear parameters on how we’ll use a given dataset when undertaking OD work with clients. This is important because there is always the temptation to search for evidence in the data that confirms existing decisions instead of looking at data with an objective eye. So, yes, data can serve as evidence, but it can also serve as justification. That makes it critical to have institutional controls in place.

This approach allows you to address a problem with the understanding of what the data REALLY means – not what someone wants it to mean. To be blunt, it keeps people from jumping to conclusions and finding the evidence to justify it.

How Does The Clearing Determine Data Needs and Sourcing?

Nathan: We start with the problem and determine what data we need instead of simply focusing on what data we have on hand. This often takes people out of their comfort zone. Here’s an anecdote that speaks to our instinct to stay where we’re comfortable when problem-solving.

There is a man searching for his keys in the cone of lamplight on the street. Somebody comes up and asks him, “Hey, what are you looking for?” The man says, “My keys – I must’ve dropped them.” The person then asks, “Why are you looking there?”  The man answers, “This is where the light is.” But what if his keys are in the dark, 50 feet away?

We have the tendency as humans to economize on cognitive energy when getting to the bottom of problems. One of the ways we do that is by just going to the data that we already have (the lamplight) and finding comfort in that data. However, does that data help us solve the actual problem at hand? Or do we need to get uncomfortable and out in the dark to find the data we really need?

Sonya: I absolutely agree. Organizational Design is an iterative process. Once we understand the problem, we try to understand the data elements. We catalog what data we have and use that to help unpack further what other data elements would be useful. We also endeavor to evaluate a mix of qualitative and quantitative data, which I believe differentiates The Clearing in the OD space.

Nathan: The blending of qualitative and quantitative data is important because different people respond differently to different sorts of evidence. There are some people who, when faced with a decision, will think, “What is the most similar case to this situation and the reason for that similar case?” That person will use that case as a script or map for how they should behave or how they should decide. Those types of people tend to be the ones for whom qualitative data is meaningful. There’s an impactful story, they see how it unfolds, and they can rely on it in their next decision.

There’s another type of thinking, which tends to be more structured and skeptical of one-off anecdotes. These people look for hard and fast statistics that lead to answers through numbers instead of the relatively limited experiences of a small group of people. They tend to find value in quantitative data, where they can use large amounts of data to take the macro view of a problem and use that information to inform their decision.

However, as we discussed earlier, data can serve as the shared language that unites these differing stakeholders. Combining the qualitative and quantitative allows us to create a rich theoretical story that resonates with both types of thinkers and informs how they make decisions.

How Does The Clearing Blend Qualitative and Quantitative Data in OD?

Sonya: First, let’s take a look at real-world examples of each type of data in the context of OD. Consider an organization that wants to optimize around customer experience. Qualitative data to meet this goal could include customer focus groups, testimonials, or user product reviews. Quantitative data could include customer satisfaction scores or survey results. By blending the two, we can help this organization develop the right OD strategy to drive customer experience.

Nathan: Now that we have the types of data established, let’s review an in-the-work example. We worked with a client that had an average of more than 100 days per hiring action. In other words, it took more than three months to hire someone. If you’re not familiar with that cycle, three months is incredibly long. They came to us because they wanted to shorten that turnaround.

As always, we started with understanding the problem. Our first step was gathering qualitative data. We began systematically interviewing department heads, hiring managers, and other employees involved in the hiring process. Through these interviews, we gathered anecdotes pointing to the speed of customer account managers handling the hiring. We heard that some were slower than others, which was the main issue for delays in the hiring cycle.

Our second step was bringing in the quantitative data. We reviewed the time to hire by the individual customer account manager, which seemed to validate what we heard in our interviews. Some were indeed slower than others. However, we weren’t convinced that was telling the whole story. In fact, it felt like we were only looking in our cone of light instead of the uncomfortable darkness.

To validate our initial findings, we built a regression model using the time to hire by account manager along with a number of other factors, including who the customer hiring manager was, whether a security clearance was involved, whether actions required hiring manager approvals, and more. Using multivariate regression, we looked at these different factors at once and determined what the average effect was on hiring time, while controlling for all factors simultaneously.

What we found was that the biggest factor in the slow hiring cycle was actually the amount of time specific actions were with the hiring manager, not the speed of the account manager. That blew us away. The opportunity this analysis revealed was that our client could educate hiring managers to understand how the hiring process worked—essentially empowering the customer instead of treating them as a series of transactions. Nobody had thought about that. Nobody expected that. But that’s what the solution was. And it was powerful because we had this deeply ingrained story about account managers that was intuition backed up by data. It would’ve been easy to say we had found the answer. But by venturing into the darkness and taking our client outside their comfort zone we discovered the true issue.

If we had created an OD solution based on that first finding, it may have helped but it wouldn’t have solved the issue. It’s why looking beyond the existing data is often where real gains are made.

Evaluating Your Organizational Design

If you’ve identified an issue or believe your organization is ready for a redesign, remember to take the data into account – and review it through a neutral lens. If you have questions or want to know more about how The Clearing approaches OD, please reach out. Nathan Toronto, Ph.D. is available at Nathan.toronto@dev2021.theclearing.com; Sonya Patel can be reached at sonya.patel@dev2021.theclearing.com.