Our Half-Baked Adoption of OKRs

Many companies adopt the Objectives and Key Results (OKRs) framework.

They create goals but struggle to achieve them.

Only some of this failure can be blamed on the goals themselves.

Part of this failure can be attributed to a lack of an analytic culture and a lack of an analytic mindset in your key employees.

Is your culture delivery-first or analytics-first?

Without a change in mindset and culture, any adoption of OKRs (or any other framework) is half-baked.

Look for these symptoms that your adoption of OKRs is not on track:

  • Layering OKRs on top of roadmaps with deadlines

  • Teams not involved in data analysis

  • Unable to embrace uncertainty

  • More time spent crafting key results than discussing data

  • Negative employee feedback around analytics

  • Poor execution of OKRs

  • How to change

Layering OKRs on top of roadmaps with deadlines

A symptom of a half-baked OKR adoption is creating key results but continuing to focus more on roadmaps with deadlines.

Adopting OKRs requires adopting uncertainty into company culture.

But uncertainty is often unacceptable to company leaders.

To avoid uncertainty, companies usually keep their existing processes and layer OKRs on top.

Unfortunately, roadmaps with deadlines and achieving key results are not compatible.

Leaders will still spend most of their time monitoring progress to launch (output) instead of progress to the key result (outcome).

Teams, when given a deadline, will choose to hit that deadline.

It’s much safer to say:

  • “We hit the deadline”

than…

  • “We are delaying the launch to build the feature we think will achieve the key result”

or…

  • “We need to keep iterating to make more progress towards our key result”

Achieving a key result often takes multiple launches.

In half-baked OKR adoptions, leaders and teams don’t plan for the necessary iterations to achieve a key result.

The definition of done is more complicated when using OKRs.

Are you finished when the team achieves 80% of their key result?

These are the hard questions that the OKRs framework is meant to drive.

Sticking to the simple question of whether a team hit a deadline is the easy way out.

Teams not involved in data analysis

To create an analytic mindset in your key employees, it is important to have them participate in analyzing data rather than just receive findings from a data analyst.

We tend to believe ideas that we come up with.

We tend to discount ideas that are given to us.

Knowing this, the process of analyzing data becomes just as important as understanding the findings from the analysis itself.

Here’s how the analysis process builds up analytics awareness.

When a team strives for a key result, they usually realize they need to collect more data.

To collect data, they need to allocate engineering time towards data tasks (think about spending 10% of engineering time on analytics instrumentation, collection, and analysis).

Once a team collects the data, they need to get access to the data (logins, permissions, data dumps, etc).

With access, they analyze this new data, and realize there are several problems with the quality and quantity of the data to resolve (too much? too little? not quite what they wanted…).

A team that owns their data analysis is more likely to prioritize fixing these data issues.

Most people envision the process of achieving OKRs as:

However, it’s more involved and requires more commitment:

As teams take responsibility for all the steps in data analysis, they are more likely to take an analytics-first approach to decision making.

They are more likely to make decisions based on analysis they do themselves rather than when they get the analysis handed to them from someone else.

Unable to embrace uncertainty

Adopting the Objectives and Key Results (OKRs) framework requires adopting uncertainty into company culture.

But uncertainty is often unacceptable to company leaders.

To avoid uncertainty, companies usually keep their existing processes and add OKRs to them.

Unfortunately, this does not make room in the schedule to make sure OKRs are successful.

To avoid a half-baked adoption of OKRs, leaders need to shift how everyone spends their time and energy.

Leaders need to listen to individuals and watch closely how well OKRs are being adopted.

More time spent crafting key results than discussing data

The initial excitement of adopting OKRs starts with crafting key results to measure outcomes.

Once created, inertia starts to take over and the energy around OKRs starts to fade.

Then, everyone goes back to focusing more on roadmaps with deadlines.

Reflecting on how we spend our time, what questions we ask to others, and what agenda we set for meetings can all help fully adopt OKRs.

Try making these changes to fully adopt OKRs into your culture.

  • Take 10% of minutes in existing meetings and devote them to discussing current and past data points

  • Take another 10% of minutes in existing meetings and devote them to discussing future data points and needs

  • Every presentation or narrative created by your team must include a slide or paragraph that speaks to the impact/outcome of the product/feature

  • All sprint planning and backlog lists to include tasks specifically for the collection, fixing and analysis of data

  • Mandate 10% of the engineers’ time in every technology launch to be spent on adding/improving analytics technology to collect more and better data

These changes will shift everyone’s focus to outcomes (data) instead of outputs (delivery).

The most common analogy for this shift is when teams moved to mobile-first design.

Before mobile phones dominated Internet usage, most websites were built in a landscape orientation for a large screen.

The inertia of developing for the desktop experience persisted even when the majority of visits to websites were happening on a mobile device with a much smaller, portrait oriented screen.

Many technology leaders needed to make internal mandates in order for teams to make that switch from desktop-first design to mobile-first design.

Switching from output orientation to outcome orientation requires similar mandates.

Negative employee feedback around analytics

“Analytics are unknown to me”

“I’m not a data person”

These are common feelings among Product and Design professionals.

In organizations where delivery is prized, analytics skills wither.

In large companies, analytics skills are often concentrated in a few employees such as data analysts and data scientists. This is short-sighted.

Leaders should use coaching and training to democratize analytics skills and knowledge within the organization. (I am offering an Analytics Master Class for Product and Design Professionals)

“We don’t have any analytics embedded in our products. We are flying blind.”

“We have analytics tags but they are obscurely named and no one understands what they mean”

In the digital world, analytics act as our eyes and ears.

If data isn’t collected, decisions will be made using “anecdata” and other less reliable forms of data.

Many teams have analytics debt.

Like technical debt, products lack data collection or have problems in the existing collection.

As engineers advocate for solving technical debt, leaders should advocate for solving analytics debt.

“Too many analytics. Don’t know where to start”

“We’re sitting on top of a big reservoir of data....what do we do with it?”

Too much data can be overwhelming.

Challenge individuals to start slow and find a simple metric to analyze in their subject matter area.

Avoid complicated calculations such as “Activated User” and choose to measure something simpler such as “Number of Visits” or similar.

As they get more comfortable (and accurate) then move them towards more complex metrics.

Poor execution of OKRs

Poor execution often creates a half-baked adoption of OKRs.

Look for these obvious symptoms of poor execution of OKRs:

Disconnected OKRs

  • Leaders don’t create any metrics to connect the top financial goals (revenue, deals closed) to the actual metrics teams can impact (Activated users, etc).

Poorly created OKRs

  • Renaming a feature launch into a Key Result → “Launch new onboarding flow in Q2”

  • Using “X” and “Y” in place of actual numbers → “Increase activated users from X to Y”

  • Using a percentage/% → “Reduce customer churn by 13%” (But what is churn today?)

Percentages sound good until you realize at the end of the quarter than you never measured a baseline.

Conflicts of interest

  • Each functional group has OKRs → Engineering has their own OKRs. Product has their own OKRs. Individuals have their own OKRs

Which OKR takes precedence?

OKRs should be set only for a cross-functional team in order to avoid conflicts.

Too many OKRs

  • Setting multiple OKRs at the same time

What are they supposed to work on first?

Try achieving one OKR before you ever do two at once.

There are situations where multiple OKRs are appropriate such as setting a customer satisfaction OKR while also having a growth OKR. This way you can ensure sustainable growth.

Long time frames

  • Set OKRs for a year

Leaders set yearly goals then change their minds every 3 months.

What a waste.

Use this wasted planning time for customer discovery and improvements in collection and measurement.

How to change

Fully implementing OKRs and creating an analytics-focused culture is not free.

But it has a compounding payback when teams collect more data, improve that data collection, get smarter in analyzing data and make decisions with data, not opinions.

If you want to:

  • Build analytics skills

  • Develop an analytics mindset

  • Shift to an analytics culture

then…

Send individuals to my public Analytics Master Class or have me run a private Analytics Master Class for your company.


Jim coaches Product Management organizations in startups, growth stage companies and Fortune 100s.

He's a Silicon Valley founder with over two decades of experience including an IPO ($450 million) and a buyout ($168 million). These days, he coaches Product leaders and teams to find product-market fit and accelerate growth across a variety of industries and business models.

Jim graduated from Stanford University with a BS in Computer Science and currently lectures at University of California, Berkeley in Product Management.

Previous
Previous

The Consultant Technology Stack

Next
Next

How do you know you're successful today?