Choose the Right Metrics

What gets measured gets managed. Make sure you measure the right things, identify vanity metrics, and perform sanity checks.
December 15, 2018

Chances are good that your business is awash in data, especially if we at Fivetran have done our job. You use data to produce actionable insights, measuring and assessing various events and activities related to the conduct of your business. Maybe you have so thoroughly mastered the use of data that you use it to train machine learning models. Product roadmaps, hiring decisions and other pursuits should all be dictated by the numbers because numbers don’t lie. Right?

Be Careful What You Ask for

Suppose you find that, over the past year, your website’s unique monthly visitors have grown from 30,000 to 250,000, and you observe similar trends for the number of social media likes and follows and time on-site. You’d be ecstatic — shouldn’t mouth of the funnel always be a good thing?

What we have just described are vanity metrics — numbers that superficially indicate success but are not tied to your business’s revenues or customer satisfaction. Growth in vanity metrics is not necessarily a good thing. Apparent growth without concern for sustainability and retention can be dangerous. More fundamentally, it is dangerous to fail to connect metrics to revenues, or, more importantly, to misconstrue the problem your team ostensibly solves.

Choosing bad metrics isn’t just a rookie mistake. One of the better-known stories of metrics misfires is that of Microsoft’s Bing, which used searches per user session as its key performance indicator (KPI). On the surface, such a measure seems extremely appropriate. Industry leader Google processes over 40,000 searches per second. Shouldn’t more searches per user per session mean more engagement, as well as more interactions with ads?

In practice, designating searches per user session as the team’s KPI perversely incentivized the developers to add features that kept users clicking, sometimes at the expense of exposing the most important search results. The Bing team forgot that the goal of a search engine is to return good results the first time, not to keep users searching. Google’s success stems from the fact that users seldom look past the first page because the search ends in a find.

The Bing debacle wasted a lot of time, money and effort, but it's easy to imagine even higher stakes in sensitive industries like healthcare, education, finance and insurance. History is replete with examples of poorly conceived metrics and perverse incentives leading to ecological and humanitarian catastrophe. This phenomenon is often called the “Cobra Effect,” after an apocryphal story about a cobra infestation in Delhi. British colonial administrators instituted a bounty on cobra skins, prompting enterprising Indians to breed them. As the colonial administrators recognized their error, they immediately compounded it by suspending the bounty, leading to an “excess inventory” of venomous snakes turned loose in the city, worsening the problem.

Sanity Check Using Customer Satisfaction

Eventually, the Bing team was steered in the right direction by the stubborn refusal of customer satisfaction ratings to rise alongside the KPI. Customer satisfaction is typically measured using a metric called Net Promoter Score (NPS). NPS is calculated by subtracting the percentage of detractors (ratings of 0 to 6) from promoters (ratings of 9 or 10), and it will almost never be conceptually flawed.

Though it might pain a quantitatively-minded person to hear, Bing could have avoided its problems early on by observing people as they used search engines, or simply asking them about their experiences.

That said, there was a large kernel of truth in the Bing team’s initial KPI. For a search engine, it is better for users to perform more searches, but only if they are successful searches. These successful searches are unlikely to coexist in the same session. The correct thing to measure, as the Bing team eventually determined, is the number of sessions per user – a subtle, but extremely important difference.

Your Job Is to Solve Problems and Make Money

The metrics you use should generally provide evidence that your team is making money, or more importantly, successfully solving a specific problem for your customers. Metrics that are not clearly related to the two objectives of making money and solving problems will foster incentives that are at best tangential to your goals, and at worst in complete opposition to them.

Vanity metrics aren’t all bad — they are at least loosely associated with scale and success, and can provide something of a “wow” factor to a casual observer. More is generally better than less. But to have actionable insights, you have to think beyond raw magnitudes and connect your metrics to the bottom line.

Start for free

Join the thousands of companies using Fivetran to centralize and transform their data.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Data insights
Data insights

Choose the Right Metrics

Choose the Right Metrics

December 15, 2018
December 15, 2018
Choose the Right Metrics
What gets measured gets managed. Make sure you measure the right things, identify vanity metrics, and perform sanity checks.

Chances are good that your business is awash in data, especially if we at Fivetran have done our job. You use data to produce actionable insights, measuring and assessing various events and activities related to the conduct of your business. Maybe you have so thoroughly mastered the use of data that you use it to train machine learning models. Product roadmaps, hiring decisions and other pursuits should all be dictated by the numbers because numbers don’t lie. Right?

Be Careful What You Ask for

Suppose you find that, over the past year, your website’s unique monthly visitors have grown from 30,000 to 250,000, and you observe similar trends for the number of social media likes and follows and time on-site. You’d be ecstatic — shouldn’t mouth of the funnel always be a good thing?

What we have just described are vanity metrics — numbers that superficially indicate success but are not tied to your business’s revenues or customer satisfaction. Growth in vanity metrics is not necessarily a good thing. Apparent growth without concern for sustainability and retention can be dangerous. More fundamentally, it is dangerous to fail to connect metrics to revenues, or, more importantly, to misconstrue the problem your team ostensibly solves.

Choosing bad metrics isn’t just a rookie mistake. One of the better-known stories of metrics misfires is that of Microsoft’s Bing, which used searches per user session as its key performance indicator (KPI). On the surface, such a measure seems extremely appropriate. Industry leader Google processes over 40,000 searches per second. Shouldn’t more searches per user per session mean more engagement, as well as more interactions with ads?

In practice, designating searches per user session as the team’s KPI perversely incentivized the developers to add features that kept users clicking, sometimes at the expense of exposing the most important search results. The Bing team forgot that the goal of a search engine is to return good results the first time, not to keep users searching. Google’s success stems from the fact that users seldom look past the first page because the search ends in a find.

The Bing debacle wasted a lot of time, money and effort, but it's easy to imagine even higher stakes in sensitive industries like healthcare, education, finance and insurance. History is replete with examples of poorly conceived metrics and perverse incentives leading to ecological and humanitarian catastrophe. This phenomenon is often called the “Cobra Effect,” after an apocryphal story about a cobra infestation in Delhi. British colonial administrators instituted a bounty on cobra skins, prompting enterprising Indians to breed them. As the colonial administrators recognized their error, they immediately compounded it by suspending the bounty, leading to an “excess inventory” of venomous snakes turned loose in the city, worsening the problem.

Sanity Check Using Customer Satisfaction

Eventually, the Bing team was steered in the right direction by the stubborn refusal of customer satisfaction ratings to rise alongside the KPI. Customer satisfaction is typically measured using a metric called Net Promoter Score (NPS). NPS is calculated by subtracting the percentage of detractors (ratings of 0 to 6) from promoters (ratings of 9 or 10), and it will almost never be conceptually flawed.

Though it might pain a quantitatively-minded person to hear, Bing could have avoided its problems early on by observing people as they used search engines, or simply asking them about their experiences.

That said, there was a large kernel of truth in the Bing team’s initial KPI. For a search engine, it is better for users to perform more searches, but only if they are successful searches. These successful searches are unlikely to coexist in the same session. The correct thing to measure, as the Bing team eventually determined, is the number of sessions per user – a subtle, but extremely important difference.

Your Job Is to Solve Problems and Make Money

The metrics you use should generally provide evidence that your team is making money, or more importantly, successfully solving a specific problem for your customers. Metrics that are not clearly related to the two objectives of making money and solving problems will foster incentives that are at best tangential to your goals, and at worst in complete opposition to them.

Vanity metrics aren’t all bad — they are at least loosely associated with scale and success, and can provide something of a “wow” factor to a casual observer. More is generally better than less. But to have actionable insights, you have to think beyond raw magnitudes and connect your metrics to the bottom line.

Topics
No items found.
Share

Related blog posts

No items found.
No items found.
No items found.

Start for free

Join the thousands of companies using Fivetran to centralize and transform their data.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.