Your analysis provides clear data that the campaign was a (glorious) failure.
It could not be clearer.
The KPI you chose for your brand campaign was Trust, it had a pre-set target of +5. The post-campaign analysis that compares performance across Test & Control cells shows that Trust did not move at all. (Suspiciously, there are indications that in a handful of Test DMAs it might have gone down!)
Every so often, the story is just as simple as that.
You do the best you can with a marketing campaign (creative, audience, targeting, channels, media plan elements like duration, reach, frequency, media delivery quality elements like AVOC, Viewability, etc.), and sometimes the dice does not roll your way when you measure impact.
You would be surprised to know just how frequently the cause for failure is things that have nothing to do with the elements I mentioned above. In future Premium editions we’ll cover a bunch of these causes, today I want to cover one cause that is in your control but often a root cause of failure:
Judging a fish by its ability to climb a tree!
AKA: You picked the wrong KPI for the campaign.
[Note 1: I’m going to use the phrase Success KPI a lot. To ensure clear focus, clear postmortems and clear accountability, I recommend identifying one single solitary metric as the Success KPI for the initiative. You can measure seven additional metrics – say for diagnostic purposes -, but there has to be just one Success KPI. Close accountability escape hatches.]
[Note 2: Although the guidance in this article applies to companies/analytics teams of all sizes, it applies in particular to larger companies and large agencies. It is there that the
If there is one thing the universe agrees on, it is that you should just provide data… You should provide INSIGHTS!!!
In the 807,150 (!) words I’ve written on this blog thus far, at least 400,000 have been dedicated to helping you find insights.
In posts about advanced segmentation, in posts about how to build strategic dashboards that don’t suck, in encouraging you to reimagine how you pick metrics to obsess about using the magnificent Impact Matrix, and on and on and on.
Go for insights!
In time, I’ve come to hate the word insights.
In our world – marketing research and analytics – that word has come to represent data puking.
It has come to represent telling people, with dozens of reports or eighty slides, that water is wet.
I’ve observed, during my work across the world, when we deliver insights, we mostly deliver to our audiences things in-sight – things they can already see!
As in, the blue line is 20% above the red line. I CAN SEE THAT! Or, life-time value of California purchasers is 3x when compared to those who reside in Georgia. Oh, please, I can also see that on the table with my eyes.
This, unsurprisingly, ends up being a massive waste of your incredible talent, and an insult to the intelligence of our audience (the people who pay your salary).
This blog post was originally published as an edition of my newsletter TMAI Premium. It is published 50x/year, and shares bleeding-edge thinking about Marketing, Analytics, and Leadership. You can sign up here – all revenues are donated to charity.
The last time I changed jobs, I wanted to change the aspiration of what our talented team and I should shoot
I was reading a paper by a respected industry body that started by flagging head fake KPIs. I love that moniker, head fake.
Likes. Sentiment/Comments. Shares. Yada, yada, yada.
This is great. We can all use head fake metrics to calling out useless activity metrics.
[I would add other head fake KPIs to the list: Impressions. Reach. CPM. Cost Per View. Others of the same ilk. None of them are KPIs, most barely qualify to be a metric because of the profoundly questionable measurement behind them.]
The respected industry body quickly pivoted to lamenting their findings that demonstrate eight of the top 12 KPIs being used to measure media effectiveness are exposure-counting KPIs.
A very good lament.
But, then they then quickly pivot to making the case that the Most Important KPIs for Media are ROAS, Exposed ROAS, “Direct Online Sales Conversions from Site Visit” (what?!), Conversion Rate, IVT Rate (invalid traffic rate), etc.
Wait a minute.
Most important KPI?
No siree, Bob! No way.
Take IVT as an example. It is such a niche obsession.
Consider that Display advertising is a tiny part of your budget. A tiny part of that tiny part is likely invalid. It is not a leap to suggest that it is a big distraction from what’s important to anoint this barely-a-metric as a KPI. Oh, and if your display traffic was so stuffed with invalid traffic that it is a burning platform requiring executive attention… Any outcome KPI you are measuring (even something basic as Conversion Rate) would have told you that already!
Conversion Rate obviously is a fine metric. Occasionally, I might call it a KPI, but I have never anointed it as the Most Important KPI.
In my experience, Most Important
Almost all metrics you currently use have one common thread: They are almost all backward-looking.
If you want to deepen the influence of data in your organization – and your personal influence – 30% of your analytics efforts should be centered around the use of forward-looking metrics.
But first, let’s take a small step back. What is a metric?
Here’s the definition of a metric from my first book:
A metric is a number.
Conversion Rate. Number of Users. Bounce Rate. All metrics.
[Note: Bounce Rate has been banished from Google Analytics 4 and replaced with a compound metric called Engaged Sessions – the number of sessions that lasted 10 seconds or longer, or had 1 or more conversion events or 2 or more page views.]
The three metrics above are backward-looking. They are telling us what happened in the past. You’ll recognize now that that is true for almost everything you are reporting (if not everything).
But, who does not want to see the future?
Yes. I see your hand up.
The problem is that the future is hard to predict. What’s the quote… No one went broke predicting the past. 🙂
Why use Predictive Metrics? As Analysts, we convert data into insights every day. Awesome. Only some of those insights get transformed into action – for any number of reasons (your influence, quality of insights, incomplete stories, etc. etc.). Sad face.
One of the most effective ways of ensuring your insights will be converted into high-impact business actions is to predict the future.
Consider this insight derived from data:
The Conversion Rate from our Email campaigns is 4.5%, 2x of Google Search.
Now consider this one:
The Conversion Rate from our Email campaign is
One of the business side effects of the pandemic is that it has put a very sharp light on Marketing budgets. This is a very good thing under all circumstances, but particularly beneficial in times when most companies are not doing so well financially.
There is a sharper focus on Revenue/Profit.
From there, it is a hop, skip, and a jump to, hey, am I getting all the credit I should for the Conversions being driven by my marketing tactics? AKA: Attribution!
Right then and there, your VP of Finance steps in with a, hey, how many of these conversions that you are claiming are ones that we would not have gotten anyway? AKA Incrementality!
Two of the holiest of holy grails in Marketing: Attribution, Incrementality.
Analysts have died in their quests to get to those two answers. So much sand, so little water.
Hence, you can imagine how irritated I was when someone said:
Yes, we know the incrementality of Marketing. We are doing attribution analysis.
You did not just say that.
I’m not so much upset as I’m just disappointed.
Attribution and Incrementality are not the same thing. Chalk and cheese.
Incrementality identifies the Conversions that would not have occurred without various marketing tactics.
Attribution is simply the science (sometimes, wrongly, art) of distributing credit for Conversions.
None of those Conversions might have been incremental. Correction: It is almost always true that a very, very, large percentage of the Conversions driven by your Paid Media efforts are not incremental.
Attribution ≠ Incrementality.
In my newsletter, TMAI Premium, we’ve covered how to solve the immense challenge of identifying the true incrementality delivered by your Marketing budget. (Signup, email me for a link to that newsletter.)
Today, let me unpack the crucial
Analytics teams are named for the silos and limitations within which they trap themselves.
Paid Media. Owned Media. SEO. BI. Customer Service. Data Warehousing. Email. And, a thousand other silos (depending on your company size).
One outcome of this reality is that while every team works hard to do their very best work, it is rare that they earn strategic influence from their work. That’s not really surprising, if your view of your scope is narrow… Your impact will be narrow as well.
The other dimension to consider is most Analtyics teams kick into gear after the campaign is concluded, after the customer interaction has taken place in the call center, and after the funds budgeted have already been spent. When you only look backwards, it limits your ability to have an impact.
Finally, few analytics teams obsess about predictive analytics in a way that allows them to dictate future action. This is a huge miss… Left to their own accord, how many companies will make the same decisions data would recommend? Astonishingly few.
Transforming Data’s Strategic Influence.
The above-observed realities were on my mind as I took on a new role to lead Global Strategic Analytics. This time around, my goal was for the analytics team to chart a very different path… To solve for expansive influence, before, during, after, money is spent by the organization.
A key part of how this manifested in our work was doing truly super-advanced machine-learning powered analysis to answer hard questions that few can successfully. This is of course exciting and very cool.
But the difference in the team’s impact comes from the combination of an audacious vision and putting together the people-process-structure that powers our desire for data
Like you, I consume a whole lot of reports every day – company data, public data.
Many are acceptable, some are very good and all the rest leave me extremely frustrated with both the ink and the think.
People make so many obvious mistakes. Sometimes repeatedly.
Just yesterday I was quietly seething because none of visuals included in the report contained any context to understand if the performance I was looking at was good or bad.
The graph could be going up, down, all around and I as a consumer had the job of figuring if something was good, bad or worth ignoring.
The heartbreaking part is that most executives will take a look, realize the difficulty in interpretation in 15 – 20 seconds, and go back to shooting from the gut. Even if the report has hidden gold.
In a move that might not surprise you, I sat down with the person for 90 minutes going visual by visual, table by table, directing changes that would ensure everything had context.
A report usually has a hard time explaining why something is going awry or going really well. (That is why you have job security as an Analyst!)
A report can usually be very good at clearly highlighting what is going well or badly.
Your #1 job is to make sure your reports don’t fail at this straightforward responsibility.
So today a simple collection of tips that you can use to up-level your reports – to allow them to speak with a clear, and influential, voice.
For many of you a reminder of what you might have let slip, for others a set of new things to implement as you aim for your next promotion.
#1. Context, Context,
The gap between a bad and good data visualization is small.
The gap between a good and great data visualization is a vast chasm!
The challenge is that we, and our HiPPOs, bring opinions and feelings and our perceptions of what will go viral to the conversation. This is entirely counter productive to distinguishing between bad, good, and great.
What we need instead is a rock solid understanding of the updraft we face in our quest for greatness, and a standard framework that can help us dispassionately assess quality.
Let’s do that today. Learn how to seperate bad from good and good from great, and do so using examples that we can all relate to instantly.
We’ll start by looking at the two sets of humans who are at the root of the conflict of obsessions and then learn to assess how effective any data visualization is in an entirely new way. If you adopt it, I guarantee the impact on your work will be transformative.
The Conflict of Obsessions.
There are two parties involved in any data visualization.
1. Analyst/Data Visualizer. As I’ve passionately shared frequently on this blog, we, Analysts, are all in the business of persuasion. We work against that desired outcome because when we work on creating a data visualization, here are our top-of-mind concerns/desires/perspectives:
How can I cram as much as I can into the graphic?
What can I include to ensure everyone clearly gets just how much work I did?
How much of my agenda do I need to make overt, and how much can I make covert?
Is there something I can add to increase the chances that this will go viral and result in fame and glory?
Ok. I’m only teasing.
But, as an
Some moments in time are perfect to reflect on where you are, what your priorities are, and then consider what you should start-stop-continue. In those moments, you are not thinking of delivering incremental change… You are driven by a desire to deliver a step change (a large or sudden discontinuous change, especially one that makes things better – I’m borrowing the concept from mathematics and technology, from “step function”).
In those moments – common around new years or new annual planning cycles – the difference between delivering an incremental change vs. a step change is the quality of ideas you are considering. In this post, my hope is to both enrich your consideration set and encourage the breadth of your goals.
My professional areas of interest cover Customer Service, User Experience and Finance, though here on Occam’s Razor my focus is on influencing incredible Marketing through the use of innovative Analytics. To help kick-start your 2019 step change, I’ve written two “Top 10” lists, one for Marketing and one for Analytics – consisting of things I recommend you obsess about.
Each chosen obsession is very much in the spirit of my beloved principle of the aggregation of marginal gains. My recommendation is that you deeply reflect on the impact of the 10 x 2 obsessions in your unique circumstance, and then distill the ten you’ll focus on in the next twelve months. Regardless of the then you choose, I’m confident you’ll end up working on challenging things that will push your professional growth forward and bring new joy from the work you do for your employer.
First… The Analytics top ten things to focus on to elevate your game this year…
The universe of digital analytics is massive and can seem as complex as the cosmic universe.
With such big, complicated subjects, we can get lost in the vast wilderness or become trapped in a silo. We can wander aimlessly, or feel a false sense of either accomplishment or frustration. Consequently, we lose sight of where we are, how we are doing and which direction is true north.
I have experienced these challenges on numerous occasions myself. Even simple questions like “How effective is our analytics strategy?” elicit a complicated set of answers, instead of a simple picture the CxO can internalize. That’s because we have to talk about tools (so many!), work (collection, processing, reporting, analysis), processes, org structure, governance models, last-mile gaps, metrics ladders of awesomeness, and… so… much… more.
Soon, your digital analytics strategic framework that you hoped would provide a true north to the analytics strategy question looks like this…
The frameworks above cover just one dimension of the assessment (!). There is another critical framework to figure out how you can take your analytics sophistication from wherever it is at the moment to nirvanaland.
A quick search query will illustrate that that looks something like this…
It is important to stress that none of these frameworks/answers exist in a vacuum.
Both pictures above are frighteningly complex because the analytics world we occupy is complex. Remember, tools, work, processes, org structure, governance models, last-mile gaps, metrics ladders of awesomeness, and… so… much… more.
The Implications of Complexity.
There are two deeply painful outcomes of the approaches you see in the pictures above (in which you’ll also see my work represented as well).
No CxO understands the