Your analysis provides clear data that the campaign was a (glorious) failure.
It could not be clearer.
The KPI you chose for your brand campaign was Trust, it had a pre-set target of +5. The post-campaign analysis that compares performance across Test & Control cells shows that Trust did not move at all. (Suspiciously, there are indications that in a handful of Test DMAs it might have gone down!)
Every so often, the story is just as simple as that.
You do the best you can with a marketing campaign (creative, audience, targeting, channels, media plan elements like duration, reach, frequency, media delivery quality elements like AVOC, Viewability, etc.), and sometimes the dice does not roll your way when you measure impact.
You would be surprised to know just how frequently the cause for failure is things that have nothing to do with the elements I mentioned above. In future Premium editions we’ll cover a bunch of these causes, today I want to cover one cause that is in your control but often a root cause of failure:
Judging a fish by its ability to climb a tree!
AKA: You picked the wrong KPI for the campaign.
[Note 1: I’m going to use the phrase Success KPI a lot. To ensure clear focus, clear postmortems and clear accountability, I recommend identifying one single solitary metric as the Success KPI for the initiative. You can measure seven additional metrics – say for diagnostic purposes -, but there has to be just one Success KPI. Close accountability escape hatches.]
[Note 2: Although the guidance in this article applies to companies/analytics teams of all sizes, it applies in particular to larger companies and large agencies. It is there that the
If there is one thing the universe agrees on, it is that you should just provide data… You should provide INSIGHTS!!!
In the 807,150 (!) words I’ve written on this blog thus far, at least 400,000 have been dedicated to helping you find insights.
In posts about advanced segmentation, in posts about how to build strategic dashboards that don’t suck, in encouraging you to reimagine how you pick metrics to obsess about using the magnificent Impact Matrix, and on and on and on.
Go for insights!
In time, I’ve come to hate the word insights.
In our world – marketing research and analytics – that word has come to represent data puking.
It has come to represent telling people, with dozens of reports or eighty slides, that water is wet.
I’ve observed, during my work across the world, when we deliver insights, we mostly deliver to our audiences things in-sight – things they can already see!
As in, the blue line is 20% above the red line. I CAN SEE THAT! Or, life-time value of California purchasers is 3x when compared to those who reside in Georgia. Oh, please, I can also see that on the table with my eyes.
This, unsurprisingly, ends up being a massive waste of your incredible talent, and an insult to the intelligence of our audience (the people who pay your salary).
This blog post was originally published as an edition of my newsletter TMAI Premium. It is published 50x/year, and shares bleeding-edge thinking about Marketing, Analytics, and Leadership. You can sign up here – all revenues are donated to charity.
The last time I changed jobs, I wanted to change the aspiration of what our talented team and I should shoot
I was reading a paper by a respected industry body that started by flagging head fake KPIs. I love that moniker, head fake.
Likes. Sentiment/Comments. Shares. Yada, yada, yada.
This is great. We can all use head fake metrics to calling out useless activity metrics.
[I would add other head fake KPIs to the list: Impressions. Reach. CPM. Cost Per View. Others of the same ilk. None of them are KPIs, most barely qualify to be a metric because of the profoundly questionable measurement behind them.]
The respected industry body quickly pivoted to lamenting their findings that demonstrate eight of the top 12 KPIs being used to measure media effectiveness are exposure-counting KPIs.
A very good lament.
But, then they then quickly pivot to making the case that the Most Important KPIs for Media are ROAS, Exposed ROAS, “Direct Online Sales Conversions from Site Visit” (what?!), Conversion Rate, IVT Rate (invalid traffic rate), etc.
Wait a minute.
Most important KPI?
No siree, Bob! No way.
Take IVT as an example. It is such a niche obsession.
Consider that Display advertising is a tiny part of your budget. A tiny part of that tiny part is likely invalid. It is not a leap to suggest that it is a big distraction from what’s important to anoint this barely-a-metric as a KPI. Oh, and if your display traffic was so stuffed with invalid traffic that it is a burning platform requiring executive attention… Any outcome KPI you are measuring (even something basic as Conversion Rate) would have told you that already!
Conversion Rate obviously is a fine metric. Occasionally, I might call it a KPI, but I have never anointed it as the Most Important KPI.
In my experience, Most Important
Almost all metrics you currently use have one common thread: They are almost all backward-looking.
If you want to deepen the influence of data in your organization – and your personal influence – 30% of your analytics efforts should be centered around the use of forward-looking metrics.
But first, let’s take a small step back. What is a metric?
Here’s the definition of a metric from my first book:
A metric is a number.
Conversion Rate. Number of Users. Bounce Rate. All metrics.
[Note: Bounce Rate has been banished from Google Analytics 4 and replaced with a compound metric called Engaged Sessions – the number of sessions that lasted 10 seconds or longer, or had 1 or more conversion events or 2 or more page views.]
The three metrics above are backward-looking. They are telling us what happened in the past. You’ll recognize now that that is true for almost everything you are reporting (if not everything).
But, who does not want to see the future?
Yes. I see your hand up.
The problem is that the future is hard to predict. What’s the quote… No one went broke predicting the past. 🙂
Why use Predictive Metrics? As Analysts, we convert data into insights every day. Awesome. Only some of those insights get transformed into action – for any number of reasons (your influence, quality of insights, incomplete stories, etc. etc.). Sad face.
One of the most effective ways of ensuring your insights will be converted into high-impact business actions is to predict the future.
Consider this insight derived from data:
The Conversion Rate from our Email campaigns is 4.5%, 2x of Google Search.
Now consider this one:
The Conversion Rate from our Email campaign is
A story where data is the hero, followed by two mind-challenging business-shifting ideas.
At a previous employer customer service on the phone was a huge part of the operation. Qualitative surveys were giving the company a read that customers were unhappy with the service being provided. As bad customer service is a massive long-term cost – and short-term pain –, it was decided that the company would undertake a serious re-training effort for all the customer service reps and with that problems would get solved faster. To ensure customer delight was delivered in a timely manner, it was also decided that Average Call Time (ACT) would now be The success metric. It would even be tied to a customer service rep’s compensation creating an overlap between their personal success and the company’s success.
What do you think happened?
There is such a thing as employees that don’t really give a frek about their job or company, they just come to work. You’ll be surprised how small that number is. (Likewise, the number of employees that go well above the call of duty, look to constantly push personal and company boundaries is also quite small.) Most employees work diligently to deliver against set expectations.
Reflecting that, in our story, most customer service reps, re-trained, took the phone calls with the goal of driving down Average Call Time. They worked as quick as they could to resolve issues. But, pretty quickly customers with painful problems became a personally painful problem for an individual customer service rep. They hurt ACT, and comp. Solution? If the rep felt the call was going too long, self-preservation kicked in and they would hang up on the customer. Another
Ten years, and the 944,357 words, are proof that I love purposeful data, collecting it, pouring smart strategies into analyzing it, and using the insights identified to transform organizations.
In the quest for that last important bit, I am insanely obsessive about 1. simplification and 2. pressing the right emotional buttons.
The reasons are that we all like complexity, it gives us energy :), we tend to be logical, and we often treat data output as the end when in reality the data output is just the start of the process that results in actions that deliver business impact.
Very often the output of our work with Big Data or Small Data, Google Analytics or R, will end up in a few cells of a spreadsheet or a table in Word/Keynote/PowerPoint. The stakes for this output are higher when we are in front of the Senior Leadership of any company, we have but a few minutes to communicate what we have to. Hence my two obsessions above.
In this post, with lots of pictures and real-world data examples, I want to share 6 different strategies you can leverage in service of simplification and pressing the right emotional buttons. Along our journey, I’ve also sprinkled in 15 universal truths that will bring you joy.
Here are the sections in this post:
An important assumption.
Death at the last-mile.
1. Rebel against crapification via cluttering.
2. Don’t fragment data, don’t forget higher order bits.
3. Obsess with deleting information provided.
4. Don’t run away, make the tough choices.
5. So what? So What?? So WHAT!
6. Sell smarter,
If you don’t have goals, you are not doing digital analytics. You are doing i am wasting earth’s precious oxygenalytics.
Let’s back up. Let me start with a story.
We were brain storming about the next cluster of coolness for Analytics, the conversation quickly went to what Analysts need to look at on a daily, weekly and monthly basis. I started to outline a simple framework that stated that no one should look at anything daily (that should all be automated and run off automated or custom set thresholds – things don’t really change materially on a daily basis), weekly should be based on stuff that borders reporting squirrel work and pinches of analysis ninja work, and monthly…. well super analysis ninja stuff. And, then I started to redefine what daily, weekly and monthly even means. From there, it is only a hop, skip and jump to the most deadly question in analytics….
What’s the business solving for?
Everything came to a screeching halt. This beautiful daily, weekly, monthly blog post I was drafting in my head to share my excitement with you about thinking analysis differently went poof.
It pains me how critical it is to know what the heck we are solving for with our analytics, and how few people identify goals for their website (mobile or desktop). The reason is simple: If you don’t know where you are going, you’ll get somewhere and you’ll be miserable.
We see this everyday. “Analysts” spewing data out left right and center, after spending so much time tagging and re-tagging and Google Tag Managering. Yet, few Marketers or executives take them seriously (because they don’t know what the heck all that means to the business
For the last decade (#omg!), I’ve consistently complained about a fundamental flaw in Web Analytics tools: They incentivize one night stands, rather than engagements matching customer-intent.
This leads to owners of digital experiences (insanely) expecting all visitors to their websites to convert right away – anything less than that is a failure. Damn the intent the customer is expressing.
It also results in Marketers obsess about awful things like last-click conversions (die last-click attribution die!). They make silly user experience decisions (Searching for car insurance options? We will remove every single thing from the page except a GET QUOTE button. Ha! Sucks to be you Visitor!). They never consider Think or Care intent, all they obsess about is Do intent (See-Think-Do-Care business framework). Not even all of the Do, just the strongest of commercial intent. The very bottom of the Do! It really is quite crazy.
You’ll agree all of this sounds quite insane. Not just insane, so visibly insane that everyone should see through it and fix their minds/reports/strategies. So, why are we still so obviously wrong and still on the insane path?
Simple. It is just how all of the Digital Analytics tools are configured at their very core.
Every standard report in every standard tool is configured off Visits (or in Google Analytics language, Sessions), rather than Visitors (GA language, Users). The specific metric I’ve been mad about since day one of this blog (May 14th, 2006!) is Conversion Rate. It is measured as Orders/Visits. [Or, its variation Outcomes/Sessions]
Built into that is the mental model that if you visit a website, then every Visit has to result in money for the site owner. Else, it is a failed visit. Scroll
The difference between a Reporting Squirrel and Analysis Ninja? Insights.
As in, the former is in the business of providing data, the latter in the business of understanding the performance implied by the data. That understanding leads to insights about why the performance occurred, which leads to so what we should do.
Do you see how far away a Reporting Squirrel’s job is from that of an Analysis Ninja?
For one, I hope you see the massive investment in self-development of business skills required to have the foundation required to get to the why and, even more, the so what.
Pause. Reflect on the implication of that why and so what on your current skills/career.
I’m sure you came up with a set of actions you can take to evolve from a squirrel to a ninja, or, if you are already a ninja, how to become even more awesome at ninja’ness.
One of the actions that both clusters will come up with is the ability to communicate the insights you discover. Even if you have really amazing why and so what, I’ve observed many Analysts die at the last mile: Presenting their whys and the so whats, in the form of stories.
In fact 86.4% of all Analyst careers fail due to a lack of this critical last mile skill!
Ok, ok. I kid. I kid.
It is really 88%. : )
Tom Fishburne’s wonderful cartoon is here for another purpose.
We send out our multi-tab spreadsheets, our best Google Analytics custom reports, our great dashboards full of data , and more to the tactical layer of data clients. The Directors, the Marketers, the Optimization employees and our resident social media gurus. The valiant hope is that they will
Standard reports stink. Custom reports rock!
If you are a regular reader of this blog, you are quite familiar with this sentiment. I’ve expressed it often. 🙂
The primary reason is simple: You are unique. Your business is unique. Why would a report created for everyone work for the special someone that you are?
There are other great reasons as well.
Custom reports allow you to deeply focus (by eliminating the rif-raf metrics and dimensions, they save time and show just what you want). When shared, custom reports allow you to deliver deeper relevance. Custom reports allow you to package up entire datasets for deeper analysis.
I’ve shared a whole bunch of custom reports in the past. You can download them into your Google Analytics account via one click (along with some lovely Advanced Segments and a Dashboard). Just go to the GA Solutions Gallery and click Import: Occam’s Razor Awesomeness.
You can download a bunch more, that are not yet in the bundle above, by following the links at the end of this post. Seven more! The include single custom reports that replace all/most current standard reports in GA on Mobile, Content, Paid Search and Acquisition. Your life will be simpler. Grab the above, then grab the ones at the end of this post.
Today, I want to share a few of my recent favorites that solve day-to-day challenges in clever ways.
But, before we go there I want to share an important concept. Many custom reports are wrong because we mess up the fundamental data model in analytics. We mis-align metrics and dimensions across Users, Session, Hits. If you want to create accurate custom reports (or apply advanced segments), this post is mandatory reading: