Your Email Metrics Are Wrong
Oct 04, 2016
Data is an important piece of what's needed to gain knowledge, insight, wisdom...what you need to identify opportunities for improvement and potential solutions that will turn those opportunities into a profitable reality.
But as we all know, "garbage in, garbage out" and there has been a great deal of "discussion" and attention placed of the quality of data in recent weeks.
For years, digital marketers would attempt to prove their value with statements such as these - "Digital media offers a quality of data that traditional media cannot touch." Source: Adelsberger Marketing
Then, every once in a while, someone would stand up and call the quality of data into question...such as Julie Fleischer, head of CRM at Kraft, who gained fame by stating that "...90% of data is crap..."
Last week, my colleague, Dudley Stevenson published "Digital Metrics: How Much Is Real & How Much Is Bullshit?"
Then we had the Facebook "metrics mistake".
So let's dive into the weeds for a little overkill...and look at those all to commonly referenced metrics tied to your email campaigns.
Your Open Rate is Crap!
Open rate is a measure of how many people on an email listopen (or view) a particular email campaign. The open rate is normally expressed as a percentage, and we calculate it as follows: So a 20% open rate would mean that of every 10 emails delivered to the inbox, 2 were actually opened. Source: Campaign Monitor
Well, that's not actually the case...
The technology used to track an “open” uses an HTML IMG tag embedded in the outgoing emails. This is a tiny, transparent tracking image that tries to determine when a person’s email browser displays the email. Problem is, this reporting mechanism has no idea whether or not a human actually opened the message. Many browsers these days open messages automatically. Outlook, for instance, has a preview pane that records emails as “opened.”
This reporting mechanism has no idea whatsoever if an opened message (by the browser or a human) was actually seen or read by the target/prospect. Source: MarketSmart
And from the folks at MailChimp
Aggressive spam filters will click links in incoming mail before delivering them to make sure there isn't any malicious content. Our system automatically tracks clicks as opens, and we have no way of differentiating when a click is from a spam filter. This can sometimes lead to an unusually high number of opens from a single domain. If you notice this on your report, it's likely the result of a spam filter.
And Your Click Rate is Crap Too...
Click through rate (CTR) percentage: Click through rate is the percentage of recipients that have clicked on any link in your email message. A click is tracked by a tracking code appended automatically to the email links by your ESP. Click through rate calculation is Unique clicks / delivered number x 100 = CTR. Source: KickDynamic
Mmm well, to be honest...re-read the piece from MailChimp above that states "Aggressive spam filters will click links in incoming mail before delivering them..."
Let's Not Overlook 'Delivered' Because It's Crap Too....
Delivered: Date and time the campaign completed sending from MailChimp servers. Source: MailChimp
Well, then there is ...
Deliverability refers to ensuring email messages are delivered to the inbox and aren’t blocked or rerouted by spam filters. This is an ongoing battle for email marketers. Successful deliverability depends on a combination of best practices, including authentication and email reputation. Source: Email Vendor Selection
Disclaimer: Definition of Terms Is a Mess Too!
Hopefully a couple of you are reading this and verifying my terms...and if you are, chances are pretty good that you are finding different definitions of the terms used above. First off, congratulations...you probably just spent more time checking terms than you probably do making sure the data you're reporting is not crap!
Second off, that's part of the problem - if we don't have a written data dictionary, then we should have little confidence that we are speaking the same language. And if we're not speaking the same language, we can't be confident when someone says the "Lead" is "Qualified" that you know what the heck they are talking about.
The Solution? Challenge The Data, Sheeple!
To be completely honest, I turn to what my colleague wrote last week:
From a marketing perspective, when it gets right down to it, the only measurements that you are sure mean anything are that someone actually purchases something, or god forbid picks up the phone and actually calls you.
But if you want to focus on other pieces of data, make sure you know what you have - including the strengths and weaknesses - so you can use it with confidence. (Check out this article regarding the Association of National Advertisers calling for metrics to be audited and accredited by the Media Rating Council as one example.)
And keep on challenging the data. Push to ensure the data is still being gathered the same way, and that the data comes with the same "pros" and "cons" then you can determine how much weight/value to put on the data.
But to sit there and accept all data, without question or any real understanding of what it really means...is unacceptable. Don't fall into that trap!
For more on marketing analytics and metrics that mean something…
Also consider this book for a real understanding of direct marketing analytics and metrics….
Patrick McGraw is VP of Higher Educaton Marketing Services and has more than 25 years experience in market research, competitive intelligence, business intelligence including database marketing and CRM, strategic planning, brand development and management as well as operations/campaign management. His work has consistently helped his clients and employers develop and implement more efficient ways to attract and retain profitable customers, enter new markets and launch new products. His areas of focus include the education, hospitality, travel and tourism, hi-tech, telecommunications, financial services, and retail industries on both the agency and customer sides.