We can all agree that in 2013 web analytics is still a nightmare, right?
The last few years have brought about an enormous expansion in the top of the web analytics information overload funnel, and today I can discover just about any aspect of my web traffic that piques my curiosity.
I know how much traffic I’m getting, who told them to come here, how they got here, how long they’re staying, what they’re looking at, what they’re using to look at it, where they’re from, and just about anything else I want to know about them. If I don’t like what I’m looking at, I can customize everything from my dashboard to reports to parameters within those reports.
What none of this tells me is how I can be more successful at turning the words I put on the Internet into dollars in my pocket.
Now, I know what you’re thinking: “It’s all there! More information than you could ever figure out what to do with.”
The problem with that is that it’s all there. It’s more information than I could ever figure out what to do with. I’m not a web analytics analyst; I run a business, and I don’t have the time or inclination to try to find and pay someone to tell me something that reactive. If I could get the guy who does all those Google Panda updates to analyze my web stats, I’d pay him. But he’s probably way too busy trying to make sure no one ever comes to my website.
Web analytics still feels like there are still dozens of unwritten rules that can dramatically change the course of your entire revenue stream. I’m constantly worried that if I shift one pixel or use too many or not enough buzzwords that my traffic and ad revenue will plummet into a hole that I’ll never be able to dig out of.
Furthermore, these concerns are not unwarranted. I can’t tell you how many times my traffic has reached what I was convinced was a new plateau, only to — for seemingly no reason whatsoever — drop right back to where it was before the plateau started. My SEO friends tell me things like “Google did something to the search algorithms, go look at (insert acronym here) and see if that changed.” And when I finally find that acronym and it has or hasn’t changed, I still don’t know what to do.
Let’s take a look at an example.
Here’s the anonymized data for a website as presented in dashboard format for December 29, 2012, which appeared to be an extremely good day for said website. Since any anonymized name I would give you would probably be taken, let’s call the website joeprocopio.com. Don’t hurt your eyes, I’ll go over the good bits.
I mentioned earlier that web analytics is a reactive process. Like most big data analysis, the goal is to take a ton of information, run algorithms against it, and return a smaller, more actionable set of data to the user. This is what dashboards purport to accomplish — taking this vast, disparate data and crunching it and spitting it back out in actionable chunks.
The only problem is it’s rarely actionable because it’s hardly an analysis.
In order for it to be actionable, an analysis has to be three things:
- Timely — I need to know exactly what happened on December 29th and how that fits into the story. I’ll live with the assumption that 7-day and 30-day lookbacks are the most important. I can set this dashboard to compare the day against the last 7 days or 30, but not both, and I only get totals, not averages, unless those averages are explicit, as in Average Visit Duration.
- Thorough — The basic widgets here show me traffic, goal conversions, traffic type, country, mobile, and alerts. I can replace these widgets with other widgets, but it’s still just a snapshot — “Here’s today’s data.” This is what you hand the analyst so they can begin their research.
- Suggestive — I’ve set the dashboard to compare to yesterday, but none of what it tells me is rooted in the expected performance of my site. Who knows (except for me) whether a jump from 1,346 visits to 3,152 was a good thing?
My company, Automated Insights, produces narrative analysis on big data that is cloud-based, machine generated, and personalized. Here’s how those three factors impact the ability to act on an analysis:
- Cloud-based — We can get ridiculously close to real time. In our work with Yahoo Fantasy Football this season, we were producing 500- to 1,000-word recaps with notes and charts at a rate of up to 1,100 reports per second — from data receipt to actual public web page.
- Machine generated — Since we’re fast and since much of the understanding of structured datasets can be machine replicated and pre-programmed, we can run thousands of scenarios to determine what the most important results of the analysis might be. Think of it as graduating from “Here’s today’s data” to “Here are the five most important things to know about today’s data.”
- Personalized — We can determine, resolve, and store the context of the data as it applies to you. In an example as simple as this, we can look at all of your data and figure out that, yes, a jump of 1,346 visits to 3,152 is a really good thing. We can also figure out that there’s a “but” coming.
Now, let’s look at the same data as output by our automated content engine in 369 words. It starts with a headline, which summarizes completely what I need to know about what happened on December 29th.
12/29/12: A New Page and Twitter Spike Boost joeprocopio.com to Second Best Traffic Day of the Year But Value Plummets
And there’s that “but” I mentioned earlier. By all accounts, this was a banner day for the website in terms of traffic, its second best day out of 364. However, the fact that this traffic resulted in far fewer dollars per visitor doesn’t show on the dashboard.
Moving into the narrative, we reveal just how awesome those traffic numbers are by comparing them to last week, last month, and verifying that they’re even impressive over the last year:
joeprocopio.com saw a massive increase in traffic today on its way to recording the second best day of the year for Visits (3152), Unique Visitors (2997), and Page Views (3614). The Visits represent a 34% increase in traffic from yesterday as well as a 66% pop from the weekly average and a 60% jump from the monthly average. The record for 2012 was 7821 Visits on December 7th, a date which falls into the monthly average.
Here, in a few words, we’ve told the user everything they need to know about the impact of the traffic on the story of the site. Now we’ll discuss the why:
The spike was primarily due to this new page: http://joeprocopio.com/brandnewpage, which accounted for 84% of the visits. That page was the basis of 60 Twitter referrals, which represents a new daily high, and 35 Facebook referrals, a 48% increase on the monthly average.
The page “brandnewpage” was a clear winner, nearly doubling traffic overnight. However, it’s not only in what the report says, but what it doesn’t. This site already gets a large amount of traffic organically and directly, and since those percentages didn’t change much, even with the jump in the social, there’s no need to waste my time poring over that data. Along the same vein, there was a 412% increase in the number of referrals from Google+, but since that was a jump to four clicks from a paltry 0.97 monthly average, it too wasn’t worth mentioning.
Now here’s the rub:
Ad revenue was down only slightly from yesterday to $13.54. However, the CTR dropped substantially to 0.49% today and has been trending steadily downward from your monthly average of 1.12%. There was a similar plunge in eCPM, down 59% from yesterday to $4.40 and off 57% from the monthly average of $10.23. While CPM had been up to $10.49 for the weekly average, yesterday’s $4.30 appears to be settling to the monthly average of $3.76.
A quick comparison of today’s revenue to yesterday’s might have resulted in complete oversight had not a couple extra parameters been checked. The site had been pulling in less than $4 in revenue up until December, so a drop of 46 cents from yesterday to a still lucrative (for our purposes) $13.54 is no big deal.
But the click-through rate, which had been steadily declining as traffic was rising, dropped as precipitously as the traffic exploded, from 0.76% to 0.49% in a single day. Furthermore, the CPM is returning to where it was before the traffic started increasing. This shows me, for planning purposes, not to count on revenue spiking at the same degree as the traffic — which the report then states, just in case I didn’t put two and two together:
In summary, even though yesterday’s revenue was up 97% over the monthly average, it appears the new traffic is not nearly as valuable as the existing traffic on a per visitor basis.
Once the analysis is complete, other important insights can be delivered in bullet format with the report:
- This is the 10th time this year and the 2nd time this month that Visits have increased at least 33% over the previous day, the weekly average, and the monthly average.
- Duration (0:34) dropped 19% from yesterday (0:42) and is now below the monthly average (0:37).
- U.S.-based traffic has been trending steadily upward from 65% last month to 72% last week to 83% today.
- Returning traffic continues to trend downward from 8.3% last month to 6.6% last week to 5.8% today.
- Mobile views have spiked this month, from 34% last month to 41% last week to 58% yesterday.
In short, this kind of narrative is a powerful tool that can encapsulate the most widely used parameters for analysis but tailor that analysis to the needs of the user. And it’s small. It’s a 5K email on my phone or half-a-browser-page when I log in to my reporting tool.
In the real-time world of the web, time is extremely valuable, and trends often end before they’re even discovered. Armed with this report, which I can digest in a matter of a few minutes, I can spend my time proactively capitalizing on the information, rather than reactively measuring the data.