Why I Don’t Put Recommendations on Dashboards

Published by Tim Wilson on September 10, 2013 All posts from Tim Wilson

WARNING: Gilligan contrarianism alert! The following post posits a thesis that runs contrary to popular opinion in the analytics community.

Many companies these days rely on some form of internal dashboard(s). That’s a good thing. Even better is when those companies have actually automated these dashboards – pulling data from multiple data sources, structuring it in a way that directly maps to business objectives, and delivering the information in a clean, easy-to-digest format. That’s nirvana.

dashboard

Reality, often, is that the dashboards can only be partially automated. They wind up being something an analyst needs to at least lightly touch to bridge inevitable API gaps before delivering them on some sort of recurring schedule: through email, through an intranet, or even in person in a regularly scheduled meeting.

So, what is the purpose of these dashboards? Here’s where a lack of clarity — clearly communicated — becomes a slippery slope faster than Miley Cyrus can trigger a TV viewer’s gag reflex. Dashboards are, first and foremost, performance measurement tools. They are a mechanism for quickly (at a glance!) answering a single question:

“Are we achieving the goals we set out to achieve?”

They can provide some minimal context around performance, but everything beyond answering that question is a distant second purpose-wise.

It’s easy enough to wax sophomoric on this. It doesn’t change the fact, though, that one of the top complaints dashboard-delivering analysts hear is: “I get the [weekly/monthly/quarterly] dashboard from the analyst, but it doesn’t have recommendations on it. It’s just data!”

I get it. And, my response? When that complaint is leveled, it’s a failure on the part of the analyst to educate (communicate), and a failure of process — a failure to have mechanisms in place to deliver actionable analytical results in a timely and effective manner.

But…here…I’m just going to lay out the various reasons that dashboards are not the place to expect to deliver recommendations, because, in my experience, analysts hear that complaint and respond by trying to introduce recommendations to their dashboards. Why shouldn’t they? I can give four reasons!

Reason No. 1: Dashboards Can’t Wait

Another complaint analysts often hear is that dashboards aren’t delivered quickly enough at the end of the reporting period. Well, no one, as far as I know, has found a way to stop time. It marches on inexorably, with every second taking exactly one second, every minute having a duration of 60 seconds, and every hour having a duration of 60 minutes (crappy Adam Sandler movies – pardon the adjectival redundancy — notwithstanding).

timeflies
Source: aussiegal

Given that, let’s step back and plot out a timeline for what it takes in an “insights and recommendations delivered with the dashboard” scenario for a dashboard that gets delivered monthly:

  1. Pull data (can’t happen until the end of the previous month)
  2. Consolidate data to get it into the dashboard
  3. Review the data — look at KPIs that missed targets and supporting metrics that moved unexpectedly
  4. Dig in to do analysis to try to figure out why those anomalies appeared
  5. IF the root cause is determined, assess whether this is something that needs “fixing” and posit ways that it might be fixable
  6. Summarize the results — the explanation for why those anomalies appeared and what might be done to remedy them going forward (if the root cause was something that requires a near-term change)
  7. Add the results to the dashboard
  8. Deliver the dashboard
  9. [Recipient] Review the dashboard and the results
  10. [Recipient] Decide whether to take action
  11. [Recipient] If action will be taken, then take the action

Seems like a long list, right? I didn’t write it trying to split out separate steps and make it needlessly long. What’s interesting is that steps 1 and 2 can (and should!) be shortened through automation. Aside from systems that are delayed in making their data available, there is no reason that steps 1 and 2 can’t be done within hours (or a day) of the end of the reporting period.

Steps 3 through 7, though, are time-consuming. And, often, they require conversations and discussion — not to mention time to actually conduct analysis. Despite vendor-perpetuated myths that “the tool” can generate recommendations… tools really suck at doing so (outside of highly operationalized processes).

Here’s the other kicker, though: steps 9 through 11 take time, too! So, realistically, let’s say that steps 1 and 2 take a day, steps 3 through 8 take a week, steps 9 and 10 takes 3 days (because the recipient doesn’t drop everything to review the dashboard when it arrives), and then step 11 takes a week (because “action” actually requires marshalling resources and getting something done). That means — best case — we’re 2.5 weeks into the month before action gets taken.

So, what happens at the end of the month? The process repeats, but there was only 1.5 weeks of the change actually being in place… which could easily get dwarfed by the 2.5 weeks of the status quo!!!

Let’s look at how a “dashboard without insights” process can work:

  1. Pull data (can’t happen until the end of the previous month)
  2. Consolidate data to get it into the dashboard
  3. Deliver the dashboard (possibly calling out any anomalies or missed targets)
  4. [Recipient] Review the dashboard and hones in on anything that looks troubling that she cannot immediately explain (more on that in the next section)
  5. The analyst and the recipient identify what, if any, trouble spots require deeper analysis and jointly develop actionable hypotheses to dig in
  6. The analyst conducts a very focused analysis (or, in some cases, proposes an A/B test) and delivers the results.
  7. [Recipient] If action is warranted, takes action

Time doesn’t stop for this process, either. But, it gets the information into the business’s hand inside of 2 days. The analyst doesn’t waste time discovering root causes that the business owner already knows (see the next section). The analysis that gets conducted is focused and actionable, and the business owner is already primed to take action, because she participated in determining what analyses made the most sense.

Reason No. 2: Analysts Aren’t Omniscient

I alluded to this twice in the prior paragraph. Let’s look at a generic and simplistic (but based on oft-observed real-world experience) example:

  1. The analyst compiles the dashboard and sees that traffic is down
  2. The analyst digs into the traffic sources and sees that paid search traffic is down dramatically
  3. The analyst digs in further and sees that paid search traffic went to zero on the 14th of the month and stayed there
  4. The analyst fires off an urgent email to the business that paid search traffic went to zero mid-month and that something must be wrong with the site’s SEM!
  5. The business responds that SEM was halted mid-month due to budget adjustments, and they’ve been meaning to ask what impact that has had

What’s wrong with this picture? Steps 2 through 4 are largely wasted time and effort! There is very real analysis to be done… but it doesn’t come until step 5, when the business provides some context and is ready for a discussion.

This happens all the time. It’s one of the reasons that it is imperative that analysts build strong relationships with their marketing stakeholders, and one of the reasons that a sign of a strong analytics organization is one where members of the team are embedded – literally or virtually – in the teams they support.

But, even with a strong relationship, co-location with the supported team, regular attendance at the team’s recurring meetings, and a spot on the team’s email distribution list, analysts are seldom aware of every activity that might result in an explainable anomaly in the results delivered in a dashboard.

This gets to a data source that gets ignored all too often: the minds and memories of the marketing team. There is nothing at all wrong with an analyst making the statement: “Something unexpected happened here, and, after I did some cursory digging, I’m not sure why. Do you have any ideas as to what might have caused this?” There are three possible responses from the marketer who is asked this question:

  • “I know exactly what’s going on. It’s almost certainly the result of X.”
  • “I’m not sure what might have caused that, but it’s something that we should get to the bottom of. Can you do some more digging to see if you can figure it out?”
  • “I’m not sure what might have caused that, but I don’t really care, either. It’s not important.”

These are quick answers to an easy question that can direct the analyst’s next steps. And, two of the three possible answers lead to a next step of moving onto a value-adding analysis — not pursuing a root cause that will lead to no action! Powerful stuff!

Reason No. 3: Insights Don’t have a Predictable and Consistent Length

I see it all the time: a standard dashboard format that, appropriately, has a consistent set of KPIs and supporting metrics carefully laid out in a very tightly designed structure. Somewhere in that design is a small box – at the top of the dashboard, at the bottom right of the dashboard, somewhere – that has room for a handful of bullet points or a short paragraph. This  area of the dashboard often has an ambitious heading: “Insights,” “Recommendations,” “Executive Summary.”

The idea – conceived either on a whiteboard with the initial design of the dashboard, or, more likely, added the first time the dashboard was produced – is that this is where the analysts real value will be manifested. THIS is where the analyst will place the Golden Nuggets of Wisdom that have been gleaned from the data.

Here’s the problem: some of these nuggets are a flake of dust, and some are full-on gold bars. Expecting insights to fit into a consistent, finite space week in and week out or month in and month out is naïve. Sometimes, the analyst has half a tweet’s worth of prose-worthy material to include, which makes for a largely empty box, leaving the analyst and the recipient to wonder if the analyst is slacking. At other times, the analyst has a handful of useful nuggets to impart…but then has to figure out how to distill a WordPress-sized set of information into a few tweet-sized statements.

golddust
Source: martinak15

 Now, if you buy into my first two reasons as to why recommendations shouldn’t be included with the dashboard in the first place, then this whole section becomes moot. But, if not — if you or your stakeholders still insist that performance measurement include recommendations — then don’t constrain the space to include that information to a fixed box on the dashboard.

Reason No. 4: Insights Can’t Be Scheduled

A scene from The Marketer and the Analyst (it’s a gripping — if entirely fictitious — play):

Marketer: “This monthly dashboard is good. It’s showing me how we’re doing. But, it doesn’t include any insights based on the performance for the month. I need insights to take action!”

Analyst: “Well, what did you do differently this month from previous months?”

Marketer: “What do you mean?”

Analyst: “Did you make any changes to the site?”

Marketer: “Not really.”

Analyst: “Did you change your SEM investment or strategy?”

Marketer: “No.”

Analyst: “Did you launch any new campaigns?”

Marketer: “No.”

Analyst: “Were there any specific questions you were trying to answer about the site this month?”

Marketer: “No.”

Analyst: ???!

Raise your hand if this approximates an exchange you’ve had. It’s symptomatic of a completely ass-backward perception of analytics: that the data is a vast reserve of dirt and rock with various veins of golden insights threaded throughout. And, that the analyst merely needs to find one or more of those veins, tap into it, and then produce a monthly basket of new and valuable ingots from the effort.

The fact is, insights come from analyses, and analyses come from hypotheses. Some analyses are small and quick. Some are large and require gathering data – through an A/B or multivariate test, for instance, or through a new custom question on a site survey. Confusing “regularly scheduled performance measurement” with “hypothesis-driven analysis” has become the norm, and that is a mistake.

While it is absolutely fine to measure the volume and value of analyses completed, it is a recipe for failure to expect a fixed number of insights to be driven from and delivered with a scheduled dashboard.

A Final Word: Dashboards vs. Reports

Throughout this post, I’ve discussed “dashboards.” I’ve steered clear of the word “report,” because it’s a word that has become pretty ambiguous. Should a report include insights? It depends on how you define a report:

  • If the report is the means by which, on a regularly scheduled basis, the performance of a [site/campaign/channel/initiative] is performing, then my answer is: “No.” Reasons 1, 2, and 4 explain why.
  • If the report is the term used to deliver the results of a hypothesis-driven analysis (or set of hypothesis-driven analyses), then my answer is, “Perhaps.” But…why not call it “Analysis Results” to remove the ambiguity in what it is?
  • If the report is intended to be a combination of both of the above, then you will likely be delivering a 25+ deck of rambling slides that — despite your adoration for the content within — is going to struggle to hold your audience’s attention and is going to do a poor job of both measuring performance and of delivering clearly actionable analysis results.

We live in a real-time world. Consumers — all marketers have come to accept — have short attention spans and consume content in bite-sized chunks. An effective analyst delivers information that is super-timely and is easily digestible.

So. Please. Don’t spend 3 weeks developing insights and recommendations to include on a 20-page document labeled “dashboard.”

Categorized under Analysis, Metrics

  • J. Neal Cornett

    I’m reading this and practically screaming, “Yes, preach on!” Stop the insanity of expecting actionable insights without hypothesis or testing.

    Still it is our responsibility to educate our customers/executives not just get annoyed at these ceaseless expectations for ‘gold’ actionable take-aways. This article went a long way toward helping me plan that journey with mine and for that I am grateful.

    • http://www.gilliganondata.com Tim Wilson

      I see your “preach on!” and raise you a “damn straight!” I *absolutely* believe analysts need to own that education. And, the education can only be 3 parts preaching to 7 parts showing — analysts figuring out ways to tease out hypotheses, play them back as such (without using the word “hypothesis,” necessarily) and then delivering results where the business can take action… totally separate from the dashboard generation.

      Build up a few of those examples, and it starts to come into focus for everyone that the dashboard isn’t where the insights come from — it’s the ad hoc, fluid response to questions and assumptions (hypotheses).

  • Jonathon Frampton

    I like the shift from existing line of thinking, but do feel (in many cases) that a well executed dashboard lends itself to actions.

    I agree that KPI dashboards with an array of moving parts would require a months’ (2.5 weeks to be exact) worth of analysis to dig into, but what about breaking that dashboard up into smaller more manageable chunks. So if in your example I have a dashboard that is focused on what is driving traffic to the site then adding some nuggets and insights would seem more do-able in terms of time spent ROI.

    I could go on, but this is a comment, not my own post…

    Either way, great article!

    Jon

    • http://www.gilliganondata.com Tim Wilson

      My intention wasn’t to imply that dashboards are not inherently actionable. Rather, the action it drives can be one of two things: 1) the recipient sees a problem and immediately knows what to do (which is why it’s imperative to get them the information with minimal latency), 2) the recipient sees a problem and does *not* know what to do…but willingly engages in a dialogue to generate hypotheses with the analyst.

      In your example, what if traffic to the site is…pretty much what is expected? Do you find yourself needing to include supporting data — traffic sources, new vs. returning, etc. — anyway? Even though things look fine?

      • Jonathon Frampton

        I had a large write up of a comment keyed up but realized that I agree, those are the results / actions that it drives and obviously the latter is to be encouraged. I apologize if my comment came of wrong, perhaps it should have been posed as more of a question.

        In my example I am trying to get the best of both worlds by creating inherent actions within the dashboards in a automated fashion because (as is often the case) I am a resource strapped analyst. I was just curious if creating more dashboards with this type of action is a better solution as the business still needs the actions to be called out…

        And in my (little) experience it is very rare that “nothing” changes, either on the front or back end, so that was my miss.

        Thank you for the reply, it is much appreciated.

        • http://tim.webanalyticsdemystified.com/ Tim Wilson

          Hmmm… Tell me more about “creating inherent actions…in an automated fashion.” Google, I think, is trying to move towards this on a media mix front, and GA has long had automated “intelligence”…but those aren’t “actions” so much as “oddities that you might want to know about.”

          Can you give me an example of an action that would be automatically recommended (without an analyst involved)? I’m just trying to think that through.

          “Manageable chunks” is definitely key (as long as that’s not: “This spreadsheet has 25 tabs…each one is a manageable chunk of information.”). The analyst’s passion for breadth and depth of data is seldom matched by their stakeholders.

  • julian barnes

    My initial thoughts are this, I’d mostly detach the “analyst” from the dashboard, they need to be aware of business drivers and available to answer questions if something has changed that can’t be explained by the business but leave the (most likely well paid) “analyst” to analyse, get deep into the data and answer those business questions which a KPI dashboard hardly asks but the business wants to know.

    That deeper, richer analysis then has recommendations or insight.

    Someone who produces a report is not necessarily an analyst and a good analyst that is producing reports wishes he wasn’t.

    Completely agree on a lot of what you say though, having seen it first hand

    • http://tim.webanalyticsdemystified.com/ Tim Wilson

      Yup. I agree. I feel like my next post may need to be “why I avoid the word ‘report!’”

      “The business user asked a question, to which the analyst is providing a ‘report’ that answers it.” That’s analysis — digging in, answering the point-in-time question, and, hopefully, providing insight that can lead to action.

      “The business user wants to see the monthly report on how the web site is performing.” THAT, in my mind, totally falls under this post — expecting the analyst to find actionable nuggets by trying to figure out which data fluctuations matter and where the business actually cares about performance — is pretty inefficient.

      Many times, a “richer analysis” may have a deeper insight — explain the *root cause* of something bad that is happening. But, I don’t think it can always have a 100%-from-the-analyst recommendation. “I’ve figured out that visitors are steadily gravitating to the lower margin products” may be valuable. I’m wayyy more comfortable facilitating and contributing to a discussion as to how that might be rectified rather than just rattling off a recommendation — quite possibly external factors at play that the marcom, product marketer, or product manager may be aware of off the top of their heads that needs to be factored into any recommendation.

  • Sergio Maldonado

    Very good post, Tim. It crowns you as one of the few bloggers in this industry who is actually trying to make analytics useful, and not just something people do for a living.

    As you know, I preach Digital Insight Management (i.e. making analytics useful is what I tell my wife I do for a living :) ). This basically means two things: a) I automate dashboards with an obsession for actionability; and b) I provide a framework for “Analysis Results” to be delivered or for hypotheses to be acted upon and tracked.

    Now, Can this two coexist? I believe Yes. Dashboards should indeed be automated and cannot wait for insights or recommendations. But they can very well become the starting point for a conversation (as opposed to an exchange of emails followed by a few meetings). And this could happen in both directions: the Analyst decides after a while that some of the results called for an explanation/recommendation or the business user decides to provide some context and requests some insights.

    All in all a great discussion, anyhow.

    • http://tim.webanalyticsdemystified.com/ Tim Wilson

      Thanks, Sergio!

      Scenario that I’m cool with:
      1. Dashboard automatically delivered (still as a “push” rather than just passively waiting for the marketer to log in and view it) — to the business and to the analyst.

      2. The analyst looks at it and sees a couple of anomalies. She knows the business well enough to know that both anomalies might really be troubling.

      3. She spends 10 minutes digging in to: 1) confirm there isn’t a data issue, and 2) see if there is a quickly available explanation. Let’s say she confirms the first and doesn’t find that explanation.

      4. She pings the business (specific communication channel varies): “Hey… did you see these results? Any idea what might be causing them? I assume we want to get to the bottom of it.”

      5. Collaboratively, they develop and prioritize some hypotheses — doesn’t take more than 5 minutes because they are well aligned and have a good working relationship. The analyst is sure to confirm, with each one, that they’d be actually able to *do* something if each hypothesis is confirmed.

      6. The analyst returns to her office and starts analyzing. She doesn’t wait to compile a detailed report with the results of every hypothesis. She gets what can be done fairly quickly and then pushes out the results to the marketer. The marketer weighs in — some of that analysis actually triggers some additional hypotheses.

      7. The marketer starts to act as each conclusive result comes in.

      It’s fluid and collaborative…and the dashboard is just one potential jumping off point for that collaboration.

      • julian barnes

        Collaboration here is the key between the relevant parties, something that technology can only help facilitate but never substitute…..internal stakeholder relationships, mutual respect for ones opinions; if the relationships are strong enough then the desired end result which is beneficial for the business will work itself out.

        @disqus_ECvuRMSGnH:disqus: Hi hope your well, small world :)

 


Recent Blog Posts

10 Tips for Building a Dashboard in Excel
Tim Wilson, Partner

This post has an unintentionally link bait-y post title, I realize. But, I did a quick thought experiment a few weeks ago after walking a client through the structure of a dashboard I’d built for them to see if I could come up with ten discrete tips that I’d put to use when I built it. Turns out…I can!

Continue reading this article ... ... more from Tim Wilson

Exploring Optimal Post Timing ... Redux
Tim Wilson, Partner

Back in 2012, I developed an Excel worksheet that would take post-level data exported from Facebook Insights and do a little pivot tabling on it to generate some simple heat maps that would provide a visual way to explore when, for a given page, the optimal times of day and days of the week are for posting.

Continue reading this article ... ... more from Tim Wilson

What I Love: Adobe and Google Analytics*
Tim Wilson, Partner

While in Atlanta last week for ACCELERATE, I got into the age-old discussion of "Adobe Analytics vs. Google Analytics." I'm up to my elbows in both of them, and they’re both gunning for each other, so this list is a lot shorter than it would have been a couple of years ago.

Continue reading this article ... ... more from Tim Wilson

Top 5 Metrics You’re Measuring Incorrectly ... or Not
Eric T. Peterson, Senior Partner

Last night as I was casually perusing the days digital analytics news — yes, yes I really do that — I came across a headline and article that got my attention. While the article’s title ("Top 5 Metrics You’re Measuring Incorrectly") is the sort I am used to seeing in our Buzzfeed-ified world of pithy “made you click” headlines, it was the article’s author that got my attention.

Continue reading this article ... ... more from Eric T. Peterson

Bulletproof Business Requirements
John Lovett, Senior Partner

As a digital analytics professional, you’ve probably been tasked with collecting business requirements for measuring a new website/app/feature/etc. This seems like a task that’s easy enough, but all too often people get wrapped around the axle and fail to capture what’s truly important from a business users’ perspective. The result is typically a great deal of wasted time, frustrated business users, and a deep-seated distrust for analytics data.

Continue reading this article ... ... more from John Lovett

Welcome to Team Demystified: Nancy Koons and Elizabeth Eckels!
Eric T. Peterson, Senior Partner

I am delighted to announce that our Team Demystified business unit is continuing to expand with the addition of Nancy Koons and Elizabeth “Smalls” Eckels. Our Team Demystified efforts are exceeding all expectation and are allowing Web Analytics Demystified to provide truly world-class services to our Enterprise-class clients at an entirely new scale.

Continue reading this article ... ... more from Eric T. Peterson

When to Use Variables vs SAINT in Adobe Analytics
Adam Greco, Senior Partner

In one of my recent Adobe SiteCatalyst (Analytics) "Top Gun" training classes, a student asked me the following question: When should you use a variable (i.e. eVar or sProp) vs. using SAINT Classifications? This is an interesting question that comes up often, so I thought I would share my thoughts on this and my rules of thumb on the topic.

Continue reading this article ... ... more from Adam Greco

5 Tips for #ACCELERATE Exceptionalism
Tim Wilson, Partner

Next month’s ACCELERATE conference in Atlanta on September 18th will be the fifth — FIFTH!!! — one. I wish I could say I’d attended every one, but, sadly, I missed Boston due to a recent job change at the time. I was there in San Francisco in 2010, I made a day trip to Chicago in 2011, and I personally scheduled fantastic weather for Columbus in 2013.

Continue reading this article ... ... more from Tim Wilson

I’ve Become Aware that Awareness Is a #measure Bugaboo
Tim Wilson, Partner

A Big Question that social and digital media marketers grapple with constantly, whether they realize it or not: Is "awareness" a valid objective for marketing activity?

I’ve gotten into more than a few heated debates that, at their core, center around this question. Some of those debates have been with myself (those are the ones where I most need a skilled moderator!).

Continue reading this article ... ... more from Tim Wilson

Advanced Conversion Syntax Merchandising
Adam Greco, Senior Partner

As I have mentioned in the past, one of the Adobe SiteCatalyst (Analytics) topics I loathe talking about is Product Merchandising. Product Merchandising is complicated and often leaves people scratching their heads in my "Top Gun" training classes. However, many people have mentioned to me that my previous post on Product Merchandising eVars helped them a lot so I am going to continue sharing information on this topic.

Continue reading this article ... ... more from Adam Greco

Team Demystified Update from Wendy Greco
Eric T. Peterson, Senior Partner

When Eric Peterson asked me to lead Team Demystified a year ago, I couldn’t say no! Having seen how hard all of the Web Analytics Demystified partners work and that they are still not able to keep up with the demand of clients for their services, it made sense for Web Analytics Demystified to find another way to scale their services. Since the Demystified team knows all of the best people in our industry and has tons of great clients, it is not surprising that our new Team Demystified venture has taken off as quickly as it has.

Continue reading this article ... ... more from Eric T. Peterson

SiteCatalyst Unannounced Features
Adam Greco, Senior Partner

Lately, Adobe has been sneaking in some cool new features into the SiteCatalyst product and doing it without much fanfare. While I am sure these are buried somewhere in release notes, I thought I’d call out two of them that I really like, so you know that they are there.

Continue reading this article ... ... more from Adam Greco

Hello. I’m a Radical Analytics Pragmatist
Tim Wilson, Partner

I was reading a post last week by one of the Big Names in web analytics…and it royally pissed me off. I started to comment and then thought, “Why pick a fight?” We’ve had more than enough of those for our little industry over the past few years. So I let it go.

Except I didn’t let it go.

Continue reading this article ... ... more from Tim Wilson

Competitor Pricing Analysis
Adam Greco, Senior Partner

One of my newest clients is in a highly competitive business in which they sell similar products as other retailers. These days, many online retailers have a hunch that they are being “Amazon-ed,” which they define as visitors finding products on their website and then going to see if they can get it cheaper/faster on Amazon.com. This client was attempting to use time spent on page as a way to tell if/when visitors were leaving their site to go price shopping.

Continue reading this article ... ... more from Adam Greco

How to Deliver Better Recommendations: Forecast the Impact!
Michele Kiss, Partner

One of the most valuable ways to be sure your recommendations are heard is to forecast the impact of your proposal. Consider what is more likely to be heard: "I think we should do X ..." vs "I think we should do X, and with a 2% increase in conversion, that would drive a $1MM increase in revenue ..."

Continue reading this article ... ... more from Michele Kiss

ACCELERATE 2014 “Advanced Analytics Education” Classes Posted
Eric T. Peterson, Senior Partner

I am delighted to share the news that our 2014 “Advanced Analytics Education” classes have been posted and are available for registration. We expanded our offering this year and will be offering four concurrent analytics and optimization training sessions from all of the Web Analytics Demystified Partners and Senior Partners on September 16th and 17th at the Cobb Galaria in Atlanta, Georgia.

Continue reading this article ... ... more from Eric T. Peterson

Product Cart Addition Sequence
Adam Greco, Senior Partner

In working with a client recently, an interesting question arose around cart additions. This client wanted to know the order in which visitors were adding products to the shopping cart. Which products tended to be added first, second third, etc.? They also wanted to know which products were added after a specific product was added to the cart (i.e. if a visitor adds product A, what is the next product they tend to add?). Finally, they wondered which cart add product combinations most often lead to orders.

Continue reading this article ... ... more from Adam Greco

7 Tips For Delivering Better Analytics Recommendations
Michele Kiss, Partner

As an analyst, your value is not just in the data you deliver, but in the insight and recommendations you can provide. But what is an analyst to do when those recommendations seem to fall on deaf ears?

Continue reading this article ... ... more from Michele Kiss

Overcoming The Analyst Curse: DON’T Show Your Math!
Michele Kiss, Partner

If I could give one piece of advice to an aspiring analyst, it would be this: Stop showing your "math". A tendency towards "TMI deliverables" is common, especially in newer analysts. However, while analysts typically do this in an attempt to demonstrate credibility ("See? I used all the right data and methods!") they do so at the expense of actually being heard.

Continue reading this article ... ... more from Michele Kiss

Making Tables of Numbers Comprehensible
Tim Wilson, Partner

I'm always amazed (read: dismayed) when I see the results of an analysis presented with a key set of the results delivered as a raw table of numbers. It is impossible to instantly comprehend a data table that has more than 3 or 4 rows and 3 or 4 columns. And, "instant comprehension" should be the goal of any presentation of information — it's the hook that gets your audience's brain wrapped around the material and ready to ponder it more deeply.

Continue reading this article ... ... more from Tim Wilson

Automating the Cleanup of Facebook Insights Exports
Tim Wilson, Partner

This post (the download, really — it’s not much of a post) is about dealing with exports from Facebook Insights. If that's not something you do, skip it. Go back to Facebook and watch some cat videos. If you are in a situation where you get data about your Facebook page by exporting .csv or .xls files from the Facebook Insights web interface, then you probably sometimes think you need a 52" monitor to manage the horizontal scrolling.

Continue reading this article ... ... more from Tim Wilson

The Recent Forrester Wave on Web Analytics ... is Wrong
Eric T. Peterson, Senior Partner

Having worked as an industry analyst back in the day I still find myself interested in what the analyst community has to say about web analytics, especially when it comes to vendor evaluation. The evaluations are interesting because of the sheer amount of work that goes into them in an attempt to distill entire companies down into simple infographics, tables, and single paragraph summaries.

Continue reading this article ... ... more from Eric T. Peterson

Funnel Visualizations That Make Sense
Tim Wilson, Partner

Funnels, as a concept, make some sense (although someone once made a good argument that they make no sense, since, when the concept is applied by marketers, the funnel is really more a "very, very leaky funnel," which would be a worthless funnel — real-world funnels get all of a liquid from a wide opening through a smaller spout; but, let’s not quibble).

Continue reading this article ... ... more from Tim Wilson

Reenergizing Your Web Analytics Program & Implementation
Adam Greco, Senior Partner

Those of you who have read my blog posts (and book) over the years, know that I have lots of opinions when it comes to web analytics, web analytics implementations and especially those using Adobe Analytics. Whenever possible, I try to impart lessons I have learned during my web analytics career so you can improve things at your organization.

Continue reading this article ... ... more from Adam Greco

Registration for ACCELERATE 2014 is now open
Eric T. Peterson, Senior Partner

I am excited to announce that registration for ACCELERATE 2014 on September 18th in Atlanta, Georgia is now open. You can learn more about the event and our unique "Ten Tips in Twenty Minutes" format on our ACCELERATE mini-site, and we plan to have registration open for our Advanced Analytics Education pre-ACCELERATE training sessions in the coming weeks.

Continue reading this article ... ... more from Eric T. Peterson

Current Order Value
Adam Greco, Senior Partner

I recently had a client pose an interesting question related to their shopping cart. They wanted to know the distribution of money its visitors were bringing with them to each step of the shopping cart funnel.

Continue reading this article ... ... more from Adam Greco

A Guide to Segment Sharing in Adobe Analytics
Tim Wilson, Partner

Over the past year, I've run into situations multiple times where I wanted an Adobe Analytics segment to be available in multiple Adobe Analytics platforms. It turns out…that's not as easy as it sounds. I actually went multiple rounds with Client Care once trying to get it figured out. And, I’ve found "the answer" on more than one occasion, only to later realize that that answer was a bit misguided.

Continue reading this article ... ... more from Tim Wilson

Currencies & Exchange Rates
Adam Greco, Senior Partner

If your web analytics work covers websites or apps that span different countries, there are some important aspects of Adobe SiteCatalyst (Analytics) that you must know. In this post, I will share some of the things I have learned over the years related to currencies and exchange rates in SiteCatalyst.

Continue reading this article ... ... more from Adam Greco

Linking Authenticated Visitors Across Devices
Adam Greco, Senior Partner

In the last few years, people have become accustomed to using multiple digital devices simultaneously. While watching the recent winter Olympics, consumers might be on the Olympics website, while also using native mobile or tablet apps. As a result, some of my clients have asked me whether it is possible to link visits and paths across these devices so they can see cross-device paths and other behaviors.

Continue reading this article ... ... more from Adam Greco

The 80/20 Rule for Analytics Teams
Eric T. Peterson, Senior Partner

I had the pleasure last week of visiting with one of Web Analytics Demystified’s longest-standing and, at least from a digital analytical perspective, most successful clients. The team has grown tremendously over the years in terms of size and, more importantly, stature within the broader multi-channel business and has become one of the most productive and mature digital analytics groups that I personally am aware of across the industry.

Continue reading this article ... ... more from Eric T. Peterson

Ten Things You Should ALWAYS Do (or Not Do) in Excel
Tim Wilson, Partner

Last week I was surprised by the Twitter conversation a fairly innocuous vent-via-Twitter tweet started, with several people noting that they had no idea you could simple turn off the gridlines.

Continue reading this article ... ... more from Tim Wilson

Omni Man (and Team Demystified) Needs You!
Adam Greco, Senior Partner

As someone in the web analytics field, you probably hear how lucky you are due to the fact that there are always web analytics jobs available. When the rest of the country is looking for work and you get daily calls from recruiters, it isn’t a bad position to be in! At Web Analytics Demystified, we have more than doubled in the past year and still cannot keep up with the demand, so I am reaching out to you ...

Continue reading this article ... ... more from Adam Greco

A Useful Framework for Social Media "Engagements"
Tim Wilson, Partner

Whether you have a single toe dipped in the waters of social media analytics or are fully submerged and drowning, you’ve almost certainly grappled with "engagement." This post isn’t going to answer the question "Is engagement ROI?" ...

Continue reading this article ... ... more from Tim Wilson

It’s not about "Big Data", it’s about the "RIGHT data"
Michele Kiss, Partner

Unless you’ve been living under a rock, you have heard (and perhaps grown tired) of the buzzword "big data." But in attempts to chase the "next shiny thing", companies may focus too much on "big data" rather than the "right data."

Continue reading this article ... ... more from Michele Kiss

Eric T.
Peterson

John
Lovett

Adam
Greco

Brian
Hawkins

Kevin
Willeitner

Michele
Kiss

Josh
West

Tim
Wilson

Contact Us

You can contact Web Analytics Demystified day or night via email or by reaching out to one of our Partners directly.

» Contact Information

Web Analytics Demystified, Inc.
P.O. Box 13303
Portland, OR 97213
(503) 282-2601


Useful Links