What Marketing/Analytics Can Learn from Mythbusters

Published by Tim Wilson on November 25, 2013 All posts from Tim Wilson

Earlier this month, I gave a presentation at the Columbus Web Group meetup that I titled Mythbusters: Analytics Edition. The more I worked on the presentation — beating the same drums and mounting the same soapboxes I’ve mounted for years — the more I realized that the Discovery Channel show is actually a pretty useful analog for effective digital analytics. And, since I’m always on the lookout for new and better ways to talk to analysts and marketers about how to break out of the soul-sucking and money-wasting approaches that businesses have developed for barfing data and gnashing teeth about the dearth of “actionable insights,” this one seemed worth trying to write down.

Note: If you’re not familiar with the show…you can just bail on this post now. It’s written with the assumption that the reader actually knows the basic structure and format of the program.

Mythbusters - Analytics Edition

First, a Mythbusters Episode Produced by a Typical Business

When I do a thought experiment of putting an all-too-typical digital marketer and their analytics team in charge of producing a Mythbusters episode, here’s what happens:

mythbusters_jamie_armorThe show’s opening credits roll. Jamie and Adam stand in their workshop and survey the tools they have: welding equipment, explosives, old cars, Buster, ruggedized laptops, high-speed cameras, heavy ropes and chain, sheet metal, plexiglass, remote control triggers, and so on. They chat about which ones seem would be the most fun to do stuff with, and then they head their separate ways to build something cool and interesting.

[Commercial break]

Jamie and Adam are now out in a big open space. They have a crane with an old car suspended above it. They have an explosive device constructed with dynamite, wire, and a bunch of welded metal. They have a pole near the apparatus with measurements marked on it. They have a makeshift bomb shelter. They have high-speed cameras pointed at the whole apparatus. They get behind the bomb shelter, trigger the crane to drop the car and, right as it lands on the explosive device, the device goes off and blows the car up into the air.

[Commercial break]

Jamie and Adam are now reviewing the footage of the whole exercise. They play and replay videos in slow motion from different angles. They freeze-frame the video at the peak of the old car’s trajectory and note how high it went. Then, the following dialogue ensues:

Adam: “That was soooooo cool.”

Jamie: “Yeah. It was. What did we learn?”

Adam: “Well, the car was raised 7’2″ into the air.”

Jamie: “Right. So, how are we going to judge this myth? Busted, plausible, or confirmed?”

Adam: “Um… what was the myth we were trying to bust?”

Jamie: “Oh. I guess we didn’t actually identify one. We just came up with something cool and did it.”

Adam: “And we measured it!”

Jamie: “That’s right! We measured it! So… busted, plausible, or confirmed?!”

Adam: “Hmmm. I don’t know. I don’t think how high the car went really tells us anything. How loud do you think the explosion was?”

Jamie: “It was pretty loud. Did we measure the sound?”

Adam: “No. We probably should have done that. But…man…that was a bright flash when it blew up! I had to shield my eyes!”

Jamie: “Aha! We have software that will measure the brightness of the flashes from the video footage! Let’s do that!”

[They measure the brightness.]

Adam: “Wow. That’s pretty bright.”

Jamie: “Yeah. So, have we now done enough analysis to call the myth busted, plausible, or confirmed?”

Adam: “Well…we still don’t know what ‘it’ is. What’s the myth?”

Jamie: “Oh, yeah. I forgot about that.” [turns to the camera] “Well, we’re about out of time. We’ll be back next week! You know the format, folks! We’ll do this again next week — although we’ll come up with something else we think is cool to build and blow up. Hopefully, we’ll be able to make a busted, plausible, or confirmed call on that episode!”

[Credits roll]

This is how we’ve somehow managed to train ourselves to treat digital analytics!!!

We produce weekly or monthly reports and expect them to include “analysis and insights.” Yet, like the wrongheaded Mythbusters thought experiment above, we don’t actually ask questions that we want answered.

Sure, We Can Find Stuff Just by Looking

Keeping with the Mythbusters theme and, actually, lifting a slide straight from the presentation I did, what happens — in reality — when we simply point a web analyst to the web analytics platform and tell them to do some analysis and provide some insights for a monthly report? Poking around, clicking into reports, correlating data, even automatically detecting anomalies, we can turn up all sorts of things that don’t help the marketer one whit:


To be clear, the marketer (Jamie) is complicit here. He is the one who expects the analyst to simply dig into the data and “find insights.” But, week in and week out, month in and month out, he gets the report, the report includes “analysis” of the anomalies in the data and other scattershot true-but-not-immediately-relevant findings, but he doesn’t get information that he can immediately and directly act on. (At which point we invoke Einstein’s definition of insanity: “doing the same thing over and over again and expecting different results.”)

“Insights” that are found this way, more often than not, have a perfectly logical and non-actionable explanation. This is what analysis becomes when the analyst is told to simply dig into the data and produce a monthly report with “analysis and insights.”

The Real Mythbusters Actually Gets It Right

Let’s look at how the real Mythbusters show runs:

  1. A well-known (or obscure) belief, urban legend, or myth is identified.
  2. The Mythbusters team develops a plan for testing that myth in a safe, yet scientifically valid, way.
  3. They experiment/construct/iterate as they implement the plan.
  4. They conclude with a one-word and unequivocal assessment of the result: “Confirmed,” “Plausible,” or “Busted.”

Granted, the myths they’re testing aren’t ones that lead to future action (just because they demonstrate that a lawn chair with a person on it can be lifted by balloons if you tie enough of them on doesn’t mean they’re going to start promoting a new form of air travel). But, aside from that, the structure of their approach is exactly where marketers could get the most value. It is nothing more and nothing less than a basic application of the scientific method.

Sadly, it’s not an approach that marketers intuitively follow (they’re conditioned not to by the legacy of bloated recurring reports). And, even worse, it’s not an approach that many analysts embrace and push themselves.

Outlining those same exact steps, but in marketing analytics terms:

  1. A marketer has an idea about some aspect of their site that, if they’re right, would lead them to make a change. (This is a hypothesis, but without the fancy label.)
  2. The analyst assesses the idea and figures out the best option for testing it, either through digging into historical web analytics or voice of the customer data or by conducting an A/B test.
  3. The analyst does the analysis or conducts the test
  4. The analyst clearly and concisely communicates the results of the analysis back to the marketer, who then takes action (or doesn’t, as appropriate)

So clear. So obvious. Yet…so NOT the mainstream reality that I see. I have a lot of theories as to why this is, and it’s becoming a personal mission to change that reality. Are you on board to help? It will be the most mundane revolution, ever…but, who knows? Maybe we’ll at least come up with a cool T-shirt.

Jamie armor photo courtesy of TenSafeFrogs

Categorized under Analysis, Uncategorized

  • Lea Synefakis-Pica

    Viva la revolution! Now…how we kick off la revolution? At the
    outset of each month, rather than blindly digging in data, survey
    marketers/stakeholders via email or meeting for their burning questions
    or potential insights?

  • http://tim.webanalyticsdemystified.com/ Tim Wilson

    I’ve got a case-study-in-the-making on that. :-)

    My best luck has been multi-pronged: 1) relationship/credibility/trust-building with stakeholders, 2) Education-education-education of business users (but in bite-sized and engaging ways — like talking about Mythbusters), 3) shifting the structure of legacy recurring reports to promote a dialogue rather than a data-puke. Every organization is different, but it starts with recognizing that the current way isn’t working and being open to shaking things up and doing thing differently. And, boy, getting to that starting point can be a BEAR — it’s against human nature to admit that what we’ve been doing month in and month out is actually not delivering value commensurate with the cost.