Thursday, February 25, 2010

Mixing analytics tools: My take

Another interesting post on the web analytics forum today: should you use multiple analytics tools?

Someone is referring to the Online Marketing Summit in San Diego, where a panel of expert addressed the danger of using multiple tools. One of the panellists is even reported as stating quite strongly not to use multiple tools (assuming he meant to say not to use multiple web analytics tools).

The first question to ask is if we're talking about multiple web analytics or online analytics. I want to make this subtle but important distinction: web analytics relates to behavioural clickstream data (your typical Google Analytics), while online analytics relates to the use of online data in a broader sense, including voice-of-customer, social media analytics, performance monitoring, competitive analysis and such. A while ago I had proposed a definition of web analytics but would like to refine it further as "online analytics":
Online analytics is the extensive use of qualitative and quantitative data, statistical analysis, exploratory and predictive models, business process analysis and fact-based management to drive a continuous improvement of online activities resulting in improved ROI.
Based on the recently published WASP study, 33% of the top 500 US retail sites are using more than one web analytics tool.

Why use multiple tools anyway?

In my experience, and while working on the Web Analytics Maturity Model, I have seen many reasons why organizations end up with multiple web analytics tools:
  • large-scale, global organizations often tries to standardize on a single tool (make sense!) but smaller groups naturally find ways to empower themselves when the central authority is too rigid and inflexible
  • organizations working with agencies often end up with tags that are included anyway; leading to a mix match of unrelated data sources obscuring the full user experience view across multiple sites
  • I have seen more occurrence of people who think the only way to measure Google PPC effectively is to use Google Analytics
  • In some cases, there is a strong belief the "old" web analytics solution is broken/incorrect and people have lost trust and think the magic answer is to switch tool. Every time this happens we eventually find out the issue is not the tool...

My take

Question: Should we use multiple web analytics tools?
Answer: if you have trouble mastering one tool, don't venture into using multiple tools. Quite simple.

There are many reasons to this and I'm sure the panel addressed those points:
  • increased costs of implementation
  • confusion and higher training requirements
  • but the most often stated argument is each tool use different methodologies

However...

If you have are one of the increasing organization using more than one tool (Google Analytics almost always being present):
  • audit your site to make sure every pages are tagged
  • compare trends, not raw numbers
  • consistently use one tool to report your insight (i.e. not visits from one and visitors from the other)
In a scientific approach, you should never study something using a single tool or set of observations. I always recommend using multiple tools of "different" types. This relates to "multiplicity", not a new concept but something Avinash Kaushik presented brilliantly. The essential ones for onsite analytics are:
  • web analytics,
  • voice of customer and
  • performance monitoring.
See "iPerceptions deep dive in behavioral and attitudinal data" for an example of web analytics + voice of customer and "Performance analytics the Coradiant way" for an example of web analytics + performance monitoring.