Friday, February 26, 2010

Where in the world is S.Hamel?

Spring is the conference season and this year is going to be exceptional!

Last December I had thought of two personal SMART objectives:
  • By December 2010 I will be speaking at least once in Europe.
  • By December 2010 I will be a keynote speaker at least once.
Within days those two objectives were scheduled.

Kicking off 2010

I kicked off the year with a half-day workshop based on the Web Analytics Maturity Model, in partnership with PublicInsite, in Ottawa. Then I was asked to do it again for the Réseau Action-TI in Québec-city, an organization fostering the education, networking and innovation in field of IT. February 10th was a 3rd opportunity to present the Roadmap to Online Analytics Success in the morning, followed by a great Web Analytics Association event with Jim Sterne. And lastly, two weeks ago I spoke at the Internet Marketing Conference in Montreal.

Sadly I have to skip the Omniture Summit for this year. I had been invited for the MindMeld invite-only event but I've been swamped by work and had to make difficult choices before leaving for Europe.

What's next?

As I always say, there's no better social media than meeting face to face! If you are anywhere around those cities please register and let me know. If you feel neglected because I'm not visiting you, please email me so I can see where there's enough interest and which partner could help me get there! I already have New York on the radar, if you are interested or can help, let me know.

Thursday, February 25, 2010

Mixing analytics tools: My take

Another interesting post on the web analytics forum today: should you use multiple analytics tools?

Someone is referring to the Online Marketing Summit in San Diego, where a panel of expert addressed the danger of using multiple tools. One of the panellists is even reported as stating quite strongly not to use multiple tools (assuming he meant to say not to use multiple web analytics tools).

The first question to ask is if we're talking about multiple web analytics or online analytics. I want to make this subtle but important distinction: web analytics relates to behavioural clickstream data (your typical Google Analytics), while online analytics relates to the use of online data in a broader sense, including voice-of-customer, social media analytics, performance monitoring, competitive analysis and such. A while ago I had proposed a definition of web analytics but would like to refine it further as "online analytics":
Online analytics is the extensive use of qualitative and quantitative data, statistical analysis, exploratory and predictive models, business process analysis and fact-based management to drive a continuous improvement of online activities resulting in improved ROI.
Based on the recently published WASP study, 33% of the top 500 US retail sites are using more than one web analytics tool.

Why use multiple tools anyway?

In my experience, and while working on the Web Analytics Maturity Model, I have seen many reasons why organizations end up with multiple web analytics tools:
  • large-scale, global organizations often tries to standardize on a single tool (make sense!) but smaller groups naturally find ways to empower themselves when the central authority is too rigid and inflexible
  • organizations working with agencies often end up with tags that are included anyway; leading to a mix match of unrelated data sources obscuring the full user experience view across multiple sites
  • I have seen more occurrence of people who think the only way to measure Google PPC effectively is to use Google Analytics
  • In some cases, there is a strong belief the "old" web analytics solution is broken/incorrect and people have lost trust and think the magic answer is to switch tool. Every time this happens we eventually find out the issue is not the tool...

My take

Question: Should we use multiple web analytics tools?
Answer: if you have trouble mastering one tool, don't venture into using multiple tools. Quite simple.

There are many reasons to this and I'm sure the panel addressed those points:
  • increased costs of implementation
  • confusion and higher training requirements
  • but the most often stated argument is each tool use different methodologies

However...

If you have are one of the increasing organization using more than one tool (Google Analytics almost always being present):
  • audit your site to make sure every pages are tagged
  • compare trends, not raw numbers
  • consistently use one tool to report your insight (i.e. not visits from one and visitors from the other)
In a scientific approach, you should never study something using a single tool or set of observations. I always recommend using multiple tools of "different" types. This relates to "multiplicity", not a new concept but something Avinash Kaushik presented brilliantly. The essential ones for onsite analytics are:
  • web analytics,
  • voice of customer and
  • performance monitoring.
See "iPerceptions deep dive in behavioral and attitudinal data" for an example of web analytics + voice of customer and "Performance analytics the Coradiant way" for an example of web analytics + performance monitoring.

Wednesday, February 24, 2010

Testing web analytics implementation with WASP

I stumbled on this thread on the Web Analytics discussion forum: "testing of webanalytics"

Q) Can you please suggest best practices around testing webanalytics implementation (Omniture or Webtrends). Any test plan or strategy document along with the test tools that can be used will be of great help.
A) There is no better test than randomly clicking your site to verify page names.

He? Excuse me! When has quality assurance become a random act of faith?

What should I test to increase quality of my web analytics data?

First, you should already have quality assurance test scenarios for your whole site. You can use the same ones to test your analytics implementation.

Here's how to proceed:
  • The home page is pretty unique (as are landing pages built for specific campaigns), so those should be tested individually.
  • Most sites uses templates. Identify each template and at least 3 pages using each of those templates. For example, category pages, product pages, article pages, etc.
  • Identify each process-driven conversion scenarios: shopping cart, subscribe to newsletter, contact us, internal search, etc. - each of those scenarios should be tested with at least 3 set of values. You will want to test extreme conditions, like entering wrong data or higher/lower range values. For example, when testing internal search, do you correctly track zero-search results and their search terms?
  • Identify how campaigns are going to be tracked and what are the parameters being used to track them
  • Especially with Omniture and WebTrends (and now Google Analytics), you can have several custom variables. You will want to identify them and test pages that are specifically assigned those values.

How do I test?

Don't do this by hand! The poor man's way would be do to a "View source", search for the tags, use some debugger and try to decipher the query string parameters. Chances you miss something are big and you will loose an incredible amount of time doing it.

WASP, the Web Analytics Solution Profiler was built specifically to ease quality assurance of web analytics implementations. When used manually, it will show and easy to read detailed breakdown of your tags in the  browser sidebar as you browse from page to page. There's a free version of WASP you can try, and if you plan on doing more serious quality audits on a frequent basis, the more advanced Market Research version will allow you to crawl your site or list a bunch of URLs in a text file (i.e. those using templates defined above!)

Disclaimer: I'm the creator of WASP - which was sold to iPerceptions a couple of months ago.

What should I watch for?

WASP for Omniture tags quality assurance

Check the following:
  • Are the tags firing at all? Does the WASP sidebar shows values?
    WASP shows data as it is being sent. If not, either the tag isn't there at all or there are JavaScript errors preventing them from firing.
  • Check for campaign parameters
  • Are the tag values the expected ones (check title and all custom variables)
  • Especially for processes, are the tags the right ones?
  • Are there multiple calls? If so, this might be normal, but you might want to double check you don't have duplicate tags or unexpected calls.

More information

You can get more information about WASP on the official site at WebAnalyticsSolutionProfiler.com as well as the following blog posts:

Monday, February 15, 2010

Internet Marketing Conference - Montreal, Feb 18th

The Internet Marketing Conference is back in Montreal this Thursday, February 18th, at the Holiday Inn Select in Downtown Montreal. Lennart Svanberg, the mind behind IMC, asked me to Chair the Montreal conference where a host of speakers will address the theme of "creativity".

The Internet continues to be an immense playground for those who are creative. Whole businesses have emerged out of what often looked like crazy ideas. As in any evolution process, most failed, but some survived and became immensely successful. What’s the role of creativity in today’s marketing and business strategies? Is creativity just play and business just about profits?

Find the answer at IMC!

Internet Marketing Conference - Montreal
Thursday, February 18th
Holiday Inn Select Downtown Montreal
Register Now!