Friday, June 19, 2009

A tale of Web Analytics near miss

Wikipedia defines a "near miss" as being an "unplanned event that did not result in injury, illness, or damage - but had the potential to do so". I was reading Jim Novo's latest post about Analyze, Not Justify, relating a conversation with a client that didn't became one... I wanted to do such as post for a while and I figured out "what the heck, I'll share one of my experience". Of course, in doing so I will try to "protect the innocent". So here goes.

The agency who wanted to be

I was contacted by a traditional marketing agency (call it "ABC") that also does websites and online campaigns. They wanted me to help out with one of their clients (let's call them "XYZ"). Since I offer to "coach and empower web agencies" this sounded like a perfect opportunity. Furthermore, I had been referred by one of my very best client.

The agency wanted me to jump right in: fix that darn web analytics tool; make magic happen. At the same time, they were talking about very ambitious web initiatives for XYZ and how they would measure all of it. I smelled "risk ahead" and I was able to convince them to start with a Web Analytics Maturity Assessment which you can view bellow (click for larger view):


The outcomes from the assessment were pretty clear, both from an agency and a client perspective... Also, the goal was to go from virutally nothing and jump more than 1 level in the Maturity scale, which is also an important risk.

SixSigma DMAIC to the rescue

Knowing the risks is already a pretty good start. To alleviate those risks I addressed the project from a SixSigma problem resolution perspective (which I always do anyway). Starting with a Definition of clear objectives, then Measuring the actual state of the union (then doing Analysis, Improvement and Control). We went on to define and clarify business objectives, identify the stakeholders and the required metrics, identify which info would need to be communicated in dashboards and how often, etc.

The wall

Measuring the actual state is where we found the "politiwall" - the political brick wall. XYZ is the canadian branch of a much larger organization; a typical multi-national, multi-product global corporation. They are obviously accountable for success in Canada and as such, they maintain the site content and define the marketing strategy. However, figure this scenario:
  • CMS tool is under the responsibility of head office, but content and accountability for success is XYZ responsibility
  • XYZ tools are developed by local partners, one of those partners being the ABC agency. Accountability for success of those process-driven tools is also under XYZ responsibility
  • Head office standardized on Omniture and did a fantastic implementation. They obviously want to offer a global perspective on all XYZ websites. This is a perfectly valid approach.
As I noticed in a previous post, multi-national head offices tend to use "best of breed" tools and "impose" a level of standardization. On the other end, local branches of those larger organizations tend to use Google Analytics more, for a number of reasons: locally perceived control, costs, and maturity.

In the actual context:
  • XYZ is not allowed to add Google Analytics to the CMS framework
  • XYZ is not allowed to add Omniture tags to locally developed applications
  • Yet, we want to have a complete view of the user's interaction with the whole XYZ ecosystem (i.e. all components of the online presence, including the CMS, tools and promotional sites)
We are clearly facing a governance and ownership issue here. So tell me, what would you do?

My recommendations

Considering XYZ accountability for success, here are the next steps I'm envisioning:
  1. Demonstrate the direct and logical correlation between "accountability for success", "means to measure" and "power to take action". Escalate at the appropriate level (beyond current local director level).
  2. Since the headoffice have an implementation and best practice document, it could be reviewed to accomodate for more flexibility for local sites (something I've seen with another client who standardized the core implementation but allow for a level of flexibility on over 200 websites)
  3. Make sure to follow the implementation guidelines and best practice in implementing locally developed tools, with collaboration of the head office to make sure they are comfortable with the implementation
Point #1 is critical. If you don't solve it now it will always come back to haunt you!

Help me help you

Latest status was "the business case was presented to head office" and I'm waiting for feedback. At one point I had the impression the agency was putting the blame on me for failing to deliver. I had not insisted on starting with the right approach they would certainly be right (in fact, I'm refusing to work with a client if we don't start with a Maturity Assement). As an independent consultant, my role is to tell things as they are and guide you in the right direction. But you are ultimately driving and taking the decision to follow my recommendations or not.

I think this would make a great case for the "Creating and Managing the Web Analytics Culture" course I'm tutoring!

Any thoughts and feedback welcome!