Monday, August 31, 2009

Components of the Web Analytics Maturity Model

This post if part of the series on the Web Analytics Maturity Model, a research project for my MBA. See the first two posts on this topic: "Overview of the Web Analytics Maturity Model" and "Definition of Web Analytics".

What is a Maturity Model?

A capability maturity model (CMM) contains the essential elements of effective processes for one or more disciplines. It also describes an evolutionary improvement path from ad hoc, immature processes to disciplined, mature processes with improved quality and effectiveness.

All reviewed models share some similarities in their attempt to define a "framework and objective criteria to determine the sophistication of an organization’s measurement and analysis skills".

In the next couple of posts I will propose a review of some existing models (listed in the 1st post) and how they apply to the objective of defining a web analytics maturity model.

Defining a Maturity Model

Anyone can make the simple complicated. Creativity is making the complicated simple.
Charles Mingus, American jazzman
Some prominent voices in the web analytics industry claim that "web analytics is hard". Rather than spreading fear, uncertainty and doubt, I’d rather be of the school of thought that web analytics can be easier if approached in the right way and given enough time.

Complex projects can only be achieved pending goals and expectations are realistic, resources are allocated and execution is sound. Web analytics, albeit its own challenges, isn’t so different or any harder than other challenges faced by organizations competing in today’s environment.

Attributes of a Maturity Model

Maturity models are not strict paradigms and they often prompt criticism whenever a specific item doesn’t match one’s own view of the world. While it is acknowledged maturity levels and their features contains gray areas and are subject to interpretation, the important elements are not so much the specifics of each level but the structure of the maturity model.

The Web Analytics Maturity Model is looking into the following attributes:

  • Maturity levels: A defined evolutionary plateau toward achieving a mature process.
  • Process capability: The range of expected results that can be achieved by following a process. The process capability provides one means of predicting the most likely outcomes.
  • Key process areas: Cluster of related activities that, when performed collectively, achieve a set of goals considered to be important.
  • Goals: The goals define the scope, boundaries, and intent of each key process area.
  • Common features: Common features are attributes that indicate whether the implementation and institutionalization of a key process area is effective, repeatable, and lasting.
  • Key practices: Key practices describe the infrastructure and activities that contribute most to the effective implementation and institutionalization of the key process area.

Pitfalls of Maturity Models

In an essay published in a 1994 issue of American Programmer magazine, James Bach pointed out several problems with CMMI that could readily be applied to the various maturity models reviewed:
  • Lack of formal theoretical basis: Models are based on the experience of "very knowledgeable people".
  • Vague empirical support: Without a comparison of alternative process models under controlled conditions, the empirical case can never be closed and most accounts remains anecdotal.
  • Revere process but ignore people: Process can never make up for mediocrity, and thus, any maturity model should be viewed as an element of a larger whole encompassing employee skills and aptitudes as well as the corporate cultural environment.
  • Reveres institutionalization of processes for its own sake: Too much focus on the “ability to commit” can easily blur the “ability to execute”.
  • Encourages displacement of goals from the true mission of improving process to the artificial mission of achieving a higher maturity level: This has been a frequent criticism of other similar concepts such as ISO and SixSigma, where the “culture of x” becomes the mission.
  • Little information about process dynamics: Why each element is defined at the level they are is quite suggestive, as mentioned in the first item.
    Despite those concerns, a maturity model brings value where there are no better or reasonable alternatives, and as a mean to assess the current and desired state, as well as a communication and change management tool. At the same time, those caveats are strong indications of opportunities for further research. As Bill Gassman commented on the first post in the series "The first benefit of a maturity model is the conversation it sparks. It puts the team in a mindframe to imagine what could be and to measure where they are."

    Want more?

    Previous posts in the series:
    Coming up:

    Critics of 6 current models, the WAMM model itself and several case studies: Toyota Motor Europe (with the help of Michael Notté), Quebecor Media - Canoe (Simon Rivard), SaveTheChildren.org (thanks to Adam Laughlin) and a failed project (name retained for obvious reasons!).

    View all posts on Maturity Model topic.

    For further information regarding the WAMM and its future evolution, including speaking, consulting and training, visit the Web Analytics Maturity Model area on immeria.net.

    Friday, August 28, 2009

    Definition of Web Analytics

    In the previous post I gave a quick background on the Web Analytics Maturity Model. This time, I'm giving a jab at the definition of "web analytics". Now that we know what is a maturity model, we need to agree on what is "web analytics".

    Although some elements of performance measurement were technically available at the inception of the World Wide Web in the early '90s, interest for measuring web business performance has been on the rise over the past couple of years. If anything, three disruptive circumstances might explain this level of attention for web analytics:
    • Google democratization of web analytics with the launch of its free Google Analytics service in 2005
    • Marketing uncovered clear benefits of Web performance data to optimize online marketing activities
    • The economical downturn of 2008-2009 forces ebusiness initiatives to be measured and accountable for success.

    A Marketing centric definition of web analytics

    The Web Analytics Association defines web analytics as "the measurement, collection, analysis and reporting of Internet data for the purposes of understanding and optimizing Web usage". This definition  centers on Internet data and website optimization from an online marketing point of view, whilst the broader scope of analytics, process and business optimization isn’t specifically addressed.

    Definition of "analytics"

    Davenport and Harris, in "Competing on Analytics: The New Science of Winning" are among the few providing a conceptual background to analytics and business optimization. In their book, the authors define analytics as being "the extensive use of data, statistical and quantitative analysis, explanatory and predictive models, and fact-based management to drive decisions and actions". This definition, much broader than that of "web analytics", is considered to be a subset of business intelligence: "a set of technologies and processes that use data to understand and analyze business performance".

    Proposed definition of "web analytics"

    I'm defining web analytics as:
    The extensive use of qualitative and quantitative data (primarily, but not limited to online data), statistical analysis, exploratory (multivariate testing) and predictive models (behavioral targeting), business process analysis and fact-based management to drive a continuous improvement of online activities and improved ROI.
    As we will see in the Web Analytics Maturity Model, the earlier stages naturally focus around online marketing, and the higher level match the lower realm of Davenport definition of "competing on analytics".

    Want more?

    This post is part of a series extracted from my MBA thesis paper. Comments and critiques welcomed!

    Previous:
    Coming up:

    What are the components of a maturity model? Critics of 6 current models, the WAMM model itself and several case studies.

    View all posts on Maturity Model topic.

    For further information regarding the WAMM and its future evolution, including speaking, consulting and training, visit the Web Analytics Maturity Model area on immeria.net.

    Wednesday, August 26, 2009

    Overview of the Web Analytics Maturity Model

    I've been talking about a Web Analytics Maturity Model for a while now, both on this blog and at previous eMetrics conferences. I'm pursuing my research as part of my MBA thesis and in the coming days and weeks I will share some elements of an upcoming paper (or book?) on this topic. Of course, I'd love to hear from you! Any feedback is welcomed, good or bad :)

    Overview

    Jim Sterne, dubbed as the “godfather of web analytics”, was pushing for online marketing as early as 1994. In his “E-Metrics: Business metrics for the new economy” paper published in 2000 he mentioned that “while all e-business managers clearly recognize the tremendous value of e-customer analytics, most lack the staff, technical resources, and expertise to harness and put to effective use the flood of raw data produced by their Web systems“. A decade later, we can only admit this statement remains true.

    The Web Analytics Maturity Model (WAMM) is adapted and derived from proven models in fields such as business intelligence and process optimization, or inspired from models proposed by industry analysts and leaders. Based on the critical success factors contributing to the “use of analytics to make better decisions and extract maximum value from business processes”, those are applied to a five level multi-dimensional capability maturity model.

    The proposed model presents five maturity states:
    1. Analytically impaired
    2. Analytically initiated
    3. Analytically operational
    4. Analytically integrated
    5. Analytical competitor
    The six key process areas, or success factor dimensions, are:
    1. Management, Governance and Adoption
    2. Objectives definition
    3. Scoping
    4. The Analytics Team and Expertise
    5. The Continuous Improvement Process and Analysis Methodology
    6. Technology and Data Integration
    Those maturity levels and key process areas defines common features and attributes as well as key practices that will significantly increase the likelihood of success and positive return of a web analytics program.

    What is a Maturity Model anyway?

    A capability maturity model (CMM) contains the essential elements of effective processes for one or more disciplines. It also describes an evolutionary improvement path from ad hoc, immature processes to disciplined, mature processes with improved quality and effectiveness.

    All proposed models share some similarities in their attempt to define a “framework and objective criteria to determine the sophistication of an organization’s measurement and analysis skills”.

    I have reviewed a number of existing models and how they apply to the objective of defining a web analytics maturity model. The following models were evaluated:
    1. Capability Maturity Model Integration (CMMI) from the Software Engineering Institute at Carnegie-Mellon University
    2. The Data Warehousing Institute Business Intelligence Maturity Model
    3. Gartner’s Maturity Model for Web Analytics
    4. WebTrends Digital Marketing Matutiry Model (DM3)
    5. Competing on Analytics maturity by stage, by Thomas Davenport in "Competing on Analytics: The New Science of Winning"

    Coming up

    In the next post: my take on a definition of web analytics and later on, a critique of other maturity models. Also, I'm putting the model to the test with some organizations and those cases promise to be very interesting! View all posts on Maturity Model topic.

    For further information regarding the WAMM and its future evolution, including speaking, consulting and training, visit the Web Analytics Maturity Model area on immeria.net.

    Friday, August 21, 2009

    WASP v1.26 released

    WASP v1.26 is now approved at addons.mozilla.org and had been available from WebAnalyticsSolutionProfiler.com for a couple of days.

    Another round of bug fixes and minor enhancements while I work on WASP v1.50 (or what could turn out to be WASP v2.0).

    What's new

    • WASP for Analyst:

      • Breakdown of tags now shows an "(uncategorized)" branch for tools where help is available but specific tag usage is unknown (applies to Google Analytics, Omniture SiteCatalyst, WebTrends and AT Internet Xiti)
      • Fixed detection failing when some JavaScript variables were defined with a null value
    • New/updated tools:

    Please take 2 minutes to submit a review at addons.mozilla.org and visit the UserVoice page to suggest improvements and cast your vote.

    Get WASP v1.26 now!

    Sunday, August 9, 2009

    Speaking about Web Analytics Maturity at Internet Marketing Conference

    The Internet Marketing Conference

    I've been invited to speak at the Internet Marketing Conference in Vancouver, September 16-18.

    I spoke several times at the eMetrics Marketing Optimization Summit and local events in Montreal, but IMC will be my first time speaking at a conference not specifically targeted to a web analytics crowd. Other speakers includes my friends Avinash Kaushik and John Hossack, chair of IMC and a host of other speakers from Canada and abroad.

    Presenting the Web Analytics Maturity Model

    The theme for IMC Vancouver 2009 is "Quality Traffic". Traffic not only needs to be "relevant" to your business, it also needs to be proven successful. This fits very well with the topic I will be presenting: the Web Analytics Maturity Model (WAMM). As I did at eMetrics Toronto and San Jose in the spring, this highly interactive session will guide you through a SWOT analysis of your web analytics maturity. Review the six Critical Success Factors of successful web analytics programs and see where you stand, what you should do next. Don't let people simply tell you "web analytics is hard" and make you feel, at the end of the session, you will be armed with clear arguments and indications on what to do next in order to bring you, and your organization, to the next level.

    This presentation received a score of 94% at eMetrics San Jose, 4th best speaker of the conference! I have pursued my research I will present an even better version of the WAMM.

    Panel: Measuring Online Marketing in a Real-Time World

    I will also participate in a panel with Kevin Hillstrom, Manoj Jasra, Braden Hoeppner, Amanda Rose and Eric Hansen, moderated by Ean Jackson, where we will jump in to answer the crowd questions and share our views and opinions about online marketing measurement.

    Other opportunities to hear about the WAMM

    Tuesday, August 4, 2009

    "Unique Visitors" doesn't have much to do with people

    I was taking my morning coffee, reading Brian Clifton post about "Should you focus on website visitors as individuals?", his views of a recent e-Consultancy interview of Coremetrics Chief Strategy Officer, John Squire. I commented on Brian's blog, but as is often the case, my thoughts turned into something too long for a comment. I strive to participate in conversations I know won't end up with the now classic "let's agree to disagree".

    Tracking individual customers, really?

    Coremetrics is differentiating its technology by focusing on individual customer data in a multichannel environment. They look at “complete historical online behavior and brand interactions on websites, across multiple ad networks and via email, video, affiliate sites, social media, and more”.
    (emphasis mine) I salute Coremetrics (and others) objectives to track "unique people" instead of "unique cookies" in the hope to get more precise and detailed information. But this is a lost cause... the inherent technologies of the web don't allow that, unless you can authenticate every single visitor coming through all channels... As soon as there is "multichannel" and "web analytics" in the same phrase I get suspicious... "multichannel" when everything is online is something, but how is data from call center, back-end core systems and other touch points integrated to give a "single view of the customer"? We're heading in the realm of business intelligence (BI) and customer relationship management (CRM), and despite the marketing fluff, I don't see any of the current web analytics vendor really playing in this space. In the above statement, all of the mentioned "channels" are "online", which, I agree, gives you a very good view to start with, but a view where your sight is concentrated and everything else in the periphery is blurred.

    I like Brian argument that aggregating & sampling is the way to go. After all, web analytics is closer to statistics: with a population, segments, sampling/aggregating... and a margin of error, confidence and significance... then it is to core systems where each piece of data should be 100% accurate and traceable.

    Trust me, I'm accurate (and honest)

    One of the thing playing against Google Analytics is the fact that experienced analysts (those who have a statistical background, come from the offline world or have lots of experience in a real multichannel, multi-system environment) want to audit, validate and diagnose the data they are relying on. The inability to audit the data and have a peak at the methodologies employed by Google to provide the data is a big showstopper to some organizations. It's basically asking for a blind trust where your provider (AdWords) is also giving you your results (analytics conversions)... or your employer is also managing your bank account... For that mater, your also have to trust other vendors because you can't have access to the algorithms they use for aggregating or sampling data, but they do not play a dominant role in your online marketing (PPC) spends.

    From time to time, access to the raw data, or the lowest level possible, is justifiable and necessary. The latest Forrester Wave: Web Analytics Q3 2009 reveals that 49% of surveyed web analytics clients said "accuracy of information" is one of their three most important factors when selecting a vendor. This comes ahead of reporting options, and far ahead of TCO! Yet, despite vendors secret formula to calculate accurate visits & visitors, research I conducted using WASP shows the number one source of distrust and inaccurate data comes from inappropriate and incorrect instrumentation.

    My take

    Detailed or aggregate? Really multichannel or "online channels"? I'll put my old record again: it depends on your Web Analytics Maturity Level. For most organizations, even if they have access to detailed data, they simply won't be able to make good use of it. If you wonder, it's probably because you don't need detailed data and Google Analytics will be just fine. If you do your homework, you'll look at offerings from various vendors and put aside frivolous and inflated claims boasted with buzz words such as "real time", "multichannel" (when they really mean online channel), "single view", "complete" and the worst of all: "leading".