Friday, September 4, 2009

Web Analytics Maturity Model

Now that I have offered a definition of Web Analytics, explained what is a Maturity Model and we have reviewed six existing models, it's time to look at the Web Analytics Maturity Model itself!

Previous posts in the WAMM series:
  1. Overview of the Web Analytics Maturity Model 
  2. Definition of Web Analytics 
  3. Components of the Web Analytics Maturity Model
  4. Review of Maturity Models
"Experience is what you get when you did not get what you wanted."
Randy Paush, known for "The Last Lecture" (must see video) and professor at Carnegie Mellon University

From "impaired" to "competitive"

Remember "maturity levels" defines an evolutionary plateau toward achieving a mature process. In the studied models, the number of levels ranges from four (DM3) to six (TDWI) and uses qualitative terms such as “Initial”, “Chaotic”, “Ad hoc”, “Pre-natal” and “Aware” up to “Optimizing”, “Sage” or “Pervasive”.

For the proposed model, each Key Process Area can be graded on a scale from “1 – Analytically impaired” through “5 – Analytical competitor”, as summarized below:
  1. Analytically impaired: Characterized by the use of out of the box tools & reports, limited resources lacking formal training (hands on skills) and education (knowledge). Web Analytics is used on an ad-hoc basis and is of limited value and scope. Some tactical objectives are defined but results are not well communicated and there are multiple versions of the truth (side note: just think of the challenge so many analysts face when they have to define what is a "visitor", and why clicks-through metrics provided by their ad network don't match their Google Analytics campaign metrics) .
  2. Analytically initiated: Working with metrics to optimize specific areas of the business (such as marketing). Resources are still limited but the process is getting streamlined. Results are communicated to various business stakeholders (often director level). However, web analytic might be supporting obsolete business processes and thus, be limited in its ability to push for optimization beyond the online channel. Success is mostly anecdotal.
  3. Analytically operational: Key Performance Indicators and dashboards are defined and aligned with strategic business objectives. A multidisciplinary team is in place and use various sources of information such as competitive data, voice of customer, and data from social media or mobile analytics. Metrics are exploited and explored through segmentation and multivariate testing. The Internet channel is being optimized, personas are being defined. Results start to appear and be considered at the executive level. Results are centrally driven, but broadly distributed.
  4. Analytically integrated: Analysts can now correlate online and offline data from various sources to provide a near 360° view of the whole value chain (see “Limits of web analytics” note). Optimization encompasses complete processes, including back-end and front-end. Online activities are defined from the user perspective and persuasion scenarios are defined. A continuous improvement process and problem solving methodologies are prevalent. Insight and recommendations reach the CXO level.
  5. Analytical competitor: This level is characterised by several attributes of companies with a strong analytics culture:


    1. One or more senior executives strongly advocate fact based decision making and analytics
    2. Widespread use of not just descriptive statistics, but predictive modeling and complex optimization techniques
    3. Substantial use of analytics across multiple business functions or processes
    4. Movement toward an enterprise level approach to managing analytical tools, data, and organizational skills and capabilities.
The last two levels are gradually leaving the realm of “web analytics” to enter that of “business analytics”, as defined by Davenport in "Competing on Analytics".

Limits of web analytics

There is a distinction to make between “web analytics” and other disciplines such as “analytics” and Customer Relationship Management (CRM). Too often, management and practitioners expect web analytics to work like CRM or other core business systems. For example, online sales as provided by web analytics solutions should not be used as a valid representation of financial figures for the purpose of accounting. They do not aim and do not account for cancellations, errors and returns.

The table below contrasts some elements of web analytics and customer relationship management:

Web AnalyticsCustomer Relationship Management
RealmOnlineMultichannel
Individuals accuracyBased on cookies and other techniques, sometimes authenticated1:1 – Authenticated most of the time
System typeAnalyticalOperational and analytical
AccuracySampling and margin of errorClose to 100% accurate

Can you think of other contrasting elements between web analytics and other disciplines such as BI or CRM?


Key Process Areas

Key Process Areas are similar to “critical success factors” (CSF), a term initially used in the world of data and business analysis. It identifies the elements that are vital for a strategy to be successful. Most organizations have perceived Web Analytics as a technological tool for solve problems in individual areas such as online marketing. But those initiatives are largely characterized by a lack of coordination and structured methodology. Unsurprisingly, Web Analytics Critical Success Factors are not that different from those of any other strategy involving strong commitment and cultural business changes, be it Customer Relationship (CRM), Business Intelligence (BI) or Process Optimization (ex. SixSigma) programs: human factor, processes and technology needs be addressed.

Changing the corporate culture, employee behaviour and business processes, is certainly the most difficult and risky part of any major organizational change. By its nature, developing an analytical culture is an iterative and continuous learning process. While data analysis can contribute to business improvement by answering pending questions and validating hypothesis, they also lead to further questioning and new hypothesis to explore.

Because web analytics touches on aspects of marketing, BI and process optimization, among others, it is easy to recoup some commonly identified CSFs and reveal the following items:
  1. Management, Governance and Adoption
  2. Objectives Definition
  3. Scoping
  4. The Analytics Team and Expertise
  5. The Continuous Improvement Process and Analysis Methodology
  6. Tools, Technology and Data Integration
The first element is unambiguously and unanimously the most critical factor in most of the reviewed models. Case studies (to be posted later) offers several examples of the importance of Management, Governance and Adoption. To be successful, executives must recognize web analytics is more than a reporting system and represents an effective way to identify weak points and improvement opportunities. They must perceive analytics (not just web analytics) as a mission critical and competitive resource that can empower each department. Sophisticated use of analytics can contribute to three key elements of a successful organization:
  • Efficient and effective execution.
  • Smart decision making.
  • Optimized business processes.
However, we must stress that web analytics, as commonly thought of, is nothing more than a tool providing indicators and metrics. Making sense of them requires knowledge and expertise, as highlighted in the next section.

Common Features

Common features are attributes that indicate whether the implementation and institutionalization of Key Process Areas is effective, repeatable and lasting. The following common features are identified:
  • Commitment: Is the level of commitment from the organization appropriate and defined by organizational policies, structure and management sponsorship?
  • Resources: Are resources to accomplish the task readily available: tools, training & education, organizational structure and people?
  • Process: Are the roles and procedures to perform each activity defined, tracked and adapted if necessary? Are recommendations implemented and reviewed to contribute to the overall learning process?
  • Reporting and Analysis: Are out-of-the-box reports used or complex multiple-data-source regression analysis conducted to provide customized Key Performance Indicators and dashboards?
  • Tools: What are the tools, their features and capabilities, is their use optimal and effective?
  • Quality: Are mechanisms in place to insure the data being collected is adequate and of good quality? Are reports, insights and recommendations audited to insure their quality over time?

Key Practices

Each Key Process Area is described in terms of key practices that contribute to satisfying the goals. The key practices describe the means and activities that contribute most to the effective implementation and institutionalization of the key process area. They describe “what” is to be done.

Key process areas of the Web Analytics Maturity Model include:
  • Data collection methodologies (log files, tags, network probes, etc.) and data modelization
  • Reporting & Analysis
  • Problem resolution techniques
  • Defining Key Performance Indicators and Dashboards
  • Communication
  • Exploration and visualization tools methods
  • A/B and Multivariate testing
  • Personalization and behavioral targeting
  • Predictive analytics
  • Process analysis and modeling

For further information regarding the WAMM and its future evolution, including speaking, consulting and training, visit the Web Analytics Maturity Model area on immeria.net.

Coming up next: conducting a Web Analytics Maturity Model assessment and several case studies!