In the wake of the Google tool to debug your tracking code announcement I thought it would be worth sharing some of the experience I gained trough years of implementations of various web analytics solutions, developing the Web Analytics Solution Profiler and conducting web site quality control.
The new Google tool is interesting and useful, but it might be a bit overtly simplistic and still requires a lot of manual intervention. Obviously, this topic is close to my field of expertise and I though I could share some tips.
Quality assurance of tags: an old issue
For the little story, I made the first version of WASP available in October of 2006 and before that I had spent a lot of time doing implementations and was faced with the quality assurance process issue. I guess I have spent a significant amount of time on this! For those who might have missed it, here are some of my previous posts on the topic of tags quality assurance:
- Web Analytics implementation Quality Assurance, a post dating back August 2007 where I referenced several blog articles discussing about common pitfalls of tagging
- Quality assurance using WASP: tag all pages, February 2008, where I was highlighting the benefits of "in context" QA - running within the browser with all the idiosyncrasies it involves rather than crawling from an external context that can't perfectly reproduce the browser
- Quality assurance of web analytics tags implementation, from January 2009, where I described various methodologies/tools for conducting quality assurance of tags.
- Testing web analytics implementation with WASP, from February 2010, where I reacted to a the statement "There is no better test than randomly clicking your site to verify page names." to which I replied "When has quality assurance become a random act of faith?" and proposed the list of "things" that needs to be checked.
How much can you test manually?
There are huge challenges in the QA process cycle. Manual testing is long and error prone (either with Firebug, HTTPWatch, Fiddler, IEWatch, Charles) and external scans (like Observepoint or SiteScanGA) are not always the best solutions for sites under development/behind firewall/secured.
Testing content areas driven off templates is fairly easy: identify a couple of pages using each of your templates and test only those, not the whole site. In WASP, you can start a crawl by specifying a local text file where you simply put the list of specific links you want to test.
Testing "processes" (checkout, registration, etc.) is particularly difficult but if I may preach for WASP again, one of the huge benefit is being "in context" - it can record the data as you go. You can use a session recorder and playback solution (I used DejaClick from AlertSite) but there are whole suites specifically built to facilitate the QA process in web development. Basically, you can record this part of the session and simply play it back whenever you need to test... and WASP will happily record the data so you can export it to Excel and check it out afterward.
A simple approach
Simply put, make tags QA an integral part of the overall web development QA process - whenever a template is modified or a process is touched it should be QA'ed again. There are usually "repro scripts" that are used to make sure the latest bug fix (related to tagging or not) doesn't break anything. Use those exact same scripts for tagging QA. It certainly requires IT & marketing collaboration, but I guess there is no easier alternative. Anyone who's been involved in web development knows no automation tool will magically pinpoint errors on your site... Make "quality" an integral part of your job!