Tuesday, February 27, 2007

Planning for March WaW in Montreal

English version of this post further down.

La rencontre de la communauté Analytique Web du Québec est un évènement inspiré de la tradition "Web Analytics Wednesday", une rencontre mensuelle, informelle, de gens oeuvrant dans le domaine de l'analytique Web, du design, du SEO et de l'internet en général.

Pour le mois de mars, la rencontre WaW sera de retour à Montréal, mercredi le 14 mars.

Lieu: Veuillez me faire part de vos suggestions. Un resto calme et sympa, avec la possibilité d'avoir un salon privé, est l'idéal.

Réservation: SVP réserver en me faisant parvenir un courriel, ou en laissant un commentaire ci-dessous.

Commanditaire: Si vous aimeriez commanditer cette rencontre, présenter un service ou un produit, offrir une tournée, ou simplement faire une présentation sur le sujet de l'analytique, SVP me contacter. Afin de respecter l'esprit de ces rencontres, la seule restriction imposée est que toute présentation doit viser à éduquer ou permettre l'avancement des pratiques en analytique web, et non promouvoir simplement votre produit ou votre service. La présentation de Swammer lors de la rencontre du mois de février était exactement dans l'esprit voulu.



The monthly meeting of Quebec's Web Analytics community is inspired by the Web Analytics Wednesday tradition, which is an happy gathering of practitioners in the field of web analytics or other related fields of interest (SEO, design, or the Internet in general).

For the month of March, we will be back in Montreal, on Wednesday, March 14th.

Location: Please send your suggestions. A quiet place, with a private room, is ideal.

RSVP: simply send me an email or leave a comment below.

Sponsors: If you would like to sponsor this event, present a product or a service, pay for the first round, or simply present something on the subject of web analytics, please contact me. The only restriction, to stay in the true spirit of WaW, is that your presentation be educative or otherwise help people grow in their web analytics practice, not simply to promote your product or service. The presentation of Swammer at February's WaW was exactly what we are looking for.

Friday, February 23, 2007

Let's call it extreme predictive analytics

At our last Web Analytics Wednesday in Quebec city, we had the chance to meet with the cool guys working on Swammer, the result of years of scientific research. The step from what we know is so huge, at least for me, that it's a bit hard to really convey the power and the impact of what this could mean. So let's proceed with a couple of building blocks comparisons.

Basics

  1. web analytics, as we know, is the measurement of the behavior of visitors to a website
  2. predictive analytics process current and historical data in order to make "predictions" about future events
  3. KPI are metrics used to quantify objectives to reflect strategic performance of an organization. In the Swammer context, the KPIs are: Visibility, Actuality (who is more "vocal"), Celebrity and Popularity.
  4. reputation management is the process of tracking an entity's actions and other's opinions about them.

Advanced concepts

Still fine, isn't it? Now some new concepts:
  1. Key Genetic Indicators (KGI) maps de DNA of an organization, the social values and representations they are associated with (ethic, empathy, flexibility, efficiency, etc.) and their counter values. This is based on axiological and socio-psychological concepts.
  2. Key Tracers Indicators (KTI): are synthesis indicators derived from KPI and KGI to provide an interpretation of a social value (an image) across time. So KTI gives a representation of an object (a company, a product, someone or anything else) in 4 areas:
    1. the inner self image representation
    2. the projected image
    3. the perceived image
    4. and the imposed image
Each of these images are quantified to get indexes such as disparage (who is saying bad things about a company or a product?), conscientization (who says what in positive terms?), redundancy (how many occurrences of a positive or negative value?), etc...

Playground

  • The data set for predictive analytics is the whole visible Web, but could as well apply to other data sets
  • There's a way to establish the trends, think about a Google Trends for social values
  • You can establish the relationships between the social values and counter values associated with each element being analyzed. Something like AmazNode at a much larger scale
  • It goes beyond business intelligence, although similar in some ways to what companies such as Resilient Corporation or Cymfony are doing, Swammer takes a different angle and goes far beyond them

Consequences

  • Marketing: You can get, spot on, the values and counter values of your competitors. This become a very strong tool for SWOT analysis. In this example, banks, pharmaceutical, oil companies, car manufacturers or any industry can be analyzed.
  • Political: It as been found that often, official election results are surprisingly close to survey results conducted close to the election date. Imagine if you could get those results weeks before what traditional methods allow today. Swammer is being tested for the current elections in France and in Quebec, and could certainly be used to see how Obama and Hilary Clinton are doing...
  • National: Values and counter values found in our society leave "traces" and "artifacts" on the Internet. If you can monitor those, you can spot specific activities... such as terrorism. Some groups have a constant "buzz" on the Net, but sometime, a sudden "blip" on the radar from an obscure group could hide some emerging activities. Although they could not provide too much details (go wonder!), the guys from Swammer revealed they have been approached by secret services to see how their concept and technologies could be used.
  • Legal: demonstrate such things as when and how, in case of legal litigation, a company as been a victim of defamation or other socially identifiable counter values.
  • Financial: One of the next research field will be to use the concepts highlighted above to predict future activities on the stock market. I kindly volunteered myself to test this one :)

Full disclaimer

I met the guys from Swammer in person during our WaW event, where they presented their technology. This post is partially based on material provided by Swammer. I do not receive any financial or other professional or personal benefits.

Thursday, February 22, 2007

Web 2.0 events

Yesterday, in Montreal, there was a Web 2.0 event sponsored by Infopresse. This follows a recent Web 2.0 debate organized by the Association Marketing de Montréal where my colleague Mohamed Kahlain debated with Michel Leblanc. In both occasions, the views expressed seemed to have been diverted by the simple (but so complex) question of what is, or is not, Web 2.0, and if there are some tangible benefits. Beyond any intellectual debate, there are clear benefits to "think Web 2.0" and "do Web 2.0". This doesn't mean to throw the baby with the water and raise our nose on the "old" Web 1.0 economy...

One of yesterdays highlight was a presentation from Hue Agence Média about Customer Generated Content: Measure/Action, Science and Experience. You guess web analytics plays a very important role in there.

The next noticeable event will be WebCom Toronto 2007, with topics such as:
  • Web 2.0
  • Web Analytics
  • Knowledge Sharing / Innovation
  • Intranet / Infrastructure
  • Content / Community
  • Usability
  • Collaboration Tools
WebCom will be back in Montreal during the spring and later on, in the fall.

Warning: Hot Jobs in Web Analytics

If anyone still had doubt about the job market for web analytics... Linda Burtch, from Smith Hanley LLC, reveals some interesting details about the web analytics job market. Here are some characteristics, with some of my own opinions:
  • Hot skill set: analytical, web saavy, marketing, IT
  • Fast moving: dozens of positions available, many for renowned companies, with several job offers at once closing in about a month!
  • Relocation challenges: the Internet is worldwide, so is this job market, relocation is often a challenge (compensations, insecurity).
  • Bonus! getting a sign-in bonus is getting more frequent.
  • Cross-industry (horizontal skills): web analytics is universal, regardless of the actual field of expertise of the employer. Of course, knowing what it's all about helps, but changing industry type is frequent.
  • Compensations: from entry at $50,000 up to director/vp over $125,000. The most frequently sought positions, such as senior analyst, range from $60k to $85k.
Now, how is it in Canada? I don't know about my colleagues, either practitioners or consultants, but I think the trend is very similar here, in Canada. However, the overall "scale" of the job market is lower, so it will be relatively rare to see directors or VP's dedicated to web analytics. However, the lower end of the scale, entry-level masters (2-3 years experience) and senior analysts are most sought after. The salary range looks similar... although being in Canadian Dollars (say a senior analyst is paid 75$k in the US, the salary will be 75k$CAD...)

Over the past few weeks I've been receiving emails from people seeking consultants or employees to fill web analytics positions, but I guess it's the same here as anywhere: it's hard to find experienced analysts! If you are in Canada and involved in web analytics, I would appreciate your input about the job market.

If your looking for a job in this field, check out the Web Analytics Association job board, or the job postings of the Association of Internet Marketing.

Thursday, February 15, 2007

Web Analytics Wednesday Feb. 20th, Quebec

February 20th Web Analytics Wednesday Tuesday is coming up!

Comme le veut la tradition, ces rencontres sont de natures "informelle" et il n'y a généralement pas d'agenda précis. Les discussions s'amorcent librement et vont aux grés des intérêts de chacun. Cette fois, nous aurons une présentation de M.Stéphane Muller qui nous démontrera certains concepts derrière le produit Swammer.
SWAMMER monitors public perceptions and corporate values using KEY PERFORMANCE INDICATORS, providing valid scientific insight and trends about an image and the perceptions associated with it, how it is shaped and how it evolves.*
Il y aura aussi une brève présentation de ce qu'est la Web Analytics Association.

Donc, le tout ce veut chaleureux, simple et une occasion idéale de rencontrer des gens qui partagent des intérêts communs autour de l'analytique Web, de la conception de sites, du design, de l'ergonomie, du SEO, et même des statistiques en général, des aspects éthiques et légaux, etc.

Ça vous intéresse?
Lieu:
Restaurant Bugatti
3355, rue de la Pérade
Sainte-Foy
418-651-4747
Nous aurons une section privée à partir de 18:00h.
SVP réserver en me faisant parvenir un courriel à l'adresse shamel67@gmail.com, en laissant un commentaire.

Note: Si vous aimeriez présenter un service ou un produit, être un "sponsor" en offrant une tournée ou un cadeau de présence, ou simplement faire une petite présentation sur le sujet de l'analytique, SVP me contacter.

Wednesday, February 14, 2007

WASP detection model

At the heart of WASP, the Web Analytics Solution Profiler, there is some JavaScript code and configuration parameters used to detect the presence of a specific web analytics solution. This post aims to explain the technique used to provide as much information as possible while avoiding true-negative or false-positive results.

What is WASP?

WASP is the Web Analytics Solution Profiler, a Firefox extension aimed at web analytics implementation specialists, web analysts and savvy web surfers who wants to understand how their behavior is being analyzed.

Why is it important?

Renowned authors and analysts recognize the web analytics JavaScript tagging process can be error prone and sometimes complex. Note: full text edited, please refer to each reference for the complete article.

"Some web analytics tools use one standard tag to collect data. Other vendors have a custom tag all over the place... it is important that you validate in QA and production that your ... tags are each capturing exactly what they are supposed to.

I know that Omniture has a nifty utility that you can use to validate and review that data is being collected ... as it should be. This is really nice and helpful and I do like it very much. Please ask your vendor if they have something like this (and they probably do)."
Avinash Kaushik's Web Analytics Technical Implementation Best Practices

Eric Enge: I have heard that one of the largest sources of error in analytics is the accuracy and problems with implementing the Javascript. Does that make sense?
Jim Sterne: First of all, web analytics numbers are not precise... So, the question about precision is a long process. The day you implement a tool you may well get bad data. So, verify everything that you possibly can. But eventually, you are going to reach the point of diminishing returns...
Eric Enge interview of Jim Sterne
WASP aims to ease quality assurance Avinash is talking about and make it easier to reach that point of diminishing returns Jim Sterne refers to, regardless of the web analytics solution being used. Furthermore, WASP offers a significantly more intuitive and easier to use tool than vendor specific solutions available today. Wait! There's still more! (I like it, it sounds like a shopping channel info-mercial!) Most sites now includes tagging for several different but complementary purposes (check screen capture of Avinash's site below).

How does it work?

  1. The WASP Firefox extension sidebar is triggered whenever a new page is loaded, a new browser tab comes into view, or when the sidebar itself is first shown.
  2. WASP watch for all HTTP GET requests sent by your browser, regardless of their type (images, scripts, stylesheets, frames)
  3. Once the page is fully loaded (we don't want to slow down page loading and processing!), for each web analytics solution found in the configuration file, we check for the presence of a very specific object only that particular web analytics solution should be setting. This could be a variable, a function, or any unique object defined in JavaScript.
  4. If that object is found, we now look for an HTTP GET request that match a specific regular expression. Again, this pattern should be something unique to this web analytics solution.
  5. When a match is found, we can check for any particular Query String parameter being passed and cookies being sent or retrieved and display that information in the sidebar.
Using both an object and a query check reduces the risks of wrongly identifying a product. At the same time, if a product tab is shown in the WASP sidebar but there are no Query String or Cookies detected, this might be an indication that there is actually no data being sent, even if the tagging itself seems to be present in that page.

Quite simple, isn't it!

Yes and no, early prototypes built with Greasemonkey quickly revealed some solutions were rather complex to detect, while others were very simple. Some are even able to "hide" themselves... or at least, try!
Another aspect that turned out to be more complex than initially thought is the Firefox extension itself. Using the XUL (XML User Interface Language) and building the right "hooks" to the Firefox page loading events are not trivial.

Some examples

Google Analytics: the unique JavaScript object is a function named urchinTracker(), and we look for an image which path includes the value of the JavaScript variable _ugifpath. In the example below (click to enlarge), we see there are several different solutions being used.

Omniture SiteCatalyst: the unique JavaScript object is s_account, and we look for a request that contains that specific variable value.

Disclaimer: I'm not affiliated and do not receive any monetary incentive from the companies providing the solution WASP is analyzing. Neither M.Kaushik nor Omniture endorse WASP and the screen captures are shown for demonstration purposes only.

Get WASP!

WASP installation instructions are available here.

WASP is licensed under a Creative Commons Attribution NonCommercial NoDerivs 2.5 License.

If you use this tool for professional purposes, please think about a donation (look for the "Make a Donation" button in the right sidebar).

A techie's path to web analytics

While others are celebrating thousands of comments and feed subscribers (bravo Avinash!), I can only celebrates 20 years of career and my 100th blog post! So here's my story :)

I can't tell exactly when I first got hooked on web analytics.

In 20 years, I touched the keyboards of many computers... in many companies, small and large, startups and bought up, played with more bad managers than good ones. That all started way back in 1987.

I remember, what now seems to be a very long time ago, when I was a freelancer at Hydro-Québec (1991-1994). I would fetch loads of data stored in Oracle databases, analyze them and present statistical results about water levels and temperature in the form of trends and histograms. I even worked for a while on real time visualization of dam instrumentation controls. You know, huge dams are full of instruments: humidity, inclination, temperature, pressure, etc. We would call "live" and fetch real-time data. As an Oracle DBA, a Sun system administrator, and a C developer, this research project got me early to a new universe: the Internet (1992).
Side note #1: This was before Microsoft historical shift to embrace the Internet (remember Blackbird? the proprietary MSN BBS editor). At that time, myself a BBS moderator and web developer, I was wondering why the hell MS didn't use it as a Web page editor and shift MSN from a proprietary BBS to a full-fledged Web site... couple of weeks later... the rest is history. Too bad I didn't write that famous email to Bill!
The discovery of smtp (mail), irc (chat), nntp (news), and soon after, http (web!), got me into the world of hacking (in the purest sense of the term). This raised some eyebrows at Softimage, just bought by Microsoft, and I was hired to implement the anti-piracy mechanism into their high-end animation software. At night I started to build their first web site and had to convince management there was some benefits! I used log file analyzers written in perl to count the number of hits to our Netscape Web server. WebTrends came along. We could tell cool stuff about "impressions", "visits" and "visitors"! We could easily see what was popular and where people came from!
Side note #2: I remember when Pizza Hut offered one of the first ecommerce site: pizza delivery! People were skeptical, this was cool, but reserved to a elite few living near Santa Cruz. Or when we had to send an email to Jerry or David and kindly ask them to add our site to their Yahoo index.
In 1999 I moved on to Bombardier and I loved building the monthly reports and seeing if Ski-Doo or Sea-Doo was performing better than the previous month. I was still the tech guy... but I could use those numbers, explain them to the marketing and communication people, and make the links between the technology, the Web, and the business objectives. I could even use LiveStats (what will soon become Microsoft Gatineau) and see the results almost in real time, and use ipMonitor to make sure the whole infrastructure was working correctly. I was now involved in serious B2B, B2C experiments going on right at the tipping point of the Internet bubble, and eventually, to the consolidation and revamp of a dozen web sites.

Those years were the "Internet World" years. Each conference would bring a new buzz: managing email, commerce on the Web, VRML, push technology (remember PointCast), cool new stuff such as streaming... That's where I met Jim Sterne for the first time, he was already talking about unique marketing strategies for the Internet.

Late 2004, as a web architect, I had the opportunity to develop the web analytics practice at Nurun, a web agency with clients such as EuroDisney, Club Med, L'Oréal, the Royal Canadian Mint, and many more ranging from publishing to high value brands and ecommerce. Only two years ago few companies took the field of web analytics as seriously as today. Most of them were still talking about reports, thinking about page views and visits. It almost seemed as I was back in the early days of the Web, having to convince everyone it was so important.

I'm now much more involved in web analytics. It's a small part of my job as a senior architect at Desjardins General Insurance, it's the focus of several of my ebusiness MBA coursework, and I've been organizing local Web Analytics Wednesday's for a few months. I'm also grateful that my employer, named one of the best employer to work for in Canada and #1 in Québec, allows me to share my interest and recognize the value of web analytics as an essential element of a serious Internet presence. This field of expertise also opens a world of interesting topics, ranging from marketing strategies to site optimization, SEO, A/B and multivariate testing, Web 2.0 and ethical or legal issues, and many more.

But ultimately, my passion for web analytics, as strange as it may seem, stem from my interest to understand how human behavior is being transformed by the Internet. And one of the most interesting side effect of my involvement in this community are the links that are being made with very bright people everywhere around the world.

Monday, February 12, 2007

Do we all have to do it on our own?

I don't want to spoil the party and I know everyone is entitled to his/her way of seeing the world... but as a group, wouldn't it be a better idea to consolidate around the Web Analytics Association?

I read a post on the Yahoo! Web Analytics discussion group regarding WebAnalyics360. I don't know who is behind (I'm voluntarily not linking their site, as I don't even want to divert any traffic to this site...) and since when it has been published, but there is obviously some room for improvement:
  • no contact info other than an email...
  • no specified goal other than numerous Google Ads
  • very limited blog list (I might be biased as I've been maintaining and screening a list for a couple of months - over 50 qualified blogs)
  • the events section lacks... some events visibility...
  • vendors... is too narrow, just my Web Analytics Solution Profiler page lists many more solutions (which, btw, also includes some ad networks and affiliate networks)
  • some books... some jobs...
It would be glad to hand out my blog list, the Web Analytics Google CoOp Search with over 150 sources I've been building, as well as a more complete list of WA solutions to the WAA and see them being augmented and contributed by the community. I would volunteer to maintain these community resources, with the help of fellow WAA members.
What do you think?

P.S. See my community services in the right sidebar

Cookies will get you confused

Update 07/02/13: After receiving a comment from Stephane, I realized I didn't see the subtle condition where one would get a 1st party cookie from Google.com, and later that same cookie be used in a 3rd party context such as when you embed a Google Ad on your site. This obviously leaves room for litigation and abuse. In my opinion, as soon as you "abuse" your relationship and trust privilege, your asking for trouble...
A post on Lies, damned lies left me wondering about my own experience with cookies.

Unless I missed something in my years of experience using and developing on the Web, a cookie can only be set AND read either by the site, or the domain on which it was set (depending on the value of an argument when the cookie is set). So even the mighty Microsoft/Yahoo!/Google can't do anything about it.

Most ad networks models are based on the fact a banner is served by their own web server (the 3rd party), but included within another's web site (the 1st party). DoubleClick raised uproar a couple of years ago because as part of their ad network, they were using their own cookies collected trough the means of being included in other's web sites, to analyze specific user's behavior across hundreds of web sites. Self regulation and lawsuits caused by obvious abuse led to a point were in theory, ad networks use aggregated models and are not supposed to track that Mr.X is reading times.com during the day and something of another nature at night...

A trick someone could use to pass values from one site to another "partner" would be to use the URL Query String as a relay. This would certainly be viewed, in most cases, as an invasion of privacy. Some even use Flash ability to store data on the local drive without user intervention and knowledge as a replacement of cookies, which is, in my opinion, even less ethical.

My simple definitions:
  • 1st party cookie: set and read by the same server (for server-specific cookies such as host.domain.com), or on the same domain (for domain specific cookies such as *.domain.com)
  • 3rd party cookie: set and read by a server on a different domain, or on a different host (for server-specific cookies - host.domain.com embedding an image from some.ad.net)
  • friendly 3rd party cookie (or 2nd party): what would be a 3rd party cookie, but set trough a host on the first party domain which is a DNS alias of a 3rd party host. For example, stats.mydomain.com is a DNS CNAME of mydomain_com.2o7.net (2o7.net is the domain used by Omniture tracking). This technique is often used to avoid being identified as a 3rd party cookie.
In my humble opinion, any tricks used to bypass those definitions, be it by passing data on the Query String or using permanent storage techniques without the user's consent, is a blatant demonstration of privacy abuse. Do I have to strong an opinion? Am I missing something?

So I'm not sure I understand Ian's post, but if my reading is right, and if a company like Microsoft would even think of using their power to share information about my relationship with them without my consent, regardless of their honest intent, I would strongly argue against it.

Tuesday, February 6, 2007

Everything is about context

Excellent post on SemAngel entitled "All the world's a stage". Put in a very simplistic way, a KPI becomes actionable only if it is put in context. Actionability (sic!) of a KPI is important: they should either drive action or provide a warm, comforting feeling to the reader; they should never be met with a blank stare.

Gary comment:
A report becomes actionable by using KPI’s to provide the business context within which an action can be identified or deemed worth trying. The more relevant context a report provides, the more likely it is to be actionable. KPI’s are the context builders that make up our view of what’s important and what isn’t.
This reminds me of two circumstances where context saved the face.

The brick and mortar retail store

Goal: A retail store without online ecommerce wanted to increasing brand awareness and ultimately, offline sales.
Strategy: Conduct an online contest.
Result: Visits KPI increase. Contest participation was high. Good!
Context: the #1 referrer was an online site tracking all kinds of contests... when subtracting what was deemed to be unqualified traffic, results were far from being that great... We determined the pattern was "come from contest tracking site/go right in contest page/fill form/submit/get out".
Simple recommendation: review contest rules, use a different kind of incentive: redeemable coupons in store.

The online music store

Goal: increase online sales, promote the revamped downloadable music section.
Strategy: use the massive opt-in list to communicate the best sellers and offer rebates on some downloadable musics.
Result: very good ratio of clicktroughs, even an increase in the conversion rate. Everything is great!
Context: people were actually buying only the songs in rebate, not reaching the profit margin which was achievable only if more than 2 or 3 songs were purchased. We couldn't identify an increase in repeat purchase ratios when compared to non-solicited customers. The more they would send emails, the more they would actually loose money.
Simple recommendation: change the rebates to be of the type "buy two, get one free".

The lesson

Are these two examples too simplistic? When looked individually, each KPI had a very good meaning and the WA provided nice graphics showing a clear and bright sunny day. It's only when those meaningful KPI's were put in context that the truth was revealed. And to take back the story analogy from SemAngel; trust your WA to give you the pen & paper, trust your analyst to make you a good story :)

Friday, February 2, 2007

WASP v0.2 released

Update 07/02/13: To avoid any confusion about the "High/Medium/Low/Hard to detect" confidence levels: this refers to the avoidance of true-negative and false-positive detection by WASP itself and is really not related to the features or security aspects of any of the listed products.
WASP is the Web Analytics Solution Profiler, a Firefox extension aimed at web analytics implementation specialists, web analysts and savvy web surfers who wants to understand how their behavior is being analyzed.

Installation

Couple of easy steps to get started:
  1. Uninstall the previous version (under Tools/Add-ons) if you have it.
  2. Download & install Firefox if you don't already have it
  3. Get FirefoxInstall the WASP extension.
  4. (if you are not prompted to install) Save the .XPI file to a temporary folder than drag it on your Firefox window. You should be prompted to install.
  5. Restart Firefox
  6. In Firefox, use View/Sidebar/WASP to display the WASP sidebar.

Note

The first version has been downloaded nearly 200 times and I received great feedback. This version of WASP is the second beta release. It is much more stable than the previous version and now includes information about Query String and Cookies. It has been tested more extensively, but as with any beta software, you might find minor glitches here and there. Your feedback is highly valuable and appreciated!

Current features

Enhancements and bug fixes

  • Allow switching between URL Encode/Decode views
  • Extended list of supported tools and more extensive Query String analysis
  • Now show Cookies information
  • Added an "Options" tab
  • Enhanced the "About" tab
  • Added licensing information
  • Added a button for voluntary donations
  • Removed the accelerator (Ctrl-W) to toggle the sidebar display... Ctrl-W is used for closing Windows!
  • Stop processing when the WASP tab is not shown
  • Values displayed as "undefined" when null are fixed
  • Now process when switching tabs or on initial open
  • If page is loading slowly WASP seems to hang until "onload" completion (should not happen or be less frequent)
  • Fixed: If replacing WASP sidebar by another one, then show it again, WASP didn't work anymore
  • Some XUL display adjustments (scroll/resize/focus)

Upcoming features

  • Display information about 1st and 3rd party cookie status
  • Display HTTP headers
  • Display P3P status
  • Put supported tools in a configuration file editable trough preferences
  • Allow sorting of value-pairs, copy (or export) of values
  • Add knowledge base links allowing to get additional information on any piece of data being sent
  • Document the detection mechanism for easier contributions by the web analytics community
  • Handle frames and iframes
  • Add a tab to show Alexa and Google PageRank information about the page currently being displayed
Please post a comment for new feature requests.

Known bugs

Things happen!
  • None for now...
Please post a comment if you discover new bugs.

Tip of the hat

Some credits goes to the following people for their inspiration or simply helping me out with some bugs!
  • WAVcheck, from Webbanalys, extends on the same idea with an executable version which can detect up to 27 different vendors.
  • Rahul Revo posted on his blog a request for a Greasemonkey extension that would detect Google Analytics. Mohnsish Rao proposed a simple solution.
  • Mike Keyes, on his blog "On the trail", created a simple bookmarklet (a one-liner JavaScript you can put in your favorites) that will detect a bunch of different vendors. Cool, simple and works in both Firefox and Internet Explorer.
  • Vendors sometimes provides their own debugging aids, usually in the form of a bookmarklet that will display the parameters being passed to their data warehouse.
  • Some have proposed developing Unix-based scripts with grep and wget or perl but it looks to me like a pretty complex endeavors that have its own limitations.
  • Or you could get help from your vendor or ask for a independent consultants to help you out. Maxamine is one of them.
  • Other "complementary" solutions that might help: Watchfire WebQA is particularly good at crawling a site and looking for specific code.
  • Fiddler or ieWatch are two useful tools for Internet Explorer.
  • Charles is a proxy that records every communication between your browser and the Internet and will work with both MSIE and Firefox.
  • I use a bunch of Firefox extensions to help me in my day to day Web development activities and some of the WASP features are inspired by those tools. Look for FireBug, Live HTTP Headers, View Dependencies and Web Developer and View Cookies.
  • AlertSite recently posted a Firefox extension named DejaClick. Not only is this extension wonderful as a very powerful macro recorder, but it is also one of the best packaged extension I have seen so far.
  • I found out the spirit of the early days of the Web is still alive in the specialized groups I relied upon to help me out with some development issues. Thanks to Neil & Nickolay from mozzila.dev.extensions

Revision history

  • 2007/02/03 - 0.2 - Second beta release to the WA community
  • 2006/11/30 - 0.1 - Early beta release to the WA community

Thursday, February 1, 2007

The lonely life of bloggers

One of my interest about the Internet, beyond technology, is the impact the Web and the Internet have on human behavior and it's social impact. I started blogging in October 2002, so I guess it makes me an early adopter of this medium.
"Bloggers are living in a world where emotions may be real but everything else is make-believe, says a University of Calgary professor in a new book." Globe and Mail
I read with interest a post on One Degree about an article of the Globe and Mail... This article depict bloggers as asocial and lone souls. Although I haven't read professor Keren of the University of Calgary new books, just looking at the cover gives you a pretty good idea of the author's opinion...

I think there is something fundamentally wrong with this article, and probably with the authors mindset:
"they are not real", "who cares if they're not real people?"
I digress... blogs are made by human beings, and are read by human beings... how can this be unreal? Only the medium and reach have evolved.

Then it goes on to talk about the fact that few people achieve fame, and most people remain in the dark. Blogging to get fame is akin to funding a company to get rich... That's the wrong way to do it. You blog or you start a company because you believe you can offer something unique, something of interest to others, or simply because you are "rich" enough to do what you like. And "rich" doesn't necessarily means money, it's the leisure of having the time to blog.

I blog and I read other's blog by interest, to share and discover new ideas, open my mind and learn about people without being constrained by the limits of my physical surrounding.

And trough mine and others blogs, I'm able to meet people, real human beings, and in some case, meet them in person. That's how I got to organize the monthly Web Analytics Wednesday and got to know new people: this is real.