You know the scene. The hero needs to track the bad guys, so under the cover of darkness he crawls under their car — preferably a late 70s Plymouth — and sticks a little tracking device under the bumper, a gizmo with a blinking red light and a mini antenna. Then, as the Plymouth winds and wends its way through the streets of the city, our hero and his partner keep one on the road and the other on their dashboard display, tracking a flashing dot as it moves down Maple and hangs a louie onto Elm.
So it goes with the sisphyean and semi-creepy job of tracking users through cookies and “tracking beacons.” The cookie debate has been with us since their introduction in the mid-90s, and many a paranoid user would express outrage that some hapless publisher was sticking a bug under their bumper. I would write explanatory FAQs, debate the ethics with the corporate legal department, answer angry emails from those paranoid users, all trying to explain that cookies were a convenience, didn’t disclose their bank account numbers, and weren’t being sold to spammers.
As the web moved from the wild west to a highly buttoned up vision of metric nirvana, the nature of the bug under the bumper changed from one of establishing a state between the user’s browser and the destination site — “remember me” functionality — to one of following the user’s clickstream, establishing repeat visit behavior patterns, and painting a mostly anonymous picture of what John Battelle eloquently calls the “Database of intentions.”
The old model of web metric was slow and rear-facing, looking at log files generated by the servers to determine the kind of gross tonnage traffic figures that dot.bomb CEOs loved to throw around like golf scores: “My site gets a million hits a day,” not understanding the difference between a hit, a page view, a visit, a unique visit, a weekly unique visit, pages per visit, etc. etc. — the high level numbers that make some people happy but aren’t very useful to a web team buying advertising, optimizing pages for conversions, and studying clickstreams to see where users are coming from, what search terms are getting them there, etc. etc.
The present state of metrics is very advanced from where things stood five years ago. Omniture’s SiteCatalyst, Web Side Story’s Hitbox, are examples of the ASP model of web metrics (with Google Analytics coming on as a free alternative for SMBs and less sophisticated users) that are giving great insights into user paths, fall off points, A-B testing, promotional tracking, and the lifeblood insights that spell the difference between pissing away ad dollars in the case of a commerce site, or losing users to bad nav for a media site.
All ASP systems depend on a tracking beacon — a script embedded into the html of every page that tracks the user from one page to the next. Spyware detectors don’t like these things, and sophisticated or paranoid users are growing more and more accustomed to purging them as nuisances and potential spam threats. Omniture found itself in hotwater last summer when the Wall Street Journal reported on its “2o7” cookie, and how spyware and privacy watchdogs were calling into question Omniture’s lack of transparency in divulging the source or purpose of the cookie.
That didn’t stop us at IDG from continuing to live within Omniture — the producers and editors in the clickmap — a graphical overlay of the pages, and the marketing types inside of custom dashboards which at the very least spared me — the chief analyst and bottle-washer — from having to wrestle with ad hoc queries. Or at least not as many.
The point of this digression is to report that the science of web analytics and metrics has moved far from the days at Forbes.com in the mid-90s when everything was a best guess informed by the blunt Web Trends analysis of the log files. Further integration of metrics with other quantitative measures — customer satisfaction, cart revenue, SEM CPCs — could yield a new model of content management where sites will tuned on a minute-by-minute basis for maximum yields. When that happens — and it is happening on the most sophisticated sites, the repercussions on interactive agencies, paid search, and insertion engines such as Doubleclick will be massive.
Learning how to measure and optimize in as close to real time as possible will prove to be the most valuable skill in web management, one that will see the old paradigm of web masters tweaking content with designers and then firing it into the ether through an FTP client, only to see the results a week later (like driving with an eye on the rear mirror), doomed and done.