FTC “Do Not Track?” Bring it on …
Published by Eric T. Peterson on December 2, 2010.
|« Back to all posts||Share, Save or Email|
As the hubub around consumer privacy continues I was gently prodded by a friend to pipe up in the conversation. While my feelings about how we have ended up in this position are pretty clear, and while my partner John and I have proposed what we believe is a step in the right direction regarding online privacy and the digital measurement community, it seems that some type of ban or limitation on online tracking is becoming inevitable.
Without getting political or debating the reality of what we can and cannot know about online visitors I have a single word response to the FTC:
Before you accuse me of changing my stripes or going completely nuts consider this: If the FTC is able to somehow pull off the creation of a universal opt-out mechanism, and if the browser developers support this mechanism despite clear and compelling reasons not to, and if consumers actually widely adopt the mechanism — all pretty big “ifs” in my humble opinion — then I believe the digital measurement industry will do what I have already described as inevitable:
Since my tenure at JupiterResearch back in 2005 I have been telling anyone who would listen to stop worrying about counting every visitor, visit, and page view and instead start thinking about statistically relevant samples, confidence intervals, and the algorithmic use of data to conduct analysis. Yes, you need to work to ensure data quality — of course you do — but you don’t have to do it at the expense of your sanity, your reputation, or your job …
See, it turns out in our community it doesn’t really matter whether we are able to measure 100% of the population, 90% of the population, or even 80% of the population — what matters is that we are able to analyze our visitor populations and that are able to draw reasonable conclusions from that analysis. Oh, we have to be empowered to conduct analysis as well, but that’s a whole other problem …
Statistical analysis of the data … trust me, it’s going to be all the rage in a few years. I’m not saying this simply because I have a white paper describing the third generation of digital measurement tools that will empower this type of analysis … although I would encourage you to download and read “The Coming Revolution in Web Analytics” (freely available thanks to the generous folks at SAS!)
I’m saying this because every day I see the writing on the wall. Data volumes are increasing, data sources are increasing, and demands for insights are increasing, all while professional journalists, politicians, and political appointees are supposedly protecting our “God-given right to surf the Internet in peace” without any regard to the businesses, employees, and investors who depend to a greater or lesser degree on web-collected data to provide a service, pay their bills, and make a profit …
Okay, sorry, that was editorializing. My bad.
Still, rather than wring our hands and gripe about how much the credit card companies know (which is a silly argument given that credit card companies provide tangible value in exchange for the data they collect … it’s called “money”) I believe it is time to do three things:
- Suck it up.
- Hold yourself to a higher standard.
- Buy “Statistics in Plain English” and start reading.
The good news is that we have access to lots and lots of great statistical analysis of sampled data today — we just might not realize it. Consider:
- Google Analytics has their “Analytics Intelligence” report which was brilliant to begin with and recently got a big-time upgrade with their “Major Contributors” functionality;
- Smart vendors like Scout Analytics are solving point solutions such as revenue optimization for digital products;
- The Voice of Customer vendors, including our good friends at ForeSee Results and OpinionLab, have long made incredibly valuable insights available to business owners working from sampled populations and statistics;
- The Customer Experience Management vendors, including the fine folks at Tealeaf and Clicktale, have long leveraged both sampled populations and, more recently, the algorithmic discovery of frustration and struggle on the part of site visitors;
- The Testing and Optimization vendors, including Google (Web Site Optimizer), Omniture (Test & Target) and SiteSpect, more or less make their entire living off of statistical analysis of the behavior of sampled populations;
- The Adaptation and Recommendation engines, including Baynote, Certona, and Rich Relevance, all also more or less depend completely on algorithms to determine how to adapt content to suit a particular visitor’s likely needs;
Have I mentioned Excel, Tableau, and R? Hopefully by now you get the gist … statistics is already all around us all the time, perhaps just not exactly where we expect it or, in the context of lower rates of data collection, where we will ultimately need it to be.
Perhaps the most encouraging evidence that we will be able to make this shift is the increasing attention the digital world is getting from traditional business intelligence market leaders like Teradata, FICO, IBM, and SAS. I, for one, am more or less convinced that the gap between “web analytics” and “Analytics” is about to be closed even further … and here’s one guy that seems to agree with me.
We don’t need to thumb our noses at the privacy people — quite the opposite, and to this end John and I will be sitting down with a representative from the Center for Democracy and Privacy and Adobe’s Chief Privacy Officer MeMe Rasmussen at the next Emetrics in San Francisco! We also don’t need to stick our head’s back in the sand and hope this issue will simply go away — it won’t, trust me.
We need to prepare.
Prepare by committing yourself to not being that scary data miner that consumers are supposedly so afraid of; prepare by improving your data quality to the extent that you are able; and prepare by starting to communicate to leadership that it really doesn’t matter if you can count every visitor, every visit, and every page view — what matters is your ability to analyze data using the tools at your disposal to deliver value back to the business.
If you’re not sure how to do that, call us.
Viva la revolution!
DISCLOSURE: I mentioned and linked to lots of vendors in this post which I normally do not do. Some are clients of Web Analytics Demystified, others are not. If you have concerns about why we linked to one company and not another please don’t hesitate to email me directly.
About Eric T. Peterson
Eric T. Peterson is the founder of Web Analytics Demystified, Inc. and the author of Web Analytics Demystified, Web Site Measurement Hacks, and The Big Book of Key Performance Indicators. Mr. Peterson frequently presents on web analytics, is often cited in articles about digital measurement, and has been blogging on the subject since 2004.
Want to speak with Eric? Contact Web Analytics Demystified