Registration is now open for ACCELERATE 2014 in Atlanta, Georgia on September 18th. Reserve your spot today at Eventbrite — tickets are only $99 USD!


Sad to say, I partially agree with Brandt Dainow

Published by Eric T. Peterson on January 12, 2009 All posts from Eric T. Peterson

Readers who are enthusiastic members of the web analytics community are by now familiar with Brandt Dainow and his sometimes antagonistic missives published at iMediaConnection. While I try pretty hard to follow the old “if you can’t say something nice” rule I occasionally fail in my efforts. Perhaps the best evidence of my failing was my calling Brandt Daniow insane when he suggested that Google Analytics version 2.0 was “simply a quantum leap above any other analytics product on the planet.”

While I firmly believe that Google Analytics is a great, valuable, and appropriate application for a wide range of needs, I think that Dainow’s “quantum leap” claim and statements like “What Google has done is simply take every feature in every product on the market and put them all into one system, and then make it available for free” are so obviously hyperbolic that they beg criticism (which Mr. Dainow got in spades from many within the analytics community.)

Dainow has since turned on Google Analytics, more recently pointing out what he describes as “disturbing inaccuracies behind Google Analytics” and again getting our  attention with irresponsible statements like “Google Analytics is different from other products in that it has been intentionally designed by Google to be inaccurate over and above the normal inaccuracies that are inevitable.

Oddly enough, his rant about Google Analytics included some statements that rubbed members of the Web Analytics Association the wrong way.  When folks like Jodi McDermott commented on the article and questioned some of Dainow’s assertions, Brandt did what any normal person would do …

… he wrote a nasty follow-up piece critical of the Web Analytics Association and the WAA Standards Committee!

I will let you read his piece yourself, but the two-sentence summary of Dainow’s opinion is that “the work of the WAA standards committee is a disaster for the web analytics community. It will take years to undo the damage and create proper precise standards that can be implemented in software. The WAA “standard” is not a standard, it’s just second-rate muttering.

Clearly Dainow is not worried about making friends in the web analytics industry.

I personally am a big fan of the Web Analytics Association.  I am pretty loyal to some of the current Board of Directors, I’ve done a bunch in the past to support the WAA and am about to announce more of the same, and I’ve even gone out of my way to help promote the work of the Web Analytics Association Standards Committee.  So it is was with great trepidation I wrote this article’s title … but I find myself agreeing with one small part of Dainow’s otherwise unnecessary rant.

Towards the end of his article, right before he declares that some pretty nice people’s work has been little more than time wasted, he says this:

The WAA should be setting the agenda, not following the crowd. The task of the WAA standards committee should be to determine how web analytics metrics should be calculated in order to achieve the highest degree of precision possible. The WAA should be laying out the roadmap for the way things should be. It then falls to the vendors to bring their software into line.”

I more or less made this same comment, although I like to believe I used a great deal more tact, when I commented on the original Web Analytics Association Standards published under the direction of former Director Avinash Kaushik back in August 2007.

At the time I preferred to focus on the reality of the situation–the fact that the WAA had proposed a set of standard definitions that, for good or ill, were better than anything else out there.  Instead of being openly critical of the definitions as written, I preferred to ask the question, “Now that we have these definitions, what are we going to do with them?”

While my call for a web analytics standards compliance matrix has since been answered by all of the major vendors except for Omniture, I personally don’t believe that the Standards process is serving the needs of our community as best possible.  We all continue to be vexed by a lack of standard definitions, a situation that will likely get worse with the decline of the web analytics economy.

Not having participated in the process of drafting the WAA Standards I can only express gratitude towards those members of the community who have volunteered  their valuable time for this work.  In my humble opinion, people like Jason Burby and Angie Brown are to be congratulated for their efforts, not denigrated and accused of having set our industry back into the dark ages.

But, in the spirit of having an open mind and building consensus, I would be interested in hearing my reader’s collective thoughts on Dainow’s point that the WAA should be setting standards without regard to their practicality today. Put another way, should the Association have written definitions that would be robust and useful in an analytics context and then presented that guidance to the entire community–vendor, consultant, and practitioner alike–saying “this is the result we should all be working towards.”

For example, should the WAA have been more explicit in their definition of a “visit” and proclaim that a visit is terminated after 30 minutes of inactivity, instead of saying “if an individual has not taken another action (typically additional page views) on the site within a specified time period, the visit will terminate by timing out.“  Being explicit about the timeout duration would make a clear statement about our collective expectation for the definition of a visit, and any technology or analysis that choses to use a timeout other than 30 minutes would also need to justify their decision to eschew the WAA Standard for another value.

I know that the WAA is doing the best they can, and I am enthusiastic about the work Angie, Judith and their fellow volunteers have all been doing.  But I do think Dainow’s assertion that standards should be set based on overall value to the community in the long run, not necessarily the near-term practicality, is worth exploring.  Taking this approach would definitely penalize some vendors and reduce their self-generated “compliance score” but it does kind of make sense to be working collectively towards a more precise set of definitions we can all work from.

Doesn’t it?

These are the kinds of conversations that aren’t just magically resolved and so I’m sure we’ll have to add this to the list of issues worthy of discussion the next time we all meet.  I’m sure it will come up at some of the upcoming vendor events, in San Jose at Emetrics, and likely at our own web analytics conference, the X Change (where last year Forrester analyst John Lovett led a conversation on the topic.)

As always I welcome your thoughts, feedback, open disagreement, pointing out flaws in my logic, etc.  I consider myself fortunate to have such thoughtful and experienced analytics practitioners among my most loyal readers and sincerely hope Dainow’s otherwise disturbing rant will lead to something of value for our community.

Share this blog post ...
Twitter0LinkedIn0Facebook0Google+0Email

Categorized under General Web Analytics, Web Analytics Association, Web Analytics People

  • http://www.whencanistop.com Alec Cochrane

    I’m not sure that the definitions were conducted the wrong way around. That is to say, the WAA standards committee took how the vendors described the key metrics (which they more or less all do) and then amalgamated them into a set of metrics with given names and given descriptions from this list.

    What maybe should have happened is we decide what the name for the metric was and then work out what it was that it was meant to be measuring and how. I’m sure someone from the WAA standards committee is going to disagree with me here. Maybe they didn’t do it that way, but that is the way that it seems to have happened.

    The result of the first way of doing it was that the vendors could put up their compatibility matrix and claim they were compatible with a few different naming conventions without having to do anything about it (Not that they couldn’t have done that using the second option either, but it would have made it more difficult). Whilst what we really wanted them to do was change their tools nomenclature and processing to be compliant.

    The other thing we need to think about is how we get the vendors to take up the standards. At the moment it doesn’t seem like the WAA can do anything about those that don’t (cf Omniture and Google to an extent) so are going for the carrot method rather than the stick.

  • http://www.openroad.ca Bryan Robertson

    Hi Eric,

    I agree that the best scenario would be for the WAA to be in a leadership role on definitions and the quality of vendors be judged on compliance.

    An interesting parallel may be with the W3C and the transition from HTML 1.0 to XHTML over the years. With a vocal developer community and passionate advocates such as Jeffrey Zeldman behind the standards, industry judgment of competing browsers came to focus on the degree of compliance with the standards rather than on the “tags as features” war that preceded it(insert your favourite blink tag joke here).

    Let’s look at the pieces that came together there in turn:
    1. A powerful standards body
    2. A vocal community
    3. Passionate thought leaders

    1. A powerful standards body: HTML had the W3C, we have the WAA. Since, like Eric, I think the quality of people on the committee is unimpeachable, I would ask questions in other areas: Is the WAA powerful enough at this point in time, or do we need to continue to build momentum before the standards can be more bold? For example, is the WAA hand wringing too much over the polite “we’ll share with you if you share with us” arrangement with the IAB over standards definitions? Is the WAA in a tough position in trying to bring practitioners and vendors together at the same table?

    2. A vocal community: As a community I think we need to evaluate whether the WA community is vocal enough on this matter if we deem it important enough. Is it just me or is the average developer more of a squeaky wheel than the average WA practitioner?

    3. Passionate thought leaders: Clearly Eric is trying to step up here. Will Eric or one of the other go-to WA thought leaders be the one that creates critical mass here, or is that person yet to emerge?

    It is admittedly easy for me to throw questions out there without answers, but I am curious as to what others think about in particular the role and clout of the WAA within the industry at this point.

    Bryan

  • http://www.webanalyticsdemystified.com eric

    Alec: Agreed. As I said back in August 2007 publishing standards without any mechanism for enforcement is not going to have the effect we are collectively looking for. And while I was pretty disappointed in how long it took IndexTools, Google Analytics (via EpikOne, nobody from Google has actually said anything about standards … Avinash!?!?), Unica, WebTrends, and Coremetrics to publish their own standards compliance matrix, I am certainly encouraged that these documents now exist. More importantly, in my interview with Angie Brown she alluded to the Standards Committee doing more with these matrices in the future.

    Now if only Omniture would get on board and publish their compatibility … their lack of initiative here is disconcerting and certainly lends credibility to their competitors claims that their HBX and SiteCatalyst products will garner the ** lowest ** compatibility scores among major vendors. Anyone from Omniture care to comment here?

    If I had a little more time I would personally vet the matrices that have been provided and start to publish an uber-matrix with scores and updates. Alas, time is a killer here. Maybe Phil Kemelor, Bill Gassman or John Lovett will pick up the slack in their writing?

    Thanks for your comment!

  • http://www.webanalyticsdemystified.com eric

    Bryan: Excellent points, all. I am working frantically on a follow-up post exploring the proposed IAB Guidelines for Measurement (watch this space!) that will explore some of your points.

    Until that time I am happy to continue to bang the drum loudly. Thanks for listening!

  • Pingback: Web Analytics Demystified » Blog Archive » Thoughts on the proposed IAB Guidelines

  • http://blackbeak.com/ Steve Jackson

    Hi Eric;

    I did post a response on Monday but it vanished in the ether it seems. My full response is on my site.
    http://www.blackbeak.com/2009/01/13/web-analytics-association-standards-document/

    In essence I agree with you (and on 2 points Dainow). I also share your reluctance to agree with Mr Dainow.

    I think we could use the point he raises about precise technical terms with the current standards. I thought it was a controversial decision to change the time spent on site definition to accommodate Google Analytics way of calculating things. While I understand why that decision was made I think that the vendors are having it too much of their own way at the moment.

    That said I can see a situation where Google’s omnipresence would be problematic if we forced them to change. Everyone using GA would be up in arms about not being able to rely on GA numbers if we forced them to change the calculation to the way it was before.

    The fact is unless it’s in the vendor interests we can’t really force them to do anything. Otherwise “one tag to rule them all” would also be a viable solution as we’ve discussed before.

    Cheers
    Steve

  • http://www.showmeanalytics.com angie

    Hi all. Angie here from the Standards Committee. I appreciate all your comments and assure you we’ll discuss in our Committee and improve where we can. There seems to be a few misconceptions about how our committee works, and I’d like a chance to clear them up.

    First, we didn’t start with the vendor’s terms or their definitions. Several years ago, there was an action item for our members to come up with a list of terms that we think are critical to the practice of our craft. We ended up with a whole bunch of “top 10″ lists, colored by our experiences with e-commerce, advertising, pure content, lead-generation and other sites. We merged the lists together, and decided as a group which metrics were most important. It was only then that we looked at our tools to see how much the definitions varied, so we could see where we had some no-brainers (if everybody already calculates a metric the same way and that way is satisfactory, then no need to change), and where we needed to discuss and work out the differences in how we approach different metrics.

    There also seems to be a misconception about how much influence the vendors have in this process. The honest answer is, not much. The people on our committee who work for vendors haven’t pushed anything, and in fact are usually pretty quiet. That’s not to say they don’t contribute: they do. They’re also some of the nicest people you’ll ever meet. Most of the conversation is from their individual experience rather than an “official position” from their employer, and indeed when we occasionally ask an “As a vendor, how do you approach…” question, they often go back and ask someone on their product team. I definitely encourage anyone who worries about vendor influence to sit in on one of our meetings to see for yourself that Akin, Boaz, Katie, and Michelle don’t have anyone tied up in a corner while they take over the committee.

    Our real audience is the practitioner, and much of the wiggle room you see in our current version of the definitions is because of the practitioners’ needs. For example, we didn’t proclaim 30 minutes as the required visit timeout because 30 minutes is not always right. A shorter timeout is appropriate for a kiosk or other shared site. Different timeouts can also be used for financial services, insurance, and medical sites where security and privacy dictate that the application itself times out a visit, and that is not always at 30 minutes. It makes sense in that case to match the WA timeout to the application’s timeout.

    I disagree that our document shouldn’t have been based on what is practical today. In fact, that was our guiding principle on this version. But it’s not from a “tool X can only do Y” standpoint, but using our collective experience to determine what is the “best” way to do something, and is it sufficient. If not, what needs to improve? It’s not that the precision argument has no merit – it does – but it was discussed and we made a conscious decision to approach the standards in a practical way right now, and tighten them up as we get more feedback from the analytics community about what works and what doesn’t. The last thing we wanted is to create a document that’s so far from reality, and so expensive and impractical to eventually comply to, that everyone immediately proclaims “No one will ever bother.”

    We also disagree on the need for the “highest degree of precision possible” and would prefer the “highest degree of precision practical”. There’s always going to be some imprecision in what we measure unless we take draconian steps to reduce privacy among our customers and prospects, and that’s not good business.

    Standards will most definitely be tightened up on the next go-round, but we are working on the best way to do it. For example, specifying 30 minutes as the default visit timeout, then outlining specific cases where it is acceptable to use an alternate value. And then, do we require that the timeout is configurable in order for a tool to be standards-compliant? Or is it appropriate to have stepped levels of compliance so that practitioners who don’t deal with those types of sites have a wider choice of tools?

    For our next steps, we’ve finalized a number of scenarios and are going to be sending them out to WAA member vendors this week, along with a compliance matrix and additional questions. While we are very happy that several vendors stepped up and publicly compared their tools with the definitions, there are areas where we (the Standards Committee) don’t agree with their assessment of “compliance”, and it was never our intent for vendors to self-rate anyway. Replies have been requested to the committee by the end of February, after which we will publish our own “uber-matrix” (nice term, Eric, I’m stealing it :)).

    We received a lot of constructive feedback in private (and believe it or not, a lot was positive), and are compiling it right now so we can discuss. After we published the draft last fall, we had a number of volunteers approach us offering to translate the final document into Spanish, French, Japanese, and several other languages. We will be working with the WAA’s International Committee in the coming year to see that effort through.

    Thanks for keeping the conversation going.

    angie

  • http://www.showmeanalytics.com angie

    Steve, can you elaborate on your statement: “I thought it was a controversial decision to change the time spent on site definition to accommodate Google Analytics way of calculating things.”

    I wasn’t aware we did that. Because no one from Google participates on the committee, we don’t really know how GA calculates things. Does anybody? It’s hard to accommodate them when we aren’t sure what they do.

  • http://www.webanalyticsdemystified.com eric

    Angie: I’m so glad you piped up and provided some additional clarity regarding the Standards Committee and how you’ve come to the definitions you have today. As I said when we talked in D.C., and again when I interviewed you, I definitely appreciate the work you and your committee are doing. But yeah, I do think more work needs to be done but based on A) your comments above about “next step” and B) the recent re-structuring of the WAA around Research, Standards, Education, and Networking are both very encouraging in terms of the Association’s ability to get these standards out in form and format that the average practitioner can truly benefit from.

    Thanks again for all your hard work!

  • http://UnicaWebAnalytics.com Akin Arikan / Unica

    Well said Angie! And Eric too.

    Hey, unless I am mistaken, I remember that the first round of web analytics standards had been criticized as “having been created in a vaccuum without paying attention to what is possible with real existing web analytics solutions.”. I think the criticism back then referred to the definition of a generic unique visitors metric whereas most solutions only offered things such as “daily uniques, weekly uniques etc”

    And now Brandt’s criticism seems to push in the the exact opposite direction, if I understand correctly. I.e. he suggests that the standards should ignore what is possible today and create some desirable standard in a vaccuum after all.

    Obviously, the standards can’t satisfy both at the same time.

    Given that, I think they strike a darn good balance at approaching each of the terms from a top down perspective.

    Having had the chance to participate in some of the committee meetings on behalf of Unica, I am so impressed with how much honest thought Angie, Jodie, Judith, and all the volunteers are pouring into the definitions. The simplicity of the text hides all the thinking that goes into them.

    Feels like observing a poem being written at times. Write something. Adjust it. Erase it. Write something new.

    My point is that all this thought and discussion that the volunteers have invested so far probably had to take place before the standards could possibly be narrowed down any further.

    If the volunteers hadn’t spent all this thought and allowed for broader perspectives, they’d be shooting from the hip.

    Just my humble, personal opinion.

    Akin Arikan, Unica

  • http://blackbeak.com Steve Jackson

    @Angie;

    My reasoning is just that the second set of standards caters for the way Google calculates Time On Site (TOS) whereas the first set released didn’t take that into account.

    I also understand fully why the decision was taken to include a method of calculation similar to that of Google’s as it’s also possible to use the same method of TOS calculation in other tools.

    I don’t think however that it was a coincidence that the decision was made to change this specific standard after Google became the most popular web analytics application on the planet.

    *The last thing we wanted is to create a document that’s so far from reality, and so expensive and impractical to eventually comply to, that everyone immediately proclaims “No one will ever bother.”*

    I’m not saying that the decision you made was the wrong one. What I’m saying is that it was controversial. It opens the WAA up for this kind of debate simply because it is a matter of opinion on which is the best method to use.

    Practical is good, but I think technically specific is better, especially for the big three, plus events and time spent, because these are the key foundations that in my view should not be open to debate (eventually).

    What I’d like to see is a kind of compromise, a sliding scale that vendors can work towards in their roll-outs.

    We all know that no vendor will implement everything immediately, especially if it cuts into their development cycles. However if the standards were available showing something where 5 stars was perfect and zero was non-compliant then we could see vendors using their standard achievements as a pitch.

    This is how for instance the ISO standards are so effective. Companies all over the world adopt them because of the prestige of being ISO compliant. That’s where I’d like to see the WAA headed, being a body that vendors are fighting to stay *a 5 star vendor according to the WAA*.

    I should re-iterate, in no way am I trying to say that the WAA work is not of value. Unlike Dainow I think the standards are a very good piece of work and I know from experience working within the WAA (still since 2006) is a thankless and voluntary task. I applaud the Standards committee efforts.

    I simply believe that this is a good point of discussion to get into, especially if it raises some good points of discussion in the WAA.

    I’m pleased that you’ve jumped on this Angie and I hope you continue to do so. I’d also like to hear vendors official thoughts on being rated in such a manner by industry professionals.

  • http://www.showmeanalytics.com angie

    Steve, here’s the rub: I looked back through both the old and the new documents, and we didn’t change the calculation of “Visit Duration” at all. In both documents, it’s “The length of time in a session. Calculation is typically the timestamp of the last activity in the session minus the timestamp of the first activity of the session.”

    What we *did* do is add a caution that there are differences in the way different tools calculate this metric. We also recommend that the analyst ask their vendor which way the metric is calculated if they are unsure.

    The reason we added this that statement is because one of our goals last year was to make a concerted effort to understand what is out there, so that people will know about some of the “gotchas” in the current landscape. I went back through some of my old bookmarks, and remember a controversy in 2007 when GA tried to drop the zeros from their TOS calculation. There was a HUGE uproar when they made that change. So much of an uproar, in fact, that they changed it back.
    ( http://analytics.blogspot.com/2007/09/reverting-back-to-original-average-time.html )

    So right now, yes Google does include all visits in their “Average Time On Site” metric. So does Unica, in their “Visit Duration” metric. And my recollection is that the old SurfAid tool (formerly IBM, now Coremetrics) did so as well. So why is it controversial to caution people that some tools (including the “most popular web analytics application on the planet”) calculate this metric in a manner that they might not expect? That’s not “catering,” it’s a fact. And it’s a fact that web analysts need to know.

  • http://blackbeak.com/ Steve Jackson

    @Angie;@Anyone else reading;

    I am starting to feel my point has been totally misconstrued.

    I never said that Google forced the issue with secret meetings (or anything like it). I said the standards changed because of the most popular tool on the planet TOS calculation. That was a mistake and I apologize for it. Being honest I simply assumed that was why the standard has the statement added. In retrospect mentioning Google was a bad idea and has detracted from the main point.

    I’m not even saying it was wrong to change the standards to add the statement about different tools calculating time on site. I even said repeatedly on my own and Eric’s forum that I totally respect the current standards.

    I am not into pointless patter and never have been. I have tried to enter into a debate which I believe has value, that the current standards while good could be better by being more technically precise. I have nothing against Google and never have. I used Eric’s site (and my own) to discuss the point because it was the first reputable source where it was raised.

    I don’t read Dainow but I do read Eric.

    The point is that the TOS statement was added to the current standard document and I meant to use it as an example of making the standards open to debate as to what the correct way to measure TOS actually is.

    I don’t want to stop there. I also think that other terms need stronger definition. What is a page, an event, a visit or a visitor in computer machine code? Do we need to go that far? If not why not? These questions and more are what I’m trying to ask.

    Jim Sterne asked for comments when he asked us all to take a copy at the last eMetrics summit in Stockholm. My opinion when I read the standards was that I felt they could be stronger but I didn’t really know how. I kept quiet because I didn’t know how to improve the current iteration of the document. At the time saying anything would have achieved nothing.

    Dainow raised two issues, which I also blogged on my site that I believe are valid. That’s all I wanted to get out there.

    I have nothing but respect for you and anyone else involved with the standards committee and the current iteration of the standards. If I have insulted or misled anyone then please believe that it was never my intention. I apologize for any confusion caused.

    Best
    Steve

  • Jay Tkachuk

    I believe that the reason behind the thorny issue of creating a set of standards lies in the wrong approach. In my opinion, the standards should be set by an entity that has an undisputed weight in the industry, with an ability to basically dictate it to the rest — otherwise the standards will encounter a wave of resistance they won’t be capable of overcoming. The process of creating a successful set of web analytical standards in its essence is not that different from the process of establishing technological ones — Apple’s IEEE 1394, Sun’s J2SE and Sony’s Blu-ray are great examples. A process not democratic by any stretch of the imagination, but proven to work. WAA should identify such a bellwether(s), cooperate with them, look into their R&D roadmap, and then issue the standards backed by the heavyweight(s). Do I personally like the concept of having a set of standards being forced on me rather than have it created organically — not one bit. But it sure would solve the problem.

  • http://www.ansi.org Bob Russotti

    Reading the posts here, enforcement seems to be a concern as does the relative credibility and acceptance of the standards. I asked a few colleagues at ANSI to address those concerns and to explain the US standardization system:

    At the heart of the U.S. standardization system are documents that arise from a formal, coordinated, consensus-based, open process. These are commonly called voluntary consensus standards and are written by industry professionals from both the public and private sectors. The voluntary process requires full cooperation by all parties. It depends upon data gathering and compromises among a diverse range of stakeholders. When due process is followed, the resulting standards provide economic benefit to the many rather than the few.

    The American National Standards Institute (ANSI) coordinates, facilitates and promotes the development of voluntary consensus standards that are relied upon by industry, government agencies and consumers across the United States and around the world. In its role as coordinator and voice of the U.S. standards and conformity assessment system, ANSI helps reduce the chance of mutually contradictory standards when two groups are working in the same area.

    ANSI does not develop standards; rather, the Institute fosters the U.S. standardization system by accrediting the procedures of standard-setting organizations and subsequently approving their documents as American National Standards (ANS). More than 200 ANSI-accredited bodies are now engaged in the creation and maintenance of voluntary consensus standards that are being used in virtually every sector.

    American National Standards are voluntary consensus standards as defined by the Federal government in OMB A-119. ANSI-accredited standards developing organizations – and the experts that populate the consensus bodies of these groups – serve an important public interest function in developing American National Standards. The ANS process seeks to ensure that there is an opportunity for all those who are interested in and affected by a standard to participate in its development. Due process is key to ensuring that ANS are developed in an environment that is equitable, accessible and responsive to the requirements of various stakeholders.

    Participation by a standards developer in the ANS process signifies to the public an interest in developing high-quality, market-driven and responsive standards in a manner that is open to public scrutiny and that ensures that an opportunity exists for the voices of all stakeholders to be heard.

    The designation as an American National Standard indicates that a standard reflects current technology, is responsive to market demand, contributes to the effective and efficient operation of commerce within a free market environment, and was developed using a process that respects all viewpoints, thus making the resulting standard more broadly accepted.

    For more information about how ANS are developed and their value to both the public and private sectors see, http://tinyurl.com/7jtmob

  • Pingback: Thoughts on the proposed IAB Guidelines | KlickIntelligence

  • Pingback: Unique Visitors ONLY Come in One Size … | KlickIntelligence

  • Pingback: Unique Visitors ONLY Come in One Size | Web Analytics Demystified

  • Pingback: Knallhard kritikk av standard fra Web Analytics Association | Webanalyse

 


Recent Blog Posts

Welcome to Team Demystified: Nancy Koons and Elizabeth Eckels!
Eric T. Peterson, Senior Partner

I am delighted to announce that our Team Demystified business unit is continuing to expand with the addition of Nancy Koons and Elizabeth “Smalls” Eckels. Our Team Demystified efforts are exceeding all expectation and are allowing Web Analytics Demystified to provide truly world-class services to our Enterprise-class clients at an entirely new scale.

Continue reading this article ... ... more from Eric T. Peterson

When to Use Variables vs SAINT in Adobe Analytics
Adam Greco, Senior Partner

In one of my recent Adobe SiteCatalyst (Analytics) "Top Gun" training classes, a student asked me the following question: When should you use a variable (i.e. eVar or sProp) vs. using SAINT Classifications? This is an interesting question that comes up often, so I thought I would share my thoughts on this and my rules of thumb on the topic.

Continue reading this article ... ... more from Adam Greco

5 Tips for #ACCELERATE Exceptionalism
Tim Wilson, Partner

Next month’s ACCELERATE conference in Atlanta on September 18th will be the fifth — FIFTH!!! — one. I wish I could say I’d attended every one, but, sadly, I missed Boston due to a recent job change at the time. I was there in San Francisco in 2010, I made a day trip to Chicago in 2011, and I personally scheduled fantastic weather for Columbus in 2013.

Continue reading this article ... ... more from Tim Wilson

I’ve Become Aware that Awareness Is a #measure Bugaboo
Tim Wilson, Partner

A Big Question that social and digital media marketers grapple with constantly, whether they realize it or not: Is "awareness" a valid objective for marketing activity?

I’ve gotten into more than a few heated debates that, at their core, center around this question. Some of those debates have been with myself (those are the ones where I most need a skilled moderator!).

Continue reading this article ... ... more from Tim Wilson

Advanced Conversion Syntax Merchandising
Adam Greco, Senior Partner

As I have mentioned in the past, one of the Adobe SiteCatalyst (Analytics) topics I loathe talking about is Product Merchandising. Product Merchandising is complicated and often leaves people scratching their heads in my "Top Gun" training classes. However, many people have mentioned to me that my previous post on Product Merchandising eVars helped them a lot so I am going to continue sharing information on this topic.

Continue reading this article ... ... more from Adam Greco

Team Demystified Update from Wendy Greco
Eric T. Peterson, Senior Partner

When Eric Peterson asked me to lead Team Demystified a year ago, I couldn’t say no! Having seen how hard all of the Web Analytics Demystified partners work and that they are still not able to keep up with the demand of clients for their services, it made sense for Web Analytics Demystified to find another way to scale their services. Since the Demystified team knows all of the best people in our industry and has tons of great clients, it is not surprising that our new Team Demystified venture has taken off as quickly as it has.

Continue reading this article ... ... more from Eric T. Peterson

SiteCatalyst Unannounced Features
Adam Greco, Senior Partner

Lately, Adobe has been sneaking in some cool new features into the SiteCatalyst product and doing it without much fanfare. While I am sure these are buried somewhere in release notes, I thought I’d call out two of them that I really like, so you know that they are there.

Continue reading this article ... ... more from Adam Greco

Hello. I’m a Radical Analytics Pragmatist
Tim Wilson, Partner

I was reading a post last week by one of the Big Names in web analytics…and it royally pissed me off. I started to comment and then thought, “Why pick a fight?” We’ve had more than enough of those for our little industry over the past few years. So I let it go.

Except I didn’t let it go.

Continue reading this article ... ... more from Tim Wilson

Competitor Pricing Analysis
Adam Greco, Senior Partner

One of my newest clients is in a highly competitive business in which they sell similar products as other retailers. These days, many online retailers have a hunch that they are being “Amazon-ed,” which they define as visitors finding products on their website and then going to see if they can get it cheaper/faster on Amazon.com. This client was attempting to use time spent on page as a way to tell if/when visitors were leaving their site to go price shopping.

Continue reading this article ... ... more from Adam Greco

How to Deliver Better Recommendations: Forecast the Impact!
Michele Kiss, Partner

One of the most valuable ways to be sure your recommendations are heard is to forecast the impact of your proposal. Consider what is more likely to be heard: "I think we should do X ..." vs "I think we should do X, and with a 2% increase in conversion, that would drive a $1MM increase in revenue ..."

Continue reading this article ... ... more from Michele Kiss

ACCELERATE 2014 “Advanced Analytics Education” Classes Posted
Eric T. Peterson, Senior Partner

I am delighted to share the news that our 2014 “Advanced Analytics Education” classes have been posted and are available for registration. We expanded our offering this year and will be offering four concurrent analytics and optimization training sessions from all of the Web Analytics Demystified Partners and Senior Partners on September 16th and 17th at the Cobb Galaria in Atlanta, Georgia.

Continue reading this article ... ... more from Eric T. Peterson

Product Cart Addition Sequence
Adam Greco, Senior Partner

In working with a client recently, an interesting question arose around cart additions. This client wanted to know the order in which visitors were adding products to the shopping cart. Which products tended to be added first, second third, etc.? They also wanted to know which products were added after a specific product was added to the cart (i.e. if a visitor adds product A, what is the next product they tend to add?). Finally, they wondered which cart add product combinations most often lead to orders.

Continue reading this article ... ... more from Adam Greco

7 Tips For Delivering Better Analytics Recommendations
Michele Kiss, Partner

As an analyst, your value is not just in the data you deliver, but in the insight and recommendations you can provide. But what is an analyst to do when those recommendations seem to fall on deaf ears?

Continue reading this article ... ... more from Michele Kiss

Overcoming The Analyst Curse: DON’T Show Your Math!
Michele Kiss, Partner

If I could give one piece of advice to an aspiring analyst, it would be this: Stop showing your "math". A tendency towards "TMI deliverables" is common, especially in newer analysts. However, while analysts typically do this in an attempt to demonstrate credibility ("See? I used all the right data and methods!") they do so at the expense of actually being heard.

Continue reading this article ... ... more from Michele Kiss

Making Tables of Numbers Comprehensible
Tim Wilson, Partner

I'm always amazed (read: dismayed) when I see the results of an analysis presented with a key set of the results delivered as a raw table of numbers. It is impossible to instantly comprehend a data table that has more than 3 or 4 rows and 3 or 4 columns. And, "instant comprehension" should be the goal of any presentation of information — it's the hook that gets your audience's brain wrapped around the material and ready to ponder it more deeply.

Continue reading this article ... ... more from Tim Wilson

Automating the Cleanup of Facebook Insights Exports
Tim Wilson, Partner

This post (the download, really — it’s not much of a post) is about dealing with exports from Facebook Insights. If that's not something you do, skip it. Go back to Facebook and watch some cat videos. If you are in a situation where you get data about your Facebook page by exporting .csv or .xls files from the Facebook Insights web interface, then you probably sometimes think you need a 52" monitor to manage the horizontal scrolling.

Continue reading this article ... ... more from Tim Wilson

The Recent Forrester Wave on Web Analytics ... is Wrong
Eric T. Peterson, Senior Partner

Having worked as an industry analyst back in the day I still find myself interested in what the analyst community has to say about web analytics, especially when it comes to vendor evaluation. The evaluations are interesting because of the sheer amount of work that goes into them in an attempt to distill entire companies down into simple infographics, tables, and single paragraph summaries.

Continue reading this article ... ... more from Eric T. Peterson

Funnel Visualizations That Make Sense
Tim Wilson, Partner

Funnels, as a concept, make some sense (although someone once made a good argument that they make no sense, since, when the concept is applied by marketers, the funnel is really more a "very, very leaky funnel," which would be a worthless funnel — real-world funnels get all of a liquid from a wide opening through a smaller spout; but, let’s not quibble).

Continue reading this article ... ... more from Tim Wilson

Reenergizing Your Web Analytics Program & Implementation
Adam Greco, Senior Partner

Those of you who have read my blog posts (and book) over the years, know that I have lots of opinions when it comes to web analytics, web analytics implementations and especially those using Adobe Analytics. Whenever possible, I try to impart lessons I have learned during my web analytics career so you can improve things at your organization.

Continue reading this article ... ... more from Adam Greco

Registration for ACCELERATE 2014 is now open
Eric T. Peterson, Senior Partner

I am excited to announce that registration for ACCELERATE 2014 on September 18th in Atlanta, Georgia is now open. You can learn more about the event and our unique "Ten Tips in Twenty Minutes" format on our ACCELERATE mini-site, and we plan to have registration open for our Advanced Analytics Education pre-ACCELERATE training sessions in the coming weeks.

Continue reading this article ... ... more from Eric T. Peterson

Current Order Value
Adam Greco, Senior Partner

I recently had a client pose an interesting question related to their shopping cart. They wanted to know the distribution of money its visitors were bringing with them to each step of the shopping cart funnel.

Continue reading this article ... ... more from Adam Greco

A Guide to Segment Sharing in Adobe Analytics
Tim Wilson, Partner

Over the past year, I've run into situations multiple times where I wanted an Adobe Analytics segment to be available in multiple Adobe Analytics platforms. It turns out…that's not as easy as it sounds. I actually went multiple rounds with Client Care once trying to get it figured out. And, I’ve found "the answer" on more than one occasion, only to later realize that that answer was a bit misguided.

Continue reading this article ... ... more from Tim Wilson

Currencies & Exchange Rates
Adam Greco, Senior Partner

If your web analytics work covers websites or apps that span different countries, there are some important aspects of Adobe SiteCatalyst (Analytics) that you must know. In this post, I will share some of the things I have learned over the years related to currencies and exchange rates in SiteCatalyst.

Continue reading this article ... ... more from Adam Greco

Linking Authenticated Visitors Across Devices
Adam Greco, Senior Partner

In the last few years, people have become accustomed to using multiple digital devices simultaneously. While watching the recent winter Olympics, consumers might be on the Olympics website, while also using native mobile or tablet apps. As a result, some of my clients have asked me whether it is possible to link visits and paths across these devices so they can see cross-device paths and other behaviors.

Continue reading this article ... ... more from Adam Greco

The 80/20 Rule for Analytics Teams
Eric T. Peterson, Senior Partner

I had the pleasure last week of visiting with one of Web Analytics Demystified’s longest-standing and, at least from a digital analytical perspective, most successful clients. The team has grown tremendously over the years in terms of size and, more importantly, stature within the broader multi-channel business and has become one of the most productive and mature digital analytics groups that I personally am aware of across the industry.

Continue reading this article ... ... more from Eric T. Peterson

Ten Things You Should ALWAYS Do (or Not Do) in Excel
Tim Wilson, Partner

Last week I was surprised by the Twitter conversation a fairly innocuous vent-via-Twitter tweet started, with several people noting that they had no idea you could simple turn off the gridlines.

Continue reading this article ... ... more from Tim Wilson

Omni Man (and Team Demystified) Needs You!
Adam Greco, Senior Partner

As someone in the web analytics field, you probably hear how lucky you are due to the fact that there are always web analytics jobs available. When the rest of the country is looking for work and you get daily calls from recruiters, it isn’t a bad position to be in! At Web Analytics Demystified, we have more than doubled in the past year and still cannot keep up with the demand, so I am reaching out to you ...

Continue reading this article ... ... more from Adam Greco

A Useful Framework for Social Media "Engagements"
Tim Wilson, Partner

Whether you have a single toe dipped in the waters of social media analytics or are fully submerged and drowning, you’ve almost certainly grappled with "engagement." This post isn’t going to answer the question "Is engagement ROI?" ...

Continue reading this article ... ... more from Tim Wilson

It’s not about "Big Data", it’s about the "RIGHT data"
Michele Kiss, Partner

Unless you’ve been living under a rock, you have heard (and perhaps grown tired) of the buzzword "big data." But in attempts to chase the "next shiny thing", companies may focus too much on "big data" rather than the "right data."

Continue reading this article ... ... more from Michele Kiss

Eric T.
Peterson

John
Lovett

Adam
Greco

Brian
Hawkins

Kevin
Willeitner

Michele
Kiss

Josh
West

Tim
Wilson

Contact Us

You can contact Web Analytics Demystified day or night via email or by reaching out to one of our Partners directly.

» Contact Information

Web Analytics Demystified, Inc.
P.O. Box 13303
Portland, OR 97213
(503) 282-2601


Useful Links