Nick Arnett challenges my visitor engagement calculation

Published by Eric T. Peterson on October 25, 2007 All posts from Eric T. Peterson

Nick Arnett from MCC Media (and one of the creators of Buzzmetrics) posted a very well though-out and moderately critical assessment of the visitor engagement calculation I wrote about earlier this week. Nick makes some great points and I thought it was worth addressing them while I prepare the follow-up post that shows off some of what the metric can do. My comments are preceded by ETP and Nick’s statements are in italics.

Definitely thought-provoking, Eric… I’m deep into this issue, although focused specifically on community sites.Overall, your approach doesn’t work for me on two main counts — it is too complicated (and thus unlikely to become any sort of standard) and doesn’t generate a metric that allows different sites to be compared. The latter is arguable, since standardized weightings could yield comparable numbers, but I think that’s excess complication also.

ETP: I’m sorry the calculation doesn’t work for you but I do appreciate your thoughts on the subject. Regarding it being too complicated, compared to what? Compared to “simple” metrics like bounce rate and average page views per session? Or compared to the technology you built to power Buzzmetrics? I guess I separate the complexity of making the calculation from one’s ability to actually explain the calculation.

ETP: Regarding using this metric to compare different sites … as I mentioned in the post, I don’t think there is “one” measure of visitor engagement and thusly trying to compare sites is probably a futile effort at best. I suppose you could remove the Brand, Feedback, Subscription and Interaction indices and come up with a standard set of threshold values for specific vertical markets, but I’m not sure that is really the best use of this calculation.

Is there any ground truth behind this? In case that isn’t clear, do you have any sort of primary market data for engagement that correlates with the output of your engagement metric?

ETP: Hmmm, here I’m not sure what you mean. What kind of primary market data is actually able to identify “engaged” visitors? Because I am able to see individuals interacting with my web site, I did talk to a handful of people based on their engagement scores when I was doing the original work on this metric, and some of their feedback was critical to tweaking the metric and inputs to its current state. But other than that I’d love to see the primary data you’re talking about if you’re able to share it!

As I’ve dug into the issues and our data (about five dozen communities ranging from very large to very small), I keep coming back to two main indicators of engagement — return rates and proactive behavior. If visitors don’t visit regularly and do something other than passive page viewing, I have a tough time including them in any measurement of community engagement.

ETP: Exactly why the Recency Index and Interaction Index are included in the calculation, but I disagree with your assessment that these are the only measures of engagement. I’m not sure exactly how I would determine that someone was only “passively” viewing pages, and again this metric is not designed to be a measure of “community engagement” but rather visitor engagement more broadly considered.

Some point-by-point thoughts…

Click-depth index — this is a place where ground truth really matters, I think. I’m not comfortable with the assumption that more clicks per session means greater engagement. Do we know enough about browsing behavior to know that this is true? And of course there’s the old problem of bad design resulting in more clicks… but when I consider that issue, I tend to think that if people show willingness to click through a bad design, maybe that means they really are engaged! Perhaps we should all include some known bad design… ;-)

ETP: I haven’t seen anything that says that more clicks means less engagement but I agree that confused people might generate more clicks. But I think it’s unlikely that confused and frustrated people would return, complete defined events, subscribe to blogs, etc. so despite your assertion that the metric is complex, multiple inputs are designed to mitigate those that may be confusing.

ETP: You do, however, make an excellent argument for not using something as simple as “click-depth” or “average depth of visit” as your sole measure of engagement.

I have pretty much the same questions about duration. Is there good, objective evidence that session duration correlates to engagement? There are visitors with long-duration visits who don’t visit regularly and don’t do anything proactive… I can’t see including them in any measurement of engagement.

ETP: It sorta depends on your definition of engagement, doesn’t it? But see my comment above about why a single measure like duration (as in Nielsen’s Time Spent ranking system) is perhaps inappropriate on its own to determine engagement.

Recency makes perfect sense to me — the fact that engaged visitors return often is practically a tautology. I would be very skeptical of calling anybody engaged if they aren’t returning regularly.

ETP: What about first time visitor? Are you saying you can’t be engaged on the first visit to a site? I agree that regular returns are a good indicator of engagement, but in my analysis the metric I’ve defined is able to resolve first time visitors into several engagement segments which I personally have found quite useful.

Your Brand Index is a great piece of data, but I don’t believe it works in a metric intended to compare sites. Language is too subtle and ambiguous to infer engagement from search terms. I spent years in the search engine and related markets, which gave me a great appreciation for the fact that what sometimes seems obvious about language isn’t. When people search on brand-related terms, it indicates *reach* to me, not engagement. I’m unwilling to assume anything more than brand awareness. People search on things they dislike, but that doesn’t mean they’re engaged with the subject they’re searching. And my data shows that visitors who show many other indications of engagement actually search *less* often.

ETP: Same comment about this metric not being specifically designed for comparing sites. I know that is the uber-goal for lots of folks in the world, it’s just not necessarily my goal or the best use for my engagement calculation.

ETP: Doesn’t “reach” plus “action” equal engagement? I haven’t spent years in search and related markets, but I struggle to believe that people searching on brands they dislike are not somehow engaged. Again, maybe this is a semantic issue arising from conflicting definitions of engagement.

ETP: Because the calculation is designed to be made over the lifetime of visitor sessions, searching less often is not a problem. I guess I more-or-less expect that the “direct” component of the Brand Index will be more important over time with truly engaged visitors (who wouldn’t be as likely to go back to Google and search on a branded term.)

Counting brand-related searches makes sense if we’re measuring *brand* engagement.
Counting direct (non-referred) visits makes sense if we’re measuring *site* engagement.

Counting both in the same metric doesn’t make sense to me. I don’t think we should even be talking here about ways to measure brand engagement… because I believe that’s well beyond the scope of site analytics. It requires massive monitoring systems along the lines of BuzzMetrics. (I’m the primary inventor of one of their systems.)

ETP: I’m not differentiating *brand* and *site* engagement since I’m trying to calculate an operational measure of ongoing *visitor* engagement. Brand is just a component, and the site is the measurement point. I think I understand your desire to differentiate the two given your background with Nielsen but I’m not trying to do the same thing.

One more problem with the Brand Index — people will argue all day long about what terms are appropriate to include… and there’s a strong incentive for site owners to err on the side of too many terms if their success is being measured by this metric. For example, you included “web site measurement hacks” in your list… but that could be a generic term. Is “Web Analytics Wednesday” really your brand? Or is it the WAA’s? I don’t want to argue which it is, just point out the kind of ambiguity that is inevitable.

ETP: Here I agree with you, coming up with a reasonable list is not easy, but web analytics is hard so at some point you have to make some tough decisions. “Web Site Measurement Hacks” is a book title and a branded term but could be a generic phrase. “Web Analytics Wednesday” is a branded Web Analytics Demystified term and has nothing to do with the Web Analytics Association. I don’t think there is that much ambiguity at the site level, at least in my experience.

Your Feedback Index is a specific instance of what I think of as the general principle of tracking proactive behaviors — what you seem to be getting at in your Interaction Index. In communities, visitors have many such opportunities — posting, editing, tagging, voting and so forth. I decided very early in this work to just give people one point for each such proactive action, despite the temptation to weight them (which would violate the need to keep things simple). These are the behaviors that make a community work; sites that aren’t based on user-generated content can exist without them.

ETP: Same comment about this calculation perhaps not being what you’re looking for vis-a-vis communities.

Your session focus really got me thinking. Does it make more sense to count the number of sessions in which visitors signal engagement or the number of actual such signals? I think it’s close to a toss-up, but so far, our ground truth suggests the latter — the number of proactive actions correlates better to our subjective estimates of engagement… but among our future tasks is to establish better ground truth. So far, I’m just using our community manager’s collective subjective scoring… but it correlates quite well to all but our smallest communities.

ETP: I agree, it’s probably a toss-up but if you think about the calculation all it does is count the number of signals. Long sessions are a signal, deep sessions are a signal, frequent sessions are signal, etc. I know you don’t like anything but recency and interaction but we can agree to disagree on this point. I’d love to hear about your “ground truthing” efforts and I’ll try and keep you appraised of mine.

The subscriber index doesn’t work for me because we want to be able to compare communities regardless of whether or not visitors are able to subscribe, join, become members or what-have-you. Some of our clients — e.g., a large professional sports organization — allow full participation without any need to sign up. Also, as I’ll explain below, I’ve found a strong negative correlation between highly active visitors and RSS subscribers.

ETP: Again, not designed for comparison (and at this point no wonder you don’t like my calculation!) I’d love to see the negative correlation data for RSS and yes, if you don’t have a subscription it doesn’t make sense to assign a negative penalty.

Finally, I guess I’ll toss out one of the ideas that I’m working with — segmenting visitors by proactivity.

In several ways, communities (and most web sites, I suspect) have a bimodal distribution of users. There’s typically a relatively large “Core” group that visits often, looks at lots of pages and does a lot of proactive stuff. There’s a middle ground, which I’m calling “Lingerers,” of people who fall into the 10th to 80th percentiles of such activities. Third and last, there’s a large contingent in the 0th percentile, people who might have one or two activities in a given time period, which I call the “Drive-bys.” In our communities, the Drive-bys are the largest group, but the Core usually is a bigger group than the Lingerers. What this says to me is that people tend to engage a lot or hardly at all — there isn’t much middle ground. I’ve been focusing on the Core’s relationship to the whole community for my engagement measurements. That’s what seems to correlate best to what little ground truth we have.

ETP: I am seeing a more normal distribution, especially as visitors return a third time, but it is definitely left-skewed towards lower levels of engagement. I’ll try and highlight this when I show some data that highlights the calculation in action. And since I’m not working on a community proper, I’ve found myself focusing on my middle group (“Moderately Engaged”) and trying to determine what I might be able to do to shift them up to “Highly Engaged”.

Overall, I’ve found that the Drive-bys and Lingers exhibit fairly similar behavior, but the Core is different. The Core visitors post more, search less and use RSS far less (so much for “subscribing” to RSS as a positive indicator of engagement!)

ETP: Your assessment of RSS being a poor indicator of engagement runs contrary to popular opinion (why would you subscribe to a RSS feed or email newsletter if you weren’t engaged??!) Perhaps this result is uncovering a flaw in your engagement calculation?

This post is getting long… so I’ll wrap it up (but ready to discuss further, of course) by repeating myself. I think any sort of engagement metric has to be backed up by demonstrating correlation to some kind of ground truth. Otherwise, it’s a mental exercise that runs the risk of having little relevance to the marketplace.

ETP: You keep coming back to the notion of “ground truth” but surely you recognize that this is A) extraordinarily difficult to come by and B) if we had it easily available we wouldn’t need a measure of engagement. I would love to see your “ground truth” data and talk about how you’re generating that, but unless I’m missing something it sounds a little impractical for widespread use. Still, I appreciate your feedback and very thoughtful comments and will endeavor to demonstrate the correlation between my calculation and “truly engaged” visitors.

Man, talk about a long post! What do you think? Is Nick more right than wrong? Are you focusing on communities and have the same concerns? Do you have similar concerns about your site? The conversation is almost as interesting as the metric and resulting analysis in my opinion so please, comment away!

Share this blog post ...
  • Clint

    Interesting discussion.
    I guess I would say that Mr. Arnett comes from a background of providing comparative data (Nielsen & Buzzmetrics) so a KPI that is truly site/business centric just doesn’t make sense to him.

    Ground Truth appears to be a geology term referring to empirical review. So maybe he is talking about primary research (field-testing, usability, etc).

    Interesting that he equates web analytics data to secondary research…

    The test of the engagement index is it’s ability to predict some desired behavior (moving visitors up the engagement chain by x% increases their propensity to by y%.

    Based on some comments you’ve made in your posts on the subject I imagine that you are busy trying to prove out at least the correlation aspects of engagement in a statistically meaningful way so we’ll just have to wait and see…

  • Tim Wilson

    You both make good points, but I’m going to have to come down on your side, Eric, as being “more right.” What I like, is that the formula is really more of a framework — it gives the components, but then leaves it up to the specific analyst/site to determine how to apply them.

    While Nick points out this means “no standardization,” this is absolutely the sort of metric that has one and only one golden standard: the site being measured. The type of site — its audience and its goals — are really going to dictate what massaging of the formula makes the most sense. Then, if an initiative is under way to increase engagement, a qualitative assessment of “how much do we expect this initiative to increase it?” can be asked at the outset, and the engagement score can be used to quantify whether that sort of impact is achieved.

    A final thought: “Complicated” and “Multidimensional” are not the same thing. In this case…your proposal is both. BUT, that also means that, to really impact the metric, you have to take well thought-out actions that may, themselves, be difficult. I’d much rather have that than a simplistic metric that can be improved semi-artificially.

    I agree with Nick that the engagement score might not be something that gets broadly shared across an entire organization. But, on the other hand, it might. 4 or 5 years ago, I worked with a lady who developed an internal “Google Index” that looked at natural search rankings, referrals, and page rank data to develop a single score for how well the site at the company was “doing” SEO-wise. Under the hood, she was looking at each of the components. But, to all of the people she was relying on for content support, technology support, and design/development support, this index was something they could rally around. And…they bought into it. It was a very effective way to drive action. And, like your engagement formula, it was not easily manipulated — so, when things flatlined or dipped, there wasn’t any hiding of the issue.

  • eric

    Clint: Yeah, I was confused by the use of ground truth and his relegation of web analytics to secondary research but Nick’s work history is probably the reason. I like his perspective.

    I want to challenge your assertion that “the test of the engagement index is it’s ability to predict some desired behavior” … what metrics in web analytics are predictive, in your opinion? Is conversion rate predictive, or merely suggestive? I think that customer satisfaction (per my thread with Larry Freed) can be predictive, but the ACSI is a more complex input than my engagement index by a long shot.

    I’d love to hear your thoughts on this since I’ve always had concerns about the use of the term “predictive” in web analytics.

  • Nick Arnett

    I should clear up some confusion about my background… My current job is Director of Business Intelligence Services at Liveworld Inc. We host and manage communities for eBay, HBO, QVC, the NBA and many others. MCC Media is my own company that started life as Multimedia Computing Corp. back in 1988 when I was publishing an industry newsletter and consulting. My connection to BuzzMetrics is that they bought the stuff I invented at Opion, a company I co-founded. I don’t know how much of what they currently use is what they invented or what I did. I reinvented some similar technology to create Senti-Metrics Partners a few years ago, which Liveworld acquired.

    As for “ground truth,” in this situation I think it needs to be a survey of site visitors, across multiple sites (to uncover a variety of engagement levels), with questions designed to determine how engaged they are — and bit more subtle than just “On a scale of one to ten, how engaged are you?”

    I also should say that I don’t object to the existence of metrics that don’t support comparisons… but my present goal is to develop one that does.

  • eric

    Tim: EXACTLY! The calculation is fully intended to be a framework, not a dictum, since I do think that every site will have slightly different needs regarding their measure of visitor engagement. I know people are looking for some “magic number” that will replace page views to compare sites, but I just don’t think it works that way.

    Great insight that actually “using” this metric may be difficult. Have you heard that I think web analytics is hard? I guess I’m just not sold on simple metrics like bounce rate being “immediately useful” (to quote another consultant) Making web analytics work often requires the well thought-out actions that you cite, and “not easily manipulated” is the key.

    I hope to use the next series of posts on the subject to demonstrate that using this metric is not nearly as complicated as you and Nick believe. But, I admit that I haven’t given folks much to work with (outside of the presentation I gave in D.C. and the slides in my Web Analytics 2.0 presentation so hopefully I will be able to convince you.

    Thanks a ton for your comment!

  • eric

    Nick: Thanks for clarifying your background! Sounds like you have access to some amazing data on communities to play with, and damn I would LOVE to see my engagement calculation run against the data you have.

    Regarding “ground truth” … thanks for clarifying that. I was going to ask Larry Freed at ForeSee Results if he’d help me collect some of that data (Larry’s a great guy!) What, in your opinion, would be the ** best ** questions to ask people to gauge their level of engagement? I agree, asking “are you engaged?” is probably not the best since few people have a good working definition of that term.

    Given your day job and background I don’t doubt that a comparative metric would be helpful. What are you using today to compare engagement across the communities you manage?

    Thanks again for challenging my thoughts and engaging in the conversation!

  • Nick Arnett

    (It would be great if you could edit the top of this page so that it doesn’t say I’m a creator of BuzzMetrics… that give the wrong impression.)

    I haven’t started thinking too hard about the right survey questions. Avinash pushed me on this issue a couple of weeks ago and I’m thinking we might hire somebody to develop the survey, partly for the credibility that couild bring.

    My current ground truth is a rating from 1 to 10, created by our community managers for about 20 of our English-language communities.

    As for what I’m using now… that’s what I’m working on — we don’t have an engagement metric yet.

    One more thought for now — I think you need to clarify what you’re measuring — site engagement and brand engagement are different. Site engagement implies some degree of brand engagement, but the brand is always bigger (unless the brand owner is totally unsuccessful!).

    And to be fair, I will add that I’d probably be okay with including click depth and duration IF ground truth shows that they really do imply engagement. However, my preference for simple over complex is not due to my own laziness, it’s the battle for industry adoption. Industry laziness, let’s say.

  • Clint

    I’m far to lazy to go back and re-read all your posts on the subject. But if I remember correctly one of the things you were looking at is engagement correlation to propensity to buy a book. Presumably now you’d be looking at other actions too – like propensity to hire you or buy a job listing etc.

    Knowing where you came from is great but only if it helps you to get where your going.

    All science strives to be predictive. A good theory is based on real-world observations of the past that makes testable predictions about something – in this case future action. A theory that stands the test of time is one where the predictions it makes are born out in experimentation or observation.

    Is conversion rate predictive? Yes (IMHO) – in a small way. I know that if I increase the conversion rate by some percentage that revenue will increase by a related amount.

    What I’m looking to see as web analytics matures is growth in its ability to be predictive. Isn’t that what all the testing is for? Shouldn’t we be using all the test data we have to build predictive models (theories) of how the user-website interaction works?

  • eric

    Clint: I actually revisited some of my original thinking around the relationship between engagement and conversion. Yes, engagement ideally leads to a conversion event, but in some ways this metric is designed to help site operators better understand their audience even if conversion is not likely to happen online (or at all … what is the conversion event at Facebook?)

    Good points about prediction, but I guess I’m not sure that we’re there yet. I think that most people are using metrics and testing today not to be predictive by rather as proof points. The emergence of multivariate as a “must do” activity (remember the RAMP!) supports optimization but not necessarily understanding, at least not always.

    And yeah, as the sector matures, the technology’s ability to be predictive should improve … but we don’t even have the basic stats baked into the applications today. We’ve come a long way, and still have a long way to go …

    Thanks again for your comments and I hope the Omniture acquisition treats you well.

  • Rishi

    This entire series has been very exciting and I hope you keep at it. I think both Nick and you make some exceptional points. However, on the subject of ‘Recency’ playing a role in engagement calculation I disagree with both (more with Nick because he believes it completely and you because you kinda agree with him). I come from the world of multichannel retail and believe engagement should be measured between the time an eCommerce site is found to when the person who found it makes a purchase. There is no need to extend engagement to future sessions. I tend to buy car tires once every 3 years but could still have a very engaged one time experience at

    The good news is that this is bleeding edge stuff and while we might not agree of nuances the fact of the matter is online metrics are making every other medium obsolete.

  • eric

    Rishi: Thanks for your enthusiasm! I disagree that engagement stops when someone makes a purchase in a multi-channel retail environment and I think you’re talking about acquisition, not engagement.

    Imagine that you’re Best Buy or Sears … a well-engaged customer is someone who will not only make the initial purchase, but return again and again to your online and off-line properties, maybe not even making a purchase some of the time, but using your site/store as a resource/place to dream about future purchases/etc. This is what my engagement metric is designed to capture — the time someone is spending on a web site ** when they’re not making a purchase **

    Or, using your car example, I agree that you could have a very engaged one time experience at, but don’t you think the site owners there are looking for ways to stay “top of mind” so that customers remain engaged enough to make the next purchase there, or to recommend discounttire to a friend or loved one? I think their “Email Specials” ( application is designed entirely to keep engagement levels high, despite long purchase cycles.

    Agreed: This is bleeding edge stuff (although again I don’t think that online metrics are making the rest of the world obsolete, but we can agree to disagree!)

  • Nick Arnett

    Initial purchases v. repeat… One of our clients is a car manufacturer. How often do people buy cars?! Yet they have one of our highly engaged communities.

    Brands live in communities (not just online communities) and this car company knows that it is very important to try to keep their customer engaged, even though they surely won’t make another (major) purchase for years. They’re building loyalty, but in the short run, they’re also nurturing a group that will attract other customers with their enthusiasm.

  • Pingback: Webanalyticsbook » WEB ANALYTICS LINKS - SEVEN

  • Rishi

    Hi Nick and Eric:

    (Eric) are you saying you can measure engagement for repeat purchases irrespective of product type (or price-point)? Are you suggesting the purchase of a 20 dollar Web Cam on should be blended with plasma TV purchase when calculating engagement? And does (should) have one number of engagement?

    I believe engagement measurement (for a retailer) should start from the moment someone enters a site to make a specific purchase to the moment the purchase has been made and should not be applied across SKU’s. But, engagement measurement SHOULD include interaction with any touchpoint that plays a role in the eventual purchase. So if a prospective buyer repeatedly visits the highly engaged communities Nick was talking about that should definitely be factored in when making engagement calculations for that car model.

    I believe, for a retailer like root level engagement measurement is misleading.


Recent Blog Posts

Slack Demystified
Adam Greco, Senior Partner

Those of you who follow my blog have come to know that when I learn a product (like Adobe SiteCatalyst), I really get to know it and evangelize it. Back in the 90′s I learned the Lotus Notes enterprise collaboration software and soon became one of the most proficient Lotus Notes developers in the world, building most of Arthur Andersen’s global internal Lotus Notes apps. In the 2000′s, I came across Omniture SiteCatalyst, and after a while had published hundreds of blog posts on Omniture’s (Adobe’s) website and my own and eventually a book! One of my favorite pastimes is finding creative ways to apply a technology to solve everyday problems or to make life easier.

Continue reading this article ... ... more from Adam Greco

Profile Website Visitors via Campaign Codes and More
Adam Greco, Senior Partner

One of the things customers ask me about is the ability to profile website visitors. Unfortunately, most visitors to websites are anonymous, so you don't know if they are young, old, rich, poor, etc. If you are lucky enough to have authentication or a login on your website, you may have some of this information, but for most of my clients the "known" percentage is relatively low. In this post, I'll share some things you can do to increase your visitor profiling by using advertising campaigns and other tools.

Continue reading this article ... ... more from Adam Greco

A Primer on Cookies in Web Analytics
Josh West, Partner

Some of you may have noticed that I don't blog as much as some of my colleagues (not to mention any names, but this one, this one, or this one). The main reason is that I'm a total nerd (just ask my wife), but in a way that is different from most analytics professionals. I don't spend all day in the data - I spend all data writing code. And it's often hard to translate code into entertaining blog posts, especially for the folks that tend to spend a lot of time reading what my partners have to say.

Continue reading this article ... ... more from Josh West

Excel Dropdowns Done Right
Tim Wilson, Partner

Do you used in-cell dropdowns in your spreadsheets? I used them all the time. It's both an ease-of-use and a data quality maneuver: clicking a dropdown is faster than typing a value, and it's really hard to mis-type a value when you're not actually typing!

Continue reading this article ... ... more from Tim Wilson

The Downfall of Tesco and the Omniscience of Analytics
Michele Kiss, Partner

Yesterday, an article in the Harvard Business Review provided food for thought for the analytics industry. In Tesco's Downfall Is a Warning to Data-Driven Retailers, author Michael Schrage ponders how a darling of the "analytics as a competitive advantage" stories, British retailer Tesco, failed so spectacularly - despite a wealth of data and customer insight.

Continue reading this article ... ... more from Michele Kiss

Creating Conversion Funnels via Segmentation
Adam Greco, Senior Partner

Regardless of what type of website you manage, it is bound to have some sort of conversion funnel. If you are an online retailer, your funnel may consist of people looking at products, selecting products, and then buying products. If you are a B2B company, your funnel may be higher-level like acquisition, research, trial and then form completion.

Continue reading this article ... ... more from Adam Greco

10 Tips for Building a Dashboard in Excel
Tim Wilson, Partner

This post has an unintentionally link bait-y post title, I realize. But, I did a quick thought experiment a few weeks ago after walking a client through the structure of a dashboard I'd built for them to see if I could come up with ten discrete tips that I'd put to use when I built it. Turns out…I can!

Continue reading this article ... ... more from Tim Wilson

Exploring Optimal Post Timing ... Redux
Tim Wilson, Partner

Back in 2012, I developed an Excel worksheet that would take post-level data exported from Facebook Insights and do a little pivot tabling on it to generate some simple heat maps that would provide a visual way to explore when, for a given page, the optimal times of day and days of the week are for posting.

Continue reading this article ... ... more from Tim Wilson

What I Love: Adobe and Google Analytics*
Tim Wilson, Partner

While in Atlanta last week for ACCELERATE, I got into the age-old discussion of "Adobe Analytics vs. Google Analytics." I'm up to my elbows in both of them, and they're both gunning for each other, so this list is a lot shorter than it would have been a couple of years ago.

Continue reading this article ... ... more from Tim Wilson

Top 5 Metrics You're Measuring Incorrectly ... or Not
Eric T. Peterson, Senior Partner

Last night as I was casually perusing the days digital analytics news - yes, yes I really do that - I came across a headline and article that got my attention. While the article's title ("Top 5 Metrics You're Measuring Incorrectly") is the sort I am used to seeing in our Buzzfeed-ified world of pithy "made you click" headlines, it was the article's author that got my attention.

Continue reading this article ... ... more from Eric T. Peterson

Bulletproof Business Requirements
John Lovett, Senior Partner

As a digital analytics professional, you've probably been tasked with collecting business requirements for measuring a new website/app/feature/etc. This seems like a task that's easy enough, but all too often people get wrapped around the axle and fail to capture what's truly important from a business users' perspective. The result is typically a great deal of wasted time, frustrated business users, and a deep-seated distrust for analytics data.

Continue reading this article ... ... more from John Lovett

Welcome to Team Demystified: Nancy Koons and Elizabeth Eckels!
Eric T. Peterson, Senior Partner

I am delighted to announce that our Team Demystified business unit is continuing to expand with the addition of Nancy Koons and Elizabeth "Smalls" Eckels. Our Team Demystified efforts are exceeding all expectation and are allowing Web Analytics Demystified to provide truly world-class services to our Enterprise-class clients at an entirely new scale.

Continue reading this article ... ... more from Eric T. Peterson

When to Use Variables vs SAINT in Adobe Analytics
Adam Greco, Senior Partner

In one of my recent Adobe SiteCatalyst (Analytics) "Top Gun" training classes, a student asked me the following question: When should you use a variable (i.e. eVar or sProp) vs. using SAINT Classifications? This is an interesting question that comes up often, so I thought I would share my thoughts on this and my rules of thumb on the topic.

Continue reading this article ... ... more from Adam Greco

5 Tips for #ACCELERATE Exceptionalism
Tim Wilson, Partner

Next month's ACCELERATE conference in Atlanta on September 18th will be the fifth - FIFTH!!! - one. I wish I could say I'd attended every one, but, sadly, I missed Boston due to a recent job change at the time. I was there in San Francisco in 2010, I made a day trip to Chicago in 2011, and I personally scheduled fantastic weather for Columbus in 2013.

Continue reading this article ... ... more from Tim Wilson

I've Become Aware that Awareness Is a #measure Bugaboo
Tim Wilson, Partner

A Big Question that social and digital media marketers grapple with constantly, whether they realize it or not: Is "awareness" a valid objective for marketing activity?

I've gotten into more than a few heated debates that, at their core, center around this question. Some of those debates have been with myself (those are the ones where I most need a skilled moderator!).

Continue reading this article ... ... more from Tim Wilson

Advanced Conversion Syntax Merchandising
Adam Greco, Senior Partner

As I have mentioned in the past, one of the Adobe SiteCatalyst (Analytics) topics I loathe talking about is Product Merchandising. Product Merchandising is complicated and often leaves people scratching their heads in my "Top Gun" training classes. However, many people have mentioned to me that my previous post on Product Merchandising eVars helped them a lot so I am going to continue sharing information on this topic.

Continue reading this article ... ... more from Adam Greco

Team Demystified Update from Wendy Greco
Eric T. Peterson, Senior Partner

When Eric Peterson asked me to lead Team Demystified a year ago, I couldn't say no! Having seen how hard all of the Web Analytics Demystified partners work and that they are still not able to keep up with the demand of clients for their services, it made sense for Web Analytics Demystified to find another way to scale their services. Since the Demystified team knows all of the best people in our industry and has tons of great clients, it is not surprising that our new Team Demystified venture has taken off as quickly as it has.

Continue reading this article ... ... more from Eric T. Peterson

SiteCatalyst Unannounced Features
Adam Greco, Senior Partner

Lately, Adobe has been sneaking in some cool new features into the SiteCatalyst product and doing it without much fanfare. While I am sure these are buried somewhere in release notes, I thought I'd call out two of them that I really like, so you know that they are there.

Continue reading this article ... ... more from Adam Greco

Hello. I'm a Radical Analytics Pragmatist
Tim Wilson, Partner

I was reading a post last week by one of the Big Names in web analytics…and it royally pissed me off. I started to comment and then thought, "Why pick a fight?" We've had more than enough of those for our little industry over the past few years. So I let it go.

Except I didn't let it go.

Continue reading this article ... ... more from Tim Wilson

Competitor Pricing Analysis
Adam Greco, Senior Partner

One of my newest clients is in a highly competitive business in which they sell similar products as other retailers. These days, many online retailers have a hunch that they are being "Amazon-ed," which they define as visitors finding products on their website and then going to see if they can get it cheaper/faster on This client was attempting to use time spent on page as a way to tell if/when visitors were leaving their site to go price shopping.

Continue reading this article ... ... more from Adam Greco

How to Deliver Better Recommendations: Forecast the Impact!
Michele Kiss, Partner

One of the most valuable ways to be sure your recommendations are heard is to forecast the impact of your proposal. Consider what is more likely to be heard: "I think we should do X ..." vs "I think we should do X, and with a 2% increase in conversion, that would drive a $1MM increase in revenue ..."

Continue reading this article ... ... more from Michele Kiss

ACCELERATE 2014 "Advanced Analytics Education" Classes Posted
Eric T. Peterson, Senior Partner

I am delighted to share the news that our 2014 "Advanced Analytics Education" classes have been posted and are available for registration. We expanded our offering this year and will be offering four concurrent analytics and optimization training sessions from all of the Web Analytics Demystified Partners and Senior Partners on September 16th and 17th at the Cobb Galaria in Atlanta, Georgia.

Continue reading this article ... ... more from Eric T. Peterson

Product Cart Addition Sequence
Adam Greco, Senior Partner

In working with a client recently, an interesting question arose around cart additions. This client wanted to know the order in which visitors were adding products to the shopping cart. Which products tended to be added first, second third, etc.? They also wanted to know which products were added after a specific product was added to the cart (i.e. if a visitor adds product A, what is the next product they tend to add?). Finally, they wondered which cart add product combinations most often lead to orders.

Continue reading this article ... ... more from Adam Greco

7 Tips For Delivering Better Analytics Recommendations
Michele Kiss, Partner

As an analyst, your value is not just in the data you deliver, but in the insight and recommendations you can provide. But what is an analyst to do when those recommendations seem to fall on deaf ears?

Continue reading this article ... ... more from Michele Kiss

Overcoming The Analyst Curse: DON'T Show Your Math!
Michele Kiss, Partner

If I could give one piece of advice to an aspiring analyst, it would be this: Stop showing your "math". A tendency towards "TMI deliverables" is common, especially in newer analysts. However, while analysts typically do this in an attempt to demonstrate credibility ("See? I used all the right data and methods!") they do so at the expense of actually being heard.

Continue reading this article ... ... more from Michele Kiss

Making Tables of Numbers Comprehensible
Tim Wilson, Partner

I'm always amazed (read: dismayed) when I see the results of an analysis presented with a key set of the results delivered as a raw table of numbers. It is impossible to instantly comprehend a data table that has more than 3 or 4 rows and 3 or 4 columns. And, "instant comprehension" should be the goal of any presentation of information - it's the hook that gets your audience's brain wrapped around the material and ready to ponder it more deeply.

Continue reading this article ... ... more from Tim Wilson

Automating the Cleanup of Facebook Insights Exports
Tim Wilson, Partner

This post (the download, really - it's not much of a post) is about dealing with exports from Facebook Insights. If that's not something you do, skip it. Go back to Facebook and watch some cat videos. If you are in a situation where you get data about your Facebook page by exporting .csv or .xls files from the Facebook Insights web interface, then you probably sometimes think you need a 52" monitor to manage the horizontal scrolling.

Continue reading this article ... ... more from Tim Wilson

The Recent Forrester Wave on Web Analytics ... is Wrong
Eric T. Peterson, Senior Partner

Having worked as an industry analyst back in the day I still find myself interested in what the analyst community has to say about web analytics, especially when it comes to vendor evaluation. The evaluations are interesting because of the sheer amount of work that goes into them in an attempt to distill entire companies down into simple infographics, tables, and single paragraph summaries.

Continue reading this article ... ... more from Eric T. Peterson

Funnel Visualizations That Make Sense
Tim Wilson, Partner

Funnels, as a concept, make some sense (although someone once made a good argument that they make no sense, since, when the concept is applied by marketers, the funnel is really more a "very, very leaky funnel," which would be a worthless funnel - real-world funnels get all of a liquid from a wide opening through a smaller spout; but, let's not quibble).

Continue reading this article ... ... more from Tim Wilson

Reenergizing Your Web Analytics Program & Implementation
Adam Greco, Senior Partner

Those of you who have read my blog posts (and book) over the years, know that I have lots of opinions when it comes to web analytics, web analytics implementations and especially those using Adobe Analytics. Whenever possible, I try to impart lessons I have learned during my web analytics career so you can improve things at your organization.

Continue reading this article ... ... more from Adam Greco

Registration for ACCELERATE 2014 is now open
Eric T. Peterson, Senior Partner

I am excited to announce that registration for ACCELERATE 2014 on September 18th in Atlanta, Georgia is now open. You can learn more about the event and our unique "Ten Tips in Twenty Minutes" format on our ACCELERATE mini-site, and we plan to have registration open for our Advanced Analytics Education pre-ACCELERATE training sessions in the coming weeks.

Continue reading this article ... ... more from Eric T. Peterson

Current Order Value
Adam Greco, Senior Partner

I recently had a client pose an interesting question related to their shopping cart. They wanted to know the distribution of money its visitors were bringing with them to each step of the shopping cart funnel.

Continue reading this article ... ... more from Adam Greco

A Guide to Segment Sharing in Adobe Analytics
Tim Wilson, Partner

Over the past year, I've run into situations multiple times where I wanted an Adobe Analytics segment to be available in multiple Adobe Analytics platforms. It turns out…that's not as easy as it sounds. I actually went multiple rounds with Client Care once trying to get it figured out. And, I've found "the answer" on more than one occasion, only to later realize that that answer was a bit misguided.

Continue reading this article ... ... more from Tim Wilson

Currencies & Exchange Rates
Adam Greco, Senior Partner

If your web analytics work covers websites or apps that span different countries, there are some important aspects of Adobe SiteCatalyst (Analytics) that you must know. In this post, I will share some of the things I have learned over the years related to currencies and exchange rates in SiteCatalyst.

Continue reading this article ... ... more from Adam Greco

Linking Authenticated Visitors Across Devices
Adam Greco, Senior Partner

In the last few years, people have become accustomed to using multiple digital devices simultaneously. While watching the recent winter Olympics, consumers might be on the Olympics website, while also using native mobile or tablet apps. As a result, some of my clients have asked me whether it is possible to link visits and paths across these devices so they can see cross-device paths and other behaviors.

Continue reading this article ... ... more from Adam Greco

The 80/20 Rule for Analytics Teams
Eric T. Peterson, Senior Partner

I had the pleasure last week of visiting with one of Web Analytics Demystified's longest-standing and, at least from a digital analytical perspective, most successful clients. The team has grown tremendously over the years in terms of size and, more importantly, stature within the broader multi-channel business and has become one of the most productive and mature digital analytics groups that I personally am aware of across the industry.

Continue reading this article ... ... more from Eric T. Peterson

Ten Things You Should ALWAYS Do (or Not Do) in Excel
Tim Wilson, Partner

Last week I was surprised by the Twitter conversation a fairly innocuous vent-via-Twitter tweet started, with several people noting that they had no idea you could simple turn off the gridlines.

Continue reading this article ... ... more from Tim Wilson

Omni Man (and Team Demystified) Needs You!
Adam Greco, Senior Partner

As someone in the web analytics field, you probably hear how lucky you are due to the fact that there are always web analytics jobs available. When the rest of the country is looking for work and you get daily calls from recruiters, it isn't a bad position to be in! At Web Analytics Demystified, we have more than doubled in the past year and still cannot keep up with the demand, so I am reaching out to you ...

Continue reading this article ... ... more from Adam Greco

A Useful Framework for Social Media "Engagements"
Tim Wilson, Partner

Whether you have a single toe dipped in the waters of social media analytics or are fully submerged and drowning, you've almost certainly grappled with "engagement." This post isn't going to answer the question "Is engagement ROI?" ...

Continue reading this article ... ... more from Tim Wilson

It's not about "Big Data", it's about the "RIGHT data"
Michele Kiss, Partner

Unless you've been living under a rock, you have heard (and perhaps grown tired) of the buzzword "big data." But in attempts to chase the "next shiny thing", companies may focus too much on "big data" rather than the "right data."

Continue reading this article ... ... more from Michele Kiss

Eric T.








Contact Us

You can contact Web Analytics Demystified day or night via email or by reaching out to one of our Partners directly.

» Contact Information

Web Analytics Demystified, Inc.
P.O. Box 13303
Portland, OR 97213
(503) 282-2601

Useful Links