Want to learn the Adobe Analytics and Testing platforms from the team at Web Analytics Demystified? Sign up for our April 2014 "Adobe Intensive" and learn Enterprise Analytics from the best in the business!


Nick Arnett challenges my visitor engagement calculation

Published by Eric T. Peterson on October 25, 2007 All posts from Eric T. Peterson

Nick Arnett from MCC Media (and one of the creators of Buzzmetrics) posted a very well though-out and moderately critical assessment of the visitor engagement calculation I wrote about earlier this week. Nick makes some great points and I thought it was worth addressing them while I prepare the follow-up post that shows off some of what the metric can do. My comments are preceded by ETP and Nick’s statements are in italics.

Definitely thought-provoking, Eric… I’m deep into this issue, although focused specifically on community sites.Overall, your approach doesn’t work for me on two main counts — it is too complicated (and thus unlikely to become any sort of standard) and doesn’t generate a metric that allows different sites to be compared. The latter is arguable, since standardized weightings could yield comparable numbers, but I think that’s excess complication also.

ETP: I’m sorry the calculation doesn’t work for you but I do appreciate your thoughts on the subject. Regarding it being too complicated, compared to what? Compared to “simple” metrics like bounce rate and average page views per session? Or compared to the technology you built to power Buzzmetrics? I guess I separate the complexity of making the calculation from one’s ability to actually explain the calculation.

ETP: Regarding using this metric to compare different sites … as I mentioned in the post, I don’t think there is “one” measure of visitor engagement and thusly trying to compare sites is probably a futile effort at best. I suppose you could remove the Brand, Feedback, Subscription and Interaction indices and come up with a standard set of threshold values for specific vertical markets, but I’m not sure that is really the best use of this calculation.

Is there any ground truth behind this? In case that isn’t clear, do you have any sort of primary market data for engagement that correlates with the output of your engagement metric?

ETP: Hmmm, here I’m not sure what you mean. What kind of primary market data is actually able to identify “engaged” visitors? Because I am able to see individuals interacting with my web site, I did talk to a handful of people based on their engagement scores when I was doing the original work on this metric, and some of their feedback was critical to tweaking the metric and inputs to its current state. But other than that I’d love to see the primary data you’re talking about if you’re able to share it!

As I’ve dug into the issues and our data (about five dozen communities ranging from very large to very small), I keep coming back to two main indicators of engagement — return rates and proactive behavior. If visitors don’t visit regularly and do something other than passive page viewing, I have a tough time including them in any measurement of community engagement.

ETP: Exactly why the Recency Index and Interaction Index are included in the calculation, but I disagree with your assessment that these are the only measures of engagement. I’m not sure exactly how I would determine that someone was only “passively” viewing pages, and again this metric is not designed to be a measure of “community engagement” but rather visitor engagement more broadly considered.

Some point-by-point thoughts…

Click-depth index — this is a place where ground truth really matters, I think. I’m not comfortable with the assumption that more clicks per session means greater engagement. Do we know enough about browsing behavior to know that this is true? And of course there’s the old problem of bad design resulting in more clicks… but when I consider that issue, I tend to think that if people show willingness to click through a bad design, maybe that means they really are engaged! Perhaps we should all include some known bad design… ;-)

ETP: I haven’t seen anything that says that more clicks means less engagement but I agree that confused people might generate more clicks. But I think it’s unlikely that confused and frustrated people would return, complete defined events, subscribe to blogs, etc. so despite your assertion that the metric is complex, multiple inputs are designed to mitigate those that may be confusing.

ETP: You do, however, make an excellent argument for not using something as simple as “click-depth” or “average depth of visit” as your sole measure of engagement.

I have pretty much the same questions about duration. Is there good, objective evidence that session duration correlates to engagement? There are visitors with long-duration visits who don’t visit regularly and don’t do anything proactive… I can’t see including them in any measurement of engagement.

ETP: It sorta depends on your definition of engagement, doesn’t it? But see my comment above about why a single measure like duration (as in Nielsen’s Time Spent ranking system) is perhaps inappropriate on its own to determine engagement.

Recency makes perfect sense to me — the fact that engaged visitors return often is practically a tautology. I would be very skeptical of calling anybody engaged if they aren’t returning regularly.

ETP: What about first time visitor? Are you saying you can’t be engaged on the first visit to a site? I agree that regular returns are a good indicator of engagement, but in my analysis the metric I’ve defined is able to resolve first time visitors into several engagement segments which I personally have found quite useful.

Your Brand Index is a great piece of data, but I don’t believe it works in a metric intended to compare sites. Language is too subtle and ambiguous to infer engagement from search terms. I spent years in the search engine and related markets, which gave me a great appreciation for the fact that what sometimes seems obvious about language isn’t. When people search on brand-related terms, it indicates *reach* to me, not engagement. I’m unwilling to assume anything more than brand awareness. People search on things they dislike, but that doesn’t mean they’re engaged with the subject they’re searching. And my data shows that visitors who show many other indications of engagement actually search *less* often.

ETP: Same comment about this metric not being specifically designed for comparing sites. I know that is the uber-goal for lots of folks in the world, it’s just not necessarily my goal or the best use for my engagement calculation.

ETP: Doesn’t “reach” plus “action” equal engagement? I haven’t spent years in search and related markets, but I struggle to believe that people searching on brands they dislike are not somehow engaged. Again, maybe this is a semantic issue arising from conflicting definitions of engagement.

ETP: Because the calculation is designed to be made over the lifetime of visitor sessions, searching less often is not a problem. I guess I more-or-less expect that the “direct” component of the Brand Index will be more important over time with truly engaged visitors (who wouldn’t be as likely to go back to Google and search on a branded term.)

Counting brand-related searches makes sense if we’re measuring *brand* engagement.
Counting direct (non-referred) visits makes sense if we’re measuring *site* engagement.

Counting both in the same metric doesn’t make sense to me. I don’t think we should even be talking here about ways to measure brand engagement… because I believe that’s well beyond the scope of site analytics. It requires massive monitoring systems along the lines of BuzzMetrics. (I’m the primary inventor of one of their systems.)

ETP: I’m not differentiating *brand* and *site* engagement since I’m trying to calculate an operational measure of ongoing *visitor* engagement. Brand is just a component, and the site is the measurement point. I think I understand your desire to differentiate the two given your background with Nielsen but I’m not trying to do the same thing.

One more problem with the Brand Index — people will argue all day long about what terms are appropriate to include… and there’s a strong incentive for site owners to err on the side of too many terms if their success is being measured by this metric. For example, you included “web site measurement hacks” in your list… but that could be a generic term. Is “Web Analytics Wednesday” really your brand? Or is it the WAA’s? I don’t want to argue which it is, just point out the kind of ambiguity that is inevitable.

ETP: Here I agree with you, coming up with a reasonable list is not easy, but web analytics is hard so at some point you have to make some tough decisions. “Web Site Measurement Hacks” is a book title and a branded term but could be a generic phrase. “Web Analytics Wednesday” is a branded Web Analytics Demystified term and has nothing to do with the Web Analytics Association. I don’t think there is that much ambiguity at the site level, at least in my experience.

Your Feedback Index is a specific instance of what I think of as the general principle of tracking proactive behaviors — what you seem to be getting at in your Interaction Index. In communities, visitors have many such opportunities — posting, editing, tagging, voting and so forth. I decided very early in this work to just give people one point for each such proactive action, despite the temptation to weight them (which would violate the need to keep things simple). These are the behaviors that make a community work; sites that aren’t based on user-generated content can exist without them.

ETP: Same comment about this calculation perhaps not being what you’re looking for vis-a-vis communities.

Your session focus really got me thinking. Does it make more sense to count the number of sessions in which visitors signal engagement or the number of actual such signals? I think it’s close to a toss-up, but so far, our ground truth suggests the latter — the number of proactive actions correlates better to our subjective estimates of engagement… but among our future tasks is to establish better ground truth. So far, I’m just using our community manager’s collective subjective scoring… but it correlates quite well to all but our smallest communities.

ETP: I agree, it’s probably a toss-up but if you think about the calculation all it does is count the number of signals. Long sessions are a signal, deep sessions are a signal, frequent sessions are signal, etc. I know you don’t like anything but recency and interaction but we can agree to disagree on this point. I’d love to hear about your “ground truthing” efforts and I’ll try and keep you appraised of mine.

The subscriber index doesn’t work for me because we want to be able to compare communities regardless of whether or not visitors are able to subscribe, join, become members or what-have-you. Some of our clients — e.g., a large professional sports organization — allow full participation without any need to sign up. Also, as I’ll explain below, I’ve found a strong negative correlation between highly active visitors and RSS subscribers.

ETP: Again, not designed for comparison (and at this point no wonder you don’t like my calculation!) I’d love to see the negative correlation data for RSS and yes, if you don’t have a subscription it doesn’t make sense to assign a negative penalty.

Finally, I guess I’ll toss out one of the ideas that I’m working with — segmenting visitors by proactivity.

In several ways, communities (and most web sites, I suspect) have a bimodal distribution of users. There’s typically a relatively large “Core” group that visits often, looks at lots of pages and does a lot of proactive stuff. There’s a middle ground, which I’m calling “Lingerers,” of people who fall into the 10th to 80th percentiles of such activities. Third and last, there’s a large contingent in the 0th percentile, people who might have one or two activities in a given time period, which I call the “Drive-bys.” In our communities, the Drive-bys are the largest group, but the Core usually is a bigger group than the Lingerers. What this says to me is that people tend to engage a lot or hardly at all — there isn’t much middle ground. I’ve been focusing on the Core’s relationship to the whole community for my engagement measurements. That’s what seems to correlate best to what little ground truth we have.

ETP: I am seeing a more normal distribution, especially as visitors return a third time, but it is definitely left-skewed towards lower levels of engagement. I’ll try and highlight this when I show some data that highlights the calculation in action. And since I’m not working on a community proper, I’ve found myself focusing on my middle group (“Moderately Engaged”) and trying to determine what I might be able to do to shift them up to “Highly Engaged”.

Overall, I’ve found that the Drive-bys and Lingers exhibit fairly similar behavior, but the Core is different. The Core visitors post more, search less and use RSS far less (so much for “subscribing” to RSS as a positive indicator of engagement!)

ETP: Your assessment of RSS being a poor indicator of engagement runs contrary to popular opinion (why would you subscribe to a RSS feed or email newsletter if you weren’t engaged??!) Perhaps this result is uncovering a flaw in your engagement calculation?

This post is getting long… so I’ll wrap it up (but ready to discuss further, of course) by repeating myself. I think any sort of engagement metric has to be backed up by demonstrating correlation to some kind of ground truth. Otherwise, it’s a mental exercise that runs the risk of having little relevance to the marketplace.

ETP: You keep coming back to the notion of “ground truth” but surely you recognize that this is A) extraordinarily difficult to come by and B) if we had it easily available we wouldn’t need a measure of engagement. I would love to see your “ground truth” data and talk about how you’re generating that, but unless I’m missing something it sounds a little impractical for widespread use. Still, I appreciate your feedback and very thoughtful comments and will endeavor to demonstrate the correlation between my calculation and “truly engaged” visitors.


Man, talk about a long post! What do you think? Is Nick more right than wrong? Are you focusing on communities and have the same concerns? Do you have similar concerns about your site? The conversation is almost as interesting as the metric and resulting analysis in my opinion so please, comment away!

Share this blog post ...
Twitter0LinkedIn0Facebook0Google+0Email

Categorized under Engagement, Web Analytics 2.0, Web Analytics People

  • http://blog.instantcognition.com/ Clint

    Interesting discussion.
    I guess I would say that Mr. Arnett comes from a background of providing comparative data (Nielsen & Buzzmetrics) so a KPI that is truly site/business centric just doesn’t make sense to him.

    Ground Truth appears to be a geology term referring to empirical review. So maybe he is talking about primary research (field-testing, usability, etc).

    Interesting that he equates web analytics data to secondary research…

    The test of the engagement index is it’s ability to predict some desired behavior (moving visitors up the engagement chain by x% increases their propensity to by y%.

    Based on some comments you’ve made in your posts on the subject I imagine that you are busy trying to prove out at least the correlation aspects of engagement in a statistically meaningful way so we’ll just have to wait and see…

  • http://www.secondtree.com/data Tim Wilson

    You both make good points, but I’m going to have to come down on your side, Eric, as being “more right.” What I like, is that the formula is really more of a framework — it gives the components, but then leaves it up to the specific analyst/site to determine how to apply them.

    While Nick points out this means “no standardization,” this is absolutely the sort of metric that has one and only one golden standard: the site being measured. The type of site — its audience and its goals — are really going to dictate what massaging of the formula makes the most sense. Then, if an initiative is under way to increase engagement, a qualitative assessment of “how much do we expect this initiative to increase it?” can be asked at the outset, and the engagement score can be used to quantify whether that sort of impact is achieved.

    A final thought: “Complicated” and “Multidimensional” are not the same thing. In this case…your proposal is both. BUT, that also means that, to really impact the metric, you have to take well thought-out actions that may, themselves, be difficult. I’d much rather have that than a simplistic metric that can be improved semi-artificially.

    I agree with Nick that the engagement score might not be something that gets broadly shared across an entire organization. But, on the other hand, it might. 4 or 5 years ago, I worked with a lady who developed an internal “Google Index” that looked at natural search rankings, referrals, and page rank data to develop a single score for how well the site at the company was “doing” SEO-wise. Under the hood, she was looking at each of the components. But, to all of the people she was relying on for content support, technology support, and design/development support, this index was something they could rally around. And…they bought into it. It was a very effective way to drive action. And, like your engagement formula, it was not easily manipulated — so, when things flatlined or dipped, there wasn’t any hiding of the issue.

  • http://www.webanalyticsdemystified.com eric

    Clint: Yeah, I was confused by the use of ground truth and his relegation of web analytics to secondary research but Nick’s work history is probably the reason. I like his perspective.

    I want to challenge your assertion that “the test of the engagement index is it’s ability to predict some desired behavior” … what metrics in web analytics are predictive, in your opinion? Is conversion rate predictive, or merely suggestive? I think that customer satisfaction (per my thread with Larry Freed) can be predictive, but the ACSI is a more complex input than my engagement index by a long shot.

    I’d love to hear your thoughts on this since I’ve always had concerns about the use of the term “predictive” in web analytics.

  • http://www.liveworld.com Nick Arnett

    I should clear up some confusion about my background… My current job is Director of Business Intelligence Services at Liveworld Inc. We host and manage communities for eBay, HBO, QVC, the NBA and many others. MCC Media is my own company that started life as Multimedia Computing Corp. back in 1988 when I was publishing an industry newsletter and consulting. My connection to BuzzMetrics is that they bought the stuff I invented at Opion, a company I co-founded. I don’t know how much of what they currently use is what they invented or what I did. I reinvented some similar technology to create Senti-Metrics Partners a few years ago, which Liveworld acquired.

    As for “ground truth,” in this situation I think it needs to be a survey of site visitors, across multiple sites (to uncover a variety of engagement levels), with questions designed to determine how engaged they are — and bit more subtle than just “On a scale of one to ten, how engaged are you?”

    I also should say that I don’t object to the existence of metrics that don’t support comparisons… but my present goal is to develop one that does.

  • http://www.webanalyticsdemystified.com eric

    Tim: EXACTLY! The calculation is fully intended to be a framework, not a dictum, since I do think that every site will have slightly different needs regarding their measure of visitor engagement. I know people are looking for some “magic number” that will replace page views to compare sites, but I just don’t think it works that way.

    Great insight that actually “using” this metric may be difficult. Have you heard that I think web analytics is hard? I guess I’m just not sold on simple metrics like bounce rate being “immediately useful” (to quote another consultant) Making web analytics work often requires the well thought-out actions that you cite, and “not easily manipulated” is the key.

    I hope to use the next series of posts on the subject to demonstrate that using this metric is not nearly as complicated as you and Nick believe. But, I admit that I haven’t given folks much to work with (outside of the presentation I gave in D.C. and the slides in my Web Analytics 2.0 presentation so hopefully I will be able to convince you.

    Thanks a ton for your comment!

  • http://www.webanalyticsdemystified.com eric

    Nick: Thanks for clarifying your background! Sounds like you have access to some amazing data on communities to play with, and damn I would LOVE to see my engagement calculation run against the data you have.

    Regarding “ground truth” … thanks for clarifying that. I was going to ask Larry Freed at ForeSee Results if he’d help me collect some of that data (Larry’s a great guy!) What, in your opinion, would be the ** best ** questions to ask people to gauge their level of engagement? I agree, asking “are you engaged?” is probably not the best since few people have a good working definition of that term.

    Given your day job and background I don’t doubt that a comparative metric would be helpful. What are you using today to compare engagement across the communities you manage?

    Thanks again for challenging my thoughts and engaging in the conversation!

  • http://www.liveworld.com Nick Arnett

    (It would be great if you could edit the top of this page so that it doesn’t say I’m a creator of BuzzMetrics… that give the wrong impression.)

    I haven’t started thinking too hard about the right survey questions. Avinash pushed me on this issue a couple of weeks ago and I’m thinking we might hire somebody to develop the survey, partly for the credibility that couild bring.

    My current ground truth is a rating from 1 to 10, created by our community managers for about 20 of our English-language communities.

    As for what I’m using now… that’s what I’m working on — we don’t have an engagement metric yet.

    One more thought for now — I think you need to clarify what you’re measuring — site engagement and brand engagement are different. Site engagement implies some degree of brand engagement, but the brand is always bigger (unless the brand owner is totally unsuccessful!).

    And to be fair, I will add that I’d probably be okay with including click depth and duration IF ground truth shows that they really do imply engagement. However, my preference for simple over complex is not due to my own laziness, it’s the battle for industry adoption. Industry laziness, let’s say.

  • http://blog.instantcognition.com/ Clint

    I’m far to lazy to go back and re-read all your posts on the subject. But if I remember correctly one of the things you were looking at is engagement correlation to propensity to buy a book. Presumably now you’d be looking at other actions too – like propensity to hire you or buy a job listing etc.

    Knowing where you came from is great but only if it helps you to get where your going.

    All science strives to be predictive. A good theory is based on real-world observations of the past that makes testable predictions about something – in this case future action. A theory that stands the test of time is one where the predictions it makes are born out in experimentation or observation.

    Is conversion rate predictive? Yes (IMHO) – in a small way. I know that if I increase the conversion rate by some percentage that revenue will increase by a related amount.

    What I’m looking to see as web analytics matures is growth in its ability to be predictive. Isn’t that what all the testing is for? Shouldn’t we be using all the test data we have to build predictive models (theories) of how the user-website interaction works?

  • http://www.webanalyticsdemystified.com eric

    Clint: I actually revisited some of my original thinking around the relationship between engagement and conversion. Yes, engagement ideally leads to a conversion event, but in some ways this metric is designed to help site operators better understand their audience even if conversion is not likely to happen online (or at all … what is the conversion event at Facebook?)

    Good points about prediction, but I guess I’m not sure that we’re there yet. I think that most people are using metrics and testing today not to be predictive by rather as proof points. The emergence of multivariate as a “must do” activity (remember the RAMP!) supports optimization but not necessarily understanding, at least not always.

    And yeah, as the sector matures, the technology’s ability to be predictive should improve … but we don’t even have the basic stats baked into the applications today. We’ve come a long way, and still have a long way to go …

    Thanks again for your comments and I hope the Omniture acquisition treats you well.

  • http://betterretail.wordpress.com/ Rishi

    This entire series has been very exciting and I hope you keep at it. I think both Nick and you make some exceptional points. However, on the subject of ‘Recency’ playing a role in engagement calculation I disagree with both (more with Nick because he believes it completely and you because you kinda agree with him). I come from the world of multichannel retail and believe engagement should be measured between the time an eCommerce site is found to when the person who found it makes a purchase. There is no need to extend engagement to future sessions. I tend to buy car tires once every 3 years but could still have a very engaged one time experience at discounttire.com.

    The good news is that this is bleeding edge stuff and while we might not agree of nuances the fact of the matter is online metrics are making every other medium obsolete.

  • http://www.webanalyticsdemystified.com eric

    Rishi: Thanks for your enthusiasm! I disagree that engagement stops when someone makes a purchase in a multi-channel retail environment and I think you’re talking about acquisition, not engagement.

    Imagine that you’re Best Buy or Sears … a well-engaged customer is someone who will not only make the initial purchase, but return again and again to your online and off-line properties, maybe not even making a purchase some of the time, but using your site/store as a resource/place to dream about future purchases/etc. This is what my engagement metric is designed to capture — the time someone is spending on a web site ** when they’re not making a purchase **

    Or, using your car example, I agree that you could have a very engaged one time experience at discounttire.com, but don’t you think the site owners there are looking for ways to stay “top of mind” so that customers remain engaged enough to make the next purchase there, or to recommend discounttire to a friend or loved one? I think their “Email Specials” (http://www.dtcspecials.com/) application is designed entirely to keep engagement levels high, despite long purchase cycles.

    Agreed: This is bleeding edge stuff (although again I don’t think that online metrics are making the rest of the world obsolete, but we can agree to disagree!)

  • http://www.liveworld.com Nick Arnett

    Initial purchases v. repeat… One of our clients is a car manufacturer. How often do people buy cars?! Yet they have one of our highly engaged communities.

    Brands live in communities (not just online communities) and this car company knows that it is very important to try to keep their customer engaged, even though they surely won’t make another (major) purchase for years. They’re building loyalty, but in the short run, they’re also nurturing a group that will attract other customers with their enthusiasm.

  • Pingback: Webanalyticsbook » WEB ANALYTICS LINKS - SEVEN

  • http://betterretail.wordpress.com/ Rishi

    Hi Nick and Eric:

    (Eric) are you saying you can measure engagement for repeat purchases irrespective of product type (or price-point)? Are you suggesting the purchase of a 20 dollar Web Cam on bestbuy.com should be blended with plasma TV purchase when calculating engagement? And does (should) bestbuy.com have one number of engagement?

    I believe engagement measurement (for a retailer) should start from the moment someone enters a site to make a specific purchase to the moment the purchase has been made and should not be applied across SKU’s. But, engagement measurement SHOULD include interaction with any touchpoint that plays a role in the eventual purchase. So if a prospective buyer repeatedly visits the highly engaged communities Nick was talking about that should definitely be factored in when making engagement calculations for that car model.

    I believe, for a retailer like JoAnn.com root level engagement measurement is misleading.

 


Recent Blog Posts

Current Order Value
Adam Greco, Senior Partner

I recently had a client pose an interesting question related to their shopping cart. They wanted to know the distribution of money its visitors were bringing with them to each step of the shopping cart funnel.

Continue reading this article ... ... more from Adam Greco

A Guide to Segment Sharing in Adobe Analytics
Tim Wilson, Partner

Over the past year, I've run into situations multiple times where I wanted an Adobe Analytics segment to be available in multiple Adobe Analytics platforms. It turns out…that's not as easy as it sounds. I actually went multiple rounds with Client Care once trying to get it figured out. And, I’ve found "the answer" on more than one occasion, only to later realize that that answer was a bit misguided.

Continue reading this article ... ... more from Tim Wilson

Currencies & Exchange Rates
Adam Greco, Senior Partner

If your web analytics work covers websites or apps that span different countries, there are some important aspects of Adobe SiteCatalyst (Analytics) that you must know. In this post, I will share some of the things I have learned over the years related to currencies and exchange rates in SiteCatalyst.

Continue reading this article ... ... more from Adam Greco

Linking Authenticated Visitors Across Devices
Adam Greco, Senior Partner

In the last few years, people have become accustomed to using multiple digital devices simultaneously. While watching the recent winter Olympics, consumers might be on the Olympics website, while also using native mobile or tablet apps. As a result, some of my clients have asked me whether it is possible to link visits and paths across these devices so they can see cross-device paths and other behaviors.

Continue reading this article ... ... more from Adam Greco

The 80/20 Rule for Analytics Teams
Eric T. Peterson, Senior Partner

I had the pleasure last week of visiting with one of Web Analytics Demystified’s longest-standing and, at least from a digital analytical perspective, most successful clients. The team has grown tremendously over the years in terms of size and, more importantly, stature within the broader multi-channel business and has become one of the most productive and mature digital analytics groups that I personally am aware of across the industry.

Continue reading this article ... ... more from Eric T. Peterson

Ten Things You Should ALWAYS Do (or Not Do) in Excel
Tim Wilson, Partner

Last week I was surprised by the Twitter conversation a fairly innocuous vent-via-Twitter tweet started, with several people noting that they had no idea you could simple turn off the gridlines.

Continue reading this article ... ... more from Tim Wilson

Omni Man (and Team Demystified) Needs You!
Adam Greco, Senior Partner

As someone in the web analytics field, you probably hear how lucky you are due to the fact that there are always web analytics jobs available. When the rest of the country is looking for work and you get daily calls from recruiters, it isn’t a bad position to be in! At Web Analytics Demystified, we have more than doubled in the past year and still cannot keep up with the demand, so I am reaching out to you ...

Continue reading this article ... ... more from Adam Greco

A Useful Framework for Social Media "Engagements"
Tim Wilson, Partner

Whether you have a single toe dipped in the waters of social media analytics or are fully submerged and drowning, you’ve almost certainly grappled with "engagement." This post isn’t going to answer the question "Is engagement ROI?" ...

Continue reading this article ... ... more from Tim Wilson

It’s not about "Big Data", it’s about the "RIGHT data"
Michele Kiss, Partner

Unless you’ve been living under a rock, you have heard (and perhaps grown tired) of the buzzword "big data." But in attempts to chase the "next shiny thing", companies may focus too much on "big data" rather than the "right data."

Continue reading this article ... ... more from Michele Kiss

Eric T.
Peterson

John
Lovett

Adam
Greco

Brian
Hawkins

Kevin
Willeitner

Michele
Kiss

Josh
West

Tim
Wilson

Contact Us

You can contact Web Analytics Demystified day or night via email or by reaching out to one of our Partners directly.

» Contact Information

Web Analytics Demystified, Inc.
P.O. Box 13303
Portland, OR 97213
(503) 282-2601


Useful Links