Saturday, October 5, 2024

Key Issues in Web Anaytics Implementation and Rollout

I spent most of the last two weeks on the road – teaching down in San Diego at the WebSideStory DMU and manning a booth at the Omniture Summit. Going out and talking to so many people is always interesting (if a bit daunting for an essentially shy guy) – and if you take out the travel parts it was all pretty enjoyable.

As I think back on the conversations, there were a couple themes that seemed to come up quite a bit. One thing I heard over and over was how many companies struggle in the tagging and implementation phase of web analytics. Information which convinced Paul Legutko (our East Coast VP of Analytics) and I that we should develop more formal implementation checklists for both Omniture and WebSideStory rollouts. That’s something we’re going to be working on, but it also reinforced the direction for my next blog.

Last time, I put forth some pretty tentative (at least for me) views on placing a web measurement department in an organization. Today, I wanted to discuss some of the major problems and mistakes I often see when companies roll-out web analytics.

1. The Plain-Vanilla Tag

Tool vendors often bring this problem on themselves and their clients by overselling the ease of putting a tag on a page. Yes, you can have measurement in an hour. Will it meet your real needs? Probably not. I see lots of companies commit to the plain-vanilla tag knowing that they will have to come back and fix it but wanting to get a deployment out as quickly as possible. Usually, I think that’s a mistake. The pressure to release numbers is always overwhelming – and whatever gets rolled out is immediately in-play. That means the organization starts to use and react to the numbers – almost always before they’ve been adequately tested.

2. The Tag as Software-Development Project

There aren’t two sides to every web measurement coin – but it’s surprisingly easy to either under or over-do your tagging effort. At the opposite end of the Plain-Vanilla tag spectrum is the tendency to treat the tag like it must be a fully-engineered software development project. It’s this tendency that sometimes causes business managers to just throw their hands in the air and scream – ‘Let’s just roll the damn thing out!’ A tag is simply not as complicated as even a very basic software development effort. It has no GUI, the number of options is paltry and the amount of code is about 1/1000 that of even the smallest software developments. IT organizations that haven’t ever implemented tags and don’t really understand the technology often give Business Units wildly inflated estimates of the time and effort involved. If you’re seeing big-ticket numbers around tagging, your best solution is to work with your vendor to train and hand-hold IT (we do this too – but for this particular service the vendor will be just as good). A little bit of training will almost always bring on the aha moment where the IT guy says – “Is that all there is to this?”

3. Rolling out Analytics to High-Level Managers

There are several related issues around rollout, training and reporting that cause no end of implementation problems. Many organizations have the strong desire to train everyone in the company who might need information on using the tool. Don’t do it! Most managers – particularly senior ones – will not be effective users of tools like SiteCatalyst and HBX. And when they do use the tool, they are highly likely to have questions/issues that send shock waves through your organization, suck down ridiculous amounts of time, and often enough damage the whole measurement effort. You need to grow usage of the tool in your organization organically – starting with the analysts and managers who absolutely must have the information. You can grow out from there – but cautiously. And with tools today providing excellent integration to Excel, you need never expose many of your managers to a web analytics tool even while driving home the value they provide.

4. Confusing Reporting w. Analytics

This is a close corollary to #3 and is also a big part of #5 – thinking that analysis doesn’t require analysts. Fast, reliable reporting on the web channel is one of the biggest value-adds to web analytics tools. Managers at every level need this to do their job well. But don’t think that just because you give somebody a report it will answer all their questions. Good reports raise more questions than they answer. And no report set will ever substitute for real analysis if you are trying to use measurement to drive site change.

5. Thinking Analysis Doesn’t Require Analysts

Tools in web analytics have improved dramatically in the last few years. But they haven’t gotten this good and they never will. Useful analysis is a time consuming activity (we usually spend 3-6 weeks on an analysis) invariably requiring decisions about how and what data to use, how to interpret the numbers and how to apply the results to meaningful decisions. If your Managers have 4 solid weeks to devote to web analytics, then they aren’t Managers they’re analysts. You pay your Managers to manage – you have to pay analysts to analyze. Avinash famously addressed this with his 90/10 rule (you should spend 90% of your analytics budget on people not tools) – I’ve never thought the rule itself was good guidance but the underlying point is dead-on. If you don’t dedicate resources to analysis you won’t get any worth having.

6. Not Tying Change to Measurement

This is a cultural and process issue – but it’s frankly staggering how many organizations with perfectly good measurement virtually ignore it when deciding what and how to change their site. Hey – this is what measurement is for! If you find your company making changes that aren’t measurement driven then you really need to assess whether your measurement is what it should be. And if the problem isn’t there, then you need to think about how your measurement people relate to everyone else. It is in this arena, by the way, that I see particular value to our Functional approach to measurement. It’s a great way to get every stake-holder in an organization understanding how measurement fits in with what they are trying to do.

7. Not Pre-Committing to Measurements

Here’s one of my least favorite tasks in the world – a client rolls out a site change and then asks us to show that it worked well. We always do, of course. But that doesn’t make me think that every site change we’ve ever measured was positive. The simple fact about measurement is that if you can look for anything as evidence of success you’ll always be able to find something. By forcing everyone to pre-commit (before a change) what the expected measurement test and direction really are, then you can put a lid on this sort of nonsense. If I change a page to improve its routing performance then its routing performance had darn well better improve. And the fact that its page time increased isn’t going to convince me that the change was effective if that wasn’t what I was trying to achieve.

8. Not Putting a Method around Measurement

Most of us who practice web analytics have come to one not so great conclusion. Web Analytics is hard. Harder than we all thought when we got started. Harder than you probably think if you haven’t actually tried to do it. As someone who comes from a background in credit card database marketing, I definitely believe that it is more challenging to squeeze behavioral insights from web data then from the incredibly rich vein of information in card usage and purchase data. Not that credit card database marketing wasn’t pretty challenging too. Doing any analytics well takes a considerable amount of skill, effort and organizational attention. So if you expect to get much out of your analytic effort, it’s really important that you put a structure around it that prevents everyone involved from wheel-spinning. What makes for good structure? I think that there are (at least) two answers: a good methodology and a strategic road-map. Having a methodology (like Functionalism) that you commit to provides a built-in analytic focus that makes it much easier for an analyst to be productive. It also provides a ready-made way for you to get into the test/measure cycle that is so critical to analytic success.

9. Not Having a Road-Map

Probably even more important than a good method to getting where you want to go with web analytics is having a clear analytic road-map. I think the biggest challenge for most organizations is after the honeymoon (post-implementation) – when everyone has gotten over the joy of just “having data” and actually wants to do something with it. How do you address this dangerous cross-road? I think the best way is to commit your organization to a specific road-map of measurement projects. You’re going to change these as you go forward, but if you start with an analytic road-map that takes you through the kinds of analysis you want to achieve in the next year, then you’ll never have that horrible awkward stretch where everyone looks around and says “What now?” Since most organizations are also struggling to build measurement into their culture, the Road-Map is a great way to generate buy-in and push the whole organization toward that test/measure cycle I mentioned earlier.

10. Believing that you are doing Good-Enough

Out at these events I talked to quite a few Digital Agencies – all of whom, almost without exception, assured me that they had web measurement well in-hand. What do they know that the rest of us – and their clients – don’t? Maybe it’s all self-interest, but I just don’t believe it. What I see when we share clients doesn’t make me think so. And while it’s reasonable to expect that the really big Agencies are at least on their way (and trying hard) to having measurement expertise – I’m not buying that most of these smaller and mid-size Agencies have the faintest idea how to do web measurement. This attitude is actually rarer in the corporate world – but I see it there often enough – with companies where the measurement is obviously raw and unused still convinced that they have it covered. I certainly don’t think my company Semphonic is doing well enough. And if you are living through the current web analytics environment and you aren’t at least worrying about how to do better then you just don’t get it.

I’m sure this list is anything but exhaustive – but ten is such a convenient stopping place for a list! I doubt I’ve said enough about any of these issues to really provide lot’s of practical guidance. But it’s useful to know what land-mines are out there – and I think each of these 10 are common and serious enough to deserve real attention if you are in the process of implementing or rolling-out a web analytics solution.

Comments

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles