Days in the Life of an Analytics Implementation



The honeymoon was terrific!
All the hard work you put in prior to the launch of your analytics system started to pay off much sooner than expected. For weeks before the launch, you diligently collected business requirements, turned them into specifications, tagged your site(s), QA-ed the data, even found time for documentation. It wasn’t easy, but you pulled it off! Your organization now has the keys to improved reporting capabilities. Data is at the fingertips of business stakeholders, optimization opportunities abound, executives are giddy with excitement.

Fast-forward a couple of months and the beauty of your analytics solution is now old news. By and large, the excitement behind all the newly-gained capabilities has given place to the ho-hum of business as usual and there is now a given expectation that [insert analytics platform of choice] should be able to provide answers to most reasonable data questions. You are now in a phase where you've put the platform to its intended purpose - data from the platform is used to critically evaluate your digital properties and inform business decisions. As your digital properties evolve, your analytics solution evolves (hopefully) through a standardized iterative process of requirements discovery, specification writing, tagging, and report delivery. 

This was the intended use of your analytics solution all along, so perhaps congratulations are in order? 

Don’t break out the bubbly just yet… If anything, it is likely that your implementation has started down the path of its inevitable decay. 

Twelve to 24 months is all it takes for an implementation to become sub-par, says Tim Wilson of Analytics Demystified. We spoke with him recently about the signs and factors of this process. It comes down, he told us, to five main factors:

“An Adobe expert who stays on top of Adobe's updates can drop into any implementation that is more than a year or two old and immediately see things that could be done better or differently, just because of the advancements of the platform,” says Wilson. 

Using Adobe Analytics as an example, some of the more recent changes in the platform that may result in a completely re-architected implementation include features such as virtual report suites (might eliminate the need for multi-suite tagging), the larger allotment of evar/events/prop, as well as more flexibility in the cardinality limits of various dimensions.  In a more long-term view of the evolution of Adobe Analytics, similar considerations could be given to a number of other features (Marketing Channels, Processing Rules, enabling Ad Hoc Analysis/Discover for clients, not to mention changes as large-scale as the introduction of SiteCatalyst version 15). 

Wilson recounts that Google Analytics implementations are facing the same trend with the shift to Universal Analytics, with new features such as Custom Dimensions and Metrics, Enhanced eCommerce, Rollup Reporting, and User ID for improved visitor stitching.

Your company’s business model was revamped. Perhaps your company got acquired, or it merged with a new entity. Wholesale changes in the structure of your organization or your digital properties will likely lead to a necessary refactoring of your analytics implementation. In such cases, the re-implementation of the tracking platform usually follows the redefinition of KPIs that better reflect the new business/digital property. You are starting from ground zero, renegotiating the business questions that drive how your analytics solution is implemented.

The more robust and sophisticated your implementation, the higher the likelihood that you will need to refactor it sooner. Especially if documentation is sparse, a well thought-out and architected implementation may be fully understood by the person who implemented it, but it might be lost on others that are less savvy. And with the average tenure of a digital analytics professional lasting about two years at a company, the chances are the this really smart individual who architected your implementation may not be available several months down the road. So if a particular implementation technique is not fully understood, it might easily be forgotten or the tags that enable it may get overwritten (because nobody understands why they were there in the first place). As a result, the platform’s capabilities to answer business questions get diminished. Soon, the need to answer the same or similar business questions may arise again, but by this time the solution may have evolved where answering such questions is done through a more standardized method. 

The fragility of tagging implementations along with factors such as insufficient training, misunderstanding or misinterpretation of the data, or setting the wrong expectations about the platform’s capabilities can lead to dissatisfaction in the clickstream solution. In such cases, it is not uncommon for the platform in question to become a victim and be swapped out for another vendor. 

Similarly to the Data Quality aspect, the quality of the partnership with the vendor is instrumental in ensuring long term satisfaction with the platform of choice. Correct understanding of the customer requirements, consulting support during the initial implementation, ongoing trainings, as well as the quality and  availability of support provided by the vendor are all factors in this equation. The less support the customer receives, the higher the likelihood that they would deviate from best implementation practices, requiring a necessary reimplementation down the line.


What are some of the signs that your implementation is in need of a refresher?

It is not so much about the implementation of one or two particular features or technologies, says Wilson. As an example, lack of a Tag Management System or a data layer maybe something that needs to be flagged, but in judging whether an implementation is due for a refactoring, the key consideration is whether the solution is able to readily answer clearly defined, meaningful questions. Are the data points for such questions not implemented or are they implemented in a sub-par way? An overview of the digital property could yield a list of potential features that should be tracked and are usually considered best practices. Such functional reviews should be performed hand in hand with understanding the pain points analysts encounter when working in the solution and in parallel to an audit of the tool itself, to see if site/app features are in any way tracked.

In summary

One of the important realizations about digital analytics is that there is no finish line. Work can be contained in stand-alone projects with a clearly defined start and a finish, but the nature and promise of analytics is that it is an iterative process aimed at the continuous improvement of your digital properties. It is only logical that this continuous improvement should be applied towards the analytics solution itself. There are always incremental improvements that could be added to an implementation. Ensuring consistent settings, updating documentation, frequent tagging audits, curation of data, and ongoing trainings are just some of the daily activities that go into the upkeep of a healthy analytics implementation. However, there are also times when larger-scale changes are in order.

Tags: Trends