Auditing Time-Based Tracking Events

Time spent on page was never a perfect metric.

When performing calculations for time spent on page, most out-of-box analytics implementations still rely on flavors of this formula:
  1. Take the delta between current page load and next page load in the visit/session.
  2. Average deltas across all sessions that saw the page in question plus one more.
  3. Sessions which end on this particular page are excluded from the average (there is no next page delta to consider).
This formula may have been good enough for the past decade. But the way people consume content online has evolved significantly and a much finer gauge is required to effectively measure the duration of content engagement. 

The advent of Single Page Applications (SPAs) and the proliferation of web-optimized video content are dictating the need for a better set of metrics to describe the time visitors spend on a given content item. In highly dynamic and highly media-rich sites, almost all sessions are soft bounces: visitors load one page, consume content for a while, and then leave. Relying solely on the legacy formula would produce highly inaccurate metrics. And measuring the time spent engaging content on such sites is often among the most important key metrics.

As a result, many analytics implementations have evolved to an enhanced formula: 
  1. When a visitor lands on a page, begins playing a video, or engages content in some other significant way (often referred to as "calls to action") the tracking javascript is told to start sending out pings at specific time intervals.
  2. Using the information provided by these pings, the analytics framework is able to measure view times in a much more accurate way.
This has become a common enough solution that tag managements systems now provide out-of-the-box timer-based triggers for this purpose. There are also many custom tagging implementations where timer-based events are set off. 

Automated audits of timer-based events are as tricky as they are necessary.

Many tag auditing solutions have evolved correspondingly in order to be able to QA timer-based beacons whose manual validation would otherwise be a major drag. But because of technical challenges, the ease of automation varies greatly.

Here's a quick look at how easy it is to automate the detection of timer pixels using QA2L's click-and-point web interface:








It gets even trickier...

The above is actually the simplest test scenario. Often, specific tracking timers are set off by video playback, or even scrolling down to a certain portion of the page. Often, there is no good way to know when an asynchronous event (such as an API request) has completed so you can start or stop listening for timer-based events.

Luckily, QA2L excels at automating all of these and many other advanced test cases.

Suggested further reading: User Acceptance Testing Automated!


Request a Demo