Quick tip: track sprint reference

After publishing my previous post, where I recommended tracking the s_code version, I realised it could have been enhanced with a new type of version to track. Another version number that is even more useful is the sprint reference or whichever value you use to track each of the releases of the website. This idea came from one of my customers (thanks Glenn), who is under the process of redeveloping the whole website and analytics implementation. If you either have a development team that is continously adding new features to the web or are in profoundly redeveloping the website, you want to know the ROI of this invesment.

Copying the sprint reference into an eVar, you can create reports that show the revenue or any other important conversion metric for each of the releases. Of course, this will require an extended period of time in the report and equal periods for each release and not all periods are the same. However, with this data, you can show the increment in revenue derived from the new release.

Quick tip: track code version

One of the suggestions I usually did when I worked with a new client was to track the s_code version. Now that we are moving to DTM, we do not have any more the concept of the s_code, but we have the concept of publishing new rules, which is similar to an s_code version. The idea is to keep a value in a prop, which is changed every time a new s_code is pushed live or a new set of rules is published. My typical suggestion is to add both a date and a version to the string.

This is a typical report looks like:

scodeversions

It is very clear when the different versions of the s_code went live. But not only that, it can be seen that, after the release of a new version, it takes a couple of days to propagate. For example, on 21st January, two versions are still reported by browsers.

There is not much business value from this report and I cannot imagine any web analyst pulling a report with this dimension. However, the idea behind this data is to be able to easily track errors that are introduced with each release. Think about these two complementary scenarios:

  • An new error is reported in the analytics code. Checking when this error occurred first, it is noticed that it the date matches when a new s_code/DTM version were published. As a consequence, you can initially concentrate on the changes introduced with the latest release.
  • The previous error is fixed and published. However, there is still a small number of cases where this error is occurring. By checking the visits for each version of the code, you can see whether some users are still using the old/buggy version of the code. As these users update their caches, you can see that the incidence of the problem is fading away.