How to organize a web analytics team

A few months ago, I was working with a customer on premise and the manager asked me a tricky question: how to organise the analytics team. That company was undergoing significant changes in the analytics front, as a few key members of this team were leaving. As with most questions in life, there is not a clear and definitive answer to this particular one. However, I can share what I have been seeing in my customers.

When companies are small, they inevitably go for cheap or free web analytics solutions. In fact, this is a no-brainer, given the cost of the licenses of proprietary solutions. In these companies, there is only one person in charge of the task of analysing visitor’s behaviour and, usually, this is only one of the many tasks he has to do. As the importance of web analytics grows, this person ends up working full time on web analytics. Sometimes, a second person joins this mini-team, but within a broader team that manages all internal analytics and reporting.

The next breakpoint happens when the company as a whole realises the benefits of having a very good understanding of the customer behaviour. In this situation, more and more people want to have access to the statistics gathered by the analytics tool. Not just only the web analysts, but others in the marketing department (or even other departments) want to have first hand data. These other roles do not need full access nor are going to spend their full working day on the web analytics tool, but they want to be able to self serve. As a consequence, more people demand more features and capabilities.

Inevitably, this approach leads to larger analytics teams, with a manager and a team of two or three people that are working full time on all aspects of the analytics: reporting, ad-hoc analysis, implementation, supporting other users… But not only that, another consequence is that free solutions become a problem rather than a solution and more powerful tools are needed, like Adobe Analytics. As an example, I remember one company switching to Adobe Analytics just because their previous solution did not scale well.

The last big change comes when the websites and apps are in continuous development, constantly adding new features, with releases every few weeks. It becomes impossible to keep up with the changes in the development and, at the same time, satisfy the reporting needs. At this point, there is a need to split the web analytics team in two: one dedicated exclusively on the implementation and another on the reporting. These two teams have different line managers, although they are closely connected.

Even in this situation, there is still room for improvement, especially in big corporations. You can see further separations in both teams for apps/web, different websites, desktop/mobile web…

In parallel to all of this process, there is another tip I would like to share. The web analytics job market is fairly small, but very dynamic. There are very few good web analysts and recruiters are always luring web analysts to fill vacancies in other companies. It is therefore very important to have a healthy balance between junior and senior analysts. Both are needed and junior team members need to receive a good training to be able to fill in the gap when more senior members or even the manager, eventually leave the company.

As I said, this is only my experience, but after having worked with dozens of companies, where this pattern tends to repeat, I am inclined to believe that this is the best approach as of today.

Quick tip: track sprint reference

After publishing my previous post, where I recommended tracking the s_code version, I realised it could have been enhanced with a new type of version to track. Another version number that is even more useful is the sprint reference or whichever value you use to track each of the releases of the website. This idea came from one of my customers (thanks Glenn), who is under the process of redeveloping the whole website and analytics implementation. If you either have a development team that is continously adding new features to the web or are in profoundly redeveloping the website, you want to know the ROI of this invesment.

Copying the sprint reference into an eVar, you can create reports that show the revenue or any other important conversion metric for each of the releases. Of course, this will require an extended period of time in the report and equal periods for each release and not all periods are the same. However, with this data, you can show the increment in revenue derived from the new release.

Quick tip: track code version

One of the suggestions I usually did when I worked with a new client was to track the s_code version. Now that we are moving to DTM, we do not have any more the concept of the s_code, but we have the concept of publishing new rules, which is similar to an s_code version. The idea is to keep a value in a prop, which is changed every time a new s_code is pushed live or a new set of rules is published. My typical suggestion is to add both a date and a version to the string.

This is a typical report looks like:


It is very clear when the different versions of the s_code went live. But not only that, it can be seen that, after the release of a new version, it takes a couple of days to propagate. For example, on 21st January, two versions are still reported by browsers.

There is not much business value from this report and I cannot imagine any web analyst pulling a report with this dimension. However, the idea behind this data is to be able to easily track errors that are introduced with each release. Think about these two complementary scenarios:

  • An new error is reported in the analytics code. Checking when this error occurred first, it is noticed that it the date matches when a new s_code/DTM version were published. As a consequence, you can initially concentrate on the changes introduced with the latest release.
  • The previous error is fixed and published. However, there is still a small number of cases where this error is occurring. By checking the visits for each version of the code, you can see whether some users are still using the old/buggy version of the code. As these users update their caches, you can see that the incidence of the problem is fading away.

Use a tag manager

Back in the old days, the only way to add Web analytics code to a website, was through manual coding. If you were using Adobe Analytics, you would need to add two pieces of code into the website: the s_code and the on-page code. The s_code is a JavaScript file with common code for Adobe Analytics (SiteCatalyst) and the on-page code contains the page-specific data. I am sure many of you are familiar with these lines of code:

While this does not seem to be a great problem, my experience with many customers shows that this traditional solution is far from ideal. Typical issues that I have found are:

  • Web developers have usually little or no knowledge of Adobe Analytics code. The moment you mention “eVar” or “prop”, they completely disconnect from the conversation until they clearly understand what these words mean. Do not get me wrong, I have been a developer myself, I have nothing against developers, but I know that it is very difficult to find a web developer that understands Adobe Analytics.
  • Changes take a very long time to be published. Even the minimum code change (just adding an eVar, for example), can take weeks, if not months, before it is live. The main reason is that, any new feature must be added to a scrum backlog, a change request…
  • Disconnect between IT and marketing. These two departments tend to have very different goals. As a consequence, what is of great importance for a web analyst, might be considered low priority by the scrum master.

If you search for more reasons, you will find many more.

So, what is the solution? Use a tag management solution, like Adobe Dynamic Tag Management. This is not the silver bullet that will solve all your problems, but it will help move forward more easily. Do not even think on developing your own solution: it will take you years before you have a solution that matches the worst commercial solution.