Before getting into the details, it must be noted that this is metric is not 100% precise. Visitors deleting the cookies or using different browsers, will have, as a consequence, fragmented data. However, I believe it still has some value, as it will provide additional information about your visitors. In fact, the visitor retention reports have the exact same limitation. In other words, you should apply the same considerations to this new metric as with the visitor retention reports.
The first thing we need is to devote an eVar and configure it as a counter eVar, with no expiration.
The next step is to add this piece of code in the doPlugins section:
Once you have this code live for a few days, you should see something like:
Probably, the data you will get is too granular to be useful, so, my suggestion is to create a classification of the values in ranges. For example:
0 – 50: Very low value
51 – 100: Low value
101 – 200: Medium value
201 – 500: High value
501+: Very high value
Of course, the thresholds will be different for each business. Also, remember that the classification file needs to have all the values, you cannot specify ranges. If you are a regular expression ninja, you might want to try using the classification rule builder to achieve the same results.
If you have been in an Adobe Analytics implementation, it is highly probable that, at one point or another, you have heard the expression “VISTA rules”. However, many of you might still wonder what those little beasts are. First of all, let’s start with the name. Unless you dig in Google or in the help section, you will never have guessed that VISTA stands for “Visitor Identification, Segmentation & Transformation Architecture”. Do not get too impressed with this name, it was just an imaginative way of getting a fancy name.
With the introduction of SiteCatalyst 15, segmentation became much more powerful. You could have one single report suite and use segments in SiteCatalyst to analyse the data. The segmentation interface received a massive improvement with the May 2014 release of Adobe Analytics.
However, there are still many valid reasons why you would want separate report suites. My friend Jan Exner gave his point of view some time ago: one or two report suites. I would go one step further and talk about more than just two and other reasons why you would want many.
Mobile apps. You will probably want to put all mobile apps in a separate report suite, with the Mobile UI enabled. Having a single report suite will probably create some headaches, as some features are mobile app specific and others, web specific. If you need totals, you can use Report Builder and create the totals in Microsoft Excel.
Multiple currencies. If you are selling in different currencies and you need an accurate reporting for each currency, then it might be better to have one report suite per country or region. However, you can stick to one single report suite, you can track both in the standard location of the s.products string and in a numeric event, and copy the currency code to an eVar. With this approach, you can report on the report suite default currency and in the local currency in which the transaction occurred.
Multiple time zones. As with currencies, if you sell in very different time zones and need accurate intra-day reporting, you might have to create multiple report suites, depending on the time zones. However, generally speaking, reports tend to span more than just one day and the differences in time zones is less noticeable.
Different teams. Some large organisations prefer to have the data separated in different report suites, so that it is possible to give permissions to access the data in a more granular way. It is then possible to give the least amount of privileges to the web analysts, so that they only have access to the data they need. For example, I was working with a customer that had completely different teams analysing Android and iOS data and these teams did not even talk to each other.
Legal requirements. This might not be very common, but if certain information should only be accessed by a limited number of people for legal reasons, then you need to have many report suites and grant access to the report suites depending on the needs, just like in the previous case. As an example, I was working with a supermarket and they were selling both their own white brand together with other brands; the analytics of their own brand, for obvious reasons, are not allowed to see the information of the other brands; this solution required a VISTA rule.
Multi-suite tagging. If your budget allows for it, the best solution is to go for both worlds: one global report suite and multiple local report suites. For lack of a better word, I use local not a geographical meaning.
Different SDRs. Well, this is a sin you should avoid at all costs, but if you have inherited implementations that use different SDRs, then you need different report suites unless you are willing to redesign all Adobe Analytics implementations.
IP address segmentation. If you need to segment by IP address, with a granularity finer than what the geolocation reports can provide, then you need a VISTA rule and multiple report suites. For example, if you have a call centre that actually uses the website, you do not want to “pollute” the main report suite with call centre data; instead, you want the call centre to be reported in a specific report suite.
Human vs non-human interactions. In a previous job, we had a Web services API that offered very similar information to the website. In fact, the information from the API was presented on third party websites, but we were not allowed to add any tagging to these websites. The solution was to track server-side the API usage, obviously, using a separate report suite.
I would like to hear your ideas on this topic or situations that you have found, which have led you to one or multiple report suites.
In my last post, I described a simple solution to track out-of-stock products using Adobe Analytics. As its name implies, this is a rather simple approach: you just get a count of the number of times an out-of-stock product is shown. For many, that might be enough, but there are many different requirements for a one-size-fits-all solution.
Another of my customers wanted a more detailed view of the stock level for all products, not just the fact that a product is out of stock. For this solution, we are going to need three events:
event1: stock level
event2: stock check
event3: out of stock
The implementation, in theory, should be very simple. For example, let’s consider a page with three products:
SKU1: more than 10 products in stock
SKU2: 7 products in stock
SKU3: out of stock
The code would look like:
s.products = ";SKU1;;;event1=10|event2=1,;SKU2;;;event1=7|event2=1,;SKU3;;;event3=1";
s.events = "event1,event2,event3";
In this example, any number above 10 products in stock is not relevant.
Now, when it comes to reporting, you need to create a calculated metric: event1/event2. This calculated metric will show the average of items in stock for each product. Using event3 in the reports, you will get the number of times each product was shown and it was out of stock.
The wealthiest man in Spain (my home country) is the owner of Zara. There are Zara shops everywhere in the world. Just as an example, I was in Bangkok two months ago and I found a Zara store in one of the most popular shopping centres. The success of this company has been widely studied. One of the key success factors of this company is stock management. If you are interested in a detailed explanation, here you have a video that I found very interesting:
In real stores, the only way to determine if a product is popular or not is by the number of units sold. I am not saying that this is not useful, but the mathematical models used could benefit from additional metrics. In the online world, we can go one step further and include other metrics in the algorithm, like product views, add to carts and number of times it is out of stock.
With Adobe Analytics, product views, add to cart, remove from cart and orders are standard metrics that will be included in any typical retail implementation. On the other hand, there is no standard out-of-stock report. I am sure different people will have slightly different views on what “out of stock” is. For me, it is the number of times per visit a product has been shown to a visitor and it was out of stock.
Let me summarise why I chose this way of measuring. While a product is in stock, you can measure the popularity of a product using metrics like add to basket or units sold. However, the moment it is out of stock, you do not have any way to measure how popular it is: you just know it cannot be sold. It could well be that the product is not popular any more and you can just remove it from the inventory. Or, it might be the most popular product, with thousands of page views and frustrated visitors that cannot purchase it. With my solution, you can tell how popular an out-of-stock product is.
After this long introduction, let’s go to the implementation with Adobe Analytics. This is probably the simplest part of it. My suggestion is to use a cookie and a list prop:
In the list prop, you set a comma separated list of product IDs that are shown and are out of stock. You need a list prop as it is possible that on one page there are many out of stock products.
In the cookie, you should store the list of product IDs that have already been reported during that visit.
I would like to show you some code, but since it entirely depends on each implementation, I will just show you the results. Surprisingly, the best example is a bra web page, as it has many different sizes:
In this example, there are four sizes out of stock, so the list prop will get four values (I used the pipe as the separator):
Finally, the report looks like:
In this case, I am only interested in instances, but visits and visitors are other valid metrics that can be useful. An alternative would be to remove the cookie and always report the products. In the end, it will depend on how you want to use those values.
After publishing my previous post, where I recommended tracking the s_code version, I realised it could have been enhanced with a new type of version to track. Another version number that is even more useful is the sprint reference or whichever value you use to track each of the releases of the website. This idea came from one of my customers (thanks Glenn), who is under the process of redeveloping the whole website and analytics implementation. If you either have a development team that is continously adding new features to the web or are in profoundly redeveloping the website, you want to know the ROI of this invesment.
Copying the sprint reference into an eVar, you can create reports that show the revenue or any other important conversion metric for each of the releases. Of course, this will require an extended period of time in the report and equal periods for each release and not all periods are the same. However, with this data, you can show the increment in revenue derived from the new release.
One of the suggestions I usually did when I worked with a new client was to track the s_code version. Now that we are moving to DTM, we do not have any more the concept of the s_code, but we have the concept of publishing new rules, which is similar to an s_code version. The idea is to keep a value in a prop, which is changed every time a new s_code is pushed live or a new set of rules is published. My typical suggestion is to add both a date and a version to the string.
This is a typical report looks like:
It is very clear when the different versions of the s_code went live. But not only that, it can be seen that, after the release of a new version, it takes a couple of days to propagate. For example, on 21st January, two versions are still reported by browsers.
There is not much business value from this report and I cannot imagine any web analyst pulling a report with this dimension. However, the idea behind this data is to be able to easily track errors that are introduced with each release. Think about these two complementary scenarios:
An new error is reported in the analytics code. Checking when this error occurred first, it is noticed that it the date matches when a new s_code/DTM version were published. As a consequence, you can initially concentrate on the changes introduced with the latest release.
The previous error is fixed and published. However, there is still a small number of cases where this error is occurring. By checking the visits for each version of the code, you can see whether some users are still using the old/buggy version of the code. As these users update their caches, you can see that the incidence of the problem is fading away.