- Optimizely Web Experimentation
- Optimizely Performance Edge
Using custom analytics integrations adds additional code execution for an experiment. When building your own custom analytics integration, you should consider how the product you are integrating with wants to receive the data and how Optimizely Web Experimentation and Optimizely Performance Edge send it.
In general, when working with custom analytics, there are four factors you should consider:
- Overwrite experiment and variation data
- isHoldback value for experiments running for less than 100% traffic
- Redirect experiments and referrer value
- Race conditions
Overwrite experiment and variation name
When custom analytics integration is enabled for all experiments, a custom integration runs for every single experiment a user is bucketed into that is active on the page. This means you execute the same code multiple times, each time potentially overwriting previous experiment information that was supposed to be sent to the integrated platform.
If you were to build your own Google Analytics integration, you would most likely use a custom dimension and send the experiment name and variation to it. However, if you were to hardcode your custom dimension value, each experiment would set that custom dimension’s value to their experiment and variation value.
If you had three experiments running on the same page, each one would execute and overwrite the custom dimension value. Whichever one is set last is sent to Google Analytics, while the other two are lost.
For example, with custom analytics, you would create an integration to append the experiment ID and the variation name of that experiment that the user was in. A custom dimension value should look like this when it sets it and sends it to Google Analytics:
Custom Dimension 2 : “Experiment 123 - Variation #1”
If there were three experiments on the page and each one activated immediately, you would see three different custom dimension values get set, and each time one is set, it overwrites the previous one. Meaning only one experiment gets associated with the custom dimension, despite there being three experiments active. The order in which experiments activate is not always the same, so there is no way to guarantee which one comes first or last.
Consider isHoldback value
If an experiment has the total traffic allocation set below 100%, anytime a user is added to the traffic allocation that should not see the experiment, Optimizely Web Experimentation and Optimizely Performance Edge still make a bucketing decision and marks the
isHoldback value to true. If this value is true, the user is technically not in the experiment and sees the original variation. But because a bucketing decision is made, the custom analytics code will still run, saying a user is in the experiment when in fact they are not.
isHoldbackis set to false.
Redirect experiments and referrer value
The custom analytics integration framework already accounts for redirect experiments. In a redirect variation, the custom code executes when on the redirected page (not on the original, pre-redirect). You can get the experiment ID and variation ID via the getRedirectInfo() function.
You can determine the referrer value using the
var redirectInfo = window.optimizely.get('state').getRedirectInfo(); redirectInfo.referrer;
Because you are now sending additional data to another system, you should ensure the third-party script or object loads and is available. Third-party analytics scripts can load asynchronously, and by default, Optimizely Web Experimentation and Optimizely Performance Edge does not wait for it. Then the custom analytics code runs before that script has initialized, causing the integration to fail.
You can write an interval or some other method to wait for the third-party script to be present on the page before executing the custom analytics integration code.