This topic describes:What to do if you find discrepancies between Google Analytics and Universal Analytics.
You can integrate Optimizely Web Experimentation and Optimizely Feature Experimentation with a variety of commonly used analytics platforms. After you set up an integration and implement it correctly, you can expect to see similar counts of unique users in Optimizely Web Experimentation and Optimizely Feature Experimentation and your chosen analytics platform.
For more information, see our implementation guides:
When working with integrations, you may notice issues or discrepancies, such as:
The data you see in Optimizely Web Experimentation and Optimizely Feature Experimentation doesn’t match the data you see in your analytics platform
You do not see data from Optimizely Web Experimentation and Optimizely Feature Experimentation being passed into your analytics platform
Small variations in unique user counts are normal, even if your integration is set up properly. This is because different platforms often define “unique visitors” differently.
However, larger discrepancies are usually due to an implementation issue or a data Issue. either the implementation is not set up correctly, or you are comparing disparate sets of data.
To rule out implementation issues, check our support articles for your chosen analytics platform. If you’ve ruled out implementation issues, you may be encountering a data issue. For help with data issues, please read on.
Compare reports in Optimizely Web Experimentation and Optimizely Feature Experimentation and your analytics platform
You should integrate Optimizely Web Experimentation and Optimizely Feature Experimentation and your analytics platform. This way, your analytics reports are filtered by visitors who are bucketed into an Optimizely Web Experimentation and Optimizely Feature Experimentation experiment, and both reports track the same group of visitors.
Optimizely Web Experimentation and Optimizely Feature Experimentation and your analytics platform are each optimized for different metrics. When you integrate the two, you ensure that only visitors from an Optimizely Web Experimentation and Optimizely Feature Experimentation experiment appear in a particular custom report. However, the metrics you see in your analytics platform look different from Optimizely Web Experimentation and Optimizely Feature Experimentation's Results page. In particular, Optimizely Web Experimentation and Optimizely Feature Experimentation looks only at unique visitors, while your analytics platform can count session, pageviews, unique pageviews, entrances, and other metrics.
For example, a single visitor may visit your Optimizely Web Experimentation and Optimizely Feature Experimentation experiment several times. Each time she does, she will trigger another unique pageview and another session. In Optimizely Web Experimentation and Optimizely Feature Experimentation, you’ll still see this as one visitor. In other words, Optimizely Web Experimentation and Optimizely Feature Experimentation deduplicates visitors and conversions. Your analytics platform probably does not deduplicate visitors and conversions.
In many cases, this difference in behavior between platforms causes the discrepancies you may see in unique visitor counts. If not, consider these other common sources of discrepancies:
Is your Optimizely Experimentation experiment running? Analytics integrations collect data only when experiments are running on the page.
Are you tracking visitors to the same set of URLs in both Optimizely Experimentation and your analytics platform? For example, if your analytics platform tracks visitors site-wide but your Optimizely Experimentation experiment is only running on your landing page, you’ll see higher visitor counts in your analytics platform.
Are you filtering the types of visitors shown in Optimizely Experimentation and your chosen analytics platform in the same way? For example, if you have an audience that excludes mobile visitors to your Optimizely Experimentation experiment, but you are not using an equivalent filter on your analytics platform, you will almost certainly see significant visitor count differences between the two.
Are you filtering using a custom date range? Certain platforms display visitors across custom date ranges in different ways. Optimizely Experimentation displays only new visitors to your experiment on the dates selected, whereas other platforms may show you both new and returning visitors on the dates selected.
Are you looking at session-based or user-based metrics in your analytics platforms? By default, Optimizely Experimentation visitors and conversions are counted at the user level. This means that all visitors and conversions are counted throughout the lifetime of your experiment rather than on a per-session basis. Some analytics platforms count visitors or conversions on a per-session basis instead.
Does your traffic allocation match in Optimizely Experimentation and your analytics platform? Allocating traffic differently between platforms is a common source of data discrepancies.
These definitions may help you understand how your data aligns between Optimizely Experimentation and Google Analytics:
Unique visitors (Optimizely Experimentation) – The number of unique visitors who were exposed to your experiment (this is based on visitors who received the optimizelyEndUserID cookie).
User (Universal Analytics) – The metric that is the closest match to Optimizely Experimentation's unique visitors metric. If you believe you have repeat visitors to your site, be sure to add the "user" metric to your report in Google Analytics so you can align your analytics data with your Optimizely Experimentation data as closely as possible.
Pageviews (Universal Analytics) – A count of each time a visitor loads and views a particular page. Each time a page loads, the Google Analytics code embedded within the page counts the instance. These instances are then summed to describe the total traffic for any specific page. The generic pageview count will include every time a page is loaded, including reloads by the same visitor.
Sessions (Universal Analytics) – A session includes visits to a site and the pages loaded in a specific period. Google Analytics sessions expire after 30 minutes of inactivity by default, although a Google Analytics administrator can set the expiration to any value between 1 minute and 4 hours.
Unique pageviews (Universal Analytics) – A pageview count that only includes new pageviews during a single session. Page reloads are not counted in the "unique pageview" metric, so it will always be equal to or less than the number of overall pageviews. Unique pageviews are different from the general "pageviews" metric in Google Analytics (described above).
Google Analytics Classic (GA) and Universal Analytics (UA)
In GA, custom variable slots are used to pass information with a tracking call to specify additional information about the visitor. The Optimizely Experimentation-GA integration uses one of these custom variable slots to pass along the experiment name and variation name the visitor is currently bucketed into, if any. For example, a custom variable slot might contain a value like this:
When Optimizely Experimentation runs at the top of your page, it immediately adds this custom value to the _gaq object (which is the default GA object on your page). When the GA code runs later on the page, it picks up this custom value and sends it with the tracking call to GA's servers.
This type of integration allows you to:
Use all your existing reports in GA
Separate each report by the contents of the custom variable slot, showing you a different row of data for each variation used in your experiment
UA takes a similar approach, except that custom dimensions are used to pass this information. A UA integration also requires you to add a tracking call to your page:
// Optimizely Experimentation Universal Analytics Integration window.optimizely = window.optimizely || ; window.optimizely.push("activateUniversalAnalytics");
Initial troubleshooting for data issues
When troubleshooting data issues in Google Analytics integrations, start by checking the following:
Make sure that the slot number you use for an Optimizely Experimentation experiment is not used by another experiment or other tracking activities. Otherwise, the experiment and variation names may overwrite each other.
Make sure that you are using the
gaqobject, not the older
gatobject, to make the tracking call. Only pages that use
_gaqare currently supported.
Further troubleshooting for data issues
If the initial troubleshooting items appear correct, the next sections will help you troubleshoot your GA or UA integration, whether you are using a standard or customized implementation.
Issue: You see more unique visitors in Google Analytics (GA) than in Optimizely Web Experimentation
Affects: Google Analytics Classic
Description: Your data in GA does not match Optimizely Web Experimentation, you see higher unique visitors in GA than Optimizely Web Experimentation, and the problem corrects itself when the Optimizely Web Experimentation experiment is paused.
Root cause: The experiment runs on a subdomain, and you are calling
_setDomainName after the snippet is loaded. Thus, when setting the
customVar, the cookies are created on the wrong domain.
To make this work, call
_setDomainName before Optimizely Web Experimentation sets the
customVar. Here's how to update the code:
window._gaq = window._gaq || ; window._gaq.push(['_setAccount', 'UA-29010521-1']); window._gaq.push(['_setDomainName', 'exampledomain.com']); <script src="//cdn.optimizely.com/js/12345678.js"></script>
The section below, you only see visits to a limited number of pages, describes a related issue, also dealing with
Issue: You see more unique visitors in Optimizely Web Experimentation than Google Analytics (GA)
Affects: Google Analytics Classic
Root cause: The Optimizely Web Experimentation snippet is positioned after the GA snippet in the page code. As a result, the GA snippet loads asynchronously, which means that although Optimizely Web Experimentation sometimes manages to load prior to GA and set the
customVar, most of the time it does not. GA data is then fired off before Optimizely Web Experimentation can set the
Solution: Place the Optimizely Web Experimentation snippet before the GA snippet in the page code.
Affects: Google Analytics Classic
_gaq, which is an empty array when it’s declared. Because GA always checks to see if
_gaqexists before it runs, Optimizely Web Experimentation can safely add a custom variable to it, which GA can find and send with the tracking call.
However, if your site has any type of custom implementation of GA, it’s possible for that custom code to overwrite the value that Optimizely Web Experimentation puts in
_gaq, preventing the Optimizely Web Experimentation integration value from being passed to GA.
Solution: Your custom implementation of GA should check for the existence of
_gaqbefore it runs. If
_gaqexists (because Optimizely Web Experimentation created it and added a custom variable), that value must be included when the custom implementation runs.
This can be accomplished with a single line of code:
var _gaq = _gaq || ;
This code effectively says, "if
_gaq has already been declared on the page, use that array. Otherwise, create
_gaq as an empty array and proceed."
Issue: You only see visits to a limited number of pages (GA and UA)
Affects: Google Analytics Classic (
_gaq) and Universal Analytics
Symptoms: You see visits only to a limited set of pages on your site, and visitor counts between Optimizely Web Experimentation and Google Analytics don’t match up.
Root cause: You are setting a leading period or www. when calling
_setDomainName. This creates an additional set of cookies. Make sure that if you have the
setDomainNamefunction call on your page, it's not using a leading period.
For example, use
.setDomainName("mydomain.com"); instead of
Google Analytics uses something called a domain hash (a hash of the domain or of the domain you set in
_setDomainName) to prevent conflicts between cookies.
When the Optimizely Web Experimentation integration is turned on, it uses the
.setCustomVar()function. This function sets a cookie on the domain without the leading period. Calling
setDomainName()with the leading period will overwrite this cookie, which could have a negative impact on your referrers and your bounce rate in Google Analytics.
If the domain hash of the domain you've configured does not match the Google Analytics cookies it finds, Google Analytics interprets this as a brand new visit (and a brand new visitor).
Here's an example: If you don't add a leading period or subdomain, your root www domain gets a domain hash that is equivalent to http://mysite.com. However, if you do add the leading period, the domain hash changes to one that will not match the domain hash of any of your returning visitors. When returning visitors enter your site, Google Analytics doesn't see a matching set of cookies with the correct domain hash, so it creates a new visitor ID. As a result, all your traffic since the hash change is separated from the traffic before the hash change.
You can recover the old cookies by switching to using no leading period. However, if you do this, you'll lose the cookies since you've made the change.
Solution: The easiest way to fix this issue is to remove the leading period or www. from your
If that isn't feasible, the workaround is to call
_setDomainNamebefore Optimizely Web Experimentation sets the
/* _optimizely_evaluate=force /window._gas = window._gas || ;window._gas.push(['_setDomainName', 'www.mimco.com.au'])>br /> window._gas.push(["_setCustomVar", 4, "MI 033 Sale Page", "Variation 1", 1]);/ _optimizely_evaluate=safe */
Finally, make sure that the Optimizely Web Experimentation snippet comes after your original
setDomainName()call but before the
trackEvent()call. This will help prevent your session information from being overwritten.
To learn more about tracking across domains and subdomains with Google Analytics, please read Google's reference article or this article on Google Analytics subdomain problems.
Issue: You see visitors on the Optimizely Web Experimentation Results page but no data in Google Analytics
Affects: Universal Analytics
Symptoms: You see visitors and conversions on the Optimizely Web Experimentation Results page, but they don’t seem to be getting passed through to Google Analytics.
Root cause: There are two possible root causes for this Issue.
The integration may not be enabled in the experiment
The Optimizely Web Experimentation snippet is not positioned correctly on the page
Make sure that the appropriate integration is enabled (UA) on your Optimizely Web Experimentation home page.
Within each experiment where you want to track data, navigate to Options > Integrations > Google Analytics or Universal Analytics. Make sure that Track Experiment is checked and that you’ve selected a custom variable or dimension number. Don’t worry about Custom Tracker unless you’re using a custom event tracker other than the default.
Make sure that the Optimizely Web Experimentation snippet is in the
<head>tag of your page, before the Google Analytics tracking call.
Affects: Google Analytics Classic and Universal Analytics
Symptoms: The experiment name is truncated or cut off, or multivariate experiment names are either cut off or separated with a tilde (~).
Root cause: Experiment titles will appear in your Google Analytics reports as
Optimizely_[experiment title], truncated to 28 characters. Variation titles will be truncated to the first 24 characters. Experiment and variation titles that contain non-Latin characters may not be reported correctly in Google Analytics.
For multivariate experiments, up to four variation titles will each be truncated and joined with a tilde (~), such as
Solution: If variations have the same first four characters, the identical string will be passed as variation name to Google Analytics. Therefore, Google Analytics will interpret various combinations of variations as identical. To fix this issue, update the variations' names in Optimizely Web Experimentation so that the first four characters are unique.
Issue: Counts from a direct or referral traffic source are too high
Affects: Redirect experiments in Universal Analytics
Symptoms: In redirect experiments, counts for an organic (search) traffic source may be too low, and counts for direct or referral traffic are too high.
Root cause: When a redirect occurs in Optimizely Web Experimentation with the UA integration enabled, Optimizely Web Experimentation gets the
document.referrer value from the original Page and calls the
ga('set','referrer'); function when the visitor lands on the redirect Page to maintain the original referrer. This works well on landing pages, but if there is a redirect on any page deeper into your site, it strips the visitor's original session referrer and replaces it with the last page the visitor saw. This inflates the direct or referral traffic source counts for your GA, UA, and AdWords reports.
Solution: Enable the Out of the Box Universal Analytics integration or follow this link if your UA is implemented via Google Tag Manager.
Issue: Metrics in UA showing fewer conversions than Optimizely Web Experimentation
Affects: All Experiment types
Symptom: Optimizely Web Experimentation results page shows more conversions, especially down funnel conversions than in Universal Analytics.
Root cause: Optimizely Web Experimentation's results page is user scoped so it will continue to attribute events to a visitor event if they do not activate an experiment at a later session. Our UA integration sends data via a custom dimension. Custom Dimensions in UA can be either hit, session, product, or user scope. Click here for more info on custom dimensions. Using a custom dimension scope other than "user" will require the experiment to activate in order for UA to attribute events.
Solution: Change the custom dimension you reserved for Optimizely Web Experimentation to "User" scope and ensure that no other running experiments are utilizing the same custom dimension.
Advanced troubleshooting for data issues
If you still having an issue with your Optimizely Web Experimentation-Google Analytics integration, try these approaches.
Check the code
When the Optimizely Web Experimentation snippet loads on a page, it evaluates every experiment that's running to see if it has a Google Analytics integration enabled. If it does (and the visitor is bucketed into a variation), Optimizely Web Experimentation runs a single line of code to perform the integration, calling the
_setCustomVar function in Google Analytics.
_gaq.push(["_setCustomVar", /*SLOT*/ 5, "Optimizely_HomePage", "Variation #1", /*SCOPE*/ 2]);
_setCustomVar– Sets the custom variable in Google Analytics
/*SLOT*/ 5– Sets a specific custom slot in Google Analytics (in this case, slot 5)
Optimizely_HomePage– Sets the experiment name (in this case, "Optimizely Web Experimentation Homepage")
Variation #1– Sets the variation name (in this case, Variation #1)
This means that if a subsequent
_setCustomVar call on your site also uses slot #5, the Optimizely Web Experimentation integration will be overwritten.
Check the timing
When the Optimizely Web Experimentation snippet loads on a page, it can make the
_setCustomVarcall immediately. However, if you are loading Optimizely Web Experimentation asynchronously, the
_setCustomVarcall could happen after the Google Analytics tracking call is sent, which would prevent the integration from working correctly.
In this case, it's best to delay the Google Analytics tracking call until after Optimizely Web Experimentation has finished loading. This is most commonly done with a callback function that is triggered by the completion of another script.
Check for redirect experiments
_setCustomVaris not set on the tracking call for a redirect variation, make sure that the redirect variation has the
_optimizely_redirect comment as the first line in the variation code to indicate a redirect:
/* _optimizely_redirect=http://custom */