The Experiment Results page

  • Updated

Relevant products:

  • Optimizely Web Experimentation
  • Optimizely Feature Experimentation

This topic describes how to:

  • Learn how Optimizely Web Experimentation and Optimizely Feature Experimentation calculate results to enable business decisions
  • Determine the definitions and formulas for result metrics
  • Share the results of your experiments with others
  • Export campaign or experiment results in a comma-separated value (CSV) format

The Results page at a glance

Optimizely Web Experimentation and Optimizely Feature Experimentation's experiment Results page help you measure success in an experiment. Dive into each metric and variation to see how visitors respond to changes you made to your site.

Key tips

What to watch out for

  • If you add more than five metrics to an experiment, the additional metrics may take longer to reach statistical significance.
  • Visitors are bucketed into variations by chance according to the traffic distribution percentage you set, so you may not see the same number of visitors in each variation.

Optimizely Web Experimentation and Optimizely Feature Experimentation's experiment Results page is powered by Stats Engine. It provides a data-rich picture of how your visitors interact with your site. Use it to measure success in an experiment and learn about visitors to your site. In Optimizely Web Experimentation and Optimizely Feature Experimentation, the experiment Results page includes confidence intervals and incorporates a Stats Engine improvement that reduces the false discovery rate.

This article walks through the experiment Results page for Optimizely Web Experimentation.

If you are using Optimizely Web Personalization, there is a slightly different Results page.

For Optimizely Feature Experimentation, most of this article is accurate, but the UI might display slightly differently. Refer to our developer documentation on Analyzing Results for more information on Optimizely Feature Experimentation results.

overall-results-page.png

Here is what you will see in the left-hand navigation: 

  • Options to pause, preview or archive the experiment

  • The date when changes were last published

  • The number of days running (Looks at the exact times an experiment was started/stopped and represents the number of whole days an experiment is/was running. Floating numbers are truncated rather than rounded up. For example, 17.8 becomes 17.)

  • Audience(s) targeted in the experiment.

  • Pages included in the experiment.

  • The number of visitors.

  • (Optional) Description of the Experiment.

The summary and metric modules provide an in-depth view of your visitors' behavior on your site. We will discuss those below. You can also learn how to interpret the results you see in Optimizely Web Experimentation and Optimizely Feature Experimentation.

All results in Optimizely Web Experimentation and Optimizely Feature Experimentation are in local time, according to the time zone set on your machine.

Results are typically available within five to ten minutes of Optimizely Web Experimentation and Optimizely Feature Experimentation receiving the data. Read our article about data freshness to learn more.

Find the experiment Results page

In Optimizely Feature Experimentation experiments (created after February 2021), the Results page is referred to as the Reports page.

Here are two ways to find the experiment Results page:

  • Experiments dashboard > Results
    experiments_with_results.png

  • Manage Experiment dashboard > View Results
    view-results-from-experiment.png

Modules

The experiment Results page provides a high-level summary and a module for each metric attached to your experiment. We will walk through the summary and modules below. You will use them to take action based on the results of an experiment.

Summary

The summary provides a high-level overview of the experiment. It allows you to compare how each variation is performing for the primary metric compared to the original.

Here is what you see once visitors enter your experiment:

results-summary-visitors.png

Optimizely Web Experimentation and Optimizely Feature Experimentation shows the number of unique visitors bucketed into each variation.

In the previous screenshot, 423 visitors (or 32.59% of visitors in this experiment) have seen the original variation, 410 (31.59%) have seen Variation #1, and 465 (35.82%) have seen Variation #2.

Metrics

Below the summary, you will see the results for each metric that you added to your experiment. The primary metric is always at the top and expanded by default. Secondary metric results will be automatically collapsed. Select the arrow next to the secondary metric name to expand its results or select Expand all to view all secondary metric results. 

results-page.gif

  • Unique Conversions or Total Conversions – When you add a metric to an experiment, you choose unique or total conversions. Unique conversions show deduplicated conversions, so a single visitor who triggers the same event multiple times is counted just once. Total conversions show a simple total of conversions for the event.

  • Conversion Rate or Conversions per Visitor – Under the Unique conversions view, Optimizely Web Experimentation and Optimizely Feature Experimentation show the conversion rate: the percentage of unique visitors in the variation who triggered the event. Under the Total conversions view, you will see Conversions per Visitor: the average conversions per visitor for visitors in the variation.

  • Improvement

    • Optimizely Web Experimentation and Optimizely Feature Experimentation display the relative improvement in conversion rate for the variation over the baseline as a percentage for most experiments. For example, if the baseline conversion rate is 5% and the variation conversion rate is 10%, the improvement for that variation is 100%.

    • Optimizely Web Experimentation and Optimizely Feature Experimentation display the absolute improvement for any experiment where Stats Accelerator is enabled and for Multi-armed bandit (MAB) optimizations.

  • Confidence interval – The confidence interval measures uncertainty around improvement. Stats Engine provides a range of values where the conversion rate for a particular experience lies. It starts wide, and as Stats Engine collects more data, the interval narrows to show that certainty is increasing.

    Once a variation reaches statistical significance, the confidence interval always lies entirely above or below 0.

Statistically significant and positive Statistically significant and negative Not yet conclusive
  • Statistical significance – Optimizely Web Experimentation and Optimizely Feature Experimentation show you the statistical likelihood that the improvement is due to changes you made on the page, not chance. Until Stats Engine has enough data to declare statistical significance, the Results page will state that more visitors are needed and show you an estimated wait time based on the current conversion rate.

    Multi-armed bandit optimizations do not calculate statistical significance.

To learn how to capture more value from your experiments, either by reducing the time to statistical significance or by increasing the number of conversions collected, see our article on Stats Accelerator.

Filter experiment results

Use graphs, date range, attributes and the baseline to determine results. We will show you how below.

Graphs

You can toggle between different graphs for each metric. To see or hide charts, click Hide graph or View graph.

view-graph-results-page.png

  • Improvement over Time (the default) – Improvement in this metric for each variation, compared to the baseline. Improvement is relative improvement. Not cumulative. Refer to Confidence intervals and improvement intervals for more information.

  • Visitors over Time – Total visitors for the variations and the baseline

  • Conversions over Time – Conversions per day in this metric for each variation, including the original

  • Conversion Rate over Time – The cumulative conversion rate for each variation, including the original

  • Statistical Significance over Time – Cumulative statistical significance for the variation

Filter by date range

Use the Date Range dropdown to select start and end dates for your Results page view. Then, click Apply. The results generated will be in your computer's time zone.

results-filter-by-date.png

Segment experiment results

By default, Optimizely Web Experimentation and Optimizely Feature Experimentation show results for all visitors who enter your experiment. However, not all visitors behave like your average visitors. Segmenting your results is a powerful way to gain deeper insights into your customers to design data-driven experiments and personalization campaigns. 

Use the Segment dropdown to drill down into a segment of your visitors. You can segment by a single attribute value or a combination of attribute values (learn more about using combinations).

segment-results.gif

For Optimizely Web Experimentation, the default segments are:

  • Browser – Firefox, Google Chrome, Internet Explorer, Opera, Safari, Unknown

  • Source – Campaign, Direct, Referral, Search

  • Campaign

  • Referrer

  • Device

You can also segment by up to 100 custom attributes in Optimizely Web Experimentation or Optimizely Feature Experimentation.

Change the baseline

Sometimes, you may want to see how all your variations compare to one variation in particular—which may not be the original. Use the Baseline dropdown to select a different variation as the baseline.

change-baseline.png

Share experiment results

Use the Share feature to send your Results page to critical stakeholders. Click Share and copy the URL provided.

share-link-resultspage.png

The Share link provides access to the Results page for that specific experiment. Users can segment data, view charts, filter by date range and more. However, they can not navigate out of the specific experiment or campaign.

If you want to reset the link you shared, click Reset Link. Users with the previous link will no longer have access to the Results page.

Export the experiment results data

Use the Export CSV feature to download the results of your experiment in a comma-separated value (CSV) file. You can use this file to view your results data in your favorite spreadsheet program. You can also share the raw results with others, store the data on your machine or perform additional analysis.

Click Export CSV to download the CSV file of the results shown on the page (limited to the Date Range and Segment selected).

export-csv.png

Here is a reference list of columns in your exported CSV files and their meanings. You can also access your Optimizely Web Experimentation and Optimizely Feature Experimentation export data via our Amazon S3 bucket.

Manage metrics

Click Manage Metrics to add or remove metrics or set a new primary metric. 

manage-metrics.png

Remember, if you add more than five metrics to an experiment, the additional metrics will take longer to reach statistical significance. This is because Optimizely Experimentation's Stats Engine controls the false discovery rate of your experiment, which is a description of the chances of making an incorrect business decision based on your results. 

However, the additional metrics do not slow down the speed of your overall test. Stats Engine ensures that the primary metric (which signals whether a variation "wins" or "loses") always reaches significance as quickly as possible.

Set currency symbols

You can change the currency symbols displayed for revenue metrics via the Currency Picker that is located under the Advanced tab in the Project Settings:

Changing the currency symbol does not convert revenue values based on an exchange rate. It only changes the symbol displayed. So a revenue value displayed as $100.00 would become ¥100 after picking "¥," regardless of the current exchange rate between USD and JPY. Learn more about tracking revenue in Optimizely Web Experimentation and Optimizely Feature Experimentation.

The currency symbol setting you select will be remembered for all sessions on your current browser. When logging in from a new device or browser, you must set the symbol again.

Edit experiment

Click Edit Experiment to make changes to your experiment. Use this option to pause a variation or adjust your traffic distribution.

edit-experiment.png

Reset results

The Reset Results button allows you to reset your Results page. This affects the data displayed on the results page and what Stats Engine calculates for this experiment.

The raw data for your experiment will still be available after the reset. If you need to access it, use the Data Export feature. This will allow you to access your Optimizely Experimentation Enriched Export data via an Amazon S3 bucket.

reset-results.png

Once you click on the Reset Results button, a modal will pop up. This modal is an agreement to verify that you understand this is a permanent action for the results page. It will also protect against accidentally enabling resets.

un-checked-reset-results.png

Once you check all three boxes, the modal's Reset Results button will become clickable. This is indicated by the button's color turning from gray to red. 

reset-results-all-checked.png

Clicking Reset Results will close out the modal. You may need to refresh the page to see your change reflected. The results page should now appear as though the experiment has just been started and there is no data yet. All visitor counts, experiment duration and metrics should return to 0. There will now be a notation above the Reset Results button that will show the date of the last reset, and the Date Range should only show the current date. 

Resetting the results does not reset variation bucketing assignments. Resetting the results is purely a front-end "wipe." As stated above, the underlying raw data events for the past results are still available through data exports. When the Results are reset, visitors will continue to see the variation they were previously bucketed into and be counted towards the correct variation.

Troubleshoot the Results page

Optimizely Experimentation aims to present your results data as soon as possible. Based on this, Optimizely caches your result data to have it ready to be displayed on the results page. Optimizely will fetch the latest data and refresh the results page automatically.

still-loading.png

In cases where Optimizely cannot fetch the cached data or the results have not been calculated yet, Optimizely will calculate the data in the backend and automatically reload the page when ready.

loading-no-cache.png

In case of a fatal error, you will see the following:

no-results.png

After a few minutes, try refreshing the page. If the problem persists, refer to troubleshooting Optimizely Experimentation support documentation or contact support.