Experiment results overview

  • Updated

In Optimizely, an experiment is a structured test that compares variations of a webpage, feature, or experience to determine which performs best. Experiments help you optimize user experiences, increase conversions, and make data-driven decisions. All experiments created within Optimizely are visible on the Experiments page in Optimizely Warehouse-Native Experimentation Analytics.

Optimizely Analytics supports A/B tests and multivariate tests.

Optimizely offers different types of experiments for various objectives. Some experiments focus on learning and long-term insights, such as A/B tests, multivariate tests, and Stats Accelerator, which incorporate statistical significance calculations. Others prioritize immediate impact, like multi-armed bandits (MABs) and contextual bandits, which optimize for short-term gains without statistical significance analysis. Learn about distribution modes and experiment types.

Access experiments

Go to Experiments to see your most recent experiments. Refine the list by adjusting filters (such as date range) or selecting one or more Types.

Select an experiment to open its scorecard.

opti-es-10.png

Experiment results

Each experiment scorecard has a Summary and Explore tab. Learn how to access Experiments.

The Summary tab has key insights from the selected experiment to support decision-making.

The Explore tab lets you further analyze data within the experiment and its variations.

The following options are available on the Explore tab: 

explore-options-bottom.png

Summary tab

The Summary tab shows the selected experiment, its configured decision-making metrics, and the results in the visualization window. Use graphs, date ranges, attributes, and the baseline to determine results. You can modify the scorecard configuration within the tab. Learn more about creating scorecards.

summarytab.png

Edit experiment in Feature Experimentation

To change experiment settings, click the external-link icon to open the flags section in Feature Experimentation.

editexp.png

Change the baseline

The Baseline option lets you compare your variations against a chosen variation instead of the original. To do so, select your preferred variation from the Baseline drop-down list.

Screenshot 2025-03-13 at 8.39.14 PM.png

Manage metrics

You can add a primary or guardrail metric and remove or edit previously added metrics.

adding-metrics.png

Visualization options

On the Summary tab, the Visualization area provides segmentation options (Segment and Group By) and graph options (Improvement Over Time, Results Over Time, and Statistical Significance Over Time) You can toggle between different graphs for each metric.

exploreoptions.png

Sample ratio mismatch (SRM) detection

A sample ratio mismatch (SRM) occurs when users are unexpectedly imbalanced across your experiment's variations. An imbalance can signal issues with your experiment configuration or external factors, which may invalidate your results. Learn about sample ratio mismatch (SRM) detection.

Click Check SRM status in the Experiments section to see the latest health status of your experiment traffic distribution.

srm-checkhealth.png
In Web Experimentation and Feature Experimentation, the SRM status updates automatically on page load. In Analytics, refresh the page to update the status. Triggering this update does not run a real-time analysis. It retrieves and displays the most recent status from automated checks that run periodically in the background.

Health check overview

Check health verifies the integrity of data used in experiments to ensure accurate results and reliable decision-making. Check health runs the following verification:

  • Dataset primary key uniqueness – Runs a primary key check on the actor dataset to verify that each actor identifier is unique. Learn about the primary key health check.
  • Actor identifier alignment – Compares actor identifiers across the event, decision, and actor datasets. Significant misalignment usually means a wrong column was selected during experiment configuration, or there is a broader data-integrity issue to investigate.
  • Single variation per actor – Counts actors that were assigned conflicting variations and excludes them from the analysis. A high count usually indicates the experiment is misconfigured.

Each check returns one of four health statuses. The status determines which data configuration you need to adjust:

  • Healthy – The data passed the check.
  • Unhealthy – The check found a critical data-integrity issue.
  • Warning – The check detected a potential issue that could affect the accuracy of your results.
  • Skipped – The check did not run because the primary key configuration is invalid (the selected columns are incompatible or misconfigured).
exp-health-2.png

Share experiment results

You can share the Results page with stakeholders using one of the following methods:

  • Email – Click the share icon, enter email address, and click Share.
  • Copy link – Click the link icon, then copy and send the provided URL.
shareexp.png

Graphs

Graphs provide a granular view of the data. Choose from the following graph types:

  • Improvement Over Time – Explore each variant's performance over time and track improvements and trends across versions.

    impovertime.png
  • Results Over Time – Track changes in experiment results over time.

    resultovertime.png
  • Statistical Significance Over Time – View changes in the statistical significance of different variants over time.

    statsigovertime.png

Explore tab

The Explore tab lets you compare segments, run funnel analyses, and perform other investigations.

experiments-2.png

Exploration summary with Optimizely Opal

Prerequisites

  • You must use Opti ID to access Opal.
  • Your Optimizely Analytics instance must be enabled for Opti ID.
  • You must have generative AI enabled in Optimizely.

If you use Opti ID, administrators can turn off generative AI in the Opti ID Admin Center. See Turn generative AI off across Optimizely applications.

Optimizely Opal interprets and summarizes the data in your explorations, so you do not have to scan visualizations and tables manually. Click the summarize icon in the visualization window to generate a summary of your exploration.

Summaries are not available for cohorts, metrics, dashboards, or other entities in Analytics.

exploration-summary-1.png

The chat displays the following information:

  • A brief summary of your exploration
  • Key takeaways
  • Next steps and suggestions

exploration-summary-2.png

Segment experiment results

You can segment your results by cohorts and attributes. 

  • Segment – Segment your results by a chosen cohort of actors. 

    segbycohort.png
  • Group By – Refine your results using one or more attributes.

    grpbyprop.png

Add tiles

Click + Add Tile to customize your visualization window.

  • New Visualization – Add an exploration to the Explore tab.
  • Existing Visualization – Select an existing exploration and add it directly to the Explore tab.
  • Filter – Add filters that you can use to narrow down data in a visualization.
  • Cohort Filter – Use cohorts to narrow down data in a visualization. 
  • Parameter – Modify the value of any placeholder parameters used in the queries of linked visualization tiles. 
  • Experiment – Add an experiment.
  • Text – Add text blocks anywhere in the Explore tab to provide context.
experiments-8.png

Adjust grid settings

Grid Settings let you alter grid configurations using the following options:

  • Grid Columns – Specify the number of columns in the grid.
  • Compact Vertically – Toggle on the compact grid view.
  • Back to default – Revert to the default grid settings. The option is enabled only if you have changed the defaults.

Click Apply to save the changes to the grid settings.

grid-icon.png

Comments

You can add comments about items in the visualization by clicking the Comment icon, entering your notes, and clicking Send.

opti-es-16.png

To edit a comment, click More (...) > Edit Comment. Make your changes, and click Confirm to save.

To delete a comment, click More (...) > Delete Comment. Click Confirm to delete.

opti-es-17.png