A/B testing (legacy)

  • Updated
A/B testing add-on for Optimizely Content Management System (CMS 11) is now open-source

For complete information, see The A/B testing addon for Optimizely CMS is now open-source by Kevin Shea.

This announcement applies to the add-on for Optimizely Content Management System (CMS) version 11 and does not pertain to the Optimizely Content Management System (CMS) version 12 of the package. There is a .NET 6 version of the add-on currently available and still supported to ensure stability.

  • The source for the add-on is now open and available at episerver/ab-testing (github.com). Fork the repository to add, update, or replace capabilities as it is relevant to your projects.
  • Optimizely continues to actively maintain support for the add-on for 3 months, ending on November 15, 2022.
  • After November 15, 2022, Optimizely’s official contributions to the project end. The source for the add-on will remain available to the community and free to use or modify as you see fit.

A/B testing lets you create variations for many page elements (blocks, images, content, buttons, form fields, and so on), and then compare which variation performs best. It measures the number of conversions obtained from the original (control) versus the variation (challenger), and the one that generates the most conversions during the testing period is typically promoted to the design for that page. Optimizely A/B testing has several predefined conversion goals you can use when setting up a test, and it is also possible for Optimizely developers to create customized conversion goals.

See also Optimizely Web, a powerful front-end A/B and multi-page experimentation product.

The Optimizely Digital Experience Platform contains many features to support you in your daily work. Depending on how your solution is set up, some features described in this documentation may not be available.

video icon Video tutorial: A/B testing (5:07 minutes)

How it works

Let's say you want to know whether a different advertisement can generate more interest from your site visitors. Using A/B testing, you create two page versions with two advertisements linking to a target page. You set the A/B test to use the conversion goal Landing Page, which measures how many visitors click the advertisements and reach the target page.

  1. When visitors view your A/B test page, the visitor sees the original (A / Control) or the variation (B / Challenger) version. A/B testing logs which version the visitor sees. If a visitor returns to the test page, the visitor sees the same version (A or B) throughout the test. However, if they clear cookies and revisit the test page, they are considered a new visitor in the test.
  2. If a visitor clicks the advertisement, the target page displays and A/B testing logs the action as a conversion.
  3. When the test duration is completed, the version that achieves the best results (the most clicks) is declared the test winner.
  4. Depending on your site configuration, you can manually pick a winner (usually the one with most conversions) or the winner is automatically published when the test completes. Test winners are only automatically published if test results are statistically significant. For information on the statistical significance of A/B tests, see the Statistical significance section below.

How an A/B test works

Statistical significance

Statistical significance is a calculation that determines if test results can be considered "significant" or not. It is a function of the number of views and conversions of the variants. So, if one version is winning by a wide margin but has a relatively low number of views, it could still be calculated as the statistically significant winner of the test. A test with many more views but where the variants' conversion rates run much closer could have results that would not be considered significant. Theoretically, statistical significance can be achieved at any point during a test. This is not done here. A/B testing lets the test finish before the calculation determines if the results are “significant”.

So, how many views are needed to ensure statistical significance? The answer is that it depends on the margin of conversions the winning variant is winning by.

Confidence level

The confidence level set in Advanced options is used in the significance calculation to specify the amount of variance the results can have before they are considered statistically significant. The higher the confidence level you select, the more “sure” the calculation has to be that a variant is winning by a statistically significant margin. This indicates how much standard deviation the calculation can have before the results are considered significant. Typically, more data in a test means that the standard deviation goes down, and thus the confidence % in those results goes up.

When the test is completed and results are calculated using the selected confidence level, the reporting displays at the top of the Pick a Winner screen.

Start an A/B test

    1. Start with a published version of a page or block as the original (A / Control). For example, you have a site devoted to air travel tips and want to get visitors interested in exploring your site. Will a fancy graphic button get more click-through than a plain text button?
    2. Create a draft by changing the button or making some other change to the page:
Image on version A / Control: Image on version B / Challenger:
Original button New button
  1. Select Publish? > A/B Test changes. Do not publish the changed page. The A/B test view appears showing A / Control and B / Challenger thumbnail images.

    If you are using Content approvals, set your draft to Ready for Review and let it be approved before you start the A/B test.

    A/B test screen

    A/B test screen part 2

    A/B test screen part 3

  2. Configure your A/B test by setting the following options:
    Option Description
    Test Goal Enter your hypothesis for the test. This is for your information only.
    Conversion goals Select the conversion goal or goals that you want to measure. (Conversion goals are also known as key performance indicators KPIs.) You can add up to five conversion goals for the A/B test, and under Advanced Options, you can decide if some goals are more or less important than others.
    • Landing Page. Select a target page to which the visitor is taken when the visitor clicks through. Only a click-through is counted as a conversion.
    • Site Stickiness. Select a target page and a timeout period (1-60 minutes). The A/B test counts a conversion if a visitor goes from the target page to any other page on your website during the time period. If the visitor closes the browser and then opens your target page again within the specified time period, a new page view is not counted. However, a conversion is counted if the visitor goes from the target page to another during the second visit.
    • Time on Page. Enter a time in seconds. The A/B test counts a conversion when a visitor stays the defined time on the test page.
    • Add to Cart. Select a product a site visitor can add to a cart. If a visitor adds that product to a cart, it is counted as a conversion.
    • Average Order. Select this conversion goal to track completed orders on each test page. The conversion goal totals the values of Optimizely Commerce Connect carts created by visitors included in the A/B test. The test determines which page variant creates the highest average value for those carts when picking a winner. If a visitor creates multiple carts, the (purchased) carts are included in the total, meaning the visitor can “convert” many times in the test duration. On Optimizely Commerce Connect websites using different currencies, the test converts the carts to the same currency.
    • Purchase Product. Select a product a site visitor can buy. If a visitor buys that product, it is counted as a conversion.
    You need Optimizely Commerce to use Commerce-related conversion goals such as Add to Cart, Purchase Product, and Average Order.
    Participation percentage Enter the percentage of the total amount of traffic to your A/B test.

    If you set it at 100%, all website visitors participate in the test. Half of the test participants will see version A, and half will see version B.

    However, you may not want so many visitors to see version B if it includes something that might be unsuccessful. You accomplish this by lowering the percentage of visitors included in the test. Visitors not included in the test will see version A. Only visitors included in the test count in the statistics.

    Test duration Specify the number of days you want the test to run.
    Start test Select one of the options. You can stop the test before the number of specified days is done.
    • Start test immediately. Select this option and click Start Test after you specify the test parameters.
    • Schedule for later. Select this option and a date picker appears. Select a date and time to start the test. Click the Schedule Test button after you specify the test parameters.
    Advanced Options
    • Balance the importance of test goals. Select if one goal is more important or less important than the others. If two conversion goals are set to High (or Low), it is the same as leaving them at Medium, meaning they have the same importance and thus do not weigh the test result. Similarly, if you add a single conversion goal, the selected weight does not affect the test result.
    • Confidence level. Select the confidence level of statistical significance you want from the results you gather. The higher the confidence level you select, the more “sure” the calculation has to be to determine that the winning variant is winning by a statistically significant margin.
  3. Click Start Test if you set the test to start immediately, or Schedule Test if you scheduled the test for later.

View a running A/B test

  1. To view a running A/B test, open your test page, and click View Test on the notification bar.

    View Test on the notification bar

    The test results are displayed, and a flame graphic shows which version is leading: Image leader icon.

    A/B test results screen

    Beneath the two thumbnail images, you can view the currently collected test data, such as views, number of conversions, and conversion rates. If you are measuring towards multiple conversion goals, you see how each goal performs and what weight each is given. If you only measure towards one goal, you see the test data and a pie chart visualizing the conversion rate. The conversion rate can be a percentage or an amount if you use the Average Order KPI. (The pie chart is not displayed for Average Order KPI.)

    A/B testing normally calculates the views as the number of times a page was displayed to a visitor. However, when testing a block, A/B testing counts the number of times CMS requested the block. If you have a condition set on your block so it is only displayed to certain visitor groups, a view may be counted even though it has not been displayed to a visitor.
    The statistical significance of the test is calculated when the test is finished. Before that, it is impossible to say whether the test results are significant.
  2. You can select the following actions from the Options menu:
    • Pick The Winner. If you see enough data before the test is completed, you can stop the test and pick a winner. For example, perhaps the changed page is a clear runaway winner, so another few days of testing may not significantly affect the result.

      If you select Pick The Winner, the Pick the Winner view displays. The leader is highlighted in green. Click Pick The Winner, and it is automatically published. After you select a winner, the loser is added to the Versions gadget as a historical artifact.

      Pick the Winner screen

    • Abort A/B test. Stop the test and discard the results.

Pick a winner

Depending on your site configuration, a test winner can be published automatically at the end of the test, or you can publish it manually during or after the A/B test.

Publish a test winner automatically

An administrator can set up your site to automatically publish A/B test winners at the end of a test if the test result is statistically significant. If this setting is enabled, it affects all tests on your site. When a test finishes, the test winner is published. However, if the test result is not statistically significant, you must manually publish one of the test versions.

Publish a test winner manually

If you have publishing rights, you can publish a test winner while the test runs or wait until it finishes.

  1. To view a finished A/B test, open your testing page, and click Pick winner on the notification bar.

    Pick winner notification bar

    The test results are displayed.

    Pick The Winner screen

    At the top of the test result screen, you can see if the results are statistically significant.

    Test results not significant

    The test winner is highlighted with a green background and a trophy icon: Image winner icon. The Pick The Winner button of the test winner is green, but you can publish either version.

    Beneath the two thumbnail images, you can view test data, such as views, number of conversions, and conversion rates. If you are measuring towards multiple conversion goals, you see how each goal has performed.

  2. Click Pick The Winner on the version you want to publish, and it will be published immediately.

    The loser is still available in the Versions gadget.

Manage A/B tests

You cannot edit a page's test settings or content while the test is running because you could invalidate the results. If you need to change the test settings or something on the test page, you must cancel the test, make your changes, and start the test over. You can cancel the test from the Options menu in the test view or from the test page. If you open a draft of the test page, the Options menu is called Publish.

Use the Tasks tab in the navigation panel to find A/B tests.

A/B tests in Tasks pane  
Scheduled Tests – Displays links to tests that are scheduled to run at a later time.

Active Tests – Displays links to active test pages that are collecting data. Click an item to display the test page, where you can click the View test link to display the snapshot of the result data.

Completed Tests – Displays links to completed tests. Data is no longer being collected. A winner has not yet been published.

Archived Tests – Displays links to completed tests where a winner has been published.

For other statuses in the Tasks bar, see Control the publishing process.

View completed and archived tests

In the Tasks panel, you can see A/B tested pages by selecting Active Tests, Completed Tests, or Archived Tests. (A completed test is a test that is finished but a test winner was not yet published. An archived test is a completed test where a test winner was published.)

To view the individual tests run on a specific page, add the Archived Tests gadget to the navigation or assets panel and open a tested page. The gadget displays archived tests run on the current page. Click a test in the gadget to view the test details.

Archived Tests gadget