10 common experiments and how to build them in Optimizely Experimentation

  • Updated
 
When you start experimenting on your site, you open a new frontier where you test ideas, gather feedback, discard what does not work, and build on what does. Knowing what to test first is a challenge. How will you make an impact? Where should you start?

This article provides a few quick test ideas to help you jumpstart your program—with guidance on building each one. Each features a short story, then walks you through a basic test setup. Use these to learn about setting up experiments and explore ideas for your own site.

Game-changing, future-shaping innovation doesn’t burst forth fully-formed like Athena from the head of Zeus. Nor is it the result of meticulous planning ... it’s a product of experimentation.

 - Polly LaBarre, founding member of Management Lab and Fast Company

If you want to try building an experiment like the ones below, follow these six steps.

See how over 35 leading companies execute winning experiment ideas in our Big Book of Experimentation.

Idea 1: Optimize for geographical differences

Neha's market research shows the top concerns of her North American and European customers differ from each other. In the U.S., her product's biggest selling point is the flexible, developer-friendly experience it offers. In European markets, most sales conversations start out with questions about privacy and security. Neha wants to experiment with different messaging on her site for European customers versus U.S. customers. She thinks that security-focused branding may make her product more compelling for European consumers.

Hypothesis

The product's value proposition focuses on the developer experience, which is a huge selling point for North American customers. This misses the top concern for European companies: security. If we show a security-focused value proposition in the hero banner of a high-traffic landing page for European visitors, the number of leads in from that region will increase.

Pages

The URL for this landing page.

Audiences

Visitors in Europe

Metrics

  • Clicks to “View pricing” (primary)

  • Clicks to the “Explore the docs” buttons (secondary)

  • Pageviews for the pricing and form pages (secondary)

  • Clicks to the form submission button (secondary)

Editor

Original: 

Variation: 

  1. Select the variation.

  2. Click to select the value proposition text.

  3. Under HTML, enter a new value proposition. Click Save.

  4. QA with the Preview tool.

  5. When you are ready, click Start Experiment.

Technical skills

None

Interpret results

If the variation wins, Neha has confirmation that her hypothesis gets something right. Security-focused value messaging does increase conversions from visitors in Europe. She may want build on this insight by expanding this branding across the site. 

If the variation loses, the results do not support Neha's hypothesis. Her next step is to figure out why. Do visitors in Europe who land on this page still value developer tools over security? Maybe developers, not program managers, make decisions based on this page. She might dive into her data and research to learn more that she can apply in future tests.

An inconclusive test tells Neha that her hypothesis is off or the test is not bold enough. Visitors continued to behave in the same way. She could try a more radical change, like switching out all the copy and imagery on the page.

More resources

More test ideas for B2B and lead generation.

Idea 2: CTAs for new visitors versus subscribers

David is in charge of the landing page for a streaming video site. Currently, it features a call to action (CTA): "Join free for a month." All visitors see this page, but research shows that subscribed customers find the button confusing. They have already joined, so the CTA does not quite apply to them. David knows his company cares about logins, so he wants to test a separate CTA for subscribers that focuses on logins.

Hypothesis

When subscribed customers land on this page, the "Join free" CTA is confusing because they have already joined. If we change this button to match the returning visitor's mindset, we can increase logins.

Pages

The URL of the landing page.

Audiences

Returning visitors

Metrics

  • Clicks to the CTA button (primary)

Editor

Original: 

Variation:

  1. Select the variation.

  2. To edit the CTA, click the text to select it. Under HTML, edit or replace the text with your new CTA. To remove extra text, click to select. Under Visibility, click Removed.

  3. Select Save.

  4. QA with the Preview tool.

  5. When you’re ready, click Start Experiment.

Technical skills

None

Interpret results

If the variation wins, the results support David's hypothesis: returning visitors respond well to a CTA about seeing what is next. Next, he might expand on this theme or test variations that evoke the returning visitor's mindset. If he has Personalization, he might feature a show that the visitor has watched or highlight popular shows for a certain audience segment.

If the variation loses, the results do not support David's hypothesis. He should investigate if any assumptions that he made might be wrong. Are returning visitors looking to do something different from what he thought? Is there something else on the page that influences their decision to log in? A loss here can give him insight into how subscribed customers react to this page.

An inconclusive test suggests that the hypothesis is off or the change is not bold enough—visitors continue to behave in the same way. He can try clearer, more distinct messaging or move on to a more urgent experiment.

More resources

More test ideas for media and publishing sites.

Idea 3: Remove distractions from the checkout funnel

Jessica’s team has a mandate to improve the purchase rate of her company’s retail site. They have dug into the data and noticed that a high percentage of customers leave the checkout flow through the breadcrumb navigation. Jessica believes that she can improve the purchase rate by removing distractions from the checkout funnel. She would like to validate this idea by testing a less cluttered the navigation.

If you are using Optimizely Experimentation to test on a checkout page, you might need to configure your site for PCI compliance. See this article for details.

Hypothesis

Customers are distracted from completing their purchases by navigation elements. If we remove those distractions, we should see the purchase rate increase by 7%.

Pages

Every URL in the checkout flow.

Hint: You can use substring match if all your checkout pages follow a pattern like “yourdomain.com/checkout.”

Audiences

Everyone

Metrics

  • Clicks to the continue button on each step of the checkout (primary)

  • Pageviews for the purchase confirmation page (secondary)

  • Clicks to the Submit Order button (secondary)

  • Pageviews for each step in the checkout flow (secondary)

Editor

Original:

Variation:

  1. Select the variation.

  2. Next to Changes, click Create.

  3. Click Element Change > Select the elements you would like to remove, such as the container that includes the breadcrumb navigation. Click the Hidden icon () to hide the feature without removing the field or the Removed icon (). Then, click Save.

  4. QA with the Preview tool.

  5. When you are ready, click Start Experiment.

Hint: Having trouble loading your page in the Editor? Check out this article on loading session-specific pages.

Technical skills

None

Interpret results

If the variation wins, Jessica’s team learns that there is merit to the hypothesis. They know that visitors are leaving the funnel because they have been given the opportunity; not because they do not intend to complete their purchase. Jessica's next step might be to shorten the checkout flow or remove other distractions from the funnel.

If the variation loses, the team’s assumption about visitors dropping off is not correct; visitors may be leaving on purpose. They might investigate if there is a negative or confusing experience somewhere along the way.

An inconclusive test suggests that the hypothesis is off or the change is not bold enough—visitors keep leaving at approximately the same rate. Jessica may investigate where and how visitors are leaving the funnel, and why.

More resources

A few more test ideas for your checkout funnel from our blog.

Test ideas for E-commerce and retail businesses.

E-Commerce: Tips for good experiment design.

Idea 4: Rearrange subscription price-points

Emmanuel has noticed that the mid-priced “Plus” plan on his page receives a surprisingly high percentage of subscribers, compared to the free “Basic” and upgraded “Premium” plans. This distribution also matches the click-through rate to the plans details pages. Emmanuel suspects that the position of options on the plans page is affecting interest and purchases. He would like to test a different arrangement to encourage more customers to evaluate and subscribe to the “Premium” plan.

Hypothesis

A smaller-than-expected percentage of visitors click the Basic or Premium plans on the pricing page; this may be due to positioning. If we move the Premium plans to the center of the page, we will see lift in interest and subscriptions.

Pages

The URL for the plans page.

Audiences

Everyone

Metrics

  • Clicks to the Go Premium button (primary)

  • Clicks to the Sign up and Get Plus buttons (secondary)

  • Plan details page pageviews (secondary)

  • Purchase confirmation pageviews (secondary)

  • Revenue (secondary)

Editor

Original: 

Variation: 

  1. Select the variation.

  2. Next to Change, click Create.

  3. Click Element Change > Select the container for the element (it will be highlighted).

  4. Under Rearrange, click the () icon. Then, click the element you would like to rearrange the first element in relation to (the “Plus” container, in the example above).

  5. Use the dropdown menu below Rearrange to re-position the element. Then, click Save.

  6. QA with the Preview tool.

  7. When you are ready, click Start Experiment.

Technical skills

None

Interpret results

If more visitors click the Go Premium button in the variation, Emmanuel has validation that visitors are sensitive to the position of the packages on his pricing page. His result also supports the idea that visitors are more willing to consider the Premium package than he might have thought. He can keep experimenting with ways to make the Premium package more visually prominent.

If the variation loses, the results do not support Emmanuel's hypothesis. He may not have identified the right problem or the right solution. If his overall goal is to increase Premium subscriptions, he might investigate what customers like about this package and test ways to draw attention to those value propositions. 

If the experiment is not conclusive, the change is not big enough or customers are not very sensitive to the position of plans. Emmanuel can leverage this insight in future designs.

More resources

More test ideas for B2B and lead generation.

Idea 5: Highlight key value propositions

Kate's company offers a number of pricing packages on the site, but research shows that customers find this list overwhelming and have trouble choosing. Based on this information, she would like to experiment with highlighting key value propositions to help customers choose between certain price points and packages.

Hypothesis

Customers are confused by the array of packages we offer. If we highlight a key value proposition, we will make it easier for customers to choose and purchases for that package will increase.

Pages

The URL for the plans page.

Audiences

Everyone

Metrics

  • Clicks to the highlighted package (primary)

  • Clicks to an Add-to-Cart button (secondary)

  • Pageviews for the package details pages (secondary)

  • Pageviews for a purchase confirmation page (secondary)

Editor

Original:

Variation:

  1. Select the variation.

  2. Next to Changes > Click Create.

  3. Select Insert HTML to add a value proposition next to a package. You'll need to insert your text in relation to another element on the page—in this case, the package. Click the () icon, then click to select the package you'll highlight.

  4. Under HTML, enter your text. Then, under HTML Placement, reposition the inserted text as you like. 

  5. Click Save. 

  6. QA with the Preview tool.

  7. When you’re ready, click Start Experiment.

Technical skills

None

Interpret results

If the variation is a winner, Kate has clear confirmation that highlighting value propositions helps customers choose and purchase a package. She should also check her secondary metric for overall purchases or overall Add-to-Cart clicks to ensure that this plan increases her company's revenue overall. If revenue decreases, she may want to investigate which plan brings in a larger percentage of the revenue and test compelling value propositions for that plan type. 

If the variation loses, the results do not support Kate's hypothesis. She has not identified either the right problem or the right solution. Maybe a different type of value proposition would work better: price per unit if her customers are especially price-conscious, or CTAs for buyer personas if she learns that certain personas prefer certain packages.

If the test is inconclusive, the change probably is not big enough. Kate might consider experimenting her way through a radical redesign of the pricing page.

More resources

Case study: Seeking the global maximum.

Idea 6: Symmetric messaging

Wesley’s company is advertising a holiday sale on fitness trackers with two different ad campaigns: one focusing on holiday gift-giving and the other outdoor adventures. He has been charged with optimizing the paid search campaigns. Wes believes he can quickly increase conversions by customizing the imagery to match each specific ad’s messaging (rather than using a generic page for all ads).

Hypothesis

The current landing page for Wesley's site promotes holiday gift-giving. But when visitors click through from ads that highlight using a fitness tracker for outdoor adventures, they should see symmetric imagery when they land on the site itself. If he switches the imagery to match, the site will align with the visitor’s mindset and goals, and conversions will increase.

Pages

The URL of the landing page.

Audiences

Visitors who match the “Ad campaign” audience condition.

Hint: The Ad campaign audience condition is “sticky,” meaning a visitor sees the same experience every time they come back to the site. Visitors who see the adventure imagery will keep seeing mountains and visitors who see gift-giving imagery will continue to see presents, even if they visit the page without the query parameter.

Metrics

  • Clicks to the CTA (Shop the sale) button (primary)

  • Clicks to product detail page buttons below (secondary)

  • Clicks to top-level navigation (secondary)

Editor

Original:

Variation:

  1. Select the variation.

  2. Click the image to select the container. In the left navigation, click Background and upload your new image.

  3. QA with the Preview tool.

  4. When you are ready, click Start Experiment.

Technical skills

None

Interpret results

If the variation “wins,” Wes has a good signal that symmetric messaging works well for his visitors—and that the outdoor adventure messaging resonates. He can use this information to design an even more differentiated landing page, or create a Personalization campaign to deliver more granular targeted messaging.

If the variation loses, he learns that his outdoor adventure imagery does not resonate with visitors who clicked his ad. He may want to ask why—are they looking for a different type of adventure? Is the holiday messaging still more powerful for these visitors?

If the results are inconclusive, he might want to try a more radical change if he thinks there is an opportunity worth pursuing.

More resources

Case study: Testing personalized experiences at Secret Escapes

Blog: The impact of symmetry in online marketing

Idea 7: Personalize based on cookies

Sylvia knows that her travel site adds cookies that she can leverage to create a personalized experience. She hypothesizes that she can increase browsing behaviors and purchases if she customizes the site’s homepage based on visitors’ interests. As an example, Silvia features a hero banner that highlights a destination related in the geographical area that a visitor frequently browses.

Hypothesis

Silvia thinks that visitors who browse locations in a certain geographical area are more likely to buy vacation packages in those areas. If she features destinations in that area in the hero banner for visitors who’ve browsed more than two pages, she’ll increase browsing behaviors and conversions.

Pages

The URL of the homepage.

Audiences

Visitors who meet the audience condition for a unique cookie that Silvia can personalize to. For example, a cookie that tracks the geographical area of vacation destinations that a visitor most recently browsed.

Metrics

  • Clicks to the promoted hero banner (primary)

  • Pageviews for the promoted destination (secondary)

  • Clicks to the Search button (secondary)

  • Pageviews for other destinations in the area (secondary)

Editor

Original:

Variation:

  1. Select the variation.

  2. Click the image to select the container. In the left navigation, click Background and upload your new image.

  3. QA with the Preview tool.

  4. When you are ready, click Start Experiment.

Technical skills

None

Interpret results

If the variation wins, Silvia has confirmation that more personalized content improves browsing and purchase behaviors for her site. She may continue testing this concept by promoting vacation types instead of geographical areas. Or, she might start to deliver more granular Personalization campaigns; for example, promoting amenities and services often purchased or browsed in certain geographical areas.

If the variation loses, the results do not support Sylvia's hypothesis. She should investigate why. Maybe her visitors prefer to start with a wide array of options instead of a targeted experience. Or, maybe many of her visitors are not primarily shopping by the geographical area—the price of the vacation is more important to them.

If the test is inconclusive, the experiment may not be bold enough. Silvia might try a lightbox or overlay for a more radical change.

More resources

Blog: What we learned by A/B testing a personalized homepage.

Blog: The Opportunist versus The Planner: Personas for travel and entertainment websites.

Inspiration for website personalization.

Idea 8: Test promotion formats

Sarah’s company offers a promotional price on a one-year subscription on its birthday, every year. The offer generates a significant jump in new subscriptions, and research shows that even customers who see the offer without purchasing respond positively.

The offer is usually promoted through an email mailing list and a note in the header. This year, Sarah plans to experiment with making the anniversary promotion more visible. She decides to test whether a site-wide banner will increase subscriptions even more.

Hypothesis

Sarah knows that her company’s anniversary sale is a successful driver for one-year subscriptions. If she promotes the promotion with a site-wide banner instead of the usual promotion in the header, even more customers will see it and sign up for discounted subscription package.

Pages

Every page on the site.

Hint: You can use substring match if all your pages follow a pattern like “yourdomain.com/.”

Audiences

Everyone

Metrics

  • Clicks to the banner link (primary)

  • Pageviews for subscription details (secondary)

  • Pageviews for the subscription form (secondary)

  • Clicks to the submit button for the form (secondary)

Editor

Original: 

Variation:

First, create (or ask your developer to help create) an Optimizely Experimentation extension: a template that helps you add custom features like carousels and banners without continuous developer support.

Check out Optimizely Experimentation's library of pre-built, re-usable extensions here

  1. Navigate to Implementation > Extensions.

  2. Click Create New > Using JSON.

  3. Copy the JSON for the butterbar extension.

  4. Click Create Extension. Then, click the Actions icon () and enable the extension.

Next, add the extension to your variation.

  1. Navigate to your variation.

  2. In the Editor, go to Create. When you scroll down, your butterbar extension should appear under Create Options.

  3. Click to select the butterbar and edit it as you would like. Remove the other promotion.

  4. QA your experiment.

  5. When you are ready, click Start Experiment to publish it!

Technical skills

None (but you may need to know JSON to create other, similar extensions).

Interpret results

If the butterbar variation wins, Sarah knows that this type of element drives more conversions than the header promotion they have traditionally used. She might decide to use this for other promotional offers. Or, she might experiment with different messaging or offers for logged-in vs. logged-out users (with cookie-based audience targeting).

If the butterbar loses, it is less successful at driving conversions. This is not a win but it is an opportunity for Sarah to dig into “why.” She might look into her data to see if there are differences between segments, investigate whether the butterbar does not display correctly for some visitors or browser types, or try a different design or message. 

If the test is inconclusive, Sarah might try a bolder version of the butterbar. Or, she might conclude that for this type of promotion, visitors respond similarly either way. She can use this insight when she needs to make a decision about design constraints for header or banner options in the future.

More resources

Optimizely Experimentation Extensions: Template custom features like lightboxes, banners, and carousels, so you can easily add them with the Visual Editor.

Idea 9: Optimize a form

Greg’s company relies heavily on landing pages to generate leads for the sales team. The company uses the inputs in the form to validate high-quality leads and tailor marketing efforts. But Greg has noticed that completion rates have fallen as more fields are added. 

Greg believes that he can optimize the form by balancing the quality and quantity of information. He wants to experiment with the form to see if they can identify a sweet spot: to ask for just enough information without adding extra friction to the funnel.

Hypothesis

Prospective customers experience the fields in our lead generation forms as friction. If we remove extra fields and streamline the form, we should increase form completions by 5-10%.

Pages

The URL for the page that the form is on.

Audiences

Everyone

Metrics

  • Clicks to the form submission button (primary)

  • Clicks to the navigation elements on the page (secondary)

  • Clicks to other buttons on the form page (secondary)

Editor

Original: 

Variation:

  1. Select the variation.

  2. Changes > Click Create.

  3. Element Change > Select the field. Click the Hidden icon () to hide the feature without removing the field or the Removed icon (). Then, click Save.

  4. Repeat Step 3 to hide or remove the label.

  5. QA with the Preview tool.

  6. When you are ready, click Start Experiment.

Technical skills

None

Interpret results

If the new variation wins, the results support Greg’s hypothesis that form fields are friction points for visitors. He might want to experiment further with this theme: by removing more fields or using placeholder text instead of field labels for a cleaner look.

If the new variation loses, the results do not support Greg’s hypothesis. This tells him he has not identified the right problem or solution. He might choose to do more research to dig in and understand what assumptions he has made that this data contradicts. Maybe customers in his industry expect to provide a large amount of information; or something else is negatively affecting the form experience.

An inconclusive result may mean that Greg tested too small a change; he might experiment with a more radical change by removing all but absolutely necessary fields. Greg could declutter the visual design, or add or remove explanatory text.

More resources

Case study: A radical redesign for a lead generation form generates a big impact.

More test ideas for B2B and lead generation sites.

Idea 10: Add social proof

Byron's team put together a business intelligence report to help generate higher-quality test ideas. In the report, the team highlights the compelling impact of social proof. They do not currently feature any social proof on the site, but Byron knows the marketing team recently gathered a set of customer use cases and testimonials. He would like to experiment with adding social proof to a high-traffic spot to see if it helps customers convert.

Hypothesis

The payroll feature is relatively new and unproven in the market, but it is being showcased and gets significant traffic. If we add social proof to this page, visitors see more evidence of success and conversions on the "Get Started" button will increase.

Pages

The URL of the feature page.

Audiences

Everyone

Metrics

  • Clicks to the Get Started button (primary)

  • Clicks to the details page of the feature (secondary)

  • Pageviews for the details page (secondary)

  • Form submissions (secondary)

Editor

Original:

Variation:

  1. Select the variation.

  2. Next to Changes > Click Create.

  3. Select Insert HTML to add a testimonial. 

  4. You will need to insert your text in relation to another element on the page. Click the () icon, then click the selector you will use to position the new text. 

  5. Under HTML, enter your text. Then, under HTML Placement, reposition the inserted text as you like.

  6. Click Save. 

  7. QA with the Preview tool.

  8. When you are ready, click Start Experiment.

Technical skills

None

Interpret results

If the variation wins, the results support Byron's hypothesis. He knows that social proof is compelling to his visitors and helps lift the conversion rate. He might build on this win by adding a short video to boost the impact of the testimonial. If he knows the industry vertical of some of his visitors, he could personalize the page with testimonials from businesses in the same vertical. Or he might look for other opportunities on the site to experiment with social proof.

If the variation loses, the results do not validate his hypothesis. He might segment his results and investigate whether the variation performed better or worse for any particular groups of visitors.

An inconclusive result may mean that the change Byron tested is too small. His next step may be to run a low-effort test that makes a bigger impact. For example, he can add a picture to his testimonial to make it more visually compelling. He could find a more dramatic quote, or feature the testimonial above the fold.

More resources

Use case: comScore increases new leads by 69% with social proof

Bonus: What not to test

Experimentation is both an art and a science. When you are just starting out, you may not know what types of tests to steer clear of. Here are three common mistakes to avoid if you want your tests to make an impact:

  1. Testing parts of your site that get low traffic. If you have a low-traffic site, we have some tips for you. But in general, avoid running experiments on pages where you do not get many visitors. It takes longer to get as many visitors into the experiment as you would get on a high-traffic page—and you may not see statistical significance while the test stalls. This includes remote pages of your site as well as parts of a page most visitors do not see, like the footer of a high-traffic page or a drop-down menu.

    In terms of opportunity cost, the price of testing on low-traffic areas is high. You will spend time waiting for results of a slow test when you might be testing other, impactful ideas. Aim to run tests on high-traffic areas so you see value right away.

  2. Testing changes that are too small to matter. When you are still getting your feet wet, it can be tempting to test small changes. You are not yet sure what impact your testing will have, or how visitors will respond. But this is exactly why you should test a bold change instead of a minor one. The value of experimentation is in the changes in your visitors' behavior that you learn from and take action on.

    Furthermore, a bigger change means your test will likely run faster since it makes more of an impact. If you test a change that is too small, you may not see significance—your program will stall while you wait. Focus on bold, impactful changes that visitors will notice and respond to.

  3. Testing ideas that are too far removed from core goals. For most programs, everything you test on your website will likely tie back to three major goals: increasing engagement, collecting data about your visitors (such as name, email, preferences), and purchases. Experiments that do not drive towards the core business goals for your company won't help your program make an impact. Focus your testing on changes that affect metrics that matter.