Five elements of effective experiment design for travel sites

  • Updated
This topic describes how to:
  • Design good experiments using five key elements of effective tests
  • Explore ideas for your own site

One of the challenges in experimentation is designing a good test. A well-designed experiment can promote your site experience and improve conversion rates; a poorly designed one can stall your program. Once you have a hypothesis, how do you turn it into an experiment generates real insight and ROI?

At Optimizely Experimentation, we work with the best experimentation programs in the world. We have learned a lot about experiment design and have identified five key elements that most successful experiments include. 

This article walks through an impactful experiment that Hotwire, a leading travel bookings company, ran on its mobile website—and breaks down five components that were key to its success. Use this scenario to learn about designing experiments and explore ideas for your own site.

Hotwire's Car Rentals homepage


Hotwire is a travel company that provides flight, hotel, and car bookings through its website and mobile app. Since travelers’ research and reservation needs are uniquely mobile, their mobile app and mobile website are important parts of the business.

Hotwire’s Site Optimization team identified the mobile web experience as a prime candidate for redesign. The experience had not been updated in several years, and the design team saw an opportunity to make the car rental product pages more seamless for customers on-the-go.


Mobile-optimized website variation

They designed to test a major redesign that took a big step away from the original experience and baseline.


Hypothesis: “If we refresh the design of our car rentals mobile web product, then we can increase the conversion rate because we can provide a better visual and interaction experience to our mobile users.” The Hotwire designed the variation to be more consistent with their mobile app experience, which had already solved some of the pain points of browsing and booking from a mobile device.

Below we describe key elements they included to create an effective test.


Every test you run in Optimizely Web Experimentation is built on top of pages. When adding pages, think carefully about the paths that your visitors take. What is a visitor in a given funnel trying to accomplish? What problem are you trying to solve for them?

Pages that this experiment included: 

  • Car rentals product detail pages

  • Pages in the checkout funnel (note that if you are using Optimizely Web Experimentation to test on a checkout page, you might need to configure your site for PCI compliance)

  • Purchase confirmation page

How did this help?

A considered approach to pages helps you focus on the visitor’s journey when designing your experiment. Hotwire’s goal was to make it easy for customers to use the mobile web experience to rent a car. So, they created a variation that dramatically changed the Car Rentals product detail page and tracked outcomes through the funnel.

By thinking about the visitors’ intent and experience on the website for mobile users, the team was able to focus on addressing pain points in this funnel. Their test is designed to answer a specific question: does a mobile web experience that is consistent with the mobile app help more customers move further down the funnel?


The team targeted the experiment to mobile visitors. Visitors who viewed the website on a mobile device were eligible to see a variation of the Car Rentals product detail page.

How did this help?

In Optimizely Web Experimentation, audiences are optional. You can show any experiment to all your visitors. But you are far more likely to reach statistical significance with an audience, so we recommend that you add one. Adding audiences forces you to focus on your target market and design experiences for them. Who are you improving this experience for? What are the priorities of that visitor segment? What do you want to learn about those visitors’ expectations and behaviors?

The Hotwire team hypothesized that mobile visitors would be more likely to convert if the site experience is optimized for on-the-go browsing and booking behaviors. Layouts that allow for quick scans of critical details on small screens, an image of the car on the same page, and an option to Continue at top of a long scroll might help mobile visitors evaluate their options more easily.


Integrations help you leverage a variety of different digital tools to get a more in-depth understanding of your users. It is important to link your systems so you can track user behavior.

How did this help?

The most successful experimentation programs leverage Optimizely Experimentation’s integrations. By integrating Optimizely with other technology platforms that your company uses, you not only maximize the information that you are getting from different data sources—you gain the ability to segment results for more meaningful business insights.

Not setting up integrations holds your program back from deeper learning and the ability to connect other parts of your analytics landscape to your experimentation efforts.


Metrics determine whether your experiment “wins” or “loses.” It is easy to create and add metrics in Optimizely, but the strategy behind an experiment’s metrics is key its success.

Primary metric: 

  • Clicks to the Continue button

Secondary metrics:

  • Clicks for final purchase

  • Views of the purchase confirmation page

  • Clicks other progress buttons, throughout the funnel

  • Clicks to navigation elements like Back and Home

How did this help?

Primary metric:

The most important decision you will make when designing your experiment is choosing the primary metric. It is the metric that determines the speed of your test—and whether it wins or loses. It can be tempting to choose revenue, but revenue is rarely an effective primary metric because it is generally influenced by many factors. The best primary metric is the action that you want to directly influence with your experiment. Always solve for the primary goal first.

The improvement in click-throughs as visitors travel down the funnel helped Hotwire confirm that their new design is helping customers carry out their intent: to rent a car.

Secondary and monitoring metrics: 

When choosing secondary metrics, you often balance site traffic and complexity. Optimizely calculates statistical significance for the primary metric separately to ensure that it always reach statistical significance at top speed, so you can make important business decisions. But for all other goals, the more you add, the more visitors you need to determine a win or loss. If traffic is a limited resource for your program (it is for most), you must balance it against goal complexity.

The Hotwire team prioritized final purchase metrics to track the downstream impact of the experiment. Clicks to the navigation elements at the top of the screen help the team confirm visitor intent and monitor the health of the funnel in general.

Learn more about primary and secondary metrics. If you are experimenting in a purchase funnel, read about choosing between macro and micro-conversions in an experiment.

Share & Evolve

The Hotwire team presented findings from their experiment to the company and started brainstorming ways to apply what they learned to Hotwire’s other lines of business.

How did this help?

Sharing the results of your experiments is important. It helps other teams incorporate what you have learned into their work and raises the visibility of your program. When programs do not share their findings, they risk siloing important insights that may be key to the business’ success.
By sharing their findings, the Hotwire team helped others learn more about customers and target them effectively. This also allowed them to bring the fresh insights they generated to a new cycle of experimentation, and to other parts of the business.

Here are a few templates for sharing your findings.