- Optimizely Web Experimentation
- Optimizely Performance Edge
- Optimizely Personalization
- Optimizely Feature Experimentation
- Optimizely Full Stack (Legacy)
Sharing results—what the team has learned and how it has made an impact—is crucial to the success of every optimization program.
If you do not also share your data-driven insights widely, you limit the impact of your testing efforts. You have taken action on customers' behalf but have not enabled other teams to do the same by sharing this knowledge.
Share what you have learned to widen the impact of your testing. Share how you have made an impact to demonstrate the value of your optimization program. Based on test results, communicate the metrics you have helped move the needle. If you do this well, you will evangelize data-driven decision-making within your organization.
Companies that consistently share their test results—test more effectively and move the metrics that matter most to their businesses.
Materials to prepare
- Team
- Metrics
- Prioritization criteria and roadmap
- Experiment plan
- Results report (an Excel download, a link to the Experiment Results page, or an internal analytics report)
People and resources
- Program manager
- Analyst
- Decision maker
- Stakeholders
- Executive sponsor (where applicable)
Actions you will perform
- Finalize documentation.
- Store documentation in an accessible place.
- Share results broadly at a cadence that aligns with company practices.
Deliverables
- Shareable, accessible versions of regularly used documentation, including the experiment plan and the roadmap.
- Special newsletter (that details changes in goals, staffing, quarterly roadmap, and test results) at a regular cadence.
- Quarterly Business Review (QBR).
What to watch out for
- Not sharing at each juncture.
- Potentially oversharing with the wrong stakeholders (instead, share results that are relevant to each stakeholder).
- Not being clear about how individual experiments affect top-line goals in program-level reporting, such as in a QBR.
What to share
When you share results, include the following sections:
Purpose – Briefly describe "why" you are running this test, including your experiment hypothesis.
Details – Include the number of variations, a brief description of the differences, the dates when the test was run, the total visitor count, and the visitor count by variation.
Results – Be concrete. Provide the percentage lift or loss, compared to the original, conversion rates by variation, and the statistical significance or Confidence intervals and improvement intervals.
Lessons Learned – This is your chance to share your interpretation of what the numbers mean and critical insights generated from the data. The most important part of results sharing is telling a story that influences your company's decisions and generating new questions for future testing.
Revenue Impact – Whenever possible, quantify the value of a given percentage lift with year-over-year projected revenue impact.
The sections above do not apply only to winning tests. Tests that do not produce a winning variation generate valuable lessons; learning what not to do can be as helpful as knowing what to do. You are more likely to get a nuanced understanding of your visitor's behaviors through tests that don't win than those that do.
Share with your testing team
When to share
Update your optimization team on active tests in a weekly, bi-weekly, or monthly meeting, depending on how often your team tests. Do not forget to include:
-
Team members involved in execution, such as designers and developers.
-
Executive decision-makers who are not typically involved in day-to-day testing.
-
Peers who may not be focused on testing but can make valuable contributions to the program's mission.
How to share
Emails and shareable spreadsheets (for example, Google Spreadsheets or Smartsheet) are effective ways to share updates with your team. Your team's wiki page in Atlassian Confluence is another place to store updated results. Email results to your testing team as experiments conclude.
Do not forget
Save your results to a roadmap or spreadsheet where you track the results of all tests in a project. Doing so will make it easier to consult these metrics when you return to results to brainstorm new tests and campaigns.
Download the CSV file of your experiment results from your Experiment Results page. See Reference for columns in CSV export files for campaigns and experiments.
Add the metrics in your CSV download to a spreadsheet where you track the results of all experiments.
Share with your broader organization
When to share
Share results with the rest of your organization at the end of each test to raise the visibility of your program and champion data-driven decision-making at your company.
How to share
To keep your organization involved, update results to an internal team wiki page (in Atlassian Confluence, for example). Create a centralized, public source for all test results. Link to and share the wiki in internal emails so stakeholders beyond your testing team know where to find it.
Try sending out a "which test won" poll by email to generate interest and engagement with data-driven testing in the rest of your organization.
Share with executive stakeholders
When to share
Executive stakeholders help allocate time and resources for your optimization program. Share your progress with executive stakeholders once per quarter using your company's internal presentation format.
What to share
Review the company goals that guide your testing program and past experiment results. Use these signposts to frame your report. It is also a good idea to include the following information for executive stakeholders:
General:
-
Top experiments with significant business results.
-
Total number of active experiments.
-
Significant takeaways from the project.
-
Number of monthly unique visitors to the site (and projection of future MUV).
-
Major tests are applied to your funnel and what you have learned.
-
Top-level summary of insights gained from experiments and campaigns overall.
Next steps:
-
People and culture – What human resources or skillsets are needed for future testing and personalization? How do you plan to build a culture of testing at your company?
-
Improved processes – What processes need improvement? For example, how would you collaborate more efficiently with the Marketing team next quarter?
-
Strategy – where do you plan to test and personalize next, and why?
-
Execution and resources – What non-human resources do you need for future testing and personalization?
Sharing the results of your tests with different stakeholders will help you spread the insights you have gained and communicate the ROI of your program. Doing this well will help you build visibility at your company and evangelize a data-driven culture.
Please sign in to leave a comment.