Metrics hub

  • Updated
  • Optimizely Web Experimentation
  • Optimizely Performance Edge
  • Optimizely Feature Experimentation
Metrics hub is in beta. Contact your Customer Success Manager or Optimizely Support for information.

Metrics hub in Optimizely Experimentation is designed to streamline the creation and management of metrics. Previously, metrics were created individually for each experiment, leading to redundancy and increased configuration time. With metrics hub, you can create and manage reusable metrics across multiple projects and experiments, enhancing efficiency and consistency.

Existing metrics continue functioning with ongoing experiments during the transition to metrics hub.

For new experiments, you must create a metric in metrics hub before adding it to the experiment.

Key features

  • Centralized metrics management with reusable metrics – Create metrics once and use them across different experiments, eliminating the need for repetitive configuration. You can create metrics using the user interface (UI) or the API. The following are available metric types:
    • Cross-project metrics – Available across all projects within an account. These metrics are shared across Web Experimentation and Feature Experimentation.
    • Project-level metrics – Specific to a single project.
    • One-time metrics – Created for a single use within an experiment, not reusable.
  • Roles and permissions – Metrics hub introduces granular roles and permissions to control metric creation, editing, and usage. See Manage collaborators in Web Experimentation and Manage collaborators in Feature Experimentation
  • Integration with experiments – You can use metrics across different experiments and projects. You can update them once, and the changes propagate wherever the metrics are used.

Metric creation requirement

Metrics must be unique within their respective scope.

  • Cross-project metrics – Metrics shared across multiple projects must be unique. Cross-project metrics cannot have identical event and calculation logic, even if they have different names.
  • Project-level metrics – Metrics specific to an individual project must be unique within that project. However, the same metric can exist in different projects because project-level metrics are not visible across other projects.

Optimizely Experimentation enforces these uniqueness requirements to encourage efficient metric reuse and maintain clarity within your metrics hub.

Permissions and roles

Permissions for creating, editing, and using cross-project metrics are documented in the product-specific collaborator documentation.

The cross-project metric option only displays if you have the proper permissions.

Create metrics

  1. Go to Metrics.
  2. Click Create New Metric.

    Click create metric.

  3. Enter the metric parameters, including the Name, Description, Event, and measurement criteria.
  4. Select Cross-project metric to create a cross-project metric.

    Create a metric and select cross-project event

  5. Click Save.

See Create a metric in Optimizely using the metric builder for information on creating different types of metrics. 

Manage metrics in the UI

You can configure and manage metrics using the Optimizely Experimentation UI. Web Experimentation and Feature Experimentation handle metrics the same.

Edit a metric

When you edit a metric, existing experiments using this metric are affected. If you do not want to affect existing experiments, you can duplicate the metric instead, so these instructions work for either product.

  1. Go to Metrics.
  2. Click the metric you want to edit.

    Select a metric from the Metrics Hub.

  3. Make your changes on the Edit Metric page.
  4. Click Save.
  5. Click Save Metric on the Save Edit confirmation page.

Archive a metric

Archive metrics to prevent their use in new experiments while maintaining them in ongoing experiments.

  1. Go to Metrics.
  2. Click More options (...) for the metric you want to archive.
  3. Click Archive.

    Click More options then archive.

  4. Click Archive Metric on the Archive Metric confirmation page.

Unarchive a metric

You can unarchive metrics to use them in new experiments.

  1. Go to Metrics.
  2. Click Status and select Archived.

    Change status to Archived.

  3. Click More options (...) for the metric you want to unarchive.
  4. Click Unarchive.

    Click unarchive.

Delete a metric

Deleting a metric impacts any existing experiments using this metric and cannot be undone. You can only delete Archived metrics. See the Archive a metric section.

  1. Go to Metrics.
  2. Click Status and select Archived.
  3. Click More options (...) for the metric you want to delete.
  4. Click Delete.

    Click More options. Then Delete.

  5. Click Delete Metric on the Delete Metric confirmation page.

Manage metrics with the REST API

You can configure and manage metrics using the REST API. See the Feature Experimentation REST API documentation for available endpoints and parameters.

Best practices

Use these guidelines to streamline your metric management, reduce errors, and enhance collaboration across projects.

  • Consistent names – Use clear and descriptive metric names for easy identification across projects.
  • Governance – Implement roles and permissions to ensure only authorized users can modify critical metrics.
  • Documentation – Maintain documentation for each metric, outlining its purpose and usage.