All Collections
Audits
Comparisons for Audits
Comparisons for Audits
Luiza Gircoveanu avatar
Written by Luiza Gircoveanu
Updated over a week ago

Overview

Comparisons allow users to surface the most critical data to our users:

  • Since making updates to our site, what has changed?

  • Are there any inconstancies in my implementation?

You will be able to compare results from three tags between the most recent run and the previous run of an Audit to see what changes have occurred. This can allow you to make sure you have the tag presence you want, the variable consistency, and even on a variable's value level. This way you can begin to track consistency in your implementation across time.

Comparison

To Navigate to the Comparison section, use the left-hand menu. You'll see the option for Comparison section right below File Substitutions.

This is where your comparisons are set up live. You'll be able to see all the comparisons you've created as well as the history of which Audits they were applied to.

To get this expanded view you'll need to click on which comparison you're interested in.

Comparison Settings

Before creating a comparison, you'll need to enable tags for comparison. By default you can enable three tags for comparison, but you can talk to your account executive about adding more if needed.

To get to the settings click on the Comparison Settings button in the top right area of the Comparison Library. (You can also access these settings through your Manage Tags section in the Admin area of ObservePoint.)

From here, follow this guide to setup up Match Criteria for your Audit. Once it is set up, you are ready to set up your first comparison.

Audit Settings

You'll most likely want to adjust any Audit you are going to do comparisons on so that it either has all URLs you're interested in and the scan limited to those (ie; 50 URLs and a 50 page site scan) or you'll want to turn on the toggle to Lock URLs in the advanced settings of the Audit. This means it will take the URLs that it scanned and always scan the same URLs.

Run over Run Comparison

To actually have a run over run comparison, you'll need to set up the comparison in the Comparison Library.

Click on Create new in the top right corner of the Comparison Library which will launch a pop-up to go through creation of the comparison.

You can give your comparison a name and add labels, this is useful for keeping things organized as you add more comparisons to your count so try to stick to a naming and labeling convention.

Note: Best practice is to label anything you create with your name.

Then you can select from the three tags you chose in your settings which tags you want on this comparison. You'll also select which variables within that tag you want to compare. In most cases you'll want to stick to All (Except Excluded) which will compare all variables except the ones you've chosen to exclude in your settings. If you want to override those exclude settings select All Variables. If you want to compare only one or more variables go through the list and select which variable(s) you're interested in.

After you select your tags and variables you'll want to go to the Apply tab to select which Audits you want to run this comparison. You can assign one or more Audits by clicking the plus icon to the left of their name or you can add all the available Audits using the button at the top of the list.

After you've selected which Audits you want to apply the comparison to you can Save & Complete or you can move to the third tab. In the Match Tags section you can override the alignment criteria you set in the global settings. This is useful if you want to switch to comparing a different request. Remember you need at least one variable unique to the request you're interested in. Your global settings will populate into this field but you can adjust them.

After all, that, make sure to click Save & Complete!

Comparison Results will not be available until the next run of your Audit. Once a comparison has been set it needs to run at least once and then it will compare the current run to the previous run. Once you've gone into Comparison Library, you can click to expand any comparison you've set up to see where that comparison was run. If an Audit is currently running it will have a rotating symbol, if it's complete it will show a checkmark.

To jump to that Audit and see the results click on Applied to Audit (Name). This will take you to the Audit specified. Then on your left-hand reports menu click on Tag & Variable Comparisons.

At the top of this report, you can select which comparison you're interested in if you have multiple comparisons applied to one Audit.

You'll get an average score which is comprised of three criteria: Tag Level parity, Variable Level parity, and Value Level parity. Tag Level parity is looking to see if the tags selected for this comparison are present on the same pages they are present in on the previous run as compared to the current run. Variable Level compares variable presence between runs. Value Level compares which variables had values and how those matched between runs.

At the bottom report, you can also see which pages are missing from the baseline or earlier run versus pages missing from the destination or newest run. In most cases, you'll want to run comparisons on Audits that use a setlist of URLs so you don't have URLs changing constantly between runs. You can lock down your Audit by toggling on the Lock URLs switch in the advanced settings. This will ensure your Audit uses the same set of URLs every run.

One Time Comparison

A One Time Comparison can be used to compare one run to another, regardless of the Audit.

Some examples of use cases include:

  • Comparing your pre-production and production environments

  • Comparing mobile and desktop experiences

  • Comparing different browsers and operating systems

  • Testing your Consent Manager by comparing opted in and opted out Audits (configured in the user session of an Audit)

To get started, click Create New in the top right corner of Comparison section.

Select One Time Comparison and click Next

Then a modal will appear where you will name the comparison and optionally add labels. You'll then choose the Audits you want to compare. The Baseline Audit is the reference Audit and the Destination Audit is the one where you will go to see the report. Once you know which Audits to compare, you'll pick the runs you'll want to compare. Usually, this will be the latest run, but not always.

It is also worth noting that you could choose the same Audit for both. For example, you may have never set up a Run over Run Comparison, but you want to quickly see the differences in the data between recent runs without taking the time to run whole the Audit again.

Next, you'll select which tags and optionally select any variables you want to compare.

Note: If haven't enabled any tags for comparisons you'll see a yellow warning message as seen in the image below.

You also have the option of selecting one of the checkboxes below:

  • Ignore URL Domain: Checking this box will ignore domain names when comparing data. So essentially we'll compare pages with matching file paths.

Ex. stage.observepoint.com/blog/validate-tags and www.observepoint.com/blog/validate-tags would be evaluated as the same URL.

  • Ignore Query Parameters: Checking this box will ignore query strings when comparing data.

When you're done, you can click Continue to specify variables to match tags, click Save to save the configuration, or click Save & Run to save the configuration and start processing the comparison.

Once you have saved the One Time Comparison, you can choose to run it later from the Comparison Library list view by hovering over the comparison and clicking the on the play icon.

To see the report you can either go view it in the Destination Audit or click the link in the history column to take you straight there.

The comparisons report should look the same as the report shown above under Run Over Run Comparisons.

Did this answer your question?