Audit sitemap scanning helps you discover and include URLs from sitemaps and robots.txt files in your audits, providing greater page coverage and ensuring you don't miss important issues on your website. Sitemaps can expose pages not easily discoverable through regular crawling, expanding your audit reach
How to Use Audit Sitemap Scanning
Enable Sitemap Discovery
Go to your audit settings.
Add sitemap or robots.txt URLs as Starting URLs
Or, toggle on Find Pages in Sitemaps and we'll automatically look for your robots.txt file and any sitemap files.
How Does ObservePoint Sitemap Scanning Work?
Audits can be configured to automatically look for sitemaps listed in robots.txt files.
ObservePoint will then scan sitemaps for page URLs.
If multiple sitemaps are found each will be scanned.
URLs discovered from these sources will be added to the audit and treated like any other page. Obeying all inclusion and exclusion filtering.
URLs found in sitemaps will also be added the Pages report for your account.
Where Do I See Results?
Sitemap-sourced pages appear alongside other URLs in the audit results.
These pages are included in all standard reporting and validation processes.
Frequently Asked Questions
Does this feature validate sitemaps or robots.txt files?
No, this feature processes URLs listed in these files. Validation must be performed using third-party tools.Can it handle large sitemaps?
Yes, it supports sitemaps up to protocol-defined limits (e.g., 50MB or 50,000 URLs per sitemap).Do I need to update existing audits?
Yes, you must enable the setting to include sitemaps in your audit configurations.Is there any cost associated with scanning sitemaps?
No additional cost to scan sitemaps. As sitemaps and robots.txt files are scanned these will count against existing audit page volume.