Why do I get an error in Page Performance that the crawler is blocked by robots.txt?

Last updated: June 6, 2016

Available For:

Product: HubSpot Marketing
Subscription: Basic, Professional, & Enterprise
When attempting to refresh page data or check a page for SEO errors in Reports > Page Performance, the following error may appear:
Page Performance Error Message Robots.txt

This error may appear if the page is hosted on a HubSpot staging domain:

  • If the URL contains, this indicates that the page is still on the HubSpot staging domain, which is automatically prevented from being indexed by search engines in the robots.txt file. In order to build up the SEO for your website and pages, your content should be on a domain or subdomain that is part of your website.
  • You will be able to see the total page views for pages on the HubSpot staging domain and the URLs are public to share; however, they are blocked from being crawled by search engines.

Another reason this error may appear is if the page is blocking our two crawlers:

  • HubSpot uses two crawlers, for pages and for links. The user agent for each of the crawlers is:
    • Pages: "HubSpot Crawler 1.0"
    • Links: "HubSpot Links Crawler 1.0"

To allow for the HubSpot crawlers to crawl your page, you may need to modify the robots.txt file you have implemented or you may need to modify how you are blocking crawlers at the server level.

Related articles:

Was this article helpful?