Why was I unable to crawl my HubSpot pages with an external crawler?

Last updated: August 21, 2017

Available For:

Marketing: Basic, Pro, Enterprise
Sales: N/A
Service: N/A

If you have attempted to crawl your HubSpot pages using an external SEO tool such as Moz, OnPage, or SEMRush, you may find that you are unable to crawl your pages successfully. If this is the case, there are a few things you can check:

  1. Robots.txt: check to see if your pages have been added to the robots.txt file in content settings, which would prevent it from being indexed or crawled. You can find more on this here
  2. Metatags: check to see if code, such as noindex, has been added to the Head HTML of your pages which would prevent them from being indexed or crawled.
  3. Googlebot: HubSpot does not allow the crawling of HubSpot pages from the Googlebot originating from non-Google IP addresses. If you attempt to crawl your HubSpot site as Googlebot, you will likely see a 403 error.

If you are interested in preventing certain pages from being indexed or crawled, you can find methods for doing so here

Was this article helpful?