How to customize the robots.txt file

Last updated: February 9, 2018

Available For:

Product: HubSpot Marketing
Subscription: Basic, Professional, & Enterprise
Add-Ons: Website

You can customize the contents of the Robots.txt file for your website pages in the page publishing options for your domain.

What can you do with a robots.txt file for your subdomain?

  • You can prevent thank you pages with content offerings from being indexed by search engines.
  • You can prevent landing pages that are tied to email only campaigns from being indexed by search engines.
  • You can prevent any pages containing duplicate content from being crawled.

Follow the steps below to learn where and how to customize your robots.txt file in HubSpot.

Locate your robots.txt file

From your Hubspot Dashboard, navigate to Content > Content Settings.

HubSpot Help article screenshot

Customize the robots.txt section

Scroll down to the page publishing options to the Robots.txt section. 

If you don't know how robots.txt files work, read the documentation at You can also use a robots.txt generator tool like this one.

HubSpot Help article screenshot

Example: To exclude all files except one

So, let's say you want to exclude certain files using your robots.txt. Below is an example from (more examples on this page).

There is no "Allow" field or functionality for a robots.txt file, so the easy way to exclude files is to put all files to be disallowed into a separate directory, say "stuff", and leave the one file in the level above this directory:

User-agent: *
Disallow: /~joe/stuff/

Alternatively you can explicitly disallow all disallowed pages:

User-agent: *
Disallow: /~joe/junk.html
Disallow: /~joe/foo.html
Disallow: /~joe/bar.html

Save your changes

Click Save to finish updating your robots.txt file.

HubSpot Help article screenshot

Was this article helpful?

Previous article:

Measuring Your Performance Project

Next article: