You can add content that hasn't yet been indexed by search engines to a robots.txt file to prevent it from being shown in search results.
To edit your robots.txt file in HubSpot:
In your HubSpot account, click the settingssettings icon in the main navigation bar.
In the left sidebar menu, navigate to Website > Pages.
Select the domain whose robots.txt file you want to edit:
To edit the robots.txt file for all connected domains, click the Choose a domain to edit its settings dropdown menu and select Default settings for all domains.
To edit the robots.txt file for a specific domain, click the Choose a domain to edit its settings dropdown menu and select the domain. If necessary, click Override default settings. This will override any robots.txt default settings for this domain.
Click the SEO & Crawlers tab.
In the Robots.txt section, edit the content of the file. There are two parts of a robots.txt file:.
User-agent: defines the search engine or web bot that a rule applies to. By default, this will be set to include all search engines, which is shown with an asterisk (*), but you can specify specific search engines here. If you're using HubSpot's site search module, you will need to include HubSpotContentSearchBot as a separate user-agent. This will allow the search feature to crawl your pages.
Disallow: tells a search engine not to crawl and index any files or pages using a specific URL slug. For each page you want to add to the robots.txt file, enter Disallow: /url-slug (e.g., www.hubspot.com/welcome would appear as Disallow: /welcome).