Yes. If you wish to block an individual file in your file manager, such as a PDF document, from being indexed by search engines, you will need to select a connected subdomain for the file(s) and use that file URL in your content. You can then add that file URL to your robots.txt file.
The default file URL looks like this (notice the cdn2.hubspot.net subdomain):
Once you've selected a connected subdomain, the file URL will look something like this:
To select a connected subdomain for your file:
- In your HubSpot Marketing Hub Basic, Professional, or Enterprise account, navigate to Content > File Manager.
- Locate the file you wish to block from crawlers and click on the file name.
- In the right sidebar, click the File URL dropdown. A connected subdomain will display your company's domain, such as www.your-company.com or info.your-company.org. Depending on the domains connected to HubSpot, you might have multiple options in this dropdown.
- Select the file URL containing your company's connected subdomain, then click Copy URL.
You can now use this URL when blocking the file from being indexed by search engines via a Robots.txt file. To learn how to install a Robots.txt file, click here.
Please note: if you have previously shared the file online using the cdn2.hubspot.net subdomain or another connected subdomain, it's possible that the file has already been indexed. If this is the case and you're seeing the file appear in search results, follow the steps above to choose a custom domain for the file, add the file URL to your Robots.txt file, then request a re-crawl of your site.