How robots.txt files work
Your robots.txt file tells search engines how to crawl pages hosted on your website.
The two main components of your robots.txt file are:
- User-agent – Defines the search engine or web bot that a rule applies to. An asterisk (*) can be used as a wildcard with User-agent to include all search engines.
- Disallow – Advises a search engine not to crawl and index a file, page, or directory.
To learn more about how to set up your robots.txt files, check out the robots.txt documentation. This resource includes more details about how robots.txt files work and examples. A few common scenarios are covered below.
You can also use a robots.txt generator tool to create your file.
Update your robots.txt file in HubSpot
- In your HubSpot account, click the settings icon in the main navigation bar.
- In the left sidebar menu, click Marketing, then click Web pages.
- Use the Modifying
dropdownmenu to select a domain to update.
- Scroll down to the Robots.txt section and make your changes to your robots.txt file in the text field.
- Click Save to save your changes.