The robots.txt file prevents web crawlers, such as Googlebot, from crawling certain pages of your site. The file contains a list commands and URLs that dictate the behavior of web crawlers on a website. Keep in mind that not all web crawlers adhere to the directives in the robots.txt. There are several ways to make edits to the Robots.txt file. This article provides the steps for editing the robots.txt file through the Content Management System (CMS).
Editing the Robots.txt file
- Go to Admin Console.
- Click View Website.
- Toggle Content Editor to display On .
- Select the desired website from the list of websites.
- Click Content Tree.
- Click Edit.
- Click Home and expand the content tree.
- Click the Robots.txt file.
- Click Edit this page to make edits to the page.
The robots.txt file, may already contain some sitemap URLs which are generated by the system and cannot be edited.
- In the Text field, if necessary, add the Disallow: / command and click enter or return to move to the next line.
If necessary, increase the size of the text box by dragging the resize area in the lower right corner.
- Add the desired URLs that should be ignored by web crawlers. Enter each URL on its own line and spaces between lines are not necessary.
The URLs need to be preceded by sitemap:. For example: sitemap: https://website.com/pagename
- After all desired URLs have been added, click the Save button. The newly added URLs now appear on the page.
- Click Publish.
- In the Publishing window, decide whether to publish immediately or sometime in the future. Click Publish.