Web Robots Files
The robots.txt file tells search engines more about how to access your site and can request that they do not index specific areas. Web robots are programs that are used to search data automatically across the web. You have the ability to restrict web robots from collecting data from your site.
The new training video hasn’t been created yet for this feature.
- If you haven't already done so, Log in and go to the Dashboard. You are taken to the Site Content tab.
- Select the Settings tab from the horizontal navigation bar at the top of the page.
- Choose Web Robots File from the left navigation bar. If you know of web robots that you wish to prevent access to your site you can enter them here.
- To learn more about web robots select the More about robots.txt link.
- When finished select Save.