Robots.txt File

A text file that instructs search engines how to crawl and index a website’s content. The file typically contains a list of URLs that are either allowed or disallowed for indexing. This file can also be used to improve a website’s search engine optimization by optimizing the URL structure for search engine crawlers.

"*" indicates required fields

Got Questions?

This field is for validation purposes and should be left unchanged.