X-robots

X-robots is an HTTP header directive used to control the behavior of search engine crawlers when indexing a webpage. It can be used to instruct search engines not to index a page, not to follow links on a page, or to apply other specific indexing rules. X-robots is an alternative to using meta tags within the HTML code of a page and can provide more granular control over crawler behavior.