SiteSentry's x-robots-tag header check

The x-robots-tag HTTP header serves is an optional header sent by your web server. It serves a similar purpose to the [robots meta tag](/docs/search-engine-indexability-robots-meta tag/), and uses the same directives to control hw search engines crawl and index your site, but has 2 key differences:

You obviously need access to your web server and some server-specific knowledge to set up an x-robots-tag HTTP header, which makes it trickier to use than the [robots meta tag](/docs/search-engine-indexability-robots-meta tag/), but it does means that you only need to set it up once to block search engines from indexing an entire site.

There are good reasons for telling search engines not to crawl or index a site, or specific types of content, but rarely would you want to block search engines completely:

x-robots-tag example

So, if SiteSentry's x-robots-tag header check finds noindex (or none, which also blocks all search engines), it will treat it as an error and send you a notification. If it finds anything else, i.e. a one or more specific search engines are blocked or there are general restrictions (e.g. nofollow, nosnippet or noarchive), it will warn you.

Related

Check out our docs for information on SiteSentry's checks for issues with the robots meta tag and for how SiteSentry treats the directives it finds.