Seo, in its the majority of fundamental sense, relies upon something above all others: Search engine spiders crawling and indexing your website.
However almost every site is going to have pages that you don’t wish to include in this expedition.
In a best-case scenario, these are doing nothing to drive traffic to your website actively, and in a worst-case, they could be diverting traffic from more vital pages.
Luckily, Google permits web designers to tell search engine bots what pages and material to crawl and what to overlook. There are a number of ways to do this, the most common being using a robots.txt file or the meta robotics tag.
We have an excellent and comprehensive explanation of the ins and outs of robots.txt, which you should certainly read.
However in top-level terms, it’s a plain text file that lives in your site’s root and follows the Robots Exclusion Protocol (ASSOCIATE).
Robots.txt offers spiders with guidelines about the site as a whole, while meta robotics tags include directions for particular pages.
Some meta robotics tags you might employ consist of index, which informs search engines to include the page to their index; noindex, which tells it not to include a page to the index or include it in search engine result; follow, which advises a search engine to follow the links on a page; nofollow, which informs it not to follow links, and an entire host of others.
Both robots.txt and meta robotics tags are useful tools to keep in your toolbox, but there’s also another way to advise online search engine bots to noindex or nofollow: the X-Robots-Tag.
What Is The X-Robots-Tag?
The X-Robots-Tag is another method for you to control how your web pages are crawled and indexed by spiders. As part of the HTTP header reaction to a URL, it controls indexing for an entire page, as well as the particular elements on that page.
And whereas utilizing meta robotics tags is fairly simple, the X-Robots-Tag is a bit more complex.
But this, of course, raises the concern:
When Should You Use The X-Robots-Tag?
According to Google, “Any directive that can be utilized in a robotics meta tag can also be specified as an X-Robots-Tag.”
While you can set robots.txt-related directives in the headers of an HTTP reaction with both the meta robots tag and X-Robots Tag, there are particular situations where you would wish to use the X-Robots-Tag– the two most typical being when:
- You want to control how your non-HTML files are being crawled and indexed.
- You wish to serve regulations site-wide instead of on a page level.
For instance, if you wish to obstruct a specific image or video from being crawled– the HTTP action approach makes this easy.
The X-Robots-Tag header is likewise useful due to the fact that it permits you to integrate several tags within an HTTP reaction or utilize a comma-separated list of regulations to specify regulations.
Possibly you don’t want a certain page to be cached and want it to be not available after a particular date. You can use a combination of “noarchive” and “unavailable_after” tags to instruct online search engine bots to follow these directions.
Essentially, the power of the X-Robots-Tag is that it is much more flexible than the meta robotics tag.
The benefit of utilizing an X-Robots-Tag with HTTP reactions is that it allows you to use routine expressions to perform crawl directives on non-HTML, as well as use parameters on a larger, global level.
To assist you understand the difference in between these instructions, it’s handy to categorize them by type. That is, are they crawler directives or indexer directives?
Here’s an useful cheat sheet to describe:
|Spider Directives||Indexer Directives|
|Robots.txt– uses the user representative, enable, prohibit, and sitemap directives to define where on-site online search engine bots are enabled to crawl and not permitted to crawl.||Meta Robots tag– permits you to specify and prevent online search engine from revealing specific pages on a site in search results page.
Nofollow– allows you to define links that should not pass on authority or PageRank.
X-Robots-tag– allows you to manage how defined file types are indexed.
Where Do You Put The X-Robots-Tag?
Let’s state you wish to obstruct particular file types. A perfect approach would be to include the X-Robots-Tag to an Apache configuration or a.htaccess file.
The X-Robots-Tag can be added to a website’s HTTP responses in an Apache server setup via.htaccess file.
Real-World Examples And Utilizes Of The X-Robots-Tag
So that sounds terrific in theory, however what does it look like in the real life? Let’s take a look.
Let’s state we wanted search engines not to index.pdf file types. This setup on Apache servers would look something like the below:
In Nginx, it would appear like the below:
place ~ *. pdf$ add_header X-Robots-Tag “noindex, nofollow”;
Now, let’s look at a different circumstance. Let’s say we want to utilize the X-Robots-Tag to obstruct image files, such as.jpg,. gif,. png, etc, from being indexed. You could do this with an X-Robots-Tag that would look like the below:
Please keep in mind that comprehending how these regulations work and the effect they have on one another is important.
For instance, what takes place if both the X-Robots-Tag and a meta robotics tag lie when crawler bots find a URL?
If that URL is obstructed from robots.txt, then particular indexing and serving instructions can not be discovered and will not be followed.
If directives are to be followed, then the URLs consisting of those can not be prohibited from crawling.
Check For An X-Robots-Tag
There are a couple of different methods that can be utilized to look for an X-Robots-Tag on the site.
The most convenient method to check is to set up an internet browser extension that will inform you X-Robots-Tag information about the URL.
Screenshot of Robots Exemption Checker, December 2022
Another plugin you can utilize to figure out whether an X-Robots-Tag is being utilized, for instance, is the Web Developer plugin.
By clicking the plugin in your internet browser and navigating to “View Reaction Headers,” you can see the various HTTP headers being utilized.
Another method that can be utilized for scaling in order to pinpoint problems on sites with a million pages is Yelling Frog
. After running a website through Shouting Frog, you can browse to the “X-Robots-Tag” column.
This will reveal you which sections of the site are using the tag, in addition to which specific regulations.
Screenshot of Yelling Frog Report. X-Robot-Tag, December 2022 Utilizing X-Robots-Tags On Your Website Comprehending and managing how online search engine interact with your site is
the foundation of search engine optimization. And the X-Robots-Tag is a powerful tool you can utilize to do simply that. Just know: It’s not without its dangers. It is really simple to make a mistake
and deindex your entire site. That stated, if you’re reading this piece, you’re probably not an SEO novice.
So long as you use it sensibly, take your time and examine your work, you’ll discover the X-Robots-Tag to be a beneficial addition to your toolbox. More Resources: Included Image: Song_about_summer/ Best SMM Panel