If you’ve been wondering why a site you want to visit isn’t showing up in Google search results, it might be because the webmaster has configured the site to not appear. You can achieve this with a “noindex” tag added to the HTML code of specific pages on a website. These tags tell search engines like Google that they should exclude these pages in their webpage index.
“noindex” Meta Tag
A common way to prevent search engines from crawling specific website pages is to specify them inside your robots.txt file. Another way is to add a “noindex” meta tag directly to your HTML code:
<meta name=”robots” content=”noindex” />
It is crucial for us to check the existence of this tag in the important pages of your website so that we will know whether your website is being prevented from being indexed, before taking any further SEO steps.
Here are some legitimate use cases for “no-index” directives:
- Pages containing sensitive information
- Admin dashboards
- Shopping cart or checkout pages on an eCommerce website
- Alternate versions of pages for active A/B or split tests
- “Staging” (or in-progress) page versions not yet ready for public use
Nick Berns is a web developer & SEO specialist.