As we know, Google run a video series on YouTube with the name Ask Googlebot. In that series, only Google’s John Mueller reply to a question about how someone can prevent content from getting indexed in search, and also any website can do it.
The Answer replied by Google’s John Mueller is as below.
He said, there are three ways which we can use to hide a website from search results.
- Use a password
- Block crawling
- Block indexing
If you will use a password in your website you can easily out from indexing from Google or if you want to get index but you don’t want to show your content then you can add Password in your content to hide it from Googlebot.
Moreover it’s not against Google webmaster guidelines as long as you are blocking that content for users as well. Yes it will against the guidelines if you will block your content from Googlebot but not blocked it for users.
Also some times we see that the website showing some different content to Googlebot and different to users that’s called “cloaking” and it is against Google’s guidelines that fall under blackhat technique.
Here are 3 Ways To Hide Content From Search Engines
1. Password Protection
If you want to keep your website private and don’t want to show your content to general users as well as Googlebot then Password protection is one of the best approach.
Password will help you to prevent your content from random web users as well as from Googlebot.
Basically, most of the people use it when your website is in development phase. If you want to show it to any client or your manager then you can just add a Password which can be access by your client and manager, and you can save your website by Googlebot from crawling.
2. Block from Crawling
Now another best way to stop Googlebot is by blocking using robots.txt file. Although, In this method, people can easily access your website via direct link. But yest it will not pick by any of the search engine.
Although search expert John Mueller says, it’s not the best option since search crawler can easily access your website without accessing the content. With this method you can’t stop your website accessing by search engine, yes of course you can stop search engine bot by accessing the content of the website.
3. Block Website/Web Page from Indexing
Last but not the least, final option to block a website from Googlebot is add no index meta tag to your webpage or website.
This no index will help search engine bot to understand that you don’t want to index your website or pages to crawl it.
All the users will not see your metatags directly on the website, they can only access your page normally. To see those meta tags, they need to see the code of the website
Mueller’s Final Thoughts
At-last Mueller wraps up the video and said that Google’s top recommendation is to go the password route:
“Overall, for private content, our recommendation is to use password protection. It’s easy to check that it’s working, and it prevents anyone from accessing your content.
Blocking crawling or indexing are good options when the content isn’t private. Or if there are just parts of a website which you’d like to prevent from appearing in search.”
See the full video below: