This Is Why Google Has Decided To Stop Supporting Unsupported Directives In Robots.txt Files
- 5 July, 2019
- Jason Ferry
- Robots.txt
SEO experts and webmasters should take note that starting September 1, Google will not be supporting unpublished and unsupported rules in the robots exclusive protocol anymore. With this change, the search engine giant will not consider robots.txt files that have noindex directive listed within them. Google has decided to proceed with this change for possible open source releases in the future as well as maintain a healthy ecosystem.
Google has presented various alternatives for this. One of them is to use the Search Console Remove URL tool that can temporarily remove URL in the search results. Another way is to implement 404 and 410 HTTP status codes, as this will remove URLs from Google’s index after being crawled or processed.
This change is made after Google announced their plan to standardise the robots exclusion protocol. This way, the mistakes that hurt the website’s presence in the search results because of unsupported implementations will be lessened. Examples of this are nofollow, noindex, and crawl-delay.
Therefore, SEO professionals and website owners who are still using the noindex directive in the robots.txt should apply the alternative methods before September 1. This is also the same for those who are using crawl-delay or nofollow commands.
Details of this article came from https://searchengineland.com/google-to-stop-supporting-noindex-directive-in-robots-txt-319003. See the full story by clicking the link.
By working with a reliable SEO company with affordable packages, you do not have to spend a huge amount just to increase your online rankings. Visit our homepage right now and see all our available services.