This Is How Google Ranks Pages Blocked By Robots.txt


In a recent Google Webmaster Central hangout, a question about “how does Google figure out query relevancy for ranking pages blocked by robots.txt” was asked. This is a good thing to ask, especially for site owners and SEO experts who are also curious about what types of search queries do these pages rank for.

In response, John Mueller admitted that Google couldn’t obviously view the page content if it’s already blocked. Instead, they’ll have to improve and look for another way to compare the URL with other URLs ranking for such queries.

It’s important to note though that Google will prioritise indexing the website’s other pages that are more accessible. In other words, if you have great content available for crawling and indexing, then that’s what Google will utilise.

However, it’s worth noting that pages blocked by robots.txt can still rank in the search results if Google sees they are worthwhile. For instance, if there are links pointing to that particular page, then Google will assume that the content is indeed valuable.

Ultimately, Mueller said that utilising robots.txt to block content is not advisable. But if it does happen, then Google will still try to rank the page in the search results.

This SEO news came from https://www.searchenginejournal.com/googles-john-mueller-explains-how-pages-blocked-by-robots-txt-are-ranked/316061/. Follow the link to know more.

Getting professional SEO services from trustworthy providers is one way to increase your site traffic, and eventually, get higher rankings. To know how we can help you, visit our website right now.