//Pages blocked by Robots.txt can still Rank

Pages blocked by Robots.txt can still Rank

Wondering how pages blocked by Robots.txt rank in Google? John Mueller has an answer for you. Google has stated that content that is blocked by Robots.txt will still be indexed. But how will Google know which types of queries to rank the pages for?

This question came up in Google Webmaster Central Hangout. In response to which John Mueller from Google said that ranking of the page is determined by links pointing to the page. Though it won’t be wise to block content with Robots.txt. If, however, you have content blocked by Robots.txt, Google will do its best to figure how to rank it. Google, obviously, cannot look at the content as it is blocked. More often than not Google will prioritize the index of other pages of the site that are not blocked. However, sometimes pages blocked by Robots.txt will rank in SERP if Google considers it worthy depending on the links the page has.

Here is John’s full video. If you wish to hear the full answer start from the mark 21:49

By |2019-07-11T13:34:29+00:00July 11th, 2019|SEO Update|0 Comments