Mueller Discusses SEO Links That Point To Old Press Releases
Quality SEO backlinks are one of the most important aspects of search engine optimisation, so it’s no wonder that the SEO community has many questions about them. Recently, Google’s John Mueller answered a query regarding the benefits of SEO links that point to old press release pages.
According to the person asking the question, they have a large number of old press releases no longer gain any traffic. However, the press releases managed to gain many inbound links from quality websites over time – sites that are credible sources of news and highly relevant to their niche.
The person sought advice on how to reorganise their website so that Google blocks their press releases but in a way where their site could still gain advantages from the quality inbound links. One of the solutions they came up with is to move it to an archive, but they still wanted their SEO to benefit from the links.
Muller replied, advising the person to redirect the SEO links to different parts of their website. He said that websites usually have an archive section for older posts. However, if they will move the old press releases and redirect the links to the archive section, they are basically telling Google to forward the links to the archived section as well.
The SEO who originally asked the question said that some of their old press releases were placed there due to legal reasons and that Google doesn’t need to access the content all the time. He added that he was also thinking of “disallowing” the folder containing the web pages, which means that they will not allow search engines to crawl certain pages using the robots.txt protocol.
The person continued with their queries, asking if the built-up SEO power would be ignored once they redirect the links to other parts of their website.
Mueller replied that Google would automatically crawl less if they think the pages are not relevant to the website, but this does not mean that they will block all the links entirely. However, if a website owner blocks the pages with robots.txt, then search engines will never know the context of those pages.
When there are a lot of SEO backlinks pointing to a webpage that’s blocked by robots.txt, Google can still include the page in the search results, showing it with a title based on the links and some text informing users that the search engine does not know the actual content of the page. Therefore, Google cannot indirectly forward the URLs to the website’s primary content.
Mueller’s advice to SEOs: never block relevant pages by robots.txt since readers and other high-quality websites are still linking to them.
He also discussed old press releases in general. Such content is usually time-limited; they are meant to be linked to once. At first, one might look at the number of inbound URLs, giving the impression that there are a lot of useful links that can be used from these articles. However, they are old press releases that are already in the archived sections, meaning that the links directed to them are not useful anymore since the content starts losing relevance as time passes.
The SEO community, including webmasters and businesses alike, have many theories about inbound links because it is a bit unclear how Google really use them. Moreover, the search engine giant always updates their SEO algorithms, making it harder to pinpoint how exactly they use inbound links. For this reason, it’s crucial to work with SEO specialists who can adapt their strategies on the fly.
Here at Position1SEO, you can be sure to get an effective SEO solution that fully complies with Google’s guidelines, ensuring that you get only desirable results for your website. We offer a free SEO audit, link building packages, content writing services, and more.
If you want to know more about what we can do for your business, dial 0141 404 7515 or email firstname.lastname@example.org. We look forward to working with you!