logo

What To Know About Google Stopping Support For HTTP and JSON-RPC Requests On Search Console API

Search Console

What You Need To Know About Google Stopping Support For HTTP and JSON-RPC Requests On Search Console API

In 2018, Google told SEO experts and developers that it would not be supporting HTTP and JSON-RPC requests for its Search Console APIs in the near future. Recently, Google has announced on its Google Webmaster Twitter account that this change will take place soon.

HTTP & JSON-RPC requests. Those who are using the Google Search Console API and batch HTTP and JSON-RPC requests should note that this will cease to work soon.

You might be using this method. According to Google, if you see the message “to see requests for the Search Console API like that”, you might currently be using these methods.

In the light of the change, the search engine suggests that users “check your implementations” to make entirely sure they’re not using HTTP and JSON-RPC requests.

The announcement. Here’s the tweet posted by Google:

google-webmasters

Why we care. A lot of site admins and SEO companies are using Google Search Console’s API in different ways to enhance their SEO efficiencies. Whether it’s for tracking issues, client reporting, dashboard tools or any other reason, the API is extremely useful. It’s therefore in your best interests to update your API implementation so your tools and applications don’t fail to function effectively as a result of this change.

This Is What We Know About Referral Links And 50% Traffic Loss

After a query from a publisher, Google’s John Mueller commented on the correlation between referrals links and a drop in traffic. There have allegedly been over five hundred referring pages from two domains reported on Google Search Console (GSC). These referrals led to a 50% traffic drop, which alarmed a whole host of site admins and SEO companies UK wide.

According to the publisher, GSC is showing referrals from two domains that total up to four links to each page of their website. When going over the referenced pages, they were completely empty, with no content at all.

The publisher claimed that the appearance of those referral links had translated to a 50% drop in traffic.

The publisher asked:

“…is this a scenario where the disavow tool makes sense or does Google detect them as unnatural and will ignore them as a ranking signal”?

Mueller addressed the mystery of the “empty” pages and what those might be:

 “It’s really hard to say what you’re seeing here. It’s certainly possible there are pages out there that show an empty page to users and then they show a full page to Googlebot”.

This practice is called cloaking, where the reference to a page shows one page to Google and a different page to other viewers.

Mueller explained that there’s a possibility the page in question might be cloaking. This might be the issue the publisher is encountering, without reference to the secondary issue of the rankings.

The referral pages aren’t a bold attempt to sabotage the publisher’s rankings, but rather just a simple technical mistake, according to Mueller.

He said:

“From that point of view, I would just ignore those pages”.

Mueller suggested one course of action could be to examine the pages through Google’s Mobile Friendly test, to check how they appear when GoogleBot views them. This is a good test for cloaking, to establish whether the page is appearing one way to Google, and another to non-Googlebot visitors.

Mueller pointed out the correlation between the referral links and the 50% drop in traffic:

“I don’t think this is something that you need to disavow.

It probably looks weird in the links report but I really wouldn’t worry about this.

With regards to the drop in traffic that you’re seeing, from my point of view that would probably be unrelated to these links.

There’s no real situation… where I could imagine that essentially empty pages would be causing an issue with regards to links.

So I would just ignore that.

If you decide to put them in the disavow file anyway… just keep in mind that this would not affect how we show the data in search console. So the links report would continue to show those.

I don’t think there’s any reason to use a disavow file in this particular case. So I would just leave them be”.

What did the publisher see?

In fact, what the publisher saw is probably referral spam, which is a pretty old but common phenomenon. The real reason behind the issue was that certain free analytics programs in the early 2000s published lists of referrers in the form of links.

As a result, this gave way to an opportunity to use fake referrals destructively from a spam website to a different website, in order to create a link from the public analytics page.

But the analytics page wasn’t linked from any particular page of a site, it was just an automatically generated URL.

Few websites now have those analytics pages. However, the practice still continues, probably because someone hopes that if sufficient publishers click on the links, Google will view those clicks as a form of popularity, and that would boost their rankings.

As for the original question, it looks like the publisher could have been looking at a manufactured referral.

But that referrer wasn’t real. And the link doesn’t exist either – so it’s most likely to be a case of referrer spam.

Does referrer spam hurt rankings?

Referrer spam is not new, and it’s always been an issue. However, it’s categorically not the reason behind the drop in rankings.

Information found in this SEO UK blog came from https://www.searchenginejournal.com/google-bad-links-traffic-loss/368512/ and https://searchengineland.com/google-search-console-api-to-stop-supporting-http-and-json-rpc-requests-334766. Click the links to read the full articles.

If you want to boost your traffic and your presence online, sourcing quality, top-notch SEO services is the answer. Head over to Position1SEO to get started.

search console 1 2
Author: Jason Ferry
Jason Ferry is an SEO specialist based in the United Kingdom. Taking pride in his many years of experience in the search engine optimisation industry, he has honed his skills in various digital marketing processes. From keyword marketing, website auditing, and link building campaigns, to social media monitoring, he is well-versed in all of them. Jason Ferry’s excellent skills in SEO, combined with his vast experience, makes him one of the best professionals to work with in the industry today. Not only does he guarantee outstanding output for everyone he works with, but also values a deep relationship with them as well.

Related  Posts

the benefits of an search engine optimisation audit
Search Engine Optimisation (SEO) is an essential ingredient for website success in the digital arena. The process of indexing and ranking websites on Search Engine Results Pages (SERP) requires constant evaluation. Therefore, it is vital to conduct an in-depth SEO audit to identify areas that need improvement. SEO audit is the process of evaluating a […]
search engine optimisation company
Search Engine Optimisation (SEO) is a crucial aspect of building a strong online presence. While many website owners focus on using keywords to rank higher on search engines, an SEO company can optimise your website in more ways than one. In this blog, we will explore the three areas an SEO company can improve upon […]
The importance of hiring a search engine optimisation consultant
Are you struggling to get your business noticed online? Is your website buried under a sea of search results? If your answer is yes, then it might be time to consider hiring an SEO consultant. With the ever-growing importance of online presence for businesses, it has become crucial to employ the right strategies to make […]