Facebook

This Is How Google Identifies Low Traffic Webpages

Low Traffic
John Mueller, Senior Webmaster Trends Analyst at Google, offers some insight into which low traffic webpages SEO UK based agencies and site admins should immediately block from being indexed or leave in place.

In a Google Webmaster Hangout, in answer to a query, Mueller identified the types of low traffic pages that are okay from the ones that shouldn’t be indexed any more.

Are Low Traffic Pages Harmful?

Ideally, low performing webpages should be removed or no-indexed as they tend to generate less traffic.

In line with this topic, Mueller answered a question to shed some light on it.

The question was focused on a particular news site, but Mueller answered it from a broader perspective.

The question was:

“We’re publishing news and articles.

“For example, we have 100 new articles every day, and ten of them give us 95% of the organic search traffic. Another 90 go nowhere.

“We’re afraid that Google can decide our website is interesting only for 10%.

“There’s an idea to hide some boring local news under noindex tag to make the overall quality of all publishing content better.

“What do you think”?

How Google Analyses Website Quality?

Initially, Mueller talks about how individual webpages and the website as a whole are reviewed by Google’s algorithm to get a grasp of the overall quality level of a site.

This answer, he said, was applicable to all kinds of website, and not limited to the type of site the question was focused on.

He explains:

“In general, we do look at the content on a per page basis.

“And we also try to understand the site on an overall basis, to understand how well is this site working, is this something that users appreciate. If everything is essentially working the way that it shout be working.

“So it’s not completely out of the question to think about all of your content and think about what you really want to have indexed”.

Then, Mueller turned his attention to news websites. According to him, the volume of traffic isn’t entirely what the judgment as to whether a page is low quality or not is based on.

Mueller stated:

“But especially with a news website, it seems pretty normal that you’d have a lot of articles that are interesting for a short period of time, that are perhaps more of a snapshot from a day to day basis for a local area.

“And it’s kind of normal that they don’t become big, popular stories on your website.

“So from that point of view, I wouldn’t necessarily call those articles low quality articles, for example”.

Therefore, the level of popularity of a news article doesn’t define its quality.

That said, Mueller then explained how to tell if a page of content really is of a low quality.

He emphasised factors such as hard-to-read sentences, wrong grammar and poorly structured content are the types of measures used to define a low-quality page. Then he explained the steps you need to take if your website has a combination of a poor and good quality content.

This is what he said:

“On the other hand, if you’re publishing articles from … hundreds of different authors and they’re from varying quality and some of them are really bad, they’re kind of hard to read, they’re structured in a bad way, their English is broken.

“And some of them are really high quality pieces of art, almost that you’re providing. Then creating that kind of a mix on a website makes it really hard for Google and for users to understand that actually you do have a lot of gems on your website…

“So that’s the situation where I would go in and say, we need to provide some kind of quality filtering, or some kind of quality bar ahead of time, so that users and Google can recognize, this is really what I want to be known for.

“And these are all things, maybe user-submitted content, that is something we’re publishing because we’re working with these people, but it’s not what we want to be known for.

“Then that’s the situation where you might say, maybe I’ll put noindex on these, or maybe I’ll initially put noindex on these until I see that actually they’re doing really well.

“So for that, I would see it making sense that you provide some kind of quality filtering.

“But if it’s a news website, where… by definition, you have a variety of different articles, they’re all well-written, they’re reasonable, just the topics aren’t that interesting for the long run, that’s kind of normal.

“That’s not something where I’d say you need to block that from being indexed. Because it’s not low quality content. It’s just less popular content”.

In conclusion

John Mueller basically said that when you have a page with low traffic, you need to look more closely at it and determine, like Google does, whether the reason for that is because it is poorly written or simply that the subject matter is not popular.

As mentioned above, an unpopular webpage is not necessarily a sign of low quality. And content that’s well written, but just not frequently visited, won’t impact the website as a whole negatively.

Low traffic is a useful signal sometimes that there may be an underlying problem, but it isn’t an issue itself.

Review your content and check if the low traffic is caused by:

  • Outdated web page information (that must be improved)
  • Thin content on your web page (which is not good)
  • The topic of the web page is not popular (not necessarily bad)

This Is Why John Mueller Is Not Reporting Issues To Webspam Team

In a recent tweet, Google’s John Mueller stated that he’s often consulted by the public regarding spam issues, but doesn’t necessarily report that back to the webspam team. According to Mueller, “in most cases I run across here, where people ask us for input, or where it looks like sites would do better if prompted, I don’t forward things to the webspam team“.

Mueller’s statement was actually an answer to a question concerning the SEMRush guest blog post link. He further said that, “maybe they’ll notice it anyway, maybe others will flag it, but maybe the site will just improve and be awesomer“.

While Mueller has confirmed that he himself isn’t reporting spam issues to the relevant team at Google, it’s more than likely that they’re already aware of it anyway.

Here’s the Twitter responses from Mueller:

Dave Ashworth

Barry Schwartz

Information in this SEO blog came from https://www.searchenginejournal.com/google-explains-content-pruning/371252/ and https://www.seroundtable.com/google-john-mueller-report-spam-to-webspam-29560.html. To read the full articles, just click the links.

Take your business to the next level online by hiring Position1SEO. You can get affordable and quality SEO packages from us. With our team of SEO professionals, you can ensure you get value for money, with maximised online visibility and improved search rankings. To get started, visit our homepage.

top

Don’t miss your free SEO audit!

I’m interested in

Free auditSchedule free consultation

Best time for appointment

MorningAfternoon

Preferred day of week

MondayTuesdayWednesdayThursdayFriday

Rainbow

Don’t miss your free SEO audit!

I’m interested in

Free auditSchedule free consultation

Best time for appointment

MorningAfternoon

Preferred day of week

MondayTuesdayWednesdayThursdayFriday

Rainbow