logo

What You Need To Know About The Update On Nofollow Links This March

Nofollow Links

Last year, Google announced a two-part update to how it will treat nofollows. The initial part already rolled out, and this impacted rankings. SEO companies and webmasters must take note that the next update on nofollow links that impacts crawling and indexing will be released on 1st March 2020.

1st March 2020 Nofollow Update: What is it?

The update is going to affect the crawling and indexing of nofollow links. Originally, Google treated nofollow links as a directive. Google obeyed the nofollow link attribute and did not crawl or crawl them.

On 1st March 2020, this will not be the case anymore. Starting on that day, Google will treat nofollow links as a hint for indexing and crawling purposes.

This is what Google's announcement said:

“For crawling and indexing purposes, nofollow will become a hint as of March 1, 2020”.

Review Your Nofollow Policy

Some publishers used nofollow to be able to prevent pages from being crawled. Typical web pages linked with a nofollow could be links to user profiles, links to login links and website sections that could be construed as thin pages.

Using nofollow to stop Google from crawling and indexing a website was not a great practice. There are other efficient means to keep pages from being indexed (such as the meta robots noindex directive).

Will Nofollow Update Affect Rankings?

It is hard to tell how the update to crawling and indexing nofollow links may impact rankings. It might depend on the pages that Google will decide to crawl and index.

Google may create guidelines for choosing pages to be or not to be indexed. It is not far-fetched that Google will decide not to index pages that are of low quality.

Takeaways

  1. Familiarise yourself with the Nofollow Hint Update

In terms of money and search rankings, ignorance is not bliss. Remember that this change is occurring on 1st March 2020.

If rankings or traffic start changing, then the nofollow crawling and indexing update might be something to check out.

  1. Review Use of Nofollow

Poor nofollow deployment may result in unwanted consequences. It is advisable to review how nofollow links are used on your web pages and determine the right time to eliminate them and move to in a meta robots noindex.

Why You Should Use JavaScript Less Often Or Not At All For SEO

Providers of SEO services and web developers have been debating about SEO and JavaScript for quite a long time.

Search engines have made, and continues to do so even to this day, considerable improvements in indexing JavaScript websites.

Having said that, the issue of whether or maybe not the leading search engines can properly render pages made with JavaScript remains unclear.

The Good: New Developments Ease Compatibility

Google and Bing made SEO announcements last year regarding JavaScript, revealing improvements to ease compatibility.

Google said that they've begun utilising the most recent version of Google Chrome to render webpages executing JavaScript, Style Sheets, and more.

Bing stated that they're using the new Microsoft Edge as the Bing Engine to render pages.

Bingbot will now render web pages with the same underlying web platform technology already utilised by Googlebot, Google Chrome, as well as other Chromium-based browsers.

Both major search engines also announced that they are going to make their solution evergreen by regularly committing to updating their web page rendering engine to the latest stable version of their browser.

These constant updates will guarantee support for the most recent features, a substantial change from the prior versions.

Search Engines Actually are Simplifying SEO by Leveraging the Same Rendering Technology

These developments from Bing and Google allows web developers to easily ensure that their website as well as their web content management system work across both browsers without consuming a lot of time investigating every solution thoroughly.

With the exception of files which aren't robots.txt disallowed, the secondary content they see and experience in their brand new Google Chrome browser or Microsoft Edge browser is the same that search engines will see and experience.

For developers and SEOs, this will not require a huge amount of money and time.

For instance, there is:

  • No longer required to keep Google Chrome 41 around to test Googlebot.
  • No longer required to escalate to Bing.
  • No longer required to keep a compatibility list of which JavaScript function, style sheet directive work per search engine.

And this list goes on.

With this terrific news and free time, does that mean a green light for JavaScript?

Most probably not.

The Bad: JavaScript Is Still Facing Many Risks and Limitations

To put it simply, JavaScript can complicate the ability of search engines to read your page, and this can increase the risks for errors, which may be damaging for SEO.

When a search engine downloads a web file and begins analysing it, the first thing it does is be aware of its document type.

If the file is a non-HTML file (e.g. HTTP redirect, PDF, video or image) then it is not required to render the document leveraging JavaScript stack, as this content type doesn't include JavaScript.

For HTML documents, they are going to attempt to render the document with their optimised browser rendering solutions if they've sufficient resources.

Issues will begin to arise when JavaScript is not embedded in the document directly.

Search engines should download the file to read and execute it.

If the content is robots.txt disallowed, this will be impossible to do.

In case they're allowed, search engines need to download the file successfully, facing crawl quota per site and site unavailability issues.

Typically, search engines do not do complex methods like clicking a button. Therefore, it will be best to utilise basic HTML as <script> link to the file.

Another potential issue is the JavaScript file may not be in sync with the website’s cached version. Usually, search engines cache for lengthy time periods so as not to fetch each resource on the web page frequently.

JavaScript might make HTTP requests to load extra information files through HTTP calls, which will multiply the change of addressing the problems mentioned above.

JavaScript included in these JavaScript files or HTML may not be compatible with the JavaScript engine utilised by search engines.

If this is the case, the search engine will not read it. Therefore, it will not be remembered.

With the latest decision of search engines to utilise the same technology and commitment to updating their browsers, dealing with this in the future should not be simpler.

Moreover, always take note that the handling of JavaScript by the search engines is limited:

  • Search normalised URLs with a #. Dropping all parameters after the # (except the legacy #! Standard)
  • Search engines do not usually click large buttons and do complex actions.
  • Search engines do not wait for a long time for pages to render.
  • Search engines do not output complex interactive webpages.

JavaScript shouldn't be the new Flash!

Remember that each JavaScript instance must be read. It will decrease the site speed for ranking index if it is used too much.

The Uncertainty: For Optimal SEO, Use JS Practically, Sparingly Or Perhaps, Not at All

For huge websites and those that want to maximise search engines, it's better to detect search engine crawlers depending on their user agent (Bingbot, Googlebot) and output basic HTML with no or limited JavaScript.

Additionally, allow crawlers to access content with one HTTP request for the HTML and text you are planning to index.

In addition, there is concern that if a website feels the necessity to differentiate the experience for bots or with JavaScript, that they might receive penalties for spammer cloaking.

Fortunately, both Google and Bing suggest there's nothing to worry if you output similar text and content with the one seen by your human customers.

Google says:

“Currently, it’s difficult to process JavaScript and not all search engine crawlers are able to process it successfully or immediately. … we recommend dynamic rendering as a workaround solution to this problem. Dynamic rendering means switching between client-side rendered and pre-rendered content for specific user agents”.

Bing says:

“When it comes to rendering content specifically for search engine crawlers, we inevitably get asked whether this is considered cloaking… and there is nothing scarier for the SEO community than getting penalized for cloaking … The good news is that as long as you make a good faith effort to return the same content to all visitors, with the only difference being the content is rendered on the server for bots and on the client for real users, this is acceptable and not considered cloaking”.

Do or Do Not?

For SEO specialists, it is advisable to output JavaScript when search engine crawlers are visiting your webpages, assuming the HTML text content and formatting you return appear to be almost the same as those available to your human visitors.

In case JavaScript has a purpose on the website and page, then there’s no problem about using it.

Just remember to have a clear idea of the technical implications so your documents can be correctly indexed or consult with a technical SEO expert.

Search engines are incentivised to index your content to maintain customer satisfaction.

In case problems arise when it comes to using JavaScript, SEO specialists and webmasters are advised to take a look at them using search engine webmaster online tools or get in touch with them.

All details of this post were gathered from https://www.searchenginejournal.com/nofollow-crawl-indexing-update/349994/ and https://www.searchenginejournal.com/seo-javascript-good-bad-uncertainty/346708/. For more information, click the links.

Never think twice about working with SEO experts if you want to make huge improvements to your website rankings. Learn how we at Position1SEO can help you with this by visiting our homepage today.

nofollow links
Author: Jason Ferry
Jason Ferry is an SEO specialist based in the United Kingdom. Taking pride in his many years of experience in the search engine optimisation industry, he has honed his skills in various digital marketing processes. From keyword marketing, website auditing, and link building campaigns, to social media monitoring, he is well-versed in all of them. Jason Ferry’s excellent skills in SEO, combined with his vast experience, makes him one of the best professionals to work with in the industry today. Not only does he guarantee outstanding output for everyone he works with, but also values a deep relationship with them as well.

Related  Posts

the benefits of an search engine optimisation audit
Search Engine Optimisation (SEO) is an essential ingredient for website success in the digital arena. The process of indexing and ranking websites on Search Engine Results Pages (SERP) requires constant evaluation. Therefore, it is vital to conduct an in-depth SEO audit to identify areas that need improvement. SEO audit is the process of evaluating a […]
search engine optimisation company
Search Engine Optimisation (SEO) is a crucial aspect of building a strong online presence. While many website owners focus on using keywords to rank higher on search engines, an SEO company can optimise your website in more ways than one. In this blog, we will explore the three areas an SEO company can improve upon […]
The importance of hiring a search engine optimisation consultant
Are you struggling to get your business noticed online? Is your website buried under a sea of search results? If your answer is yes, then it might be time to consider hiring an SEO consultant. With the ever-growing importance of online presence for businesses, it has become crucial to employ the right strategies to make […]