Get the Top 10 Inbound Marketing News - once a month:

//   by judethomas |

Why Javascript Is(n’t) A Problem For Your SEO Anymore

Javascript Is(n't) A Problem For Your SEO Anymore

Since its beginning, SEO has developed a lot, both from a strictly content-related point of view and from, most importantly, a technical one. In fact, SEO has become a very coding-based discipline, where front-end development and content marketing intersect in order to improve a particular site’s organic traffic. Let’s dive straight into the current state of Javascript for your SEO.

The Indexing Myth

Javascript has always been seen as a major problem for what concerns SEO, given the fact that it was known to slow the entire crawling and indexing process, which is exactly what a Technical SEO executive wants to avoid. This was due to the fact that the crawling process was only gathering HTML and CSS pieces of information, which automatically removed all the content that was installed into any JS-based application (whether if coded or CMS based).

As shown in the image down below, JS is indeed crawled and indexed in a second round by Googlebot, which should keep quite all the people who were saying that SEO optimized sites should be HTML and CSS based.



This has been said in many interviews, webinars, Reddit’s AMA and public speakings: Googlebot is old and it’s evolving at an incredibly slow pace. It has been stated, in fact, that the bot is using Chrome 41 to crawl websites, therefore all the newer syntaxes like ES6 are only partially supported. Chrome 64, which is the newest version, should help to keep the bot up to the pace, but no one knows when it will be released and implemented into the crawling process.


With this in mind, it’s important to act accordingly in order to understand how your JS sections are performing from an SEO point of view, especially if you are running an e-Commerce store, known for being very JS-heavy.

Timing Optimisation

Keeping in mind all these major details for what concerns both the crawling process and the indexing one, there is a major optimisation rule (stated by John Mueller) which is: “There’s no specific timeout value that we define, mostly because the time needed for fetching the resources is not deterministic or comparable to a browser (due to caching, server load, etc). 5 seconds is a good thing to aim for, I suspect many sites will find it challenging to get there though.”

With this in mind and given all the previously mentioned variables, it’s important to always keep in mind the crawl budget when it comes to planning a new page.

To Conclude

JS as a whole heavily developed and Google clearly hasn’t kept it up with it, but, compared to what once was its view on the crawling and indexing process, we can safely say that, if done right, basic JS isn’t a problem anymore. Of course, this technical optimization process has impacted the SEO scene a lot, especially in the UK, where the digital marketing sphere is incredibly big at the moment. In fact, as discussed by the biggest SEO agency Manchester currently has, JS should be the top priority when the Crawling Budget is discussed.

Double Your Growth.

We curate the best of inbound marketing news and send over the top 10 we know will contribute to your growth - once a month.

Skip to toolbar