How to Diagnose and Solve JavaScript SEO Issues in 6 Steps

It’s fairly widespread for firms to construct their web sites utilizing fashionable JavaScript frameworks and libraries like React, Angular, or Vue. It’s apparent by now that the online has moved away from plain HTML and has entered the period of JS.

While there’s nothing uncommon with a enterprise keen to reap the benefits of the most recent applied sciences, we want to tackle the stark actuality of this pattern: Most of the migrations to JavaScript frameworks aren’t being deliberate with customers or natural site visitors in thoughts.

Let’s name it the JavaScript Paradox:

  1. The massive manufacturers bounce on the JavaScript hype practice after listening to all the thrill about JavaScript frameworks creating wonderful UXs.
  2. Reality reveals that JavaScript frameworks are actually advanced.
  3. The massive manufacturers utterly butcher the migrations to JavaScript. They lose natural site visitors and typically have to minimize corners fairly than creating this wonderful UX journey for his or her customers (I’ll point out some examples in this text).

Since there is not any turning again, SEOs want to learn the way to take care of JavaScript web sites.

But that’s simpler mentioned than completed as a result of making JavaScript web sites profitable in serps is an actual problem each for builders and SEOs.

This article is supposed to be a follow-up to my complete Ultimate Guide to JavaScript SEO, and it’s supposed to be as simple to comply with as attainable. So, seize your self a cup of espresso and let’s have some enjoyable — listed below are six steps to provide help to diagnose and remedy JavaScript SEO points.

Step 1: Use the URL inspection software to see if Google can render your content material

The URL inspection software (previously Google Fetch and Render) is a superb free software that permits you to verify if Google can correctly render your pages.

The URL inspection software requires you to have your web site related to Google Search Console. If you don’t have an account but, verify Google’s Help pages.

Open Google Search Console, then click on on the URL inspection button.

In the URL type area, sort the complete URL of a web page you need to audit.

Then click on on TEST LIVE URL.

Once the check is completed, click on on VIEW TESTED PAGE.

And lastly, click on on the Screenshot tab to view the rendered web page.

Scroll down the screenshot to make certain your internet web page is rendered correctly. Ask your self the next questions:

  • Is the principle content material seen?
  • Can Google see the user-generated feedback?
  • Can Google entry areas like related articles and merchandise?
  • Can Google see different essential parts of your web page?

Why does the screenshot look totally different than what I see in my browser? Here are some attainable causes:

Step 2: Make positive you didn’t block JavaScript information by mistake

If Google can’t render your web page correctly, you need to ensure you didn’t block necessary JavaScript information for Googlebot in robots.txt

TL;DR: What is robots.txt?

It’s a plain textual content file that instructs Googlebot or another search engine bot if they’re allowed to request a web page/useful resource.

Fortunately, the URL Inspection software factors out all of the sources of a rendered web page which might be blocked by robots.txt.

But how are you going to inform if a blocked useful resource is necessary from the rendering perspective?

You have two choices: Basic and Advanced.


In most circumstances, it could be a good suggestion to merely ask your builders about it. They created your web site, so they need to comprehend it nicely.

Obviously, if the title of a script is named content material.js or productListing.js, it’s most likely related and shouldn’t be blocked.

Unfortunately, as for now, URL Inspection does not inform you concerning the severity of a blocked JS file. The earlier Google Fetch and Render had such an choice:


Now, we are able to use Chrome Developer Tools for that.

For academic functions, we shall be checking the next URL:

Open the web page in the newest model of Chrome and go to Chrome Developers Tools. Then transfer to the Network tab and refresh the web page.

Finally, choose the specified useful resource (in our case it’s YouShallNotGo.js), right-click, and select Block request URL.

Refresh the web page and see if any necessary content material disappeared. If so, then you need to take into consideration deleting the corresponding rule out of your robots.txt file.

Step three: Use the URL Inspection software for fixing JavaScript errors

If you see Google Fetch and Render isn’t rendering your web page correctly, it could be due to the JavaScript errors that occurred whereas rendering.

To diagnose it, in the URL Inspection software click on on the More information tab.

Then, present these errors to your builders to allow them to repair it.

Just ONE error in the JavaScript code can cease rendering for Google, which in flip makes your web site not indexable.

Your web site may go positive in most up-to-date browsers, but when it crashes in older browsers (Google Web Rendering Service relies on Chrome 41), your Google rankings could drop.

Need some examples?

  • A single error in the official Angular documentation precipitated Google to be unable to render our check Angular web site.
  • Once upon a time, Google deindexed some pages of, an official web site of Angular 2+.

If you need to know why it occurred, learn my Ultimate Guide to JavaScript SEO.

Side observe: If for some cause you don’t need to use the URL Inspection software for debugging JavaScript errors, you should utilize Chrome 41 as an alternative.

Personally, I favor utilizing Chrome 41 for debugging functions, as a result of it’s extra common and presents extra flexibility. However, the URL Inspection software is extra correct in simulating the Google Web Rendering Service, which is why I like to recommend that for people who find themselves new to JavaScript SEO.

Step four: Check in case your content material has been listed in Google

It’s not sufficient to simply see if Google can render your web site correctly. You have to make certain Google has correctly listed your content material. The best choice for that is to use the location: command.

It’s a quite simple and very highly effective software. Its syntax is fairly simple: website:[URL of a website] “[fragment to be searched]”. Just take warning that you didn’t put the area between website: and the URL.

Let’s assume you need to verify if Google listed the next textual content “Develop across all platforms” which is featured on the homepage of

Type the next command in Google: “DEVELOP ACROSS ALL PLATFORMS”

As you possibly can see, Google listed that content material, which is what you need, however that’s not at all times the case.


  • Use the location: command every time attainable.
  • Check totally different web page templates to make certain your total web site works positive. Don’t cease at one web page!

If you’re positive, go to the subsequent step. If that’s not the case, there could also be a few the reason why that is taking place:

  • Google nonetheless didn’t render your content material. It ought to occur up to a number of days/weeks after Google visited the URL. If the traits of your web site require your content material to be listed as quick as attainable, implement SSR.
  • Google encountered timeouts whereas rendering a web page. Are your scripts quick? Do they continue to be responsive when the server load is excessive?
  • Google continues to be requesting previous JS information. Well, Google tries to cache quite a bit to save their computing energy. So, CSS and JS information could also be cached aggressively. If you possibly can see that you just mounted all of the JavaScript errors and Google nonetheless can’t render your web site correctly, it could be as a result of Google makes use of previous, cached JS and CSS information. To work round it, you possibly can embed a model quantity in the filename, for instance, title it bundle3424323.js. You can learn extra in Google Guides on HTTP Caching.
  • While indexing, Google could not fetch some sources if it decides that they don’t contribute to the important web page content material.

Step 5: Make positive Google can uncover your inner hyperlinks

There are a number of easy guidelines you need to comply with:

  1. Google wants correct <a href> hyperlinks to uncover the URLs in your web site.
  2. If your hyperlinks are added to the DOM solely when any person clicks on a button, Google gained’t see it.

As easy as that’s, loads of massive firms make these errors.

Proper hyperlink construction

Googlebot, in order to crawl an internet site, wants to have conventional “href” hyperlinks. If it’s not offered, lots of your webpages will merely be unreachable for Googlebot!

I feel it was defined nicely by Tom Greenway (a Google consultant) throughout the Google I/O convention:

Please observe: in case you have correct <a href> hyperlinks, with some further parameters, like onClick, data-url, ng-href, that’s nonetheless positive for Google.

A standard mistake made by builders: Googlebot can’t entry the second and subsequent pages of pagination

Not letting Googlebot uncover pages from the second web page of pagination and past is a standard mistake that builders make.

When you open the cellular variations for Gearbest, Aliexpress and IKEA, you’ll rapidly discover that, in reality, they don’t let Googlebot see the pagination hyperlinks, which is admittedly bizarre. When Google permits mobile-first indexing for these web sites, these web sites will endure.

How do you verify it by yourself?

If you haven’t already downloaded Chrome 41, get it from

Then navigate to any web page. For the sake of the tutorial, I’m utilizing the cellular model of For academic functions, it’s good if you happen to comply with the identical instance.

Open the cellular model of the Mobile Phones class of Aliexpress.

Then, right-click on View More and choose the examine button to see the way it’s carried out.

As you possibly can see, there aren’t any <a href>, nor <hyperlink rel> hyperlinks pointing to the second web page of pagination.

There are over 2,000 merchandise in the cell phone class on Since cellular Googlebot is in a position to entry solely 20 of them, that’s simply 1 p.c!

That means 99 p.c of the merchandise from that class are invisible for cellular Googlebot! That’s loopy!

These errors are attributable to the improper implementation of lazy loading. There are many different web sites that make related errors. You can learn extra in my article “Popular Websites that May Fail in Mobile First Indexing”.

TL;DR: utilizing hyperlink rel=”subsequent” alone is just too weak a sign for Google

Note: it’s widespread to use “link rel=”subsequent’ to point out pagination sequence. However, the discoveries from Kyle Blanchette appear to present that “link rel=”subsequent” alone is just too weak a sign for Google and ought to be strengthened by the standard <a href> hyperlinks.

John Mueller mentioned this extra:

“We can understand which pages belong together with rel next, rel=”earlier”, but when there aren’t any hyperlinks on the web page in any respect, then it’s actually exhausting for us to crawl from web page to web page. (…) So utilizing the rel=”subsequent” rel=”earlier” in the pinnacle of a web page is a superb thought to inform us how these pages are related, however you really want to have on-page, regular HTML hyperlinks.

Don’t get me improper — there’s nothing improper with utilizing <hyperlink rel=”subsequent”>. On the opposite, they’re useful, nevertheless it’s good to mix these tags with conventional <a href> hyperlinks.

Checking if Google can see menu hyperlinks

Another necessary step in auditing a JavaScript web site is to make certain Google can see your menu hyperlinks. To verify this, use Chrome 41.

For the aim of the tutorial, we are going to use the case of

To begin, open any browser and decide some hyperlinks from the menu:

Next, open Chrome 41. In the Chrome Developer Tools (click on Ctrl + Shift + J),  navigate to the weather tab.

The outcomes? Fortunately sufficient, Google can decide up the menu hyperlinks of

Now, verify if Google can decide up the menu hyperlinks in your web site and see if you happen to’re on the right track too.

Step 6: Checking if Google can uncover content material hidden underneath tabs

I’ve typically noticed that in the case of many e-commerce shops, Google can’t uncover and index their content material that’s hidden underneath tabs (product descriptions, opinions, associated merchandise, and many others). I do know it’s bizarre, nevertheless it’s so widespread.

It’s a vital a part of each SEO audit to make certain Google can see content material hidden underneath tabs.

Open Chrome 41 and navigate to any product on; for example, Muscle Fit Vest.

Click on Details & Care to see the product description:


94% Cotton 6% Elastane. Muscle Fit Vest. Model is 6’1″ and Wears UK Size M.“

Now, it’s time to verify if it’s in the DOM. To accomplish that, go to Chrome Developers Tools (Ctrl + Shift + J) and click on on the Network tab.

Make positive the disable cache choice is enabled.

Click F5 to refresh the web page. Once refreshed, navigate to the Elements tab and seek for a product description:

As you possibly can see, in the case of, Google is in a position to see the product description.

Perfect! Now take the time and verify in case your web site is okay.

Wrapping up

Obviously, JavaScript SEO is a fairly advanced topic, however I hope this tutorial was useful.

If you’re nonetheless battling Google rating, you may want to take into consideration implementing dynamic rendering or hybrid rendering. And, after all, be happy to attain out to me on Twitter about this or different SEO wants.

Source hyperlink Internet Marketing

Be the first to comment

Leave a Reply

Your email address will not be published.