What’s this about?
This is an extract from TWIS SEO Update 2nd June 2017.
You can read actionable insights on the effects of JavaScript causing issues with crawling and indexing in this post. If you would like to discuss ways this may impact your web business, please feel free to contact me.
#SEO #JavaScript #Googlebot #GoogleSearchConsole
Some JavaScript Prevents Google Indexing & Slows Down Googlebot Crawling
Summary:
- A few weeks ago, I tweeted about Javascript causing indexing / caching issues, now we have an in-depth test on Moz which demonstrates the issues.
- Some of the frameworks used in a number of web deployments just do not present content to Google to Fetch and Render, and then index.
- Remember – no indexation = no display in SERPs.
- Crawling and Indexing content are critical to being listed
- Those frameworks that do render can cause big slowdown issues on mobile, and occasionally desktop. Usually this is caused by too much processing being used, memory sizes or memory sizes & caches being exceeded.
- JavaScript has brought many dynamic improvements to the web, but it still needs to be used carefully.
Actions to take:
- Read the Moz article to understand the particular frameworks affected and how they are affected.
- Test critical pages in Google Search Console Fetch and Render. Make sure Google can see and index all critical content.
- Review alternatives to JavaScript usage. If you don’t *need* to use it, don’t. Traffic tests show increased usage for non-JavaScript implementations.
- Always have a <noscript> alternative. Googlebot and Screen Readers will thank you for it.
- Where possible, ditch JavaScript if it drives critical / cannot fail functions on your website.
- Contact me if you would like to chat about JavaScript and if it is causing issues with your site’s indexation and performance.
Discussion:
Using JavaScript to do your processing for you is almost the ultimate distributed computing / cloud computing experiment. On Desktop with loads of RAM, super-quick processors and lots of resources, it tends to work reasonably seamlessly for the user. On mobile, however, it causes non-loads, back-outs and user frustration because the phone struggle to cope with the vast amounts of data, scripts and resources being flung down the pipes at them..
There is now a very good case for stripping JavaScript implementation back to functions which are absolutely necessary. I have had instances where a client’s entire navigation was rendered in JavaScript which was unreadable to Google. The client’s CMS also didn’t allow for the creation of an XML Sitemap. This kind of thing is the thinking behind the AMP project and almost takes us back to flat / static HTML.
Don’t believe the developers who think that Google can read an index all JavaScript. It can’t and it may not make an effort to do so.
More info:
The State of SEO Mid-2017 Released
TL;DR
- Read The State of SEO in mid-2017.
- JavaScript can still stop your site from being indexed and crawled. Use it wisely, young padawan.
- More traffic / conversions / revenue is possible with less JavaScript.
Thanks for reading. If you would like to discuss what these changes mean for your web property, or would like to know how to implement them, please feel free to contact me.