In a recent Google Search Central SEO office-hours hangout, a question was submitted to Google’s Search Advocate John Mueller asking if it’s a bad for a website to be dependent on JavaScript for basic functionality.
Might this have a negative effect on Googlebot when it comes to crawling and indexing?
Mueller observed that it’s probably fine but also suggested things to do to make sure that both Google and users have no problems with the site.
Site Is Not User Friendly Without JavaScript
The person asking the question noted that a great deal of functionality of the site depended on JavaScript and was concerned about the impact on both user-friendliness and SEO-friendliness.
This is the question:
“Our website is not very user friendly if JavaScript is turned off.
Most of the images are not loaded. Out flyout menu can’t be opened.
However the Chrome Inspect feature, in there all menu links are there in the source code.
Might our dependence on JavaScript still be a problem for Googlebot?”
What the person means about the “Chrome Inspect feature” is probably the View Page Source code inspection tool built into Chrome.
So what they mean to say is that, although the links are not accessible when JavaScript is turned off in a browser, the links are still there in the HTML code.
Mueller Recommends Site Testing
Mueller’s answer acknowledged that Google could probably handle the site.
But what was left unspoken is the fact that the functionality of many sites depends on JavaScript and that the experience of the person asking the question is pretty much normal.
Visit most any site with JavaScript turned off on a browser and many images won’t load, the layout may become broken and some of the menus won’t work.
Below is a screenshot of SearchEngineJournal as viewed with JavaScript disabled:
While Mueller hinted at this fact with his answer, it should probably be put to the forefront of the answer that most sites are user-unfriendly without JavaScript enabled on a browser and that the experience of the person asking the question is not out of the ordinary but is in fact quite common.
Mueller acknowledged that everything would probably be fine.
He said:
“And, from my point of view …I would test it.
So probably everything will be okay.
And probably, I would assume if you’re using JavaScript in a reasonable way, if you’re not doing anything special to block the JavaScript on your pages, then probably it will just work.”
Test To See How Site Performs
Mueller next encouraged the person to run tests in order to be sure the site is functioning optimally and mentioned that “we” have tools but he didn’t mention specific tools.
Presumably he’s speaking of tools available on Google Search Console that can provide feedback on whether Google is able to crawl pages and images.
Mueller continued his answer:
“But you’re much better off not just believing me, but rather using a testing tool to try it out.
And the testing tools that we have available are quite well documented.
There are lots of …variations on things that we recommend with regards to improving things if you run into problems.
So I would double-check our guides on JavaScript and SEO and think about maybe, …trying things out, making sure that they actually work the way that you want and then taking that to improve your website overall.”
User Friendly Site Experiences
Mueller next discussed the issue of user-friendliness because the person asking the question mentioned that the site is user-unfriendly with JavaScript turned off.
The overwhelming majority of sites on the Internet use JavaScript, W3Techs publishes a statistic that 97.9% of sites use JavaScript.
HTTPArchive, which uses actual Chrome user data from opted-in users notes in its annual report on JavaScript use that the median number of JavaScript downloads for mobile devices is 20 and as high as 33 first-party JavaScript and 34 third-party scripts for the 90th percentile of websites.
HttpArchive further points out that for the median average of websites, 36.2% of JavaScript forced onto a site visitor’s browser goes unused, it’s just wasted bandwidth.
As you can see, the issue is not about users with JavaScript turned off visiting a site, as the person asking the question was concerned about. Their concern was misplaced.
The real problem centers on users encountering a site that is forcing too much JavaScript on site visitors and thereby creating a poor user experience.
Mueller didn’t mention the nuance of how the person’s concerns were misplaced. But he did recommend useful ways to figure out if users are having a negative experience due to JavaScript issues.
Mueller continued his answer:
“And you mentioned user-friendly with regards to JavaScript, so from our point of view, the guidance that we have is essentially very technical in the sense that we need to make sure that Googlebot can see the content from a technical point of view, and that it can see the links on your pages from a technical point of view.
It doesn’t primarily care about user-friendliness.
But of course your users care about user-friendliness.
And that’s something where maybe it makes sense to do a little bit more so that your users are really for sure having a good experience on your pages.
And this is often something that isn’t just a matter of a simple testing tool.
But rather something where maybe you have to do a small user study or kind of interview some users or at least do a survey on your website to understand where do they get stuck, what kind of problems are they facing.
Is it because of these …you mentioned the fly-out menus. Or is it something maybe completely different where they’re seeing problems, that maybe the text is too small, or they can’t click the buttons properly, those kinds of things which don’t really align with technical problems but are more, kind of, user-side things that if you can improve those and if you can make your users happier, they’ll stick around and they’ll come back and they’ll invite more people to visit your website as well.”
Test For Users And Google
Mueller didn’t explicitly reference any tools for carrying out any of the recommended tests. It’s fairly obvious that Search Console is the best tool to diagnose crawling issues with Google. Search Console alerts publishers of how many URLs are discovered, for example.
As for user experience tools, one of the best is the free Microsoft Clarity user experience analytics tool. This GDPR compliant analytics tool provides insights into how users experience your site and can signal when they’re having a bad user experience.
So it can be very useful for diagnosing possible site issues that John Mueller discussed.
Citation
Watch John Mueller at the 10:23 minute mark:
Featured Image: Elle Aon/Shutterstock
!function(f,b,e,v,n,t,s) {if(f.fbq)return;n=f.fbq=function(){n.callMethod? n.callMethod.apply(n,arguments):n.queue.push(arguments)}; if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version='2.0'; n.queue=[];t=b.createElement(e);t.async=!0; t.src=v;s=b.getElementsByTagName(e)[0]; s.parentNode.insertBefore(t,s)}(window,document,'script', 'https://connect.facebook.net/en_US/fbevents.js');
if( typeof sopp !== "undefined" && sopp === 'yes' ){ fbq('dataProcessingOptions', ['LDU'], 1, 1000); }else{ fbq('dataProcessingOptions', []); }
fbq('init', '1321385257908563');
fbq('track', 'PageView');
fbq('trackSingle', '1321385257908563', 'ViewContent', { content_name: 'site-dependence-on-javascript-a-problem-for-googlebot', content_category: 'news seo ' });