Face It: You Can’t Rely on JavaScript

I’ve been cautioning folks against over-reliance on JavaScript for the better part of a decade. In that time, I harped a lot on Lala.com (which was eventually bought by Apple and shuttered) because they loaded all of their content via Ajax. If you showed up to the page with JavaScript disabled, you were greeted with a curt “you must be this high to ride” type message and, my favorite feature, a “loading” indicator:

Of course, without JavaScript, nothing was loading; the site was devoid of content and completely unusable. Even the search box was pointless as it had no submit button and relied on predictive typing to find anything.

That was four years ago. Skip ahead to the relaunch of the Gawker Media platform and you have a company (that should really know better) putting all of their eggs in the JavaScript basket yet again. True, they certainly haven’t been the only ones to launch a site design that relied 100% on JavaScript since Lala, but their epic fail yesterday proved, yet again, that you can’t rely on JavaScript (and Ajax).

So why can’t you rely on JavaScript? Let’s go through the list:

  1. Users may choose to turn JavaScript off in their browser (for performance reasons, as a low-fi way to block pop-ups and ads, or because they ascribe to the age-old misconception that JavaScript is inaccessible).
  2. Network administrators may block JavaScript at the firewall (usually because they think it’s insecure).
  3. A JavaScript issue as simple as a typo could cause a fatal error that causes JavaScript execution to be aborted completely.
  4. In the case of Ajax, the service you are relying on to deliver content to the browser may, itself, experience an error and return nothing or a bunch of error code.

For these reasons, you should always build your website following progressive enhancement: start with the reliable baseline of HTTP and good copywriting; add semantic HTML (and microformats); apply CSS in layers to create visual hierarchies; use Hijax and other progressively-enhanced JavaScript patterns to improve the interactivity; and cap it off with accessibility enhancements in the form of ARIA roles and states.

For musings on the Gawker redesign, progressive enhancement, and JavaScript-focused “hash-bang” URLs, read Jeremy’s excellent post and Mike’s in-depth analysis.

Like it? Share it

Share on LinkedIn

Share on Google Plus

Comments

  1. Although I agree that one can not depend on Javascript for core site functionality, I have to disagree with your four points.

    1) The majority of users don’t know about, or care what Javascript is.  Most will look for plug-in’s for noscript functionality based on something they saw on facebook or some other site.  There is a very small percentage of users (<5%) that turn it off manually.

    2) Most network administrators don’t block Javascript.  Sure it happens but Javascript is one of the least worrisome attack vectors on a corporate network.  Blocking JS also requires packet inspection as it can not be blocked without fully parsing every HTTP/HTTPS packet and in some cases, rebuilding the TCP/IP conversation completely to determine the content.  Most companies don’t want that overhead, some cannot afford it.

    3) Errors should be covered by a good development cycle.  That being said, some error conditions may slip through… but what bugs in the OS, or browser, or other software packages?  Nothing is 100% error free so why pick on Javascript?

    4) Again, proper error handling should make this a moot point.

    Further to these points, the sites you bring to light here are ‘fun’ sites.  So even if a network admin kills Javascript at the firewall, the user probably has no right being on there anyways.

    From experience as a network administrator, if you block anything from the users, they will use their mobile devices to waste their day.

    What it all comes down to is the target market for your site.  If you are targeting the corporate user-base then make your site accessible as if behind a restrictive policy.  If your site is for fun, then of course ramp up the Javascript functionality and enjoy it because most end-users will be sitting at home or in a coffee shop.

  2. I agree with the premise more than the reasoning. With the goal being 100% accessible content for all, the reasons just aren’t strong enough. I throw out the first 3 reasons, and here’s why:

    1 - Again,while the goal is accessible content, users turning off JavaScript (or user agents that don’t support JavaScript) is probably a negligible percentage for the majority of sites. I’ve seen the stats (on JS disabled) range from 1-5%, and we can speculate on both sides of the argument as to what those stats represent, so the question is, based on the project, is the client/agency willing to forgo those users in favor of a mandatory JS-enabled site? Most of the time that answer is yes.

    There are some team/client education points here, but there can be extra work involved in making sites accessible to users with JS disabled. This is not just something webdevs can build into their development process. Why? Because when you work for an agency, changing the way a site looks/functions is not your prerogative. There have to be rounds of client-approved designs so that the designers/UX and client can agree what the site/brand will look like in those instances. The tech team then has to agree on technology and approach, since again, at a large agency, you may be dealing with specific roles/functions where 1 dev doesn’t do it all, but a team of devs create a site,  and the work may be split along client-side and server-side lines. So yeah, if you’re creating a blog, or a personal portfolio site, or you’re a webdev who handles everything— from design to development—for your client roster, then absolutely you can/should build this into your process. But if you’re, say, a large shoe manufacturer, and you engage a large agency, you may wind up positioned in the experience over accessibility category. And this may all happen before a dev ever sees the site (not saying there aren’t issues with this scenario, but it’s the reality).

    2 - If the sys-admin is turning off JS, these users have bigger problems, since that’s using a sledgehammer instead of a surgical knife, and these ‘corporate’ users probably shouldn’t be surfing the net anyway (since it’s obvious somebody doesn’t approve), but if they do, see my response to #1.

    3 - Yep, simple typos or bad content can take an entire site down, but that’s true of server-side technologies too. You do something wrong, the site fails to load. That’s not a JavaScript issue, that’s a dev/QA problem and shouldn’t be part of the discussion. You make a typo in a PHP request or a variable name in .Net, or a mistake in an .htaccess file, and its bye-bye site. Nobody complains about typos in other languages being the death of the internet, so why single out JavaScript here? These things can be caught and fixed in QA, but things get missed and slip out as they do no matter what the language.

    In the case of #4, that’s about the only thing that is in the realm of the web dev and should be accounted for by the tech team—What happens on error? What’s the experience? How do we get the user back on track? This is at least a use case we can and should plan for.

    Again, there’s no compelling reasons here why we shouldn’t plan for and rely on JS. It’s even becoming a standard on all handheld devices. But I do agree that more thought needs to go into the planning stages and starting with a base of accessible content and enhancing the experience after that. It’s just not cut and dry.

    • mpaige
    • | #
  3. true, but you will got so much work doing things work without javascript for 0,1% of the users ?
    nowadays things dont even work ok in ie6…

    ok, but in a dream world things should work fine even in lynx.

  4. Glad this discussion is happening!  Thanks for the nice clean summary + thoughts + good links.  I was glad to hit your post ‘first’ as I read-up on this topic.

    I’d like to point out an assumption being made by you and Jeremy and a little by Mike: we are making webpages.  Not a bad assumption, but an assumption nonetheless that isn’t necessarily universal…

    In particular, with the rise of folks wanting to do ‘mobile apps’ but don’t have Java skills (or Objective C or whatever) AND were told that HTML5 can all-but make a mobile app.

    Personal opinion: this catalyzed an up-and-coming approach of not making web_pages_ but web_apps_.  Fundamental difference, to me, is state—pages try to be stateless, apps tend to use state like the app-makers can do in Java/C.

    If one is making a web-app-still-using-URLs, then forcing JavaScript is reasonable, I would think.  But only from a web-APP approach, not a web-PAGE approach.

    Nonetheless, I really like Mike’s statement “You are already using JavaScript, so you can do this damage much later with JavaScript using a click handler on the link.” (that is, use a ‘normal’ link and then put a click-handler on it to make it an AJAX call instead).  Seems like a great paradigm.  THIS seems like a great solution to the ridiculous use of AJAX for what is stateless (web-PAGE) anyway (the fuel of this topic).

  5. I couldn’t agree more with this article. I’m glad to see such a large movement finally speaking out against this nonsense.

    There is a very small percentage of users (<5%) that turn it off manually.

    Where did this number come from? I’ve seen it mentioned all over, but I’m curious about the source. That number is surprising to me because almost everyone I know browses with javascript disabled by default. I expected it to be at least higher than 5%.

  6. To all the people saying that Javascript errors are a QA/dev problem: that’s true if it’s a script on your site… but what about all the 3rd party analytics, tracking and ad scripts that infest every site?

    Errors in these scripts will cause the same problem, and there’s often nothing you can do about it.

    • Derek
    • | #
  7. re: “Where did this number come from? I’ve seen it mentioned all over, but I’m curious about the source.

    There are various sources available via Google search.  The norm seems to range from 0.02% - 3.2% according to most of the ones I’ve found so far.

    My own site shows 0.059%, and some of that amount are bots so the number is already higher than actual.

    I would also like to point out that using external Javascript (i.e. analytic scripts)  is no worse than using external Fonts like this site does.  There is a long pause in the load time for me while it fetches fonts from TypeKit.  Although the fallback has default fonts, it still falls into the realm of usability with a 15second load time for a simple blog post.

  8. Obviously the quality of Gawker’s implementation was questionable.

    But does that mean we should all stick with a web of documents for eternity?

    It’s great that they’re pushing on the boundaries of what’s possible with web standards, and that shouldn’t provoke a backlash against web apps in general.

    GMail’s been going strong for years, and I don’t recall anyone complaining that it didn’t use progressive enhancement or that it had cryptic internal URL states.

  9. its like saying i disabled images and now i cant see any pictures

    • drew
    • | #
  10. I’m really happy to see so much discussion around this topic as I think it’s an incredibly important conversation to have.

    <blockquote cite=”#comment-530”>There is a very small percentage of users (<5%) that turn it off manually.</blockquote>

    I hear this statistic a lot, probably because it’s the most recent (2008) statistic on JavaScript use that’s been published by W3Schools. I see a few problems with quoting stats like that though:

    # every site is different and every site has a different audience, only your Javascript stats matter (as Kev C points out);
    # many statistics-tracking packages rely on JavaScript to collect data and, therefore, don’t always produce the clearest picture of JavaScript penetration.

    It’s important to remember that, when you are looking at your own stats, your choice to rely on JavaScript from the get-go can skew your stats by driving people away who don’t have it turned on.

    It’s also important to keep in mind that many mobile browsers (pre-Webkit) run without JavaScript support or have it turned off by default. Blackberries, for instance, shipped with JavaScript off. The Blazer browser (seen on older Palm devices) offered an optimized browsing mode that turned off images and Javascript as well. We may like designing and developing for mobile Webkit implementations (iOS, Android, webOS), but that’s not all that’s out there.

    <blockquote cite=”#comment-530”>The question is, based on the project, is the client/agency willing to forgo those users in favor of a mandatory JS-enabled site? Most of the time that answer is yes.</blockquote>

    In my opinion, that only reflects their own ignorance. It’s also part of the reason I left the agency world four years ago. When it comes to websites that we, as an agency, do for our clients, that isn’t a choice we offer our clients/partners. We build stuff to be accessible, to be progressively enhanced. Most of the time, the client is unaware, but it’s something that’s part and parcel of our development process. Most of our clients recognize that we are the experts and trust us to build them amazing websites that serve _all_ of their users and it’s why our clients have received gushing praise from their blind users and other people with disabilities. There are few things that make a client’s day like getting complemented for something they didn’t even realize they’d done.

    <blockquote cite=”#comment-531”>There are some team/client education points here, but there can be extra work involved in making sites accessible to users with JS disabled. This is not just something webdevs can build into their development process. Why? Because when you work for an agency, changing the way a site looks/functions is not your prerogative.</blockquote>

    I respectfully disagree. If you practice progressive enhancement, we’ve actually found it to be cheaper because testing does not take nearly as long. When working with large (or small) development teams, you all need to be on the same page with regard to how to handle interactions without JavaScript, but that’s completely doable with a little bit of education and the occasional sit-down (led by a knowledgable dev) to walk through the JS and no-JS interactions.

    And in terms of client approval: if the client is interested, you can involve them in a conversation about how that interaction works, but if they aren’t, they should trust you to execute it well, based on the branding they’ve established for the remainder of the website.

    <blockquote cite=”#comment-530”>Errors should be covered by a good development cycle.</blockquote>

    I 100% agree; they _should_. That said, based on my experience working at places where I was not in charge of the development cycle (read: not here at Easy), development teams often schedule too little time for testing of application code before releasing it to the public. And, as Derek pointed out:

    <blockquote cite=”#comment-535”>…that’s true if it’s a script on your site… but what about all the 3rd party analytics, tracking and ad scripts that infest every site?</blockquote>

    Amen.

    On the topic of my fourth point, regarding Ajax services being unavailable or having errors: I absolutely agree with Kev C that proper error handling should be able to shield users form experiencing errors. But what if, as with Gawker and Lala, your entire site relies on that Ajax call to show _any_ content. A failure to receive or properly load that data is the programmatic equivalent of going to school naked.

    <blockquote cite=”#comment-533”>I’d like to point out an assumption being made by you and Jeremy and a little by Mike: we are making webpages.  Not a bad assumption, but an assumption nonetheless that isn’t necessarily universal…</blockquote>

    That is, in fact, my assumption. And, specifically, I am addressing my comments to *public web sites*. As I’ve said many times before, if you are running a web app (e.g. Basecamp), I think you should be able to shape your audience’s expectations with regard to which browsers and technologies you’re supporting. The same goes for intranets and private websites. On the _public web_ however, I think you have a responsibility (if not to your users, at least to your content) to ensure your site works for people. Period.

    For what it’s worth, we recently built a web app (for the Chrome Store) that did rely 100% on JavaScript, but we only did so because it was specifically targeted at the Chrome Store and the project requirements specifically outlined the HTML5 APIs they wanted used. But, again, that was a web app (and one specifically designed to be an in-browser app at that), not a web site.

    <blockquote cite=”#comment-537”>GMail’s been going strong for years, and I don’t recall anyone complaining that it didn’t use progressive enhancement or that it had cryptic internal URL states.</blockquote>

    Again, I’d consider GMail to a web-based app, rather than a public website, so my above statement still applies. GMail’s user base, however, is far larger than an app like Basecamp, so there’s an imperative to provide a non-JS version, which they do. In that case, following the hijax method of layering JavaScript atop a working HTML page allows you to build and maintain a single codebase rather than two.

    • Aaron Gustafson
    • | #
  11. Not to say you’re completely wrong on this, but I’m not entirely convinced there’s a significant difference between JS typos, and link-text typos—both completely fail to load the desired content. One might argue that the probability of one is higher than the other, but testing should be sufficient to catch/fix both. Or, for that matter, between unavailable AJAX resources, and “normal” http server problems; again, both will fail to return the requested content. With AJAX failures, you can at least make an attempt to recover gracefully.

    Though I’m becoming less and less concerned about those who *choose* to turn off JS (to me they sound more and more like people who place their foot in front of the door and then complain it won’t open when they pull) I remain concerned about those without a choice. I wish there existed a reliable way to measure that group. It would help inform client meetings where,  increasingly, I’m finding myself on the opposite side of this discussion with clients who more and more press me to “stop wasting time making it work w/o JS” because they’re obsessed with the bright/shiny.

    • Arlen
    • | #