JavaScript and HTML: Forgiveness by Default

Frankly I feel that developing against FireFox is the source of many of the script errors. I’m just as sick of the CSS hell it fosters and this practice is breaking the web.

Whatever you think about IE… too bad, it’s the de facto standard. Get over it, or get fired. Is there a place to report FireFox ninnies for employer action? There surely ought to be.

You’ll never have a problem with XHTML if you use dom-to-xml writers, but you always will eventually if you don’t, no matter how careful you think you are. I don’t see any point to using it over HTML in any other case anyway, except as a more-dangerous alternative. You’re not parsing the files as an attempted semantic web, you’re not sparking a well-formed revolution, you’re just getting them to display in browsers, and as long as you use good HTML practices they’ll work cross-browser and cross-platform better than forcing XHTML will. (And since a large amount of HTML is generated by people who will never be programmers, applying programmers’ rules will doom it, like XHTML.)

The thing I miss from XHTML is auto-closed tags, mainly just because textarea/ looks a lot cleaner. .

Javascript needs to break more because it’s more than just a glorified markup language, and it does, even if it looks so permissive. But it doesn’t have memory lifetime issues or a need for top performance; similarly forgiving languages like ASP and PHP have certainly become so popular because they bend over backwards to do the right thing, and finding the right balance is important for the success of a language.

Ironically, Web Developer toolbar on FF is throwing JS errors on the main page of this site…

the obvious problem here is that we must make sure every interpreter reacts the same when it encounters incorrect code. this is close to impossible. but it will result in the users screaming: “$MY_BROWSER shows it right, yours must be broken!” when actually it’s the code that’s broken. in the real world, $MY_BROWSER will be IE. we all have to live with that.

“The permissive, flexible tolerance designed into HTML and JavaScript is alien to programmers who grew up being regularly flagellated by their compiler for the tiniest of mistakes. Some of us were punished so much so that we actually started to like it.”

Riiiight. And when programming in C or C++ do you -
a) Turn all warnings off and ask the compiler to be as lenient as possible.
b) Turn all possible warnings on and make the compiler as strict as possible.

Having the compiler warn you about all your possible mistakes allows you to fix them. A warning on that infinite loop would have allowed the developer to have fixed it before a client ever had it lock their browser. Having a compiler/parser barf on a bad colour would allow the developer to fix the problem before a user ever saw “sea green” as blue (or even green), ruining your company’s precious branding.

The problem with code containing errors is that it fails differently on different compilers/interpreters. If you never know about the error, you’re never going to realise that the fact that it’s failing in a way that looks OK on your system is a fluke, and that it might do anything on anyone elses.

Warnings and errors are good for developers. That kind of environment for JS is harmful to the people writing JS code. And the thing that’s harmful (well, annoying at least) to users of JS, is bad JS written by developers who don’t have a JS environment that tells them when their code is bad.

The problem is that the specification defines what to do when the HTML is valid.

If you want browsers to gracefully degrade and handle crap malformed HTML then they’ll do it inconsistently and you’ve opened another cross-browser compatibility issue.

One alternative would be to define how to handle rubbish in the spec. This means a bigger spec which adds complexity to the parsers and renderers…

Too many people publish HTML littered with errors they either don’t realise or don’t care about because it “works”.

Later on they find out it doesn’t.

You wouldn’t want your C# compiler to let you ship binaries with a best-guess for each error (that also varies per runtime implementation) so why on earth would you advocate it for HTML?


javascript: in the URL also gets you the console since Netscape2.

People need to give javascript a break. It is essentially the glue of the internet for 10+ years and has spawned way beyond its initial design (the sign of a good design or language that is useful). Many javascript errors are from ads or adblockers, or just bad non cross browser script. Its the most visible glue of the web and thus you see more problems with it. But if javascript is a problem for you I am not sure that you should be developing at all for the complainers about javascript.

It’d be nice if badly written pages were penalized by browsers in a way that would get users on their back without breaking sites. e.g., render quirks mode s l o w l y (unfortunately users would blame the browsers that did this). A better plan might be a standard to log the error back on the website - eg have browsers do a HEAD request for ‘/quirks?etc…(spam filter stops what I typed)’ if they drop into quirks mode. It’s no worse than favicon.ico requests, but might keep the designers honest.

It might be nice the browser displays something even when there is an error. The problem is, tons of those pages with javascript errors DON’T ACTUALLY FUNCTION. Since the person responsible for creating them never noticed the error instead the user of the page is just S.O.L. I don’t know what the correct solution is/was but I hate going to a page with a javascript error, trying to submit a form or click a button with javascript attached and getting nothing because of the error. I don’t see how that is a win.

Surely in part the staggering success of the internet has been because anyone can create some html and a bit of javascript and get it to work.

Web pages are small, javascript programs are small, there are very few interdependencies, the need for structure and discipline to make a large application hold together without collapsing into chaos just isn’t there.

How can we forget THE golden rule, “Keep It Simple Stupid”, and apply it to the whole development process and not just the end result.

I remember when every computer came with a tolerant usable programming language as standard, and it was expected a lot of people would have a go, even if it was just scrolling their name round the screen. How many of these for fun programmers started careers this way. Truly, I lament the fact that windows has never had a usable programming language as standard, especially one as tolerant as a typical web page.

Vive la difference that what I say.

“[Where VBScript is concerned,] the end user of a web page script is someone who has absolutely no ability to understand the error. If they do understand the error, they have no ability to go to the web server and fix it. Therefore, never produce error messages unless you absolutely have to.”
– Eric Lippert

The vast majority of end-users judge their web browsers like this. Pretty much nothing is going to change this. The behaviour kind of goes like this: “I want to see Website X. When I use firefox, I can’t see it. When I use IE, it works. IE must be better than Firefox, because it allows me to accomplish my goals.” The same behaviour is in action when people turn off all the options that protect their web browser from attack. Only two or three responders seem to have absorbed this basic lesson of human behaviour.

At a guess, as we increase the amount of automatically generated DECENT, VALID client-side script/markup, so will we see in APPLIED web standards. STRUTS was a good move, .NET moves in the right direction too. A deployed end-user environment like Flash will work too, since when users install the plugin, things will start working, thus fulfilling their goals. The fact that none of these are quite as good as we’d like doesn’t mean that they won’t be in the future, or that someone who does fulfill these things won’t move in.

I think SAS might be more forgiving than HTML/JavaScript. The SAS manuals actually give a long list of spelling errors that will pass as the original word.

For example for “if” you could spell it “i f” or “iif” or “iff” or “fi” and it will still go through as “if”. The language is the most forgiving I have ever seen.

I think developers get off lightly in respect to Javascript errors. I’ve worked with a lot of developers that don’t take Javascript coding seriously and who often code it with the attitude of “thats good enough”. Many developers do not write client side code with the same care as they would server side code when the consequences of mistakes can often be equally grim. If you wouldn’t gamble with unhandled exceptions in server side code why would you in client side code?

I believe Javascript skills are more of a requirement for web developers than an option. Server side programming alone cannot provide the useability that web users demand these days and I think many developers miss that. I write a good deal of client side code and I take it just as serious as the server code.

I completely agree with your basic premise of ‘forgiveness by default’ being better for the web. I prefer the term ‘degrade gracefully’, and classify it as one of three styles of error handling - the other two being fail fast (what you call the ‘Draconians’) and ignoring errors.

I’ve written about the relationship between error handling and reliability (see ), but I have not really thought about it in the context of usability and user adoption, so I really liked how you presented that viewpoint.

Question: How did the developer upload a broken site if the clients are only displaying errors? Shouldn’t he develop against browsers his audience is using, and have seen the error message?

This lead to thinking: The developer should be using the strict browser as his primary testing platform, so that he’d be catching those errors.

Which THEN led to thinking: Why doesn’t someone make a developer oriented browser, which strictly catches things like javascript and html/css errors so that we can produce working code and not care about what kind of client they are using.

Anyone less lazy than me want to take up that project?

I just noticed, while visiting a badly-designed site, that the Script Debugger will also fail on other malformations (e.g., “Img” as a tag in a claimed-to-be XHTML page). Some times the bugs are so bad that I can’t even get my cursor off the page or the failures are in a loop that won’t quit and I have to use the Task Manager to quit the browser.

This is the whole problem about whether or not HTML was intended to be written by humans and sites that humans hand-coded in Notepad or not had to be accepted.

I do know that I am wasting my time reporting buggy sites and meanwhile I have a sucky experience. The answer for me is to turn off the script debugger except when visiting pages of my own.

The irony, of course, as already noted, is that the standard IE-sucks litany will be echoed by those producing sites that are unbearably defective and being righteous about how we should start strict-checking against the zillions of pages that are out there.

It is also interesting that many RSS feeds and blog sites don’t validate the embedded from comments and entries, claiming it is XHTML when it isn’t. So even RSS readers have to be forgiving because those DTDs are often bogus and the entries are a mess. These items are all produced by software and it is not something under the blog author’s control or above the level of awareness.

OK, going to turn off the debugger now.

If browsers suddenly started punishing web devs the way they ought to, chaos on the web would reign. It would be a rough time for everyone. The upside to this doomsday scenario is that we would all be forced to fix our broken code and learn the RIGHT way. Like I said though, chaos for months, especially that first week - ouch.

Only chaos for the developers though, the users would just see a lack of updates for a couple weeks. :wink:

I discovered this as well, coders need to realize that you have to check for certain variables and things to ensure the user is seeing what you think they’re seeing.

Those who are unwilling to forgive others cannot truely forgive themselves…