The Cost of Best Practices on the Web

Posted in Front-End

Those of us who were building websites a decade ago remember the Internet Explorer tax.

Project managers budgeted anywhere from 25% to 50% of additional engineering effort for building a site compatible in IE6 & 7. As the majority of computers ran Windows XP, IE users had to be supported. Unfortunately, it was precisely those two browsers which were saddled with a legacy rendering engine that did not confirm to web standards[1]. To make matters worse, only came with a limited set of developer tools.

The best practices for web UI development at the time were also largely necessary practices. The list of things to remember and use read like Magic: the Gathering errata: often, it came down to Googling for a blog post which ascribes a series of arcane characters that will make browsers behave in certain unexpected ways. Few people truly understood why the code worked, but went along because life was too short to be reverse engineering divergent implementations of the box model.

As time went on and new browsers evolved towards the standard, these “best practices” became less necessary. The criteria for supporting legacy browsers boiled down to market share; if not enough people visited the site on old crappy IE, it was no longer worth the effort to maintain compatibility. It was as much a business decision as it was an exercise in sanity preservation.

So when I see people argue for aggressive graceful degradation of sites and accessibility for niche computing devices, I think of the tradeoffs and opportunity costs that we wrestled with in the not-too-distant past.

Footnotes    (↑ returns to text)

  1. To be fair, IE6 was built before the current process of web standardization existed. Microsoft can’t be fully blamed for the deplorable state of IE6, but then again, they didn’t make things that much better with IE7 and 8 either.