Is HTML Too Big To Fail?

The Forks At Helm's Deep

Evil geniuses, be it in history or fiction, tend to distinguish themselves through some unique trait that hallmarks their own personal expertise beyond the generic, run-of-the-mill apparatus of evil. Dracula has dread, Torquemada has torture, Carly Rae Jepsen has “Call Me Maybe”. If ever an evil genius were to rise from the ranks of the standards community, there is no shade of a doubt for me that her evil specificity will be tedium.

Standards are tedious to read, they're tedious to write. Discussing them involves many a tedious detail, testing them involves a lot of tedious double-checking, and as a group we tend to be drearily tedious about the insipid rules which govern how we go about our everyday tedium.

But while the above can be tolerated through an acquired taste for the lackadaisical, nothing can ever quite equal just how tiresome our community's bickering can be. I would in fact posit that one reason Web standards have yet to produce their own distinguished evil genius is because we produce our best evil as a group. Rumour has it that a few ambitious souls have attempted to produce tedium duller than the drudgery of our drama, flame wars, and infighting; and that today they are in intensive recovery at a never-ending accounting convention, watching the paint dry on tic-tac-toe games against themselves.

Such bickering reaches a climax of dullness when discussion WHATWG versus W3C. Personally, I see very good aspects to both. But more importantly, I see both, at least in some of their most important activities, as lagging well behind the best that we can do today as a community. If today's Web developers, who are closest to our users' issues, were put in charge of Web standards this is not how they would handle them. And the boring bickering does nothing but disenfranchise them more than they already are.

One aspect that standards people treat as a point of contention is the consensus model. I really do like the WHATWG's approach to consensus. The editor of a document gets full responsibility for it. And if she screws up, she gets fired. This is a simple process with the right responsibilities and escalation1.

This is a great model. But as with all good, simple mechanisms, one has to be careful to avoid introducing distorting side-effects.

For most definitions and implications of "firing", the cost of firing a specification editor is high (I don't mean monetarily, but socially for the cohesion of the community, and humanly for the people involved). This implies that any issue perceived to be less (immediately) problematic than the cost involved in firing the editor will not be adequately addressed through this process. That is a limiting factor and a first source of distortion. In order to maximise the value of this approach we need to operate with a definition of "firing" that is as cheap as possible for all involved.

The cost of replacing an editor is also highly proportional to the complexity (as approximated by size) of the document she is handling; what's more, even some major issues may seem small when compared to a particularly large document.

Concrete case in point, the mother of all specifications: HTML. 940 pages, 120k lines of source code, 1.3MB over the wire. If Hixie were to do a generally outstanding job (which he does) but fail painfully on a given section and refuse to be convinced (we all have our blind spots) would you fire him away from the whole thing? No, you wouldn't, and neither should you. That points to a lack of granularity in the built-in safety mechanism that the WHATWG process has; the inability to fence off toxic content from the rest creates a structural Too Big To Fail distortion.

This, however, can be solved. For this we need two things.

First, splitting. By this I don't mean architectural modularity and other such astronautics that have often been called for. Rather, just editorial modularity. Hooks. A classic example is the DOM's clone operation which has a clear hook for other applicable specifications to use. It would be painful for the DOM to special-case the template element's cloning; it is much nicer to have it use this hook.

I will not claim that such splitting is easy, nor that hooks are a silver bullet — but it is possible to improve the situation.

The other thing we need is forking. Today us Web hackers have better options than firing people from open projects: we fork. As I explained in Go Fork and Multiply:

Fragmentation is bad. That much is well established, it hurts the full constituency of the Web at every level. We tend to consider that forking is bad because we work on the assumption that forking leads to fragmentation. I don't, however, believe that this assumption is verified.

If the relationship between standards and deployment were a static, mechanical one forking would indeed automatically fragment. But this is an idealised view of standards that does not match reality. Put differently, preventing forks is a solution to the fragmentation problem that only works for perfectly elastic spherical standards moving in a vacuum.

Making forking the default manner in which standing disagreements are captured addresses many shortcomings with the way in which standards are done today. It places a cost on the dissenter that is proportional to the breadth of their disagreement. This avoids attrition of the discussion in which whoever keeps complaining loudest and longest carries the day simply because everyone else is sick and tired.

It is generally believed that specification forks will confuse implementers and lead to fragmentation, but I think that implementers are, nowadays, smarter than that. Observing a strong, actively maintained rift in the specified continuum will send a strong signal that a specific area is not ready to ship (I'm pretty sure for instance that under this model AppCache would not have become part of the platform) and could benefit from experimentation.

These costs are in the right places to provide strong incentives for the community to help find an agreeable solution for all (or to eventually walk away from stubborn, unhelpful minorities) in order to minimise forking over time.

In addition to addressing the structural distortions that the Web standards community currently holds on to, promoting forkability also leads to a more sustainable communal model. The scarce resource in standards is quality editors. There is a lot of work to be done on the Web platform. An awful lot. And it can only be done, at any acceptable level of quality, if we have the editorial humanpower to do it.

Quality editors don't just materialise by magic. Every large successful community is set up to produce new contributors. We all wax emotional about how the view source principle and the many great resources we have make it easy for new Web developers to join the fold (that's certainly how I started, and I love it dearly), but where can one view the source to the Hello World Web standard?

Bootstrapping new editors is a lot easier with smaller, manageable specifications. It enables social learning constructs such as #GoodFirstBug and its many variants. A specification should be an inviting repository you can fork and experiment with to better anything from your own knowledge to the whole wide world; it should not be an Ivory Tower to scare people away from.

Again, the simplicity of the principles that guide this thinking should not be mistaken for a simplicity in implementation. Such an approach would require a usable and organised ecosystem, a source to list the specifications that are part of the canon (maybe a wide set of GitHub repositories under a single organisation?), some ground rules (CC license, patent protection, open tooling…). But I've found the Test The Web Forward / Web Platform Tests joint projects to be successful and extremely pleasant to work with (despite limited resources). Maybe we can meet with success in a Spec The Web Forward / Web Platform Specs project? Would a political structure inside of which companies from the Web industry and Web developers could talk be helpful? I do not know yet, but the "Webizen" project is an intriguing idea that would like to see hold some promise.

Enabling forkability provides the right kind of familiar, organic dynamics to make Web standards properly open — and why indeed would Web specifications be any less open than the rest of what makes the Web tick?

We're Web hackers. That is how we work. As the Monty Python might have put it, "Fork the forking forkers".

1. Yes, I am aware that there are people, including the source itself, claiming that this is not a consensus approach. But in fact it is, it just uses a different implementation from W3C's.