Categories
World Wide Web

User adoption and its effect on technological evolution

The IPv6 specification has been around for about 9 years now. People understand the problems with IPv4. Yet the world is struggling to move to IPv6.

Ruby has been around for about 14 years now. It had enough time to evolve before it was widely adopted.

Java has been around for about 12 years. The adoption was quite fast. Now although newer versions of Java are being released, it is slowly losing its charm. There are some known problems, which would rather have not been there in the first place, but now it is too late or difficult to correct.

Browsers were meant to be used to browse documents containing hypertext. HTTP is a request/response application level protocol. People are now using both of these for things that they are not designed to do; browsers as application platforms (mashups from multiple sites) and HTTP for pushing data from the server aka Comet.

I can go on. But, do you see a pattern here? Or are these just random bits from history?

The point I am trying to make is that it is very difficult for technology to evolve once it has been widely adopted. In such cases, it is considered okay to bypass the working of the technology without breaking it by identifying loopholes!

So do we blame adoption before a technology matures? Or should there be a way that technology can mature and evolve without carrying its sins forward? Is it possible to mandate a 'big-bang' when specifications can move from one version to another or when there can be radical changes in technology without backward compatibility? How can this be achieved by ensuring that there is minimal side-effect? Does the answer lie in The Tipping Point?!

Seems like an interesting problem to solve.