Computing Epochs

Jun 19, 2010 20:31

The world didn't exist before January 1, 1970. Don't believe me? Ask your computer. That completely arbitrary date, often referred to in computing circles simply as "The Epoch", is the date from which nearly all computer systems on earth represent current time. Interestingly, this results in a Y2K-like problem called the Year 2038, but that's a story for Wikipedia to tell; not me.

More generally, the term "epoch" means a certain period of time. For instance, scientists would say that we are currently living withing the Holocene epoch. But those epochs only change on a geologic time scale. The computing world moves much, much faster. Still, in honor of both Unix and Geology, I'm going to commandeer the term "epoch" for what I actually want to talk about here, an issue near and dear to my heart: the history of programming paradigms.

The first epoch does actually start before January 1, 1970, despite my claims to the contrary. In 1951, the ORDVAC was turned on in nearby Aberdeen Proving Grounds and became the first computer with a compiler. Basically, instead of having to manually flip switches or rewire components, there was a language that people -- programmers -- could compose and feed into the computer as signals and it would automatically carry out the specified instructions. By the end of the decade, two of the most widely used compiled programming languages in history were produced: FORTRAN in 1957, and COBOL in 1960. These languages, and others like them, were mostly a one-to-one mapping of computer instructions to a slightly human-readable format. There was relatively little abstraction away from the architecture of the machine, at least compared to many of today's languages. I'll refer to this as the epoch of Imperative Programming, and though within the field of computing that term has even broader meaning, I think it definitely applies to this era.

Then in 1965, Intel co-founder Gordon Moore realized that computing speed was essentially doubling roughly every 18 months. As a result, programs could do more and more and so they were increasingly asked to do more and more. Human programmers made more and more mistakes with their old programming habits trying to grapple with the increasing complexity of programs. And so in 1967, prominent computer scientist Edsger Dijkstra published the seminal article GOTO Considered Harmful. These two events ushered in the second programming epoch, which focused on Structured Programming. Abstract constructs such as loops and subroutines, which were mere design patterns of imperative programming before, became standard in languages. Now almost 50 years later, structured programming has nearly eradicated GOTO-laden imperative programs in all but the most highly specialized of domains.

But Moore's law marched on, and in a similar way to computing speed, computing memory was also increasing exponentially while decreasing in cost. Programs again became more and more complex as they were asked to do more and more complex things. People began to demand that computers more closely model the real world, such as in 1981 when Xerox developed its Star computer, one of the first graphical user interface (GUI) operating models. Since humans perceive the real world as consisting of objects, programmers began to explicitly model those objects in their computer code. Researchers had been working on this paradigm since the late 1970s in languages with cute names such as Squeak and SmallTalk, the latter of which was brought to a much wider audience by Byte magazine also in 1981, the same year as the GUI. Thus marks the third programming epoch, the Object Oriented (OO) Epoch. The C++ Programming Language was released in 1985 and became the first widely used OO language. While OO has largely displaced much of the non-OO structured programming languages (especially in large corporate environments), it hasn't done so to the same extent as structured programming replaced non-structured imperative ones. There are still some ardent anti-OO practitioners. Regardless, the ideas of OO still have a large impact on programming languages today, and practically every language in use since the mid 1980s has some sort of OO facility either built into it or bolted onto it. OO is here to stay, and truthfully, I'm not even sure if we've completely left this epoch yet, or if it overlaps with the one to come after it.

In 1990, Sir Tim Berners-Lee finalized his HyperText Markup Language (HTML), and by certain standards the World Wide Web was born. However, despite its name, HTML isn't technically a programming language. Instead, it opened up a whole new ecosystem where programming languages could develop and flourish. In many ways, this epoch exists side-by-side with the previous one still going on off the net and behind the scenes. Most of the work I do, for instance, is not involved with the World Wide Web and definitely heavily uses the OO paradigm. But on the WWW, with its largely all-text based (initially) interactions and extremely rapid development and to-market times, a new epoch was born: the epoch of Dynamic Languages. Once again, as with Imperative languages, I'm using a term that has a broader and much-debated meaning within software development, but it's my essay so deal with it :-). Perl was one of the first widely used dynamic languages on the web. Originally conceived of as a system administration language for heavily text-based unix-like systems, it transitioned easily to the WWW and is still a relatively strong language today, though it is losing ground to young whippersnappers like Ruby and Python. These language reside mainly on the "server side" of the WWW. The most recent development of dynamic languages in this epoch is the near-absolute dominance of JavaScript on the "client side" of the web. Much of the recent interactivity of websites these days comes from the use of a framework called "AJAX", where the J stands for "JavaScript". The name is unfortunate because it causes confusion with newbies and outsiders over another programming language that gained popularity during this time called "Java". Both were released in 1995. In a way, Java is the bridge between the OO epoch and the Dynamic epoch. Java runs as an "interpreted" language like Python etc, and is often used on the "server side" of gigantic internet-based systems.

All this history raises the question: what's next? Well, I wouldn't have written all this if I didn't have an opinion about what's next. But this history lesson has gone on long enough that I'll wait until you've had a chance to read and digest it before posting about my ideas for the next epoch :-)

systems

Previous post Next post
Up