On Software Design and the Missing Revolution

An Exploration into the Past, Present, and Future of Software Design Paradigms

Revolution 0: The beginning

The history of computing machines can be traced back almost indefinitely--Charles Babbage, Blaise Pascal, or the ancient Greeks, as far as one wishes to go--but the beginning of the revolutions can be regularly assigned a specific date: Feb 14, 1946. That date--although not long remembered in the annals of history--has great significance for the world at large, and the art of computer programming, for on that day the ENIAC [a], the world's first general-purpose programmable electronic computer, began it's life [1].

Six women--who's previous jobs as army "Computers" consisted of calculating by hand the complex differential equations associated with ballistics trajectories--were chosen to become the world's first computer programmers [2]. Programming the ENIAC involved manually wiring cable connections, and setting three thousand switches on the function tables [3]. This was tedious work to be sure, but was a huge improvement over the human computers, who would have taken one or two days to complete a calculation that the ENIAC could perform in 20 seconds [1]. This was the beginning, and all the followed would build upon it.

Revolution 1: Stored Program Computing

During the design and construction of the ENIAC, it became necessary to freeze its design and specifications; therefore, when the designers of the ENIAC realized what its limitations would be--and came up with methods to overcome them, it was too late to incorporate those features in the ENIAC machine. Thus another machine was devised, called the EDVAC, work on which began a full two years before the ENIAC was finally completed [4].

The biggest problem with the ENIAC was all that switch switching: calculations could be performed incredibly fast, but since the programs needed to be defined by hand one at a time, the system was still limited by the speed of its human operators. The EDVAC changed all that by becoming the world's first Stored Program Computer, meaning that instead of having instructions defined by switch positions, EDVAC's instructions were represented numerically and stored in the computer's memory. Instead of having to flip switches with each new program, programs would be stored on magnetic wire (and later punch cards), which would then be read into EDVAC's memory and executed from there; when a program was complete, switching to another job was as easy as reading in a new set of punch cards [4][b].

The EDVAC was finally operational in August of 1949, and brought with it the first major revolution in computer programming, essentially inventing the concept of software design, creating the paradigms in which every subsequent computer system would be built.

Revolution 2: The Programming Language

When the EDVAC was developed, it stored program instructions digitally rather than using switches as had the ENIAC. Programming the EDVAC consisted of knowing exactly what binary bits represented each instruction or value, and coding the cumulation of those bits onto wire or punch cards for input into the machine. Improvement as this was over switches, there was yet another improvement looming on the horizon: the concept of the programming language.

The first ever programming language was developed in 1949, and was known as Short Code. After completing a program in Short Code, the programmer than had to convert the program by hand into binary for input into the computing machine [6]. This was the beginnings of language abstractions, and led to the development of the first real programming languages, to which all modern programming languages can trace their roots.

Revolution 3: Language Compilers

It wasn't long after the arrival of Short Code that people realized software could be written to automatically convert a programming language into machine code. The term compiler was coined by Grace Hooper, who in 1951 wrote the first one, a program called A-0. Compilers allowed programs to be completed faster, by having the computer do the tedious work that beforehand had been the job of the programmer. The level of abstraction was increasing, and would soon increase even more, as the next revolution had already begun.

Revolution 4: High-level Languages

The first generation of programming languages were nothing but on-to-one translations of machine instructions into language instructions. Since machine instructions were unique to each type of computer, each machine had its own assembly language, which had to be learned before one could program for that machine [7]. Low-level languages would soon be replaced by higher-level languages, in which a single statement could translate in several--sometimes dozens, or even hundreds--of machine instructions. These languages were also largely machine independent, so that a program did not have to be rewritten for each different computer is was to be executed on.

In 1957 IBM completed its formula translating system, known as FORTRAN. A simple language by today's standards, FORTRAN was leaps and bounds above its predecessors, including such remarkable programming constructs as IF, DO, and GOTO statements [6]. In 1972, the C language was developed Dennis Ritchie at Bell Labs. Described by its author as "quirky, flawed, and an enormous success", C would go on to become one of the most popular programming languages of all time [8].

This was the fourth major revolution in computer programming, and with it the art of software design was beginning to take shape. Programmers could now focus their time on the program itself, instead of having to worry about the intricate details of the machine it would be running on. Common programming constructs, such as function calls, recursion, and data structures were now in common use, and many of the languages of the era are still used today.

Revolution 5: Object-Oriented Programming

The origins of Object-Oriented programming--or OOP, as it's sometimes called--can be traced back to the SIMULA programming language, developed by Norwegian scientists Ole-Johan Dahl and Kristen Nygaard in the early 1960's [9]. In the early 80's, Bjarne Stroustroup integrated several Object-Oriented concepts into a set of extensions for the C language he called "C With Classes," which eventually went on to become the full-featured C++ language, which was released in 1983 [6].

OOP changed forever the way programmers thought about their programs. One a person gets into an Object-Oriented mindset, the whole worlds seems nothing but a collection of objects and their associated interfaces. Object-Orientation also helped bring about the long dreamed of concept of large-scale software reuse, as a collection of classes can easily be transferred from one system or another, while doing so just wasn't feasible before.

Revolution 6: Missing in action?

In 1984, a year after the C++ standard was released, Apple Computer released the Macintosh, a new model of low-cost, GUI based PC that would change the computer industry forever [10]. After the Macintosh, computers were no longer for scientists or even for hobbyists; they were for anyone and everyone who wanted one, and everyone did.

In 1994, ten years after the debut of the Macintosh, the world would experience another new revolution, when the internet exploded onto the stage. The internet changed forever the way users interacted with their computers [b]. Within a few years, it would be rare to find a computer anywhere that did not have the ability to connect itself to the rest of the world.

It has now been almost ten years since the coming of the internet, and twenty since the introduction of the Macintosh. In the space of those years computer programming--both the types of programs being produced, and the machines on which they are used--has changed dramatically. So what new software design method, what change in programming paradigms, what new revolution had arisen out of this environment? The answer, strangely enough, is none. There simply has not been one. This is the sixth great software revolution: the revolution that did not occur.

It is unclear why the sixth revolution never materialized; several promising new prospects have emerged, but most never took off, some rose and fell, and still others are yet being touted as "the next big thing." Revolutions in user interface design have changed forever the way users interact with their software, but underneath it all, that software is still being written just like the command line software of the pre-gui world. Instead of developing new programming languages and constructs to design user interfaces, we instead tried to force the existing languages and methods to do the job, often with pitiful results. Intricate complicated editors were designed to allow programmers to create user interfaces visually, but underneath it was all still managed in old fashioned C++; most of the GUI widgets in our most revolutionary programs are still managed in the back-end with null-terminated character arrays.

It is unclear why all this hasn't happened, but one thing can be said: having missed completely the sixth revolution, we can now only wait for the seventh.

Revolution 7: The Future

The problem with predicting the future, is that one is almost always wrong [e]. Revolutions tend to sneak up on you, appearing from nowhere and then exploding onto the scene with life changing consequences. Predicting the form of the next one is therefore usually a fruitless activity. However, since I have nothing to lose, I shall here make a predication: Aaron Andersen says that the seventh software revolution will be about--specialization.

In the future, so says me, a single programming language will no longer be sufficient to perform all the of task required in an advanced software project [f]. Instead, different realms of the program will be performed by different languages, each perfectly suited to the task it is accomplishing. C/C++ will probably still be used for speedy back-end calculations, but user interfaces will be designed in something else--some sort of an XML schema perhaps--with still another language--like the W3C's Cascading Style Sheets--making it all look pretty, and providing easy access to the skinning features that are becoming popular these days. A separate mechanism will be used for the integrated database capabilities, and still another will make internet communication a breeze. Finally, some futuristic compiler will be able to turn it all into a single application we can run on our desktops.

Will all this really happen? I don't know. Do I wish it would happen? Definitely. One way or another the future of programming (and the world) will arrive, and we can either wait for it to show up, or, if we wish, become part of it.


Notes:

[a] ENIAC is an acronym for "Electronic Numerical Integrator And Computer". EDVAC is an anronym for "Electronic Discrete Variable Automatic Computer". It seems that the obsession with acronyms in the computer world started at the very beginning.

[b] Because of delays in the design and construction of the machine, the EDVAC was technically not the fist stored program computer to be completed, as other computers based on the EDVAC design were actually finished first; it was however, the first major stored program computer project to be undertaken, and thus is usually given the title [5].

[c] Ironically, the release of the Macintosh is said to have changed the advertising industry as much as it did the PC industry, but that is another story.

[d] Actually, the internet changed just about everything forever, not just the computer industry, but this paper is about programming, so that's what I focus on here.

[e] Every time a revolution in anything occurs, several people who "knew" it was going to happen end up getting rich. These people then get remembered as visionaries, or as geniuses. However, what doesn't get shown on the news--nor remembered in the history books--is the thousands of people who "knew" something else was going to be the next big thing, and were wrong, sometimes losing their life savings in the process.

Despite what they would like you to think, most people who predicted that a certain something was going to change the world, or that a certain stock was about skyrocket, simply got lucky. I myself have project I am working on, which I think could very well become something big; most likely, that won't happen, but if it does, I will be happy to stand in front of the cameras and listen to everyone talk about what a genius I was to see the revolution coming.

[f] What I mean is, a single language is already not sufficient to do everything we need, and that in the future I hope people will realize that, and develop systems for specialization instead of trying to force one language to do it all.

References:

[1] Looking Back At ENIAC: Computers Hit Half-Century Mark. Neeraja Sankaran. http://www.the-scientist.library.upenn.edu/yr1995/august/birth_950821.html

[2] The Eniac Programmers.. Kathryn A. Kleiman http://www.witi.org/center/witimuseum/halloffame/1997/eniac.shtml

[3] The ENIAC. Kevin W. Richey http://ei.cs.vt.edu/~history/ENIAC.Richey.HTML

[4] ELECTRONIC COMPUTERS WITHIN THE ORDNANCE CORPS. CHAPTER III -- EDVAC. Karl Kempf. http://ftp.arl.mil/~mike/comphist/61ordnance/chap3.html

[5] The First Stored Program Computer -- EDVAC. http://www.maxmon.com/1946ad.htm

[6] The History of Computer Programming Languages. Steve Ferguson. http://www.princeton.edu/~ferguson/adw/programming_languages.shtml

[7] History of Programming Languages. Anthony M. Sloane. http://www.ics.mq.edu.au/schoolies/resources/workshops/slides/history.ppt

[8] The Development of the C Language. Dennis M. Ritchie. http://cm.bell-labs.com/cm/cs/who/dmr/chist.html.

[9] How Object-Oriented Programming Started. Ole-Johan Dahl and Kristen Nygaard. http://www.ifi.uio.no/~kristen/FORSKNINGSDOK_MAPPE/F_OO_start.html.

[10] Personal Computers: History and Development. Paul Stranahan. http://www.digitalcentury.com/encyclo/update/pc_hd.html


Copyright © 2002 XulPlanet.com