C Programming Language History

The first version of Unix was written in the low-level PDP-7 assembler language. Soon after, a language called TMG was created for the PDP-7 by R. M. McClure. Using TMG to develop a FORTRAN compiler, Ken Thompson instead ended up developing a compiler for a new high-level language he called B, based on the earlier BCPL language developed by Martin Richard. Where it might take several pages of detailed PDP-7 assembly code to accomplish a given task, the same functionality could typically be expressed in a higher level language like B in just a few lines. B was thereafter used for further development of the Unix system, which made the work much faster and more convenient.

When the PDP-11 computer arrived at Bell Labs, Dennis Ritchiebuilt on B to create a new language called C which inherited Thompson’s taste for concise syntax, and had a powerful mix of high-level functionality and the detailed features required to program an operating system. Most of the components of Unix were eventually rewritten in C, culminating with the kernel itself in 1973. Because of its convenience and power, C went on to become the most popular programming language in the world over the next quarter century.

This development of Unix in C had two important consequences:

  • Portability. It made it much easier to port Unix to newly developed computers, because it eliminated the need to translate the entire operating system to the new assemble language by hand:
    • First, write a C-to-assembly language compiler for the new machine.
    • Then use the new compiler to automatically translate the Unix C language source code into the new machine’s assembly language.
    • Finally, write only a small amount of new code where absolutely required by hardware differences with the new machine.
  • Improvability. It made Unix easy to customize and improve by any programmer that could learn the high-level C programming language. Many did learn C, and went on to experiment with modifications to the operating system, producing many useful new extensions and enhancements.

Open systems. By 1982, the minicomputer industry was beginning to grow. Several computer companies began to develop commercial versions of Unix, some based on System V, and some on BSD Unix. Each vendor differentiated their system by adding unique features, but also recognized that they had a common interest in preventing AT&T from monopolizing the market. Several efforts were made in the in 1980’s to develop open Unix specifications and standards, such as by the IEEE POSIX group and a European group of companies called X/Open, with some limited success.

In 1988, in response to AT&T’s alliance with Sun described above, several vendors formed a group called the Open System Foundation to develop a new Unix operating system from open specifications and end their dependence on the AT&T code. The OSF/1 system was released in 1991, but it wasn’t as mature as the established systems, so there was only slow adoption some of its components by AT&T’s biggest competitors like DEC and IBM.

In 1993, a lot of the fight went out of the Unix wars when AT&T left the computer business and sold System V to Novell, who then assigned the rights to Unix to X/Open. In 1996, OSF and X/Open merged into The Open Group, which still promotes open system standards today.

In the late 1990’s, interest began to coalesce around Linux, the first really open Unix system released under the free software GNU license, and which might finally unify the Unix family after three decades of development.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s