The Staffordshire University Computing Futures Museum Computer Software Page

Back to the Computing Futures Museum Collections Page


Binary code was used in early computer programming by directly entering binary digits "0" and "1" into the computer memory. The meaning of a byte ("by 8"), a string of eight Binary digits (bits), depends on the context, e.g. as a numerical value, part of a computer order code, part of a memory address, a register address, an ASCII character, etc. Since the "point" is not represented in computer hardware, the software must interpret a numerical value in terms of fixed point or floating point, sign, and number of integer places, before the actual value is known.

Since it is difficult to remember or copy a binary string, the binary code is often expressed in the octal (base 8 = 2^3) or hexadecimal (base 16 = 2^4) notation. Thus programmers had to be proficient in these bases, and had to be able to convert numbers between any two of base 2, 8, 10 or 16 notations, e.g. to convert between bases 16 and 10 the quickest route is 16 > 2 > 8 > 10 since the 8 times table is usually known, while the 16 times table is not. Different conversion routines are also required for the integer part and the fraction part of any number.

Early computers often had 1Kbyte of internal memory or less. For this reason programs had to be as compact as possible, and engineers often took pride in reducing the size of a program, e.g. by replacing two instructions in a program by one different instruction and saving three bytes. Compare this attitude to present day profligate programming methods where a program takes many Mbytes! Binary programming was very useful for first generation computers, for minicomputers such as the PDP-8, and for microprocessors such as the Intel 8080 and Motorola 68000

Machine Code and Assembly Language

The 1950s Assembly languages were low-level languages for translating machine code mnemonics and symbolic addresses into their binary equivalents, usually in a one-to-one fashion. Low level computer programming then becomes easier, because the programmer does not have to remember the binary order codes for the desired instructions. The assembly languages implement a symbolic representation of the numeric machine codes and other constants needed to program a particular CPU architecture. This representation is usually defined by the hardware manufacturer, and is based on abbreviations (called mnemonics) that help the programmer remember individual instructions, registers, etc. An assembly language is therefore specific to a particular hardware computer architecture. This is in contrast to programs implemented in most high-level languages, which are usually portable, i.e. the program can be run by several hardware architectures (but beware - the numerical precisions of these various architectures may not be the same, and the results may therefore be different).

A utility program called an assembler is used to translate assembly language statements into the target computer's machine code. The assembler performs a more or less isomorphic translation (a one-to-one mapping) from mnemonic statements into machine instructions and data. This is in contrast with high-level languages, in which a single statement generally results in many machine instructions (and for different architectures the number of instructions will be different for the same high-level language instruction).

Sophisticated assemblers offer additional mechanisms to facilitate program development, to control the assembly process, and to aid debugging. There are two types of assemblers based on how many passes through the source are needed to produce the executable program: one-pass assemblers are fast to execute but go through the source code once only, and they assume that all symbols will be defined before it is necessary to assemble any instruction that references them; while two-pass assemblers create a table with all unresolved symbols in the first pass, then use the second pass to resolve these addresses (symbols can be defined anywhere in the program source).

Assembly languages were first developed in the 1950s, when they were referred to as second generation programming languages. They eliminated much of the error-prone and time-consuming first-generation (binary) programming needed with the earliest computers, freeing the programmer from tedium such as remembering numeric codes and calculating addresses. They were widely used for all kinds of programming. However, by the 1990s their use had largely been supplanted by high-level languages and structured programming languages, in the hope that there would be improved programming productivity. Modern compilers still use assembly language for direct hardware manipulation to give improved performance, e.g. in device drivers, gaming systems, low-level embedded systems on consumer chips, and real-time systems.

Historically, a large number of programs have been written entirely in assembly language. Operating systems were almost exclusively written in assembly language until the widespread acceptance of C. COBOL and FORTRAN eventually displaced much of this assembly language work for commercial and engineering applications respectively, although a number of large organizations retained assembly-language application infrastructures well into the 1990s. The most important reasons for retaining an assembly language capability are minimal size of program, greater speed, and reliability. Also, large teams of programmers using high level languages produce more complex, more costly, and not necessarily more reliable software than smaller teams of programmers using assembly language.


The IBM mathematical FORmula TRANslating System (FORTRAN) is a general-purpose procedural imperative programming language that is especially suited to numeric computation and scientific computing. Originally developed by IBM at their campus in San Jose, California in 1957, it became dominant for scientific and engineering applications, and has been in continual use for over half a century in computationally intensive areas such as numerical weather prediction, finite element analysis, computational fluid dynamics, computational physics, and computational chemistry. It is one of the most popular languages in the area of high-performance computing and is the language used for programs that benchmark and rank the world's fastest supercomputers.
In late 1953 John W. Backus submitted a proposal for the development of a more practical alternative to assembly language for programming their IBM 704 mainframe computer. A draft specification for The IBM Mathematical Formula Translating System was completed by mid-1954. The first manual for FORTRAN appeared in October 1956, with the first FORTRAN compiler delivered in April 1957. This was an optimizing compiler, because customers were reluctant to use a high-level programming language unless its compiler could generate code whose performance was comparable to that of hand-coded assembly language.
The increasing popularity of FORTRAN spurred competing computer manufacturers to provide FORTRAN compilers for their machines, so that by 1963 over 40 FORTRAN compilers existed. For these reasons, FORTRAN is considered to be the first widely used programming language supported across a variety of computer architectures.


LISt Processing (LISP) is a family of computer programming languages with a long history and a distinctive, fully parenthesized syntax. Originally specified in 1958, Lisp is the second-oldest high-level programming language in widespread use today; only Fortran is older. Like Fortran, Lisp has changed a great deal since its early days, and a number of dialects have existed over its history. Today, the most widely known general-purpose Lisp dialects are Common Lisp and Scheme.

Lisp was originally created as a practical mathematical notation for computer programs, influenced by the notation of Alonzo Church's lambda calculus. It quickly became the favoured programming language for artificial intelligence (AI) research. As one of the earliest programming languages, Lisp pioneered many ideas in computer science, including tree data structures, automatic storage management, dynamic typing, and the self-hosting compiler. Linked lists are one of Lisp's major data structures, and Lisp source code is itself made up of lists. As a result, Lisp programs can manipulate source code as a data structure, giving rise to the macro systems that allow programmers to create new syntax or even new domain-specific programming languages embedded in Lisp. The interchangeability of code and data also gives Lisp its instantly recognizable syntax. All program code is written as s-expressions, or parenthesized lists. A function call or syntactic form is written as a list with the function or operator's name first, and the arguments following; for instance, a function f that takes three arguments might be called using (f x y z). Lisp was the first homoiconic programming language: the primary representation of program code is the same type of list structure that is also used for the main data structures. As a result, Lisp functions can be manipulated, altered or even created within a Lisp program without extensive parsing or manipulation of binary machine code. This is generally considered one of the primary advantages of the language with regard to its expressive power, and makes the language amenable to metacircular evaluation. This recursive structure has led to a joking reference to the language as "a Lot of Infuriatingly Silly Parentheses".

It is not generally realised that the ubiquitous if-then-else structure, now taken for granted as an essential element of any programming language, was invented by McCarthy for use in Lisp. This structure was inherited by Algol, which popularized it.

Lisp was invented by John McCarthy in 1958 while he was at the Massachusetts Institute of Technology (MIT). McCarthy published its design in a paper in Communications of the ACM in 1960, entitled "Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I". He showed that with a few simple operators and a notation for functions, one can build a Turing-complete language for algorithms. The Information Processing Language was the first AI language, from 1955 or 1956, and already included many of the concepts, such as list-processing and recursion, which came to be used in Lisp. Lisp was first implemented by Steve Russell on an IBM 704 computer. Russell had read McCarthy's paper, and realized (to McCarthy's surprise) that the Lisp eval function could be implemented in machine code. The result was a working Lisp interpreter which could be used to run Lisp programs, or more properly, to evaluate Lisp expressions. The first complete Lisp compiler, itself written in Lisp, was implemented in 1962 by Tim Hart and Mike Levin at MIT. This compiler introduced the Lisp model of incremental compilation, in which compiled and interpreted functions can intermix freely. The language used in Hart and Levin's memo is much closer to modern Lisp style than McCarthy's earlier code.

Lisp was a difficult system to implement with the compiler techniques and hardware of the 1970s. Garbage collection routines, developed by then-MIT graduate student Daniel Edwards, made it practical to run Lisp on general-purpose computing systems, but efficiency was still a problem. This led to the creation of Lisp machines: dedicated hardware for running Lisp environments and programs. Advances in both computer hardware and compiler technology soon made Lisp machines obsolete, to the detriment of the Lisp market. During the 1980s and 1990s a great effort was made to unify the work on new Lisp dialects. The new language, Common Lisp, was only partly compatible with the dialects it replaced. In 1994 ANSI published the Common Lisp standard ANSI X3.226-1994 Information Technology Programming Language Common Lisp.

Since its inception, Lisp was closely connected with the artificial intelligence research community, especially on PDP-10 systems. Having declined somewhat in the 1990s, Lisp experienced a regrowth of interest in the 2000s. Most new activity is focused around open source implementations of Common Lisp, and includes the development of new portable libraries and applications. The book Practical Common Lisp by Peter Seibel, a tutorial for new Lisp programmers, was published in 2004, and was briefly's second most popular programming book: it is also available free online.


The ALGOrithmic Language ALGOL) is an imperative computer programming language originally developed in 1958 which has greatly influenced many other languages. ALGOL became the de facto way to describe algorithms in textbooks and academic works for the next 30 years. It was designed to avoid some of the problems encountered in FORTRAN. ALGOL introduced code blocks and the "begin" and "end" pairs for delimiting them, and it was also the first language implementing nested function definitions with lexical scope.

An early implementation of ALGOL, for the English Electric DEUCE, was developed at Whetstone, a village in the Blaby district of Leicestershire, UK, 5 miles to the south of Leicester. Whetstone was the site of Frank Whittle's factory, where jet engines were developed. The site of the Whittle factory became one of the factories of the ubiquitous 1919 English Electric Company (later part of GEC), and a significant part of several nuclear power stations were made there in the 1960s and 1970s. English Electric was one of the largest engineering companies in the Leicester area, employing thousands of workers and training hundreds of apprentices each year. At one point more than 4,000 workers had to be shipped in from Middlesex to help labour shortages and many settled permanently, causing a population boom in the late 1960s.

The ALGOL work was developed for English Electric's first generation DEUCE computer. A typical coding sheet is shown above.
ALGOL was developed jointly by a committee of European and American computer scientists in a meeting in 1958 at ETH Zurich, leading to the definition of ALGOL 58. ALGOL was used mostly by research computer scientists in the United States and in Europe. Its use in commercial applications was hindered by the absence of standard input/output facilities in its description and the lack of interest in the language by large computer vendors. ALGOL 60 did however become the standard for the publication of algorithms and had a profound effect on future language development.
John Backus developed the Backus Normal Form method of describing programming languages specifically for ALGOL 58. It was revised and expanded by Peter Naur for ALGOL 60, and at Donald Knuth's suggestion renamed Backus–Naur Form.Niklaus Wirth based ALGOL W on the ALGOL 60 standard before developing Pascal, and ALGOL 60 eventually gave rise to many other programming languages, including BCPL, B, Pascal, Simula, Lisp, C, and others.


The COmmon Business-Oriented Language (COBOL ) is one of the oldest programming languages, developed from 1959 for use in business, finance, and administrative systems for companies and governments. The COBOL specification was created by Grace Hopper after a meeting on 08.04.59 of computer manufacturers, users, and academics at the University of Pennsylvania Computing Center. The United States Department of Defense subsequently agreed to sponsor and oversee the next activities. A meeting chaired by Charles A. Phillips was held at the Pentagon on 28th - 29th May 1959 and committees were set up. It was the Short Range Committee, chaired by Joseph Wegstein of the US National Bureau of Standards, that during the following months created a description of the first version of COBOL. The committee was made up of members representing six computer manufacturers and three government agencies. The six computer manufacturers were Burroughs Corporation, IBM, Minneapolis-Honeywell (Honeywell Labs), RCA, Sperry Rand, and Sylvania Electric Products. The three government agencies were the US Air Force, the David Taylor Model Basin, and the National Bureau of Standards (now National Institute of Standards and Technology). The decision to use the name "COBOL" was made at a meeting of the committee held on 18th September 1959, a subcommittee was charged with completing the specifications for COBOL, and this was done by December 1959. The specifications were to a great extent inspired by the FLOW-MATIC language invented by Grace Hopper - commonly referred to as "the mother of the COBOL language", the IBM COMTRAN language, and the Honeywell FACT language. The first compilers for COBOL were subsequently implemented during the year 1960 and on 6th and 7th December 1959 essentially the same COBOL program was run on two different makes of computers, an RCA computer and a Remington-Rand Univac computer, demonstrating that compatibility could be achieved.

After 1959 COBOL underwent several modifications and improvements. In an attempt to overcome the problem of incompatibility between different versions of COBOL, the American National Standards Institute (ANSI) developed a standard form of the language in 1968. This version was known as American National Standard (ANS) COBOL. In 1974 ANSI published a revised version of (ANS) COBOL, containing a number of features that were not in the 1968 version. In 1985 ANSI published another revised version that had new features not in the 1974 standard, most notably structured language constructs ("scope terminators"), including END-IF, END-PERFORM, END-READ, etc. The language continued to evolve. In the early 1990s it was decided to add object-orientation in the next full revision of COBOL: the result was adopted as an ANSI standard and made available in 2002. Work is progressing on the next full revision of the COBOL Standard. It is expected to be approved and available in the early 2010s.

COBOL programs are in use globally in governmental and military agencies and in commercial enterprises, using a variety of operating systems. In 1997 the Gartner Group reported that 80% of the world's business ran on COBOL.


Acorn BBC BASIC, notable for its Mode 7 screen graphics

The Beginner's All-purpose Symbolic Instruction Code (BASIC) was designed in 1964 by John George Kemeny and Thomas Eugene Kurtz at Dartmouth College in New Hampshire, USA to provide computer access for non-science students. At the time, nearly all use of computers required the writing of custom software, often in binary, which was something only engineers, scientists, and mathematicians tended to be able to do. The BASIC language and its variants became widespread on microcomputers in the late 1970s and 1980s. BASIC remains popular to this day in a handful of highly modified dialects, particularly Structured BASIC, and new languages influenced by BASIC such as Microsoft Visual Basic.

The original BASIC language implemented in 1963 was designed to allow students to write programs. The eight design principles of BASIC were:
1. easy for beginners to use.
2. general-purpose programming language.
3. advanced features to be added for experts, while keeping the language simple for beginners.
4. interactive.
5. clear and friendly error messages.
6. quick to write and to get results for small programs.
7. no need for understanding of computer hardware.
8. shield the user from the operating system.
The language was based partly on FORTRAN II and partly on ALGOL 60, with additions to make it suitable for timesharing. Initially BASIC concentrated on supporting straightforward mathematical work, as a batch language, and with full string functionality added by 1965. At the time of its introduction, it was a compiled language. It was also quite efficient, beating FORTRAN II and ALGOL 60 implementations for several computationally intensive programming problems. The designers of the language decided to make the compiler available free of charge so that the language would become widespread. They also made it available to schools and put a considerable amount of effort into promoting the language. As a result, knowledge of BASIC became relatively widespread, and BASIC was implemented by a number of manufacturers, becoming fairly popular on newer minicomputers like the DEC PDP-8. In later instances the language tended to be implemented as an interpreter. Several years after its release some highly respected computer professionals, notably Dijkstra, expressed their opinions that the use of GOTO statements, which existed in many languages including BASIC, promoted poor programming practices.

Although BASIC had already been implemented on several minicomputers, it was the introduction of the MITS Altair 8800 "kit" microcomputer in 1975 that provided BASIC a route to universality. Most programming languages required suitable text editors, large amounts of memory and available disk space, whereas the early microcomputers had no resident editors, limited memory and often substituted recordable audio tapes for disk space. All these issues allowed a language like BASIC in its interpreted form, with a built-in code editor, to operate within those constraints. BASIC also had the advantage that it was fairly well-known to the young designers and computer hobbyists who took an interest in microcomputers, and generally worked in the electronics industries of the day. Kemeny and Kurtz's earlier proselytizing paid off in this respect and the few hobbyists' journals of the era were filled with columns that made mentions of the language, and often BASIC program examples.

In 1975 MITS released Altair BASIC, developed by college drop-outs Bill Gates and Paul Allen in the company which grew into today's corporate giant, Microsoft. Tiny BASIC, a simple BASIC implementation, was originally written by Dr. Li-Chen Wang, ported onto the Altair by Dennis Allison, and then published (design and full source code) in 1976 in Dr Dobb's Journal by Bob Albrecht. Versions of Microsoft BASIC were soon bundled with the original floppy disk-based CP/M computers, which became widespread in small business environments. Commodore Business Machines paid a one-time fee for an unlimited licence to a version of MicroSoft BASIC that was ported to the MOS 6502 chip in their PET computer, while Apple II and TRS-80 both introduced new, largely similar versions of the language. The Atari 8-bit family had their own Atari BASIC that was modified in order to fit on an 8KByte ROM cartridge. The BBC published BBC BASIC, developed for them by Acorn Computers Ltd, incorporating many extra structuring keywords. Most of the home computers of the 1980s had a ROM-resident BASIC interpreter, allowing the machines to boot directly into BASIC. Because of this legacy, there are more dialects of BASIC than there are of any other programming language. As the popularity of BASIC grew in this period, magazines published complete source code in BASIC for games, utilities, and other programs. Given BASIC's straightforward nature, it was a simple matter to type in the code from the magazine and then to execute the program.

As early as 1979 Microsoft was in negotiations with IBM to supply their IBM PCs with an IBM Cassette BASIC (BASIC C) inside BIOS. Microsoft sold several versions of BASIC for MS-DOS/PC-DOS including IBM Disk BASIC (BASIC D), IBM BASIC A (BASIC A), GW-BASIC (a BASIC A-compatible version that did not need IBM's ROM), and QuickBASIC. Turbo Pascal-publisher Borland published Turbo BASIC 1.0 in 1985 (successor versions are still being marketed by the original author under the name PowerBASIC). Microsoft wrote the window-based AmigaBASIC that was supplied with the multitasking GUI Amiga computers (late 1985/ early 1986), although the product unusually did not bear any Microsoft marks. These languages introduced many extensions to the original home computer BASIC, such as improved string manipulation and graphics support, access to the file system, and additional data types. More important were the facilities for structured programming, including additional control structures and proper subroutines supporting local variables. The new graphical features of these languages also helped lay the groundwork for PC video gaming, with BASIC programs like DONKEY.BAS showing what the PC could do.

However, by the latter half of the 1980s computers had progressed from a hobbyist interest to tools used primarily for applications written by others, and programming became less important for most users. BASIC started to recede in importance, though numerous versions remained available. BASIC's fortunes reversed once again with the introduction of Visual Basic by Microsoft. It is somewhat difficult to consider this language to be BASIC, because of the major shift in its orientation towards an object-oriented and event-driven approach. The only significant similarity to older BASIC dialects was familiar syntax. Syntax itself no longer "fully defined" the language, since much development was done using "drag and drop" methods, radio buttons, check boxes and scrollbars, for which the source code was not available to the developer. Visual Basic or "VB" had been promoted as a language for hobbyists, but the language had come into widespread use for small custom business applications shortly after the release of VB version 3.0, which is widely considered the first relatively stable version. While many advanced programmers still scoffed at its use, VB met the needs of small businesses efficiently wherever processing speed was less of a concern than easy development. By that time, computers running Windows 3.1 had become fast enough for many business-related processes to be completed "in the blink of an eye" even using a "slow" language, as long as massive amounts of data were not involved. Many small business owners found they could create their own small yet useful applications in a short time to meet their own specialized needs.

More BASIC dialects have sprung up in the 1990s and 2000s, including True BASIC, a direct successor to Dartmouth BASIC from a company controlled by Kurtz, RealBasic, released in 1998 for Macintosh computers, but also used for Microsoft Windows, Mac OS X and 32-bit x86 Linux (from the same object-oriented source code), and MiniBasic, an open source interpreter written in C.


The Basic Combined Programming Language (BCPL) is a computer programming language designed by Martin Richards of the University of Cambridge in 1966. Originally intended for writing compilers for other languages, BCPL is no longer in common use. However, its influence is still felt, because a stripped down and syntactically changed version of BCPL, called B, was the language upon which the C programming language was based. BCPL was the first curly bracket programming language, and the curly brackets survived the syntactical changes and have become a common means of denoting program source code statements.

The first compiler implementation, for the IBM 7094 under CTSS, was written while Richards was visiting Project MAC at MIT in the spring of 1967. The language was first described in a paper presented to the 1969 Spring Joint Computer Conference. The language was powerful and portable. It therefore proved possible to write small and simple compilers for it which would run in as little as 16KBytes. The Richards compiler was itself written in BCPL and was easily portable. BCPL was therefore a popular choice for bootstrapping a system.

A major reason for the compiler's portability lay in its structure. It was split into two parts: the front end parsed the source and generated O-code for a virtual machine, and the back end translated the O-code into code for the target machine. Only 1/5th of the compiler's code needed to be rewritten to support a new machine, a task that usually only took a few man-months. This approach became common practice later, e.g. for Pascal or Java, but the Richards BCPL compiler was the first to define a virtual machine for this purpose.

By 1970 BCPL implementations existed for the Honeywell 635 and 645, the IBM 360, the Univac 1108, the PDP-9, the KDF 9 and the Atlas 2. There was also a version produced for the BBC Micro in the mid 1980s by Richards Computer Products. The BBC Domesday Project made use of the language. Versions of BCPL for the Amstrad CPC and Amstrad PCW computers were also released in 1986 by UK software house Arnor Ltd.

In 1979 implementations of BCPL existed for at least 25 architectures. The philosophy of BCPL can be summarised as: "The philosophy of BCPL is not one of the tyrant who thinks he knows best and lays down the law on what is and what is not allowed; rather, BCPL acts more as a servant offering his services to the best of his ability without complaint, even when confronted with apparent nonsense. The programmer is always assumed to know what he is doing and is not hemmed in by petty restrictions." As an epitaph, the design and philosophy of BCPL strongly influenced B, which in turn influenced C.


Pascal is an influential imperative and procedural programming language, designed in 1968 - 1969 and published in 1970 by Niklaus Wirth as a small and efficient language intended to encourage good programming practices, using structured programming and data structuring. A derivative known as Object Pascal (Delphi) was designed for object oriented programming. Object Pascal is used for the user interface of Skype.

Pascal is based on the ALGOL programming language and named in honour of the French mathematician and philosopher Blaise Pascal. Leading up to Pascal, Wirth developed the language Euler, followed by Algol-W. Initially, Pascal was largely, but not exclusively, intended to teach students structured programming. A generation of students used Pascal as an introductory language in undergraduate courses. Pascal was the primary high-level language used for development in the Apple Lisa, and parts of the original Macintosh operating system were hand-translated into Motorola 68000 assembly language from the Pascal sources.


The Unix operating system was conceived and implemented in 1969 at AT&T's Bell Laboratories in the United States by Ken Thompson, Dennis Ritchie, Douglas McIlroy, and Joe Ossanna. Unix derived its name as a joke (a reference to an experimental operating system that was slow and ineffective called MULTICS). Unix was first released in 1971 and was initially entirely written in assembly language, a common practice at the time. Later, in a key pioneering approach in 1973, Unix was re-written in the programming language C by Dennis Ritchie, (with exceptions to the kernel and I/O). The availability of an operating system written in a high-level language allowed easier portability to different computer platforms. Since a legal loophole forced AT&T to license the operating system's source code, Unix quickly grew and became widely adopted by academic institutions and businesses.

MINIX is a Unix-like computer operating system based on a microkernel architecture created by Andrew S. Tanenbaum for educational purposes; MINIX also inspired the creation of the Linux kernel by Linus Torvalds. MINIX (from "mini-Unix") was first released in 1987, with its complete source code made available to universities for study in courses and research. MINIX has been free and open source software since it was re-licenced in April 2000. Like UNIX, MINIX was written in the C programming language and was intended to be easy to port to various computers. The initial implementation was for the IBM PC.


C is a general-purpose computer programming language developed in 1972 at the Bell Telephone Laboratories for use with the Unix operating system. Although C was designed for implementing system software, it is also widely used for developing portable application software. C is one of the most popular programming languages and there are very few computer architectures for which a C compiler does not exist. C has greatly influenced many other popular programming languages, most notably C++, which began as an extension to C.

C is an imperative (procedural) systems implementation language. It was designed to be compiled using a relatively straightforward compiler, to provide low-level access to memory, to provide language constructs that map efficiently to machine instructions, and to require minimal run-time support. C was therefore useful for many applications that had formerly been coded in assembly language. Despite its low-level capabilities, the language was designed to encourage machine-independent programming. A standards-compliant and portably written C program can be compiled for a very wide variety of computer platforms and operating systems with little or no change to its source code. The language has become available on a very wide range of platforms, from embedded microcontrollers to supercomputers. C's design is tied to its intended use as a portable systems implementation language. It provides simple, direct access to any addressable object (for example, memory-mapped device control registers), and its source-code expressions can be translated in a straightforward manner to primitive machine operations in the executable code. Like most imperative languages in the ALGOL tradition, C has facilities for structured programming and allows lexical variable scope and recursion, while a static type system prevents many unintended operations. In C, all executable code is contained within functions. Function parameters are always passed by value.

The initial development of C occurred at AT&T Bell Labs between 1969 and 1973, with most work being done in 1972. It was named "C" because its features were derived from an earlier language called "B", a stripped-down version of the BCPL programming language. The origin of C is closely tied to the development of the Unix operating system, originally implemented in assembly language. During the late 1970s and 1980s, versions of C were implemented for a wide variety of mainframe computers, minicomputers, and microcomputers, including the IBM PC, as its popularity began to increase significantly.

In 1983 the American National Standards Institute (ANSI) formed a committee to establish a standard specification of C. In 1989 the standard was ratified as ANSI X3.159-1989 "Programming Language C". This version of the language is often referred to as ANSI C, Standard C, or sometimes C89. C's primary use is for "system programming", including implementing operating systems and embedded system applications. One consequence of C's wide acceptance and efficiency is that compilers, libraries, and interpreters of other programming languages are often implemented in C. C is also efficient for numeric and scientific computing. C is sometimes used as an intermediate language by implementations of other languages. This approach may be used for portability or convenience; by using C as an intermediate language, it is not necessary to develop machine-specific code generators. However, since C has no input/output instructions, manufacturers need to produce machine-specific input/output instructions which may be used in C programs. C has also been widely used to implement end-user applications, but as applications became larger, much of that development has shifted to other languages.


Ada is a structured, statically typed, imperative, wide-spectrum, and object-oriented high-level computer programming language, extended from Pascal and other languages. It was originally designed by Honeywell Bull under contract to the United States Department of Defense (DoD) from 1977 to 1983 to supersede the hundreds of programming languages then used by the DoD. Ada is strongly typed and compilers are validated for reliability in mission-critical applications, such as avionics software. Ada is an international standard. Ada was named after Ada Augusta Lovelace (1815–1852), who is often credited as being the first computer programmer.

Ada was originally targeted at embedded and real-time systems. The Ada 95 revision, designed between 1992 and 1995, improved support for systems, numerical, financial, and object-oriented programming (OOP). Notable features of Ada include: strong typing, modularity mechanisms (packages), run-time checking, parallel processing (tasks), exception handling, and generics. Ada 95 added support for object-oriented programming. Ada supports run-time checks to protect against access to unallocated memory, buffer overflow errors, off-by-one errors, array access errors, and other detectable bugs. These checks can be disabled in the interest of runtime efficiency, but can often be compiled efficiently. It also includes facilities to help program verification. For these reasons, Ada is widely used in critical systems, where any anomaly might lead to very serious consequences, i.e., accidental death or injury. Examples of systems where Ada is used include avionics, weapon systems (including thermonuclear weapons), and spacecraft.

Ada also supports a large number of compile-time checks to help avoid bugs that would not be detectable until run-time in some other languages or would require explicit checks to be added to the source code. Ada is designed to detect small problems in very large software systems. For example, Ada detects each misspelled variable (due to the rule to declare each variable name), and Ada pinpoints unclosed if-statements, which require "END IF" rather than mismatching with any "END" token. Also, Ada can spot procedure calls with incorrect parameters, which is a common problem in large, complex software where most of the statements are procedure calls. The syntax of Ada is simple, consistent and readable. It minimizes choices of ways to perform basic operations, and prefers English keywords to symbols for basic logical operations, using only "+", "-", "*" and "/" for basic mathematical operations. Code blocks are delimited by words such as "declare", "begin" and "end". Conditional statements are closed with "end if", avoiding a dangling else that could pair with the wrong nested if-expression in other languages. The semicolon (";") is a statement terminator, and the null or no-operation statement is null;. A single ; without a statement to terminate is not allowed. This allows for a better quality of error messages.

In the 1970s the US Department of Defense (DoD) was concerned by the number of different programming languages being used for its embedded computer system projects, many of which were obsolete or hardware-dependent, and none of which supported safe modular programming. In 1975, the High Order Language Working Group (HOLWG) was formed with the intent to reduce this number by finding or creating a programming language generally suitable for the department's requirements. The working group created a series of language requirements documents—the Strawman, Woodenman, Tinman, Ironman and Steelman documents. Many existing languages were formally reviewed, but the team concluded in 1977 that no existing language met the specifications. The result was a proposal for a new programming language and four contractors were hired to develop their proposals under the names of Red (Intermetrics led by Benjamin Brosgol), Green (CII Honeywell Bull, led by Jean Ichbiah), Blue (SofTech, led by John Goodenough), and Yellow (SRI International, led by Jay Spitzen). In April 1978, after public scrutiny, the Red and Green proposals passed to the next phase. In May 1979, the Green proposal, designed by Jean Ichbiah at CII Honeywell Bull, was chosen and given the name Ada—after Augusta Ada, Countess of Lovelace. The preliminary Ada reference manual was published in ACM SIGPLAN Notices in June 1979. The Military Standard reference manual was approved on 10.12.80 (Ada Lovelace's birthday), and given the number MIL-STD-1815 in honor of Ada Lovelace's birth year.

Because Ada is a strongly typed language and has other safety-critical support features, it has found a niche outside the military in commercial aviation projects, where a software bug can cause fatalities. The fly-by-wire system software in the Boeing 777, the Canadian Automated Air Traffic Control, and the French TVM in-cab signalling system on the LGVs have all been written in Ada.

Ada is an Algol-like programming language featuring control structures with reserved words such as if, then, else, while, for, and so on. However, Ada also has many data structuring facilities and other abstractions which were not included in the original Algol60, like type definitions, records, pointers, enumerations. Such constructs were in part inherited or inspired from Pascal.


HyperText Markup Language (HTML) is the predominant markup language for web pages. It provides a means to create structured documents by denoting structural semantics for text such as headings, paragraphs, lists, links, quotes and other items. It allows images and objects to be embedded and can be used to create interactive forms. It is written in the form of HTML elements consisting of "tags" surrounded by angle brackets within the web page content. It can embed scripts in languages such as JavaScript which affect the behaviour of HTML webpages.

In 1980 the physicist Tim Berners-Lee, who was then a contractor at CERN, proposed and prototyped ENQUIRE, a system for CERN researchers to use and share documents. In 1989 Berners-Lee wrote a memo proposing an Internet-based hypertext system. He specified HTML and wrote the browser and server software in the last part of 1990. The first publicly available description of HTML was a document called HTML Tags, first mentioned on the Internet by Berners-Lee in late 1991.

HTML is a text and image formatting language used by web browsers to dynamically format web pages. In 1994 the IETF created an HTML Working Group, which in 1995 completed "HTML 2.0", the first HTML specification intended to be treated as a standard against which future implementations should be based. However, browser vendors, such as Microsoft and Netscape, chose to implement different subsets of HTML as well as to introduce their own extensions to it. These included extensions to control stylistic aspects of documents, contrary to the belief (of the academic engineering community) that such things as text colour, background texture, font size and font face were outside the scope of a language when their only intent was to specify how a document should look. From 1996 the HTML specifications were maintained, with input from commercial software vendors, by the World Wide Web Consortium (W3C). The last HTML specification published by the W3C was the HTML 4.01 Recommendation, published in late 1999. In 2000 HTML became an international standard (ISO/IEC 15445:2000). In 2008 HTML 4.01 and ISO/IEC 15445:2000 appeared. HTML 5 will remove most explicit style tags in favour of standardised style sheets.

In this page, which is itself written in HTML, the angle brackets (which actuate HTML features) have been replaced by curly brackets { } so that the text will be displayed:

HTML documents are composed entirely of HTML elements that, in their most general form have three components: a "start tag"; content between these tags which is to be rendered; and a matching "end tag":
i.e. {tag}content to be rendered{/tag}

An HTML element is everything between and including the tags. A tag is a keyword enclosed in angle brackets. The most general form of an HTML element is:
{tag attribute1="value1" attribute2="value2"}content to be rendered{/tag}
An unassigned attribute will in most cases be assigned a default value.

Links to other documents require the use of an anchor element,
e.g. {a href=""}Example{/a}
and an image may also be incorporated in the hyperlink,
e.g. {a href=""}{img src="exampleimage.jpg" alt="alternative text to be displayed if the image is unavailable" width="50" height="50"}{/a}.


Linux refers to the family of Unix-like computer operating systems developed from 1991 that use the Linux kernel. Their development is one of the most prominent examples of free and open source software collaboration. Typically all the underlying source code can be used, freely modified, and redistributed, both commercially and non-commercially. Linux can be installed on a wide variety of computer hardware, ranging from mobile phones, tablet computers and video game consoles, to mainframes, servers and supercomputers. Typically Linux is packaged in a format known as a Linux distribution for desktop and server use. Linux distributions include the Linux kernel and the supporting software required to run a complete system, such as utilities and libraries, the X Window System, the GNOME and KDE desktop environments, and the Apache HTTP Server. Commonly used applications with desktop Linux systems include the Mozilla Firefox web browser and the office application suite.

The name "Linux" comes from the Linux kernel, originally written in 1991 by Linus Torvalds.


Java is a programming language originally developed by James Gosling at Sun Microsystems (now a subsidiary of Oracle Corporation) and released in 1995 as a core component of Sun Microsystems' Java platform. The language derives much of its syntax from C and C++ but has a simpler object model and fewer low-level facilities. Java applications are typically compiled to bytecode that can run on any Java Virtual Machine (JVM) regardless of computer architecture. Java bytecode instructions are analogous to machine code, but are intended to be interpreted by a virtual machine (VM) written specifically for the host hardware. Java is general-purpose, concurrent, class-based, and object-oriented, and is specifically designed to have as few implementation dependencies as possible. It is intended to let application developers "write once, run anywhere". Java is considered by many as one of the most influential programming languages of the 20th century, and is widely used from application software to web applications.

James Gosling and Patrick Naughton initiated the Java language project in June 1991 for use in one of his many set-top box projects. Java was originally designed for interactive television, but it was found to be too advanced. The language, initially called Oak after an oak tree that stood outside Gosling's office, also went by the name Green and ended up later renamed as Java, from a list of random words. Gosling aimed to implement a virtual machine and a language that had a familiar C/C++ style of notation. There were five primary goals in the creation of the Java language:
"simple, object oriented, and familiar"
"robust and secure"
"architecture neutral and portable"
it should execute with "high performance"
"interpreted, threaded, and dynamic".

The original and reference implementation Java compilers, virtual machines, and class libraries were developed by Sun as Java 1.0 in 1995. It promised to be a "Write Once, Run Anywhere" (WORA) system, providing no-cost run-times on popular platforms. Being fairly secure, it allowed network- and file-access restrictions. Major web browsers soon incorporated the ability to run Java applets within web pages, and Java quickly became popular. With the advent of Java 2 in 1998 – 1999, new versions had multiple configurations built for different types of platforms.

Java remains a de facto standard, controlled through the Java Community Process. At one time, Sun made most of its Java implementations available without charge, despite their proprietary software status. In 2006 Sun released much of Java as open source software under the terms of the GNU General Public Licence (GPL). In 2007, in compliance with the specifications of the Java Community Process, Sun relicensed most of its Java technologies, making Java's core code available under free software/open-source distribution terms, except for a small portion of code to which Sun did not hold the copyright. Sun's vice-president Rich Green has said that Sun's ideal role with regards to Java is as an "evangelist." Sun Microsystems officially licenses the Java Standard Edition platform for Linux, Mac OS X, and Solaris. Although in the past Sun has licensed Java to Microsoft, the licence has expired and has not been renewed. As a result, Microsoft no longer ships Java with Windows, and in recent versions of Windows, Internet Explorer cannot support Java applets without a third-party plugin. Sun, and others, have made available free Java run-time systems for those and other versions of Windows.

Java uses an automatic garbage collector to manage memory in the object lifecycle. The programmer determines when objects are created, and the Java runtime is responsible for recovering the memory once objects are no longer in use. Once no references to an object remain, the unreachable memory becomes eligible to be freed automatically by the garbage collector. One of the ideas behind Java's automatic memory management model is that programmers should be spared the burden of having to perform manual memory management (in some languages memory for the creation of objects is implicitly allocated on the stack, or explicitly allocated and deallocated from the heap; either way, the responsibility of managing memory resides with the programmer). In Java garbage collection may happen at any time. Ideally, it will occur when a program is idle. It is guaranteed to be triggered if there is insufficient free memory on the heap to allocate a new object, and this can cause a program to stall for a short time. Explicit memory management is not possible in Java.

Cloud Computing

Cloud Computing is a term meaning the obtaining on demand of shared resources (e.g. backing memory, software, information and processing power) over the Internet from servers or computers, like electricity is obtained from the electricity grid. Cloud computing is the latest step in the progression from mainframe, to client–server, to internet computer. Details are abstracted from the users, who no longer have need for expertise in, or control over, the technology infrastructure "in the cloud" that supports them. Cloud computing is based on a consumption and delivery model for IT services based on the Internet, and it typically involves Internet provision of dynamically scalable virtual resources. The term "cloud" is used as a metaphor for the Internet, similar to the cloud drawing used in the past to represent the telephone network, and it is an abstraction of the underlying infrastructure it represents. Typical cloud computing providers deliver common business applications (e.g. Open Office) online that are accessed from another web service, or from software like a Web browser, while the required software and data are stored on servers. The major cloud-only service providers include Amazon and Google.

The underlying concept of cloud computing dates back to the 1960s, when John McCarthy opined that "computation may someday be organized as a public utility". The modern characteristics of cloud computing are online elastic provision as a utility, and the illusion of infinite supply; this is similar in concept to the electricity industry and its use of private, community, public and government forms. The actual term "cloud" was derived from telecommunications companies, who until the 1990s primarily offered dedicated point-to-point telephone circuits. They began offering Virtual Private Network (VPN) services with a comparable quality of service but a much lower cost than point-to-point services. By switching traffic to balance utilization as they saw fit, they were able to utilise their overall network bandwidth more effectively. The cloud symbol was used to denote the demarcation point between the domain of the provider and the domain of the user. The concept was strengthened by the invention of Voice Over Internet Protocol (VOIP) and the use of packet switching over the internet to replace physical copper wires. Cloud computing extends this domain boundary to cover servers as well as the network infrastructure.

Amazon played a key role in the development of cloud computing by modernizing their data centres after the dot-com bubble. In 2007 Google, IBM, and a number of universities embarked on a large scale cloud computing research project. In general, Cloud computing customers do not own the physical infrastructure; instead, they avoid capital expenditure by renting usage from a third-party provider. They consume resources as a service and pay only for resources that they use. Some cloud-computing services use a traditional utility services model, charging according to time of day and resources used, whereas others charge on a subscription flat fee model. Sharing computing power among multiple users can improve utilization rates, because servers are less often idle. A side-effect of this approach is that overall computer usage rises dramatically, as customers do not have to worry about peak load limits.

The Intercloud is an interconnected global "cloud of clouds", an extension of the Internet "network of networks" idea. The term was first used in the context of cloud computing in 2007 when Kevin Kelly opined that "eventually we'll have the intercloud, the cloud of clouds. This Intercloud will have the dimensions of one machine comprising all servers and attendant cloudbooks on the planet". The idea became popular in 2009. Servers do not have infinite physical resources, but the Intercloud scenario allows each cloud to use the computational and storage resources of the virtualization infrastructures of other clouds.

Software-as-a-Service (SaaS) applications give access simply by navigating there using a web browser. Examples are Google Mail and Google Docs.

Platform-as-a-Service (PaaS) is a set of lower level services such as an operating system, computer language interpreter, or web server offered by a cloud provider, on which developers can build custom applications. Examples are Microsoft Windows Azure and Google App Engine.

Infrastructure-as-a-Service (IaaS) is the provision of servers or virtual servers for use on a pay-as-you-go basis. Examples are: Amazon's Elastic Compute Cloud (EC2) for virtual servers running Linux or Windows, and the Simple Storage Service (S3) for storing files in the cloud; Google's email, spreadsheet, word processor, presentation graphics, mapping services, payments, and custom applications written in Java; Microsoft's Windows email, blogging and online file storage, and the Windows Azure platform for custom applications, online file storage and database services; and multi-tenanted customer relationship management (CRM), Apex programming language, Twitter, Facebook and Java.

A Rich Internet Application (RIA) has applications running a web browser that have rich graphics, fast script engines, and plug-ins. Examples are Adobe Flash and advanced HTML applications.

Multi-tenancy is a mode of operation where cloud-hosted applications are shared on a single infrastructure by multiple customers, each having access only to their own data. This is cost-effective, since the software and storage are shared. is an example.

A private cloud is a cloud-like infrastructure in a private data centre; this give the benefits of cloud computing without entrusting private data to a third party. A public cloud refers to providers such as Amazon, Google and A hybrid cloud used both public and private services.

Virtualisation is a 1970s concept for emulating different computers on a single platform: ICL used this concept for their 2900 series of computers, which used firmware to emulate both English Electric System 4 and ICT 1900 series computers. Applied to cloud computing, one or more emulated computers can run simultaneously on a single physical computer. Virtual machines can even be moved between different company premises or cloud providers.

Is Cloud Computing really anything new? Or has it, as Sun has commented, been media over-hyping of "everything we have already been doing"?

Back to Top

Send all comments, updates and queries for The Staffordshire University Computing Futures Museum Computer Software Page to Dr John Wilcock

Version: 03 04 March 2012 updated by Dr John Wilcock