The Staffordshire University Computing Futures Museum Central Processing Unit Page
Computer Hardware items are listed in order of date.
The Jacquard has a chain of 8 x 26 punched cards that effectively is the program for controlling a CPU to create patterns on the woven cloth. It was constructed in 1801 by Joseph Marie Jacquard.
The 1822 Difference Engine, invented by Charles Babbage used differentials to calculate the next term in a polynomial expression.
The 1823 Analytical Engine, invented by Charles Babbage would have been the first computer if it had been constructed in its entirety. The machine was constructed out of columns of decimal toothed wheels, allowing the functions of memory, CPU, ALU (using Pascal's and Leibniz's mechanisms), input (setting the positions of toothed wheels), and output (reading the settings of toothed wheels. Unfortunately there was no machine tool industry in existence, the wheels had to be made by hand, and the Government grant rapidly ran out, so the project was shelved and was only shown to have been feasible in the 20th Century. Babbage's assistant Ada Augusta Lovelace, acknowledged to be the first computer programmer, probably understood the capabilities of the machine better than Babbage himself.
Boolean Algebra was invented in 1848 by George Boole with the contribution of de Morgan's Laws from Augustus de Morgan. The algebra is useful in designing and minimising logic circuitry in computers for up to six variables, whether implemented by thermionic valves, transistors or microchips.
A truth table is a mathematical table used in logic, specifically in connection with Boolean algebra, boolean functions, and propositional calculus, to express the output values of logical expressions for all possible combinations of their input values. In practice a truth table has one column for each input variable (e.g. p, q), and one column for the output values of the logical operation that the table is meant to represent (e.g. p v q). For n input variables there are 2^n rows in the truth table, each row containing one of the 2^n possible combinations of the truth values of the n input variables (for instance, p=true q=false), and the corresponding output value for that combination of input values. The values are often transferred to a Karnaugh Map for logic simplification.
The 1876 Thomson Integrator which uses electromechanical discs to integrate functions in a Differential Analyser as used by Vannevar Bush and Douglas Hartree.
Venn Diagrams were invented in 1880 by John Venn. They are useful in visualising logic problems of up to four variables, but do not constitute a proof. The method is outperformed by Boolean Algebra.
1904 Typical diode construction
1906 Typical triode construction
1930 Typical triode with Bakelite base
1950 Typical computer vacuum tubes
A vacuum tube or thermionic valve was a device used to amplify, switch or create an electrical signal by thermionic emission and control of the movement of electrons in a low-pressure space. Hard tubes had as high a vacuum as possible, while soft tubes were filled with low-pressure gas.
The 19th century saw increasing research with evacuated tubes by scientists. These tubes were mostly for specialized scientific applications, and the only commercial success was the light bulb. The groundwork laid by these scientists and inventors, however, was critical to the development of vacuum tube technology. Though the thermionic emission effect was originally reported in 1873 by Frederick Guthrie, it is Thomas Edison's 1884 investigation of the Edison Effect that is most important. In 1904 the English physicist John Ambrose Fleming developed a device he called an "oscillation valve" (because it passed current in only one direction) for the Marconi company. Later known as the Fleming valve, or diode it could be used as a rectifier of alternating current and as a radio wave detector. In 1906 Robert von Lieben patented a three-electrode amplifying vacuum tube with a beam-focusing electromagnet. In 1907 Lee De Forest placed a bent wire serving as a screen, later known as the grid (later control grid) electrode, between the filament (electron emitter or cathode) and the plate electrode (anode). As a small change in the voltage from negative to positive was applied to the grid, there was a large change in the number of electrons flowing from the cathode to the anode. Thus the grid was said to electrostatically "control" the plate current. The resulting three-electrode device was therefore an amplifier of voltages. The device is now known as the triode. These developments led to great improvements in telecommunications technology.
When triodes were first used in radio transmitters and receivers, it was found that they had a tendency to oscillate due to parasitic anode-to-grid capacitance. It was discovered that the addition of a second grid, located between the control grid and the plate and called a screen grid could solve these problems. A positive voltage slightly lower than the anode voltage was applied to it, and at high frequencies the screen grid was shorted to ground with a capacitor. This arrangement decoupled the anode and the control grid at high frequencies, completely eliminating the oscillation problem. This two-grid tube is called a tetrode, and it was common by 1926.
However, the tetrode has some new problems. In any tube, electrons strike the anode hard enough to create secondary electrons. In a triode these less energetic electrons cannot reach the grid or cathode, and are re-captured by the anode. But in a tetrode, they can be captured by the screen grid, reducing the plate current and the amplification of the circuit. Since secondary electrons can outnumber the primary electrons, particularly when the anode voltage dips below the screen voltage, the plate current can actually decrease with increasing anode voltage. Another consequence of this effect is that under severe overload, the current collected by the screen grid can cause it to overheat and melt, destroying the tube. Again, the solution was to add another grid, called a suppressor grid. This third grid was set either at ground or at cathode voltage and its negative voltage relative to the anode electrostatically suppressed the secondary electrons by repelling them back toward the anode. This three-grid tube is called a pentode, and it was common from 1928.
Early vacuum tubes had a glass envelope and a Bakelite base. The 1938 miniature vacuum tube made tubes smaller by eliminating the base and fusing the external contact pins into the glass base of the envelope. Making tubes smaller reduced the voltage that they could work at, and also the power of the filament, so the older and larger Bakelite-based style continued to be used for high power rectifiers, valve amplifier output stages and radio and radar transmitting tubes.
Colossus and its successor Colossus Mk2 was built at Bletchley Park during World War II to substantially speed up the task of breaking the German high level Lorenz encryption. Based on 1500 vacuum tubes, Colossus replaced the Heath Robinson (a derogatory name for an earlier machine based on relay and switch logic). Colossus was able to break in a matter of hours messages that had previously taken several weeks. Colossus Mk2 used a total of around 2000 vacuum tubes. Colossus was the first ever use of vacuum tubes on such a large scale for a single machine, and the reliability of the tubes became an important consideration. The Colossus computer's designer, Dr Tommy Flowers, had a theory that most of the unreliability was caused during power down and more importantly power up operations. Thus once a Colossus machine was built and installed it was left switched on, running from dual-redundant diesel generators, wartime mains supply being considered too unreliable, consuming 15Kw of power, 24 hours a day, 365 days a year, nearly all for running the tube heaters. The prototype Colossus was only switched off for the addition of 500 tubes and conversion to Colossus Mk2. Another 9 Colossus Mk2s were built, and all 10 machines ran with a surprising degree of reliability.
Vacuum tubes continued to be developed throughout World War II and well into the 1960s. To make radios more rugged, aircraft and army radios began to integrate the tube envelopes into the radio's cast aluminium or zinc chassis. The radio then had a printed circuit for the non-tube components soldered to the chassis that contained all the tubes. Also in 1944 rugged metal vacuum tubes were mounted in anti-aircraft shells, creating a proximity fuse which made anti-aircraft shells 6 times more effective. They were widely used in 1950s military and aviation electronics.
Vacuum tubes were critical to the development of electronic technology, which drove the expansion and commercialization of radio broadcasting, television, radar, sound reproduction, large telephone networks, analog and digital computers, and industrial process control. For most purposes, the vacuum tube has been replaced by solid-state devices such as transistors and solid-state diodes, which last much longer, are smaller, more efficient, more reliable, and cheaper. However, tubes are still used in specialized applications: for engineering reasons, as in high-power radio frequency transmitters; or for their aesthetic appeal and distinct sound signature, as in audio amplification. Cathode ray tubes can be considered to be large vacuum tubes. A specialized form of the vacuum tube, the magnetron, is the source of microwave energy for radar installations, and has later been use in domestic microwave ovens. The klystron, a powerful narrow-band radio-frequency amplifier, is used as a high-power UHF television transmitter. It is also an important fact that vacuum tubes are less susceptible than solid-state components to the electromagnetic pulse effect of nuclear explosions, and this property has kept them in use for military applications long after transistors have replaced them in conventional electronics. Vacuum tubes are also still used for very high-powered applications such as industrial radio-frequency heating, generating large amounts of RF energy for particle accelerators, and power amplification for broadcasting. In microwave ovens, magnetrons continue to be used to produce microwave heating of hundreds of watts. Musicians also claim that vacuum tubes produce a sound quality that is superior to that produced by transistors.
Bush Differential Analyser
Thomson Differential Analyser as constructed at MIT in 1927 by Vannevar Bush
Hartree Differential Analyser
The 1934 Hartree Differential Analyser at Manchester University. This analog computer was a copy of the 1876 Thomson Differential Analyser constructed at MIT in 1927 by Vannevar Bush. Differential Analysers are analog computers which use positions or voltages to represent variables, and which use such components as the 1876 Thomson Integrator.
Finite State Machine
The ideas about the Finite State Machine (FSM) were developed from 1935. A FSM is a model of the behaviour of a machine composed of a finite number of states, transitions between those states, and the relevant inputs and outputs. The behaviour is expressed in a State Table and State Diagram. The technique was developed by Alan Turing (1936), George H. Mealy (1955), Edward F. Moore, E.J. McCluskey (1965), Marvin Minsky (1967), Z. Kohavi (1978) and Claude Shannon.
The theoretical Turing Machine proposed by Alan Turing in 1936. This was later extended into the concept of the Universal Turing Machine.
Turing Machine Emulator, constructed as a student project in 1999.
Electronic Developments 1940 - 1970
Development of switching devices from 1940 (top left to right):
1940s Electromechanical Relay
More recent relay
1940s Pentode Thermionic Vacuum Tube in metal outer casing
1950s Miniature Pentode
(bottom left to right):
50p coin for scale
1940s Thermionic Diode (see right also)
Development of diodes from 1950 to 1970:
(Bottom) 1940s Thermionic Diode (see left also)
(Top left) 1950s Germanium Semiconductor Diode
(Top right) 1970s Silicon Semiconductor Diode
Development of transistors from 1951 to 1970:
(Top left to right:)1957 Germanium
(Bottom left:)1953 Germanium
(Bottom right:)For comparison three 1970s Silicon Integrated Circuits; the visible silicon wafer in its plastic mount on the right is 5mm x 5mm (700 transistors and 1000 resistors)
Animated image of a Dekatron in operation
The 1951 Harwell Dekatron Computer.
Left to right: storage rack, ALU rack, and CPU (two racks of relays)
The Harwell Dekatron Computer at Wolverhampton as WITCH
2010 The Harwell Dekatron Computer being restored at Bletchley Park.
From the left there are two storage racks, the ALU rack, the CPU (two racks of relays with curved covers), and the power supply.
The Dekatron (also called three-phase gas counting tube, glow-transfer counting tube or cold cathode tube) is a gas-filled (hydrogen, neon or argon) decade or octal counting tube. Dekatrons were used in computers, calculators and other counting-related devices between the 1940s to 1970s. The base 10 dekatron was useful for computing, calculating and frequency-dividing purposes because one complete revolution of the neon dot in a dekatron meant 10 pulses on the guide electrode(s), and a signal could be derived from one of the ten cathodes in a dekatron to send a pulse, possibly for another counting stage. Dekatrons generally have a maximum input frequency in the high KHz range, 1 MHz being around the maximum possible.
Internal designs vary by the model and manufacturer, but generally a dekatron has ten cathodes arranged in a circle, guide electrodes between each pair of cathodes, and a common anode. When the guide electrode is pulsed, the neon gas will activate near the guide pins then "jump" to the next cathode.
Karnaugh Maps The Karnaugh map (k-map) was invented in 1952 by Edward W. Veitch and developed further in 1953 by Maurice Karnaugh, a telecommunications engineer at Bell Labs. It is a method to simplify Boolean Algebra expressions. The Karnaugh map reduces the need for extensive calculations by taking advantage of the human pattern-recognition capability, permitting the rapid simplification of logic circuitry. The boolean variables are transferred from a truth table and ordered according to the principles of Gray code in which only one variable changes in between squares. Once the outputs are transcribed into the map, the data can be combined into the largest possible groups containing 2n cells (n=0,1,2,3...) and the minterm is generated through the laws of Boolean Algebra.
Microprogram Control Store The microprogram control store is part of the CPU control unit that stores microsteps. In the design of the CPU it was found that economies could be made by breaking down instructions (order codes) into their constituent microsteps; a particular microstep could occur in several different instructions, so instead of storing it several times within those instructions it could be stored once only, provided that the different "next step" routes could be selected for the different instructions. The microprogram control store holds the microstep actions, and also uses the order code to select the "next step" destinations.
Transistors A transistor is a semiconductor device used to amplify and switch electronic signals. It has replaced the triode vacuum tube. It is made of a solid piece of semiconductor material, with at least three terminals for connection to an external circuit. A voltage or current applied to one pair of the transistor's terminals changes the current flowing through another pair of terminals. Because the controlled (output) power can be much more than the controlling (input) power, the transistor provides amplification of a signal. Some transistors are packaged individually but many more are found embedded in integrated circuits.
Modules Typical plug-in modules:
Integrated Circuit An integrated circuit (also known as IC, microcircuit, microchip, silicon chip, or chip) is a miniaturized electronic circuit (consisting mainly of semiconductor devices, e.g. transistors, diodes, resistors, and capacitors, that has been manufactured in the surface of a thin substrate of semiconductor material. Integrated circuits are used in almost all electronic equipment in use today and have revolutionized the world of electronics.
Initial Input Initial Input was a primitive form of bootstrap used in 1960s second generation machines, whereby throwing a switch on the paper tape reader (specially guarded by a sleeve to prevent accidental implementation) caused a binary program to be loaded from the punched paper tape into the fixed initial program area of memory and then executed. This bootstrap program then caused more detailed programs to be loaded into other areas of memory. An example was the KDF9 Christmas program which read further ASCII characters from the input paper tape, and then output to a paper tape punch a binary pattern representing legible letters on a 8 x 8 dot matrix, e.g. "A MERRY CHRISTMAS AND A HAPPY NEW YEAR TO ALL AT RCAT SALFORD", which was then put up on the wall to entertain the engineers.
Instruction Pipeline An instruction pipeline is a technique used in the CPU to increase instruction throughput (the number of instructions that can be executed in a unit of time). The fundamental idea is to split the processing of a computer instruction into a series of independent microsteps, with storage at the end of each microstep. This allows the computer's control circuitry to issue instructions at the processing rate of the slowest microstep, which is a much faster rate than the time needed to complete an instruction. The term pipeline refers to the fact that each microstep is carrying (processing) data at once (like water), and each microstep is connected to the next (like the sections of a pipe).
Acorn RISC Machine Reduced Instruction Set Computer (1984 Acorn ARM chip, managed by Chris Curry)
Send all comments, updates and queries for
The Staffordshire University Computing Futures Museum CPU Page to Dr John
Hydrogen dekatrons require high voltages ranging from 400 to 600 volts on the anode for proper operation; dekatrons with an inert gas usually require about 350 volts. When a dekatron is first powered up, a glowing dot will appear at a random cathode; the tube must be reset to zero state by driving a negative pulse into the designated starting cathode. Dekatrons fell out of practical use when transistor-based counters became reliable and affordable.
The Harwell Dekatron Computer, an early dekatron and relay-based computer, was originally built and used at the Atomic Energy Research Establishment at Harwell, Oxfordshire. The first name for the computer was the "Harwell Computer", but it was changed to the "Harwell Dekatron Computer" when a transistorised computer was purchased. Construction started in 1949, and the Dekatron Computer was operational in April 1951. It was found to be very reliable and performed calculations required for the design of the first UK atomic bomb. At the end of its life at Harwell it was retired for the first time in 1957; the Oxford Mathematical Institute ran a competition to award it to the college that could produce the best case for its future use. This competition was won by the Wolverhampton and Staffordshire College of Technology (which later became Wolverhampton University) where it was used to teach computing until 1973; it was redesignated the Wolverhampton Instrument for Teaching Computing from Harwell (WITCH). Retired a second time, the WITCH computer was donated to the Birmingham Museum of Science and Industry in 1973 and was displayed there for many years, until being disassembled and moved into storage at the Birmingham City Council Museums Collection Centre. In 2008 the Computer Conservation Society began a project to record its history; remarkably, it was found that all the pieces were still identifiable, so that a reconstruction was a realistic possibility. In September 2009 the machine was loaned to The National Museum of Computing at Bletchley Park, where it is to be restored to working order.
The machine used dekatrons for volatile memory (similar to RAM in a modern computer) and paper tape for both input and program storage. Output was to either a Friden teleprinter or to a paper tape punch. The machine used decimal numbers and initially had 20 8-digit dekatron registers for internal storage, later increased to 40 which appeared to be enough for nearly all calculations. It was assembled from components more commonly found in a British telephone exchange. Although it could on occasions act as a true stored-program computer, this was not its normal mode of operation, since instructions were usually read one at a time from paper tape, even for repetitive subroutines (which used a loop of paper tape). Frequent use of the tapes, using paper tape readers which sensed holes by metal plungers, eventually caused extra holes to appear, so modifiying the instructions or data; the damaged tapes then had to be replaced with new tapes, or special linen tapes were sometimes used. The multiplication time was between 5 and 10 seconds. However, it was very reliable, and could be left running for periods of days or even weeks, a remarkable achievement for a machine using thermionic tubes.
1953 Wilkes Microprogram Control Store
1971 EPROM Control Store chip
A control store is implemented as a diode-array of read-only memory. One of the first engineers to demonstrate this technique was Maurice Wilkes at the University of Manchester. The original IBM System/360 had a read-only control store, but later System/370 and successor models loaded their microprograms from floppy disks into a writeable control store consisting of ultra-high speed random-access read-write memory.
Typical discrete transistors developed from 1954
Early ideas for the transistor were patented by physicist Julius Edgar Lilienfeld in Canada in 1925, and Oskar Heil in Germany in 1934. In 1947 John Bardeen and Walter Brattain at AT&T's Bell Labs in the United States observed that when electrical contacts were applied to a crystal of germanium, the output power was larger than the input. AT&T's Solid State Physics Group leader William Shockley saw the potential in this, and over the next few months worked to greatly expand the knowledge of semiconductors, to such an extent that he could be described as the "father of the transistor". Germanium was, however, quickly replaced by silicon as the preferred semiconductor material, and the first silicon transistor was produced by Texas Instruments in 1954.
Transistors are still produced individually-packaged, mostly for power applications, and are referred to as discrete transistors. However, the vast majority of transistors now produced are in integrated circuits (ICs), also referred to as microchips or simply chips, which contain complete electronic circuits made up of transistors, diodes, resistors, and capacitors. A logic gate consists of up to about twenty transistors whereas a 2009 advanced microprocessor used as many as 2.3 billion transistors.
The key advantages that have allowed transistors to replace their vacuum tube predecessors in most applications are: small size and weight; suitability for automated manufacturing processes, resulting in low per-unit cost; low operating voltages, making transistors suitable for small, battery-powered applications; no need for warm-up period for cathode heaters; low power dissipation; high reliability and greater physical ruggedness; insensitivity to mechanical shock and vibration; and long life.
However, transistors do have disadvantages: because they operate at low voltages, they are less suitable for power applications compared with high-voltage vacuum tubes operating up to tens of thousands of volts. e.g. cathode ray tubes; high power, high frequency operation, e.g. television broadcasting, is better achieved by vacuum tubes; and most importantly transistors are much more sensitive than vacuum tubes to an electromagnetic pulse, such as generated by an atmospheric nuclear explosion.
(Top)1955 DEUCE, using vacuum tubes
(Bottom) 1956 Ferranti Pegasus
1958 Integrated Circuit Microchip
1965 RAM chip
The first integrated circuits contained only a few transistors. Called Small-Scale Integration (SSI), digital circuits containing transistors numbering in the tens provided a few logic gates. SSI circuits and early aerospace projects were interdependent. Both the Minuteman missile and the Apollo programme needed lightweight digital computers for their inertial guidance systems: the Apollo guidance computer led to and motivated SSI integrated technology, while the need for thousands of Minuteman missiles forced SSI into mass-production. These programmes purchased almost all the available integrated circuits between 1960 and 1963. SSI began to appear in consumer products by 1970, a typical application being FM inter-carrier sound processing in television receivers.
The next step in the development of integrated circuits, taken in the late 1960s, introduced devices which contained hundreds of transistors on each chip, called Medium-Scale Integration (MSI). These chips were attractive economically because while they cost little more to produce than SSI devices, they allowed more complex systems to be produced using smaller circuit boards, and they required less assembly work.
Further development, driven by the same economic factors, led to Large-Scale Integration (LSI) in the mid 1970s, with tens of thousands of transistors per chip. Integrated circuits such as 1Kbit RAMs, calculator chips, and the first microprocessors began to be manufactured in the early 1970s. Initially they had under 4000 transistors, but true LSI circuits with up to 10,000 transistors began to be produced around 1974 for computer main memories and second-generation microprocessors.
The final step in the development process, starting in the 1980s and continuing up to the present, was Very Large Scale Integration (VLSI). The development started with hundreds of thousands of transistors in the early 1980s, and in 2009 it had reached more than several billion transistors per chip. The term Ultra Large Scale Integration (ULSI) has been coined for chips of complexity of more than 1 million transistors.
Moore's law describes a long-term trend in the history of computing hardware, in which the number of transistors that can be placed inexpensively on an integrated circuit has doubled approximately every two years. The capabilities of many digital electronic devices are strongly linked to Moore's law: Cost per transistor; Computing performance (processing speed) per unit cost; Power consumption; Hard disk storage cost per unit of information; RAM storage capacity; Number of pixels in digital cameras; and Network capacity. The trend has continued for more than half a century and is not expected to stop until 2015 or later. This empirical law is named after Intel co-founder Gordon E. Moore, who described the trend in his 1965 paper which noted that the number of components in integrated circuits had doubled every year from the invention of the integrated circuit in 1958 until 1965. He predicted that the trend would continue "for at least ten years". His prediction has proved to be uncannily accurate, in part because the law is now used in the semiconductor industry to guide long-term planning and to set targets for research and development. However, this fact supports an alternative view that the "law" unfolds as a self-fulfilling prophecy, where the goal set by the prediction charts the course for realized capability, or "which comes first, the chicken or the egg?"
Raymond Kurzweil's Fifth Paradigm (Moore's Law)
Raymond Kurzweil believes that the exponential improvement described by Moore's law implies a future technological singularity, i.e. a period where progress in technology would occur almost instantly. Kurzweil also believes that by 2019 the current strategy of ever-finer photolithography will have run its course, i.e. tracks will have reached the limit of one molecule wide, but he speculates that this does not mean the end of Moore's law: "Moore's law of Integrated Circuits was not the first, but the Fifth Paradigm to forecast accelerating price-performance ratios. Computing devices have been consistently multiplying in power (per unit of time) from the mechanical calculating devices used in the 1890 U.S. Census, to the relay-based "Heath Robinson" machine that cracked the Nazi built Lorenz cipher, to the CBS vacuum tube computer that predicted the election of Eisenhower, to the transistor-based machines used in the first space launches, to the integrated-circuit-based personal computer." Kurzweil speculates that it is likely that some new type of technology (possibly optical or quantum computers) will replace current integrated-circuit technology, and that Moore's Law will hold true long after 2020 with this new technology. This Law of Accelerating Returns described by Kurzweil has in many ways altered the public's perception of Moore's Law. It is a common (but mistaken) belief that Moore's Law makes predictions regarding all forms of technology, when it actually concerns only semiconductor circuits. The trend is in fact nothing more than an exponential progression, which may be plotted as a straight line on a logarithmic scale.
There must be a limit to this technological escalation, which will arise from the physical limit of the speed of electricity (speed of light) and/or the practical problems of making reliable printed circuit tracks only one molecule wide. But, as Kurzweil predicts, another technology may arise.
See also: Storage Development
Eaarly PCs had a basic input/output system (BIOS) implemented in firmware, designed to be the first code run by a PC when powered on. The initial function of the BIOS is to identify, test, and initialize system devices. The BIOS sets the machine hardware into a known state. This process is known as bootstrapping. BIOS programs are stored on a chip and are built to work with various devices that make up the complementary chipset of the system. They provide a small library of basic input/output functions to operate and control the peripherals such as the keyboard, text display, hard disk, etc.
1961 Generic 4-stage Pipeline
Maurice Wilkes used a pipeline at the University of Manchester. The 1961 IBM Stretch Project proposed the terms, "Fetch, Decode, and Execute" that became common usage. Super computers also needed the pipeline to achieve faster and faster processing in the late 1970s. One of the early super computers was the Cyber series built by Control Data Corporation. It's main architect was Seymour Cray, who later resigned from CDC to form Cray Research. Cray developed the XMP line of super computers, using pipelining for both the multiplication and addition/subtraction functions. Later, Star Technologies took pipelining to another level by adding parallel pipelines. In 1984 Star Technologies made another breakthrough with the pipelined division circuit. Most of these circuits can now be found embedded within modern microprocessors.
The generic pipeline is broken into four stages with a set of flipflops between each stage:
1. Instruction fetch
2. Instruction decode and register fetch
4. Register write back
However, when a programmer writes assembly code, the assumption is made that each instruction has been fully executed before execution of the subsequent instruction is begun; but this assumption is invalid for pipelines. In particular, a conditional jump instruction does not know its destination until the result of the dependent condition is known, and this may depend on completion of the previous instruction. Since programs often operate in loops of instructions, the pipeline assumes that there will be a return to the beginning of the loop, and assembles that destination into the pipeline. When this causes a program to behave incorrectly, the situation is known as a hazard: the pipeline must be emptied, and restarted at the correct place in the program.
A non-pipeline architecture is inefficient because some CPU components (modules) are idle while another module is active during the instruction cycle. Pipelining does not completely cancel out idle time in a CPU but making the modules work in parallel improves program execution time significantly.
Back to Top
03 05 December 2010 updated by Dr John Wilcock
The Karnaugh map (k-map) was invented in 1952 by Edward W. Veitch and developed further in 1953 by Maurice Karnaugh, a telecommunications engineer at Bell Labs. It is a method to simplify Boolean Algebra expressions. The Karnaugh map reduces the need for extensive calculations by taking advantage of the human pattern-recognition capability, permitting the rapid simplification of logic circuitry. The boolean variables are transferred from a truth table and ordered according to the principles of Gray code in which only one variable changes in between squares. Once the outputs are transcribed into the map, the data can be combined into the largest possible groups containing 2n cells (n=0,1,2,3...) and the minterm is generated through the laws of Boolean Algebra.
Microprogram Control Store
The microprogram control store is part of the CPU control unit that stores microsteps. In the design of the CPU it was found that economies could be made by breaking down instructions (order codes) into their constituent microsteps; a particular microstep could occur in several different instructions, so instead of storing it several times within those instructions it could be stored once only, provided that the different "next step" routes could be selected for the different instructions. The microprogram control store holds the microstep actions, and also uses the order code to select the "next step" destinations.
A transistor is a semiconductor device used to amplify and switch electronic signals. It has replaced the triode vacuum tube. It is made of a solid piece of semiconductor material, with at least three terminals for connection to an external circuit. A voltage or current applied to one pair of the transistor's terminals changes the current flowing through another pair of terminals. Because the controlled (output) power can be much more than the controlling (input) power, the transistor provides amplification of a signal. Some transistors are packaged individually but many more are found embedded in integrated circuits.
Typical plug-in modules:
An integrated circuit (also known as IC, microcircuit, microchip, silicon chip, or chip) is a miniaturized electronic circuit (consisting mainly of semiconductor devices, e.g. transistors, diodes, resistors, and capacitors, that has been manufactured in the surface of a thin substrate of semiconductor material. Integrated circuits are used in almost all electronic equipment in use today and have revolutionized the world of electronics.
Initial Input was a primitive form of bootstrap used in 1960s second generation machines, whereby throwing a switch on the paper tape reader (specially guarded by a sleeve to prevent accidental implementation) caused a binary program to be loaded from the punched paper tape into the fixed initial program area of memory and then executed. This bootstrap program then caused more detailed programs to be loaded into other areas of memory. An example was the KDF9 Christmas program which read further ASCII characters from the input paper tape, and then output to a paper tape punch a binary pattern representing legible letters on a 8 x 8 dot matrix, e.g. "A MERRY CHRISTMAS AND A HAPPY NEW YEAR TO ALL AT RCAT SALFORD", which was then put up on the wall to entertain the engineers.
An instruction pipeline is a technique used in the CPU to increase instruction throughput (the number of instructions that can be executed in a unit of time). The fundamental idea is to split the processing of a computer instruction into a series of independent microsteps, with storage at the end of each microstep. This allows the computer's control circuitry to issue instructions at the processing rate of the slowest microstep, which is a much faster rate than the time needed to complete an instruction. The term pipeline refers to the fact that each microstep is carrying (processing) data at once (like water), and each microstep is connected to the next (like the sections of a pipe).
Acorn RISC Machine
Reduced Instruction Set Computer (1984 Acorn ARM chip, managed by Chris Curry)
Send all comments, updates and queries for The Staffordshire University Computing Futures Museum CPU Page to Dr John Wilcock