The Staffordshire University Computing Futures Museum Input Page

Back to the Computing Futures Museum Collections Page

Computer Hardware items are listed in order of date.

For magnetic drum, magnetic tape and optical disk see the Memory Page


1994 Microsoft Natural Keyboard

2000 Genius Keyboard

Standard keyboard layout

Foldable keyboard
QWERTY is the most used modern-day keyboard layout on English-language computer and typewriter keyboards. It takes its name from the first six characters seen in the far left of the keyboard's top row of letters. The QWERTY design is based on a layout designed by Christopher Latham Sholes in 1874 for the Sholes and Glidden typewriter and sold to Remington in the same year, when it first appeared in typewriters. It was designed to minimize typebar clashes. The layout remains in use on electronic keyboards (where the typebar clash is no longer relevant) because of the inertia in business, with millions of typists trained to touch-type using the standard layout, and the failure of alternatives to prove very significant advantages.
The first computer terminals such as the Teletype were typewriters that could produce and be controlled by various computer codes. These used the QWERTY layouts, and added keys such as escape (ESC) which had special meanings to computers. Later keyboards added function keys and arrow keys. Since the standardization of PC-compatible computers and Windows after the 1980s, most full-sized computer keyboards have followed this standard. This layout has a separate numeric keypad for data entry at the right, 12 function keys across the top, and a cursor section to the right and center with keys for Insert, Delete, Home, End, Page Up, and Page Down with cursor arrows in an inverted-T shape.
Different computer operating systems have methods of support for input of different languages such as Chinese, Hebrew or Arabic. QWERTY is designed for English, a language without any diacritical marks. QWERTY keyboards need to have a method to type accents, and many different systems have evolved. Some languages need advanced software for keyboard input, e.g. Chinese has as many as 60,000 ideographic symbols, Japanese uses four syllabaries simultaneously, etc. The QWERTZ layout is fairly widely used in Germany and much of Central Europe. The main difference between it and QWERTY is that Y and Z are swapped, and most special characters such as brackets are replaced by diacritical characters. This layout had to be learnt by workers at Bletchley Park since they had to decipher German messages. The AZERTY layout is used in France, Belgium and some neighbouring countries. It differs from the QWERTY layout in that the A and Q are swapped, the Z and W are swapped, and the M is moved from the right of N to the right of L.
Several alternatives to QWERTY have been developed over the years, claimed by their designers and users to be more efficient, intuitive and ergonomic. Nevertheless, none has seen widespread adoption, due partly to the sheer dominance of available keyboards and training.
In computing, a keyboard is an input device, partially modelled after the typewriter keyboard, which uses an arrangement of buttons or keys, to act as mechanical levers or electronic switches. A keyboard typically has characters engraved or printed on the keys and each press of a key typically corresponds to a single written symbol. However, to produce some symbols requires pressing and holding several keys simultaneously or in sequence. While most keyboard keys produce letters, numbers or signs (characters), other keys or simultaneous key presses can produce actions or computer commands.
Keyboards on laptops and notebook computers usually have a shorter travel distance for the keystroke and a reduced set of keys. The size of a standard keyboard is dictated by the practical consideration that the keys must be large enough to be easily pressed by fingers. To reduce the size of the keyboard, the numeric keyboard to the right of the alphabetic keyboard can be removed, or the size of the keys can be reduced, which makes it harder to enter text. Foldable (also called flexible) keyboards are made of soft plastic or silicone which can be rolled or folded on itself for travel. Projection keyboards project a virtual keyboard (an image of keys) onto a flat surface. The device then uses a camera or infrared sensor to "watch" where fingers move; the keys cannot be felt when pressed, since they are just projected images.
The Escape key (often abbreviated Esc) is used to initiate an escape sequence, most often to mean Stop. Certain key presses are special, e.g. Ctrl-Alt-Delete initiates a function of software to terminate certain processes. The modern PC keyboard has more than just switches. It also includes a control processor and indicator lights to provide feedback to the user about what state the keyboard is in. When a keyboard key is pressed, the key "bounces" against its contacts several times before it settles into firm contact. When released, it bounces some more until it reverts to the uncontacted state. To resolve this problem, the processor in a keyboard (or computer) "debounces" the keystrokes.
There are several ways of connecting a keyboard using cables, including the standard AT connector commonly found on motherboards, eventually replaced by the USB connection, or a combination transmitter and receiver unit (e.g. a Bluetooth dongle) for wireless keyboards.

Phelps Electro Motor Printing Telegraph

George May Phelps (1820 – 1888) was a 19th century USA inventor of automated telegraphy equipment. He combined the designs of several existing printers into this 1880 Electro Motor Printing Telegraph, that became the dominant apparatus for automated reception and transmission of telegraph messages. Note the keyboard for input, and output is by printing received messages.

Punched Cards Input

Hollerith Punched Card as used in the 1881 USA Census by Herman Hollerith.
Hollerith's company went on to become a constituent of the
Computing Tabulating Recording Corporation (CTR) in 1911,
which in 1924 was renamed as the International Business Machines Corporation (IBM).

1964 punched card using the EBCDIC code

Annual Sales of BTMC punched cards 1948 - 1965

1969 Article showing that punched cards were then still in use

1972 Punched Cards

1973 Punched Cards

1974 Card Gauge


1983 Vic 20 Joystick

1984 Amstrad Joystick

1984 Binatone Pong Joystick
The 1909 joystick is an input device consisting of a stick that pivots on a base and reports its angle or direction to the device it is controlling. Joysticks are often used to control video games, and usually have one or more push-buttons whose state can also be read by the computer.
The name "joystick" is thought to originate with early 20th century French pilot Robert Esnault-Pelterie. Joysticks were originally controls for an aircraft's ailerons and elevators. The first electrical 2-axis joystick was probably invented around 1944 in Germany. The device was developed for targeting the glide bomb Henschel Hs 293 against ship targets. Here, the joystick was used by an operator to steer the missile towards its target. The invention was also used at Peenemünde to steer the Wasserfall missile, a variant of the V-2 rocket. In the 1960s the use of joysticks became widespread in radio-controlled airplane modelling systems such as the Kwik Fly produced by Phill Kraft (1964). Kraft Systems eventually became an important supplier of joysticks to the computer industry. Ralph H. Baer, inventor of television video games, created the first video game joysticks in 1967. The Atari standard joystick, developed for the Atari 2600 was a digital joystick, with a single 'fire' button. Joysticks were commonly used as controllers in first and second generation game consoles, but then gave way to the familiar game pad with the Nintendo Entertainment System and Sega Master System in 1985 and 1986.
Most joysticks are two-dimensional, having two axes of movement (similar to a mouse), but one and three-dimensional joysticks do exist. A joystick is generally configured so that moving the stick left or right signals movement along the X axis, and moving it forward (up) or back (down) signals movement along the Y axis. In joysticks that are configured for three-dimensional movement, twisting the stick left (counter-clockwise) or right (clockwise) signals movement along the Z axis. These three axes - X Y and Z - are, in relation to an aircraft, roll, pitch, and yaw. An analog joystick is a joystick which has continuous states, i.e. returns an angle measure of the movement in any direction in the plane or the space (usually using potentiometers) and a digital joystick gives only on/off signals for four different directions, and mechanically possible combinations (such as up-right, down-left, etc.). (Digital joysticks were very common as game controllers for the video game consoles, arcade machines, and home computers of the 1980s.) Additionally joysticks often have one or more fire buttons, used to trigger some kind of action. These are simple on/off switches. Some joysticks have haptic feedback capability. These are thus active devices, not just input devices. The computer can return a signal to the joystick that causes it to resist the movement with a returning force or make the joystick vibrate. Most I/O interface cards for PCs have a joystick (game control) port. Modern joysticks mostly use a USB interface for connection to the PC.

Optical Character Recognition

The 1976 Kurzweil Reading Machine

Optical Character Recognition (OCR) is the mechanical or electronic translation of scanned images of handwritten, typewritten or printed text into machine-encoded text. It is widely used to convert books and documents into electronic files. OCR is a field of research in pattern recognition, artificial intelligence and computer vision.
In 1929 Gustav Tauschek obtained a patent on OCR in Germany. Tauschek's machine was a mechanical device that used templates and a photodetector. In 1950 David H. Shepard, a cryptanalyst at the Armed Forces Security Agency in the United States, built a machine to convert printed messages into machine language. Shepard then founded Intelligent Machines Research Corporation (IMR), which went on to deliver the world's first several OCR systems used in commercial operation. The United States Postal Service used OCR machines to sort mail from 1965 based on technology devised primarily by Jacob Rabinow. The first use of OCR in Europe was by the British General Post Office (GPO). In 1965 it began planning an entire banking system, the National Giro, using OCR technology, a process that revolutionized bill payment systems in the UK.
In 1974 Ray Kurzweil started the company Kurzweil Computer Products, Inc. and led development of the first omni-font optical character recognition system — a computer program capable of recognizing text printed in any normal font. He decided that the best application of this technology would be to create a reading machine for the blind, which would allow blind people to have text read to them by computer. This device required the invention of two enabling technologies — the flatbed scanner and the text-to-speech synthesizer. On 13.01.1976 the successful finished product was unveiled during a widely-reported news conference headed by Kurzweil and the leaders of the National Federation of the Blind. In 1978 Kurzweil Computer Products began selling a commercial version of the optical character recognition computer program. Two years later Kurzweil Computer Products became a subsidiary of Xerox known as Scansoft, and then Nuance Communications.
OCR systems require calibration to read a specific font. OCR Software uses analytical artificial intelligence techniques that consider sequences of characters rather than whole words or phrases, and make "best guesses" at characters using database look-up tables. Typical software is marketed by ABBYY and OmniPage. The accurate recognition of Latin-based typewritten or printed text is now considered largely a solved problem where clear imaging is available, with accuracy rates exceeding 99%; total accuracy can only be achieved by human review. Other areas, including recognition of cursive handwriting, and printed text in scripts other than Latin-based characters (especially those with a very large number of characters) are still the subject of active research.

Paper Tape Input

1943 5-Channel Paper Tape

1956 Data Preparation using 5-channel paper tape

1961 LEO III paper tape decks

1965 KDF9 Paper Tape Reader

1970 Blank reel of 8-Channel Paper Tape

1970 8-channel Punched Paper Tape


Typical Touchscreen, showing finger directly reacting to displayed graphics

2005 TOMTOM Go 910 Touchscreen

A touchscreen is an electronic visual display that can detect the presence and location of a touch within the display area. The term generally refers to touching the display of the device with a finger or hand. Touchscreens can also sense other passive objects, such as a stylus. The touchscreen has two main attributes: it enables direct interaction with what is displayed, rather than using indirect action with a cursor controlled by a mouse or touchpad; and also no hand-held intermediate device is required. Such displays can be attached to computers, and they are very common in the design of digital appliances such as the personal digital assistant (PDA), satellite navigation devices (e.g. TOMTOM), mobile phones, and video games.
The first touchscreen appeared in the second half of the 1940s. The 1983 HP-150 did not have a touchscreen in the strict sense; instead, it had a Cathode Ray Tube (CRT) surrounded by infrared transmitters and receivers, which detected the position of any non-transparent object on the screen.
The position of the finger on the touchscreen can be determined by a number of technologies:
A resistive touchscreen panel has two thin, metallic, electrically conductive layers separated by a narrow gap. When an object, such as a finger, presses down on a point on the panel's outer surface the two metallic layers become connected at that point. This causes a change in the electrical current, which is registered as a touch event. Voltages then determine the position.
Surface acoustic wave (SAW) technology uses ultrasonic waves that pass over the touchscreen panel. When the panel is touched, a portion of the wave is absorbed, and this change in the ultrasonic waves registers the position of the touch event.
A capacitive touchscreen panel consists of an insulator such as glass, coated with a transparent conductor such as indium tin oxide. As the human body is also a conductor, touching the surface of the screen results in a distortion of the body's electrostatic field, measurable as a change in capacitance. The sensor's controller can determine the location of the touch indirectly from the change in the capacitance as measured from the four corners of the panel.
Projected Capacitive Touch (PCT) technology uses an X-Y grid of electrodes in a single layer, or two separate, perpendicular layers of conductive material with parallel lines or tracks to form the grid (like the pixel grid of LCD displays).
An infrared touchscreen uses an array of X-Y infrared LED and photodetector pairs around the edges of the screen to detect a disruption in the pattern of LED beams. A major benefit of such a system is that is can detect essentially any input including a finger, gloved finger, stylus or pen.
In the 1960s strain gauge configuration, also called force panel technology, the screen is spring-mounted on the four corners and strain gauges are used to determine deflection when the screen is touched. This technology can also measure Z-axis movements and the force of the touch.
Optical sensing is a relatively modern development in touchscreen technology, in which two or more image sensors are placed around the edges (mostly the corners) of the screen. Infrared back lights are placed in the camera's field of view on the other side of the screen. A touch creates a shadow and each pair of cameras can then be triangulated to locate the touch.

Tracker Ball

The world's first tracker ball invented by Tom Cranston, Fred Longstaff and Kenyon Taylor working on the Royal Canadian Navy's DATAR project in 1952. It used a standard Canadian bowling ball

The 2006 Kensington Expert Mouse trackball. It can use a standard American billiards/snooker ball

A 1995 Apple Pippin games paddle with tracker ball

A tracker ball (also called a trackball) is a pointing device consisting of a ball held by a socket containing sensors to detect a rotation of the ball about two axes: it is like an upside-down mouse with an exposed protruding ball. The user rolls the ball with the thumb, fingers, or palm of the hand to move a cursor on the screen. Large tracker balls are common on CAD workstations for easy precision. Before the advent of the touchpad small trackballs were common on portable or laptop computers, where there was often no desk space on which to run a mouse. The tracker ball was invented by Tom Cranston and Fred Longstaff as part of the Royal Canadian Navy's DATAR system in 1952, eleven years before the mouse was invented. This first tracker ball used a Canadian bowling ball.

While early mice used a mechanical design with slotted "chopper" wheels which interrupted a beam of light to measure rotation, tracker balls had the advantage of being in contact with the user's hand, in a cleaner environment than the desk or mousepad, less likely to drag dirt and lint into the internal mechanism. However, the late 1990s replacement of rubber mouseballs by direct optical tracking put tracker balls at a disadvantage, restricting them to niche markets where their high-precision positioning remained more important, e.g. CAD/CAM, air traffic control, marine and military applications. Most tracker balls now have direct optical tracking which follows dots on the ball.

As with modern mice, most tracker balls now have an auxiliary device primarily intended for scrolling. Some have a scroll wheel like mice, but the most common type is a “scroll ring” which is spun around the ball. Kensington's SlimBlade Trackball tracks the ball itself in three dimensions for scrolling. Three major companies Logitech, A4Tech, and Kensington currently produce tracker balls. Microsoft was a major producer, but it has since discontinued all its tracker ball products.

Large tracker balls are sometimes seen on computerized special-purpose workstations, such as the radar consoles in an air-traffic control room or sonar equipment on a ship or submarine. Modern installations of such equipment may use mice instead. However, military mobile anti-aircraft radars and submarine sonars tend to continue using tracker balls, since they can be made more durable and fitter for fast emergency use. Trackballs are provided as the pointing device in some public internet access terminals, vending machines and telephones. Unlike a mouse, a trackball can be built into a console, and cannot be ripped away or easily vandalised. This simplicity and ruggedness also makes them ideal for use with industrial computers in a dirty environment.

Because trackballs for personal computers are stationary, they require less space for operation than a mouse, and may simplify use in confined or cluttered areas such as a small desk or a rack-mounted terminal. They are generally preferred in laboratory setting for the same reason. They are also useful to people with a mobility impairment, and it is not necessary always to use them on a flat surface: they can by used wirelessly from an armchair or bed. They are also usable by both hands. Some disabled or elderly people have difficulty holding a mouse stationary while double-clicking. A tracker ball allows them to let go of the cursor-manipulating ball while using buttons.

Speech Recognition

Speech recognition or "voice recognition" converts spoken words to text. Accuracy can often be improved by training the software to a particular speaker, i.e. by speaking the same word several times, with human correction of any mistakes made by the software. The first speech recognizer appeared in 1952 and consisted of a device for the recognition of single spoken digits. Success has been very limited, usually only for small constrained vocabularies. The user speaks into a microphone and the computer creates a text file of the words they have spoken, with obvious advantages for word processing and disabled users. Although the accuracy of these systems has improved in recent years (95%) they are still far from perfect, and systems that can recognise any word spoken by any person in any language are still a few years away.
IBM developed a prototype speech recognition product named VoiceType. In 1997 its successor ViaVoice was first introduced to the general public. In 1999 IBM released a free of charge version of ViaVoice. In 2003 IBM gave ScanSoft, which owned the competitive product Dragon Naturally Speaking, exclusive global distribution rights to ViaVoice Desktop products for Windows and Mac OS X. Two years later Nuance merged with ScanSoft, and the software is now distributed by Nuance.


2010 Epson flatbed scanner

2010 typical slide mount for high-density flatbed scanner

A scanner is a device that optically scans images, printed text, handwriting, or an object, and converts it into a digital image. Common examples are the flatbed scanner where the document is placed on a glass window for scanning, high-density slide and film scanners, hand-held scanners, 3D scanners, rotary scanners used for high-speed document scanning and transmission, and mechanically driven scanners that move the document rather than a scanning head.
The first image scanner was the 1957 US National Bureau of Standards drum scanner. It produced a 176 x 176 pixels black and white image.
Modern scanners typically use a charge-coupled device (CCD), whereas older drum scanners used a photomultiplier tube as the image sensor. A flatbed scanner is usually composed of a glass pane (or platen), under which there is a bright light (often xenon or cold cathode fluorescent) which illuminates the pane, and a moving optical array of CCD scanning elements. CCD-type scanners typically contain three rows (arrays) of sensors with red, green, and blue filters. In a modified arrangement back illumination is required and "slide" positive or negative film can be scanned to a high density. Uncut film strips of up to six frames, or four mounted slides, are inserted in a carrier, which is moved by a stepper motor across a lens and CCD sensor inside the scanner. Hand held document scanners are manual devices that are dragged across the surface of the image to be scanned. Scanning documents in this manner requires a steady hand, since an uneven scanning rate produces distorted images. Hand held 3D scanners rely on the placement of reference markers, typically adhesive reflective tabs that the scanner uses to align elements and mark positions in space.
The size of the file created increases with the square of the resolution. A resolution must be chosen that is within the capabilities of the equipment, preserves sufficient detail, and does not produce a file of excessive size. The file size can be reduced for a given resolution by using "lossy" compression methods such as JPEG, at some cost in quality. Reduced-quality files of smaller size can be produced from such an image when required (e.g. a much smaller file to be displayed as part of a fast-loading web page). Images are usually stored on a hard disk. Pictures are normally stored in image formats such as uncompressed BMP, "non-lossy" compressed TIFF, and "lossy" compressed JPEG. Documents are best stored in TIFF or PDF format, since JPEG is particularly unsuitable for text.

Graphics Tablets

1957 Styalator graphics tablet

1964 RAND Tablet

1980 Summagraphics BitPad graphics tablet

A graphics tablet is a computer input device that allows hand-drawn images and graphics to be input. It may also be used to capture handwritten signatures, or to trace an image from a piece of paper laid on the surface. Capturing data in this way is called digitizing.
The first electronic handwriting tablet was the Telautograph, patented by Elisha Gray in 1888. The first graphics tablet resembling modern tablets and used for handwriting recognition by a computer was the Styalator in 1957. The RAND Tablet (also called the Grafacon=Graphic Converter), often mis-stated to be the first digitizer tablet, was introduced in 1964. The RAND Tablet employed a grid of wires under the surface of the pad that encoded horizontal and vertical coordinates in a magnetic signal. The stylus received the magnetic signal, which was then decoded back to coordinate information by the computer. In the 1970s and early 1980s the ID (Intelligent Digitizer) and BitPad magnetostrictive tablets manufactured by the Summagraphics Corporation became popular. These digitizers were used as the input device for many high-end CAD (Computer Aided Design) systems, were used on the Apple II, and were also bundled with PC based CAD software such as AutoCAD.
Technologies used by tablets have been electromagnetic, magnetostrictive, optical, acoustic, or capacitative. Compared with touchscreens, a graphics tablet generally offers much higher precision, the ability to track an object which is not touching the tablet, and can gather much more information about the stylus, but is typically more expensive, and can only be used with the special stylus or other accessories. The stylus may be a pen-like object, but may also look like a mouse with crosshairs and buttons (referred to as a "puck").


1986 Sun SPARC Workstation

The 1959 workstation is a high-end microcomputer designed for technical or scientific applications. Intended primarily to be used by one person at a time, they are commonly connected to a local area network and run multi-user operating systems. The term workstation has also been used to refer to a mainframe computer terminal or a PC connected to a network.
Early examples of workstations were generally dedicated minicomputers; a system designed to support a number of users would instead be reserved exclusively for one person. A notable example was the PDP-8 from Digital Equipment Corporation, regarded to be the first commercial minicomputer. In the early 1980s, with the advent of 32-bit microprocessors such as the Motorola 68000, Sun Microsystems created Unix-based workstations based on this processor. As RISC microprocessors became available in the mid-1980s, these were adopted by many workstation vendors. Workstations tended to be very expensive, typically several times the cost of a standard PC and sometimes costing as much as a new car, or even a house. The high expense comes from costlier components that run faster than those found in cheaper computers, high-speed networking and sophisticated graphics.


Printable ASCII Characters

The full ASCII in binary and hexadecimal

The American Standard Code for Information Interchange (ASCII), is an encoding of characters based on the English alphabet. The development of ASCII started in 1960 and the first edition of the standard was published in 1963. A major revision followed in 1967 and the latest update was in 1986. ASCII defines codes for 128 characters, 95 of which are printable, and 33 of which are non-printable control characters. The American National Standards Institute (ANSI), based ASCII on the earlier Cluff-Foster-Idelson teleprinter encoding system, created around 1956 by Ivan Idelson, at Ferranti in the UK. His work established the coding of characters on 7 track paper tape for a British Standards Institute (BSI) committee.


Although the mouse was first invented in 1963,
Douglas Engelbart applied for a patent in 1970 for his human-computer
interaction device, the mouse, a wooden shell with two metal wheels,
described as an "X-Y position indicator for a display system".
Engelbart later revealed that it was nicknamed the "mouse" because
of the tail (cable) that hung out of the end.

1981 Mouse

1987 3-button Mouse

1989 Microsoft Mouse

1992 Logitec RS232 Mouse

1993 Chinese Mouse

2006 Dell Optical Mouse


1965 PDP8 binary data input switches

1975 Altair 8800 binary data input switches

Computer programs historically were manually input to the central processor internal memory via switches. An address was set first, and then the contents were set and input by a key. An instruction was represented by a configuration of on (1) /off (0) binary settings. After all the instructions were input in this manner, the binary start address was set, and an execute button was then pressed. In general there was no protection for the program area of memory, i.e. it was possible for the processor to modify its own instructions during a program run.

X25 Packet Switched Network

The X.25 Packet Switched Network was developed by GEC Computers and others from 1973 onwards. These techniques were vital to the development of the Internet.

Graphical User Interfaces

1984 Apple GUI

1984 Apple Hypercard GUI

1990 Microsoft GUI

2010 iPad GUI

A graphical user interface (GUI) is a type of user interface that allows the user to interact with programs other than via the keyboard. The usual components of a GUI are some form of pointing device, windows, graphical icons, pull-down menus, buttons, check boxes, scroll bars, etc. These aspects are indicated by the acronym WIMP, which stands for Windows, Icons, Menus and Pointing device.
GUIs were invented in 1963 by researchers at the Stanford Research Institute, led by Douglas Engelbart. They developed the use of text-based hyperlinks manipulated with a mouse, and the concept of hyperlinks was further refined and extended to graphics by researchers at Xerox PARC. Inspired by PARC, the first GUI-oriented computer was the 1981 Xerox 8010 Star Information System, and this was followed by the 1983 Macintosh 128K, the 1985 Atari ST and the 1985 Commodore Amiga. Indeed, Apple was always more progressive and innovative in the development of GUIs than Microsoft.
The most familiar GUIs are the Mac OS X and Microsoft Windows. Apple, IBM, Linux and Microsoft all used Xerox's ideas to develop products. For smaller mobile devices such as PDAs and smartphones the WIMP elements are modified to take into account constraints in space and available input devices. These newer interaction techniques are collectively named as post-WIMP user interfaces. Typically, the user interacts with information by manipulating visual widgets with interactions appropriate to the kind of data they hold. The visible graphical interface features of an application are sometimes referred to as "chrome". This includes the ability to select the visual appearance ("skin") of the device at will. Three-dimensional graphics are used for GUIs in computer games, social networking sites, and computer-aided design (CAD).


Wi-Fi is a trademark of the Wi-Fi Alliance of companies that manufacture certified products that belong to a class of wireless local area network (WLAN) devices. The Wi-Fi Alliance, a global association of companies, promotes WLAN technology and certifies products if they conform to certain standards of interoperability. An IEEE 802.11 device is installed in many personal computers, video game consoles, smartphones, printers, and other peripherals, and virtually all laptop or palm-sized computers.
Wi-Fi technology has its origins in a 1985 ruling by the U.S. Federal Communications Commission that released several bands of the radio spectrum for unlicensed use. Europe leads overall in uptake of wireless-phone technology, but the US leads in Wi-Fi systems partly because they lead in laptop usage. A Wi-Fi enabled device such as a personal computer, video game console, mobile phone, MP3 player or personal digital assistant can connect to the Internet when within range of a wireless network connected to the Internet. The coverage of one or more (interconnected) access points — called a hotspot — can comprise an area as small as a few rooms or as large as many square miles. In addition to private use in homes and offices, organizations and businesses - such as those running airports, hotels and restaurants - often provide free-use hotspots to attract or assist clients. Routers that incorporate a digital subscriber line modem or a cable modem and a Wi-Fi access point, often set up in homes and other premises, can provide Internet-access and internetworking to all devices connected (wirelessly or by cable) to them, thus enabling places that would traditionally not have network access to be connected, for example bathrooms, kitchens and garden sheds.

Digital Cameras

1985 Digital camera principle

2001 Mobile phone digital camera

2004 Canon PowerShot A95 5Mpixel digital camera

Photographic cameras were a development of the camera obscura ("dark room"), a device dating back to the Book of Optics (1021) of the Iraqi Arab scientist Ibn al-Haytham (Alhacen), which used a pinhole or lens to project an image of the scene outside upside-down onto a viewing surface. Scientist-monk Roger Bacon also studied the matter, and published notes and drawings as Perspectiva in 1267. On 24 January 1544 mathematician and instrument maker Reiners Gemma Frisius of Leuven University used a camera to watch a solar eclipse, publishing a diagram of his method in De Radio Astronimica et Geometrico in 1545. In 1558 Giovanni Batista della Porta was the first to recommend the method as an aid to drawing. The first real camera for photography was built by Johann Zahn in 1685, and the first recorded photograph was taken in 1814 by Nicéphore Niépce: he used a discovery by Johann Heinrich Schultz (1724) that a silver and chalk mixture darkens under exposure to light. Niépce used a sliding wooden box camera made by Charles and Vincent Chevalier in Paris, with a pewter plate coated with bitumen. The bitumen hardened where light struck. The unhardened areas were then dissolved away, but the photograph was not permanent.
Before the invention of photographic processes there was no way to preserve the images produced by these cameras apart from manually tracing them. The earliest cameras were room-sized (in fact the word "camera" means "room"), with space for one or more people inside; these gradually evolved into more and more compact models so that by Niépce's time portable handheld cameras suitable for photography were readily available. In 1836 Louis Jacques Daguerre invented the first practical photographic method, which was named the daguerreotype. Daguerre coated a copper plate with silver, then treated it with iodine vapour to make it sensitive to light. The image was developed by mercury vapour and fixed with a strong solution of ordinary salt. In 1840 William Fox Talbot perfected a different process, the calotype. Both Daguerre's and Fox Talbot's methods used cameras that were little different from Zahn's model, with the sensitized plate placed in front of the viewing screen to record the image, with focusing by sliding boxes. Collodion dry plates became available from 1855, and gelatine dry plates from 1871. The shortened exposure times that made photography more practical also necessitated another innovation, the mechanical shutter. The use of photographic film was pioneered by George Eastman, who started manufacturing paper film in 1885 before switching to celluloid in 1889. His first camera, which he called the "Kodak," was first offered for sale in 1888. It was a very simple box camera with a fixed-focus lens and single shutter speed, which along with its relatively low price appealed to the average consumer. The Kodak came pre-loaded with enough film for 100 exposures and needed to be sent back to the factory for processing and reloading when the roll was finished. In 1900 Eastman took mass-market photography one step further with the Brownie, a simple and very inexpensive box camera that introduced the concept of the snapshot. The Brownie was extremely popular and various models remained on sale until the 1960s. The 35mm film camera was developed by Leitz from 1913, and Leica and Kodak got into the market from the 1930s. The fledgling Japanese camera industry began to take off in 1936 with the Canon 35mm Rangefinder.
The first practical reflex camera was developed by Franke & Heidecke and was called the Rolleiflex (1928). The 35mm Single Lens Reflex (SLR) design gained popularity from 1933. An entirely new type of camera, the Polaroid Model 95, appeared on the market in 1948. This was the world's first viable instant-picture camera which used a patented chemical process to produce finished positive prints from the exposed negatives in under a minute. The addition of electronics to the camera began in 1938, but was commonplace by the 1960s for automatic exposure using a selenium light meter.
Digital cameras differ from their analog predecessors primarily in that they do not use film, but capture and save photographs on digital memory cards or internal storage instead. Their low operating costs have relegated chemical cameras to the heritage markets. Digital cameras also include wireless communication capabilities (for example Wi-Fi or Bluetooth) to transfer, print or share photos, and are commonly found on mobile phones. Commencing in 1961 for space exploration, the first recorded attempt at building a digital camera was in 1975 by Eastman Kodak using a new solid-state CCD image sensor chip developed by Fairchild Semiconductor in 1973. The camera weighed 3.6Kg, recorded black and white images to a cassette tape with a resolution of only 10Kpixels, and took 23 seconds to capture a single still image. Although the image was captured digitally, these first devices were essentially analog, since the images were recorded as continuous signals on cassette tape or floppy disks, and such cameras were useful mostly for news reporting in the late 1980s and early 1990s. The first truly digital camera that recorded images as a computerized file was the Fuji DS-1P of 1988, which recorded to a 16MB internal memory card. The first commercially available digital camera was the 1990 Dycam Model 1 which used a CCD image sensor, stored pictures digitally, and connected directly to a computer for download. The move to digital formats was helped by the formation of the first JPEG and MPEG standards in 1988, which allowed image and video files to be compressed for storage. The first consumer camera with a liquid crystal display screen on its back was the 1995 Casio QV-10, and the first camera to use CompactFlash was the Kodak DC-25 in 1996. In 1997 the first megapixel cameras for consumers were marketed. By 1999 2Mpixel cameras were common, and in the 2000s 5Mpixel cameras were possible.
A digital camera now commonly has interchangeable flash memory cards and may be treated as an external form of computer memory: one or more images may be copied or downloaded to internal computer memory using a USB cable for more permanent storage and printing, and the images may then be processed using a variety of image-handling programs.

Virtual Keyboards

1990 Target tracking camera mounted on laptop for use of the Headmouse

2006 On screen Virtual Keyboard

2009 Projection Virtual Keyboard with 3D tracking of fingers

A virtual keyboard is a software or hardware solution that allows a user to enter characters, other than via a conventional keyboard. Devices may enable disabled users to communicate with a computer, and include the mouse, the headmouse or eyemouse, a visual representation of a keyboard on the computer screen for mouse selection of keys, or a pseudo-keyboard projected on a flat surface, with 3D tracking of finger movements.
A headmouse or eyemouse relies on a silver reflective dot mounted on the user's forehead. An infrared or wirless optical camera mounted above the screen of the computer translates natural movements of a user's head into directly proportional movements of the computer mouse pointer. This works just like a computer mouse, with the mouse pointer being moved by the motion of the users head. The device connects to the computer through a USB port and operates using standard mouse drivers.
On a desktop PC, besides providing an alternative input mechanism for users with disabilities, a virtual keyboard is useful for bi- or multi-lingual users, who continually need to switch between different character sets or alphabets. Although hardware keyboards are available with dual layouts (for example Cyrillic/Latin letters in various national layouts), the on-screen keyboard provides a handy substitute for switching between fonts using hot keys. On devices which lack a physical keyboard (such as personal digital assistants or touchscreen equipped cell phones), it is common for the user to input text by tapping a virtual keyboard built into the operating system of the device.
Virtual keyboards reduce the risk of keystroke logging, e.g. online banking services often use a virtual keyboard using menu dropdown for password entry - it is more difficult for malware to monitor the display and mouse to obtain the data entered via the virtual keyboard, than it is to monitor real keystrokes. However it is still possible for the malware to log the virtual keyboard entries by recording screenshots at regular intervals or upon each mouse click.

Webcams, microphones and Skype

1991 Webcam
Typical free-standing microphone
2001 camera and microphone setup
Skype is a software application that allows users to make voice calls over the Internet. The origin of the name was as follows: one of the initial names for the project was "Sky peer-to-peer", which was then abbreviated to "Skyper"; dropping the final "r" left the current title "Skype". Skype was developed by Estonian developers Ahti Heinla, Priit Kasesalu and Jaan Tallinn. Skype Limited, the company that operates Skype, was founded in 2003 by Niklas Zennström and Janus Friis with headquarters in Luxembourg. eBay acquired Skype Limited in September 2005. On 14.04.2009 eBay announced plans to spin off of Skype through an initial public offering in 2010. In 2010 a report by TeleGeography Research said that Skype-to-Skype calls accounted for 13% of all international call minutes in 2009.
Skype is a Voice over IP (VoIP) program. It can be used as just a telephone service using a special handset, but is more often used as a means of making video face-to-face calls using a camera, microphone and headphones.


1994 Touchpad on a Compaq laptop computer

A touchpad (also called a trackpad) is a pointing device or tactile sensor consisting of specialized surface that can translate the motion and position of a user's fingers to a relative position on the screen. They are a common feature of laptop computers and also used as a substitute for a computer mouse where desk space is scarce.
Touchpads operate in one of several ways, including capacitive sensing and conductance sensing, rather like a small graphics tablet. Touchpads are able to sense absolute positions, but their precision is limited by their size. For common use as a pointer device, the dragging motion of a finger is translated into a finer, relative motion of the cursor on the screen, analogous to the handling of a mouse that is lifted and put back on a surface. Buttons below the pad serve as standard mouse buttons.


Bluetooth USB dongle
A personal computer that does not have embedded Bluetooth can be used with a Bluetooth adapter
or "dongle" that will enable the PC to communicate with other Bluetooth devices (such as
mobile phones, mice and keyboards). While some desktop computers and most recent laptops
come with a built-in Bluetooth radio adapter, others will require an external transmitter/receiver in the form of a dongle.

Bluetooth Logo, made up from Viking runes

2002 Bluetooth piconet architecture, showing master/slave relationships of devices

The 1994 Bluetooth system is a proprietary open wireless technology standard for exchanging data over short distances (using short length radio waves) from fixed and mobile devices, creating personal area networks (PANs) with high levels of security. It was invented by Ericsson and was originally conceived as a wireless alternative to RS-232 data cables. It can connect several devices, overcoming problems of synchronization. Bluetooth is managed by the Bluetooth Special Interest Group.
The word Bluetooth is an anglicised version of Danish blátönn, the nickname of the 10th Century CE King Harald I of Denmark and parts of Norway (Harald Gormsson) who united Danish tribes, who spoke different languages, into a single kingdom. The implication is that Bluetooth does the same for communications protocols, uniting them into one universal standard. Although blĺ in modern Scandinavian languages means blue, during the Viking age it could also mean black. So a historically correct translation of Old Norse Haraldur blátönn could be Harald Blacktooth rather than Harald Bluetooth.
Bluetooth uses a radio technology called frequency-hopping spread spectrum, which chops up the data being sent into packets and transmits the packets on up to 79 bands of 1MHz width in the 2.4GHz short-range radio frequency band (2400 - 2485.5MHz), at data rates of 1Mbit/s or more. The Bluetooth protocol has a master-slave structure, each master communicating with up to 7 slaves in a piconet; all devices share the master's clock, which ticks at 312.5 µs intervals. Two clock ticks make up a slot of 625 µs; two slots make up a slot pair of 1250 µs. In the simple case of single-slot packets the master transmits in even slots and receives in odd slots; the slave, conversely, receives in even slots and transmits in odd slots. Packets may be 1, 3 or 5 slots long but in all cases the master will transmit in even slots and the slave will transmit in odd slots. Transmitting devices randomly channel hop 1600 times a second in order to avoid contentions: if a channel is blocked, a clear one is found almost immediately. Bluetooth devices have a range of 10m and are able to transfer data through walls, pockets and briefcases.
Bluetooth provides a secure way to connect and exchange information between devices such as faxes, mobile phones, telephones, laptops, personal computers, wireless mice and keyboards, printers, Global Positioning System (GPS) receivers, digital cameras, and video game consoles. Bluetooth is a standard communications protocol primarily designed for low power consumption, with a short range, and based on low-cost transceiver microchips in each device.
Bluetooth and Wi-Fi are not interchangeable terms:
Wi-Fi is intended as a replacement for cabling for general local area network access in resident work areas. The types of applications suitable for Wi-Fi are those applications which use a wireless local area network (WLAN). Wi-Fi is a traditional Ethernet network, and requires configuration to set up shared resources, to transmit files, and to set up audio links (for example, headsets and hands-free devices). Wi-Fi uses the same radio frequencies as Bluetooth, but with higher power, resulting in a stronger connection. Wi-Fi is sometimes called "wireless Ethernet". Wi-Fi requires more setup but is better suited for operating full-scale networks; it enables a faster connection and better range from the base station.
Bluetooth on the other hand is intended for non-resident equipment and its applications. The types of applications suitable for Bluetooth are those applications which use a wireless personal area network (WPAN). Bluetooth is a replacement for cabling in a variety of portable applications in any external situation and can also support fixed location applications such as smart energy functionality in the home (thermostats, lighting, closing curtains, etc.).


1996 USB Plug Type A

The Universal Serial Bus (USB) is a specification to establish communication between devices and a host controller (usually personal computers), developed by Intel. The USB is intended to replace many varieties of serial and parallel ports. It can connect computer peripherals such as mice, keyboards, digital cameras, printers, personal media players, flash drives, and external hard drives. For many of those devices, USB has become the standard connection method. USB was designed for personal computers, but it has become commonplace on other devices such as smartphones, PDAs and video game consoles.
The USB began its development in 1994 by a group of seven companies: Compaq, DEC, IBM, Intel, Microsoft, NEC and Nortel. USB was intended to make it fundamentally easier to connect external devices to PCs by replacing the multitude of connectors at the back of PCs, addressing the usability issues of existing interfaces, and simplifying software configuration of all devices connected to USB, as well as permitting greater bandwidths for external devices. The first silicon chip for USB was made available by Intel in 1995. The USB 1.0 specification was introduced in January 1996. The original USB 1.0 specification had a data transfer rate of 12 Mbps. USB 1.1 had 12Mbps for higher-speed devices such as disk drives, and a lower 1.5 Mbps rate for low bandwidth devices such as joysticks. The USB 2.0 specification was released in April 2000. It divided USB devices into three groups: low-speed devices (e.g. keyboards, mice) 1.5Mbps; full-speed devices 12Mbps; and High-speed devices 480 Mbps, a fortyfold increase over the 12 Mbps for the original USB 1.1 standard.
A USB system has an asymmetric design, consisting of a host, a multitude of downstream USB ports, and multiple peripheral devices connected in a tiered-star topology. Additional USB hubs may be included in the tiers, allowing branching into a tree structure with up to five tier levels. A USB host may have multiple host controllers and each host controller may provide one or more USB ports. Up to 127 devices, including hub devices if present, may be connected to a single host controller. When a USB device is first connected to a USB host, the USB device enumeration process is started. The enumeration starts by sending a reset signal to the USB device. The data rate of the USB device is determined during the reset signaling. After reset, the USB device's information is read by the host and the device is assigned a unique 7-bit address. If the device is supported by the host, the device drivers needed for communicating with the device are loaded and the device is set to a configured state. If the USB host is restarted, the enumeration process is repeated for all connected devices. The host controller directs traffic flow to devices, so no USB device can transfer any data on the bus without an explicit request from the host controller. In USB 2.0, the host controller polls the bus for traffic, usually in a round-robin fashion. The slowest device connected to a controller sets the bandwidth of the interface. For SuperSpeed USB (USB 3.0), connected devices can request service from host. Because there are two separate controllers in each USB 3.0 host, USB 3.0 devices will transmit and receive at USB 3.0 data rates regardless of USB 2.0 or earlier devices connected to that host. Operating data rates for them will be set in the legacy manner.


Typical mouse pointers

The cursor is a flashing indicator used on a computer monitor to show the position of a pointing device such as the mouse pointer, or the character position for text input.
The cursor may be an underscore, a solid rectangle, a vertical line, or some other pointing symbol, which may be flashing or steady, indicating the insertion point where text will be placed, or to select the icon, menu item etc. which will be selected if a mouse button is pressed. In the old days of text mode displays, it was not possible to show a vertical bar between characters to show where the new text would be inserted, so an underscore or block cursor was usually created by periodically inverting the pixels of the character using the Boolean exclusive or function. The blinking of the cursor is usually temporarily suspended when it is being moved, otherwise the cursor might change position when it is not visible, making its location difficult to follow.
The cursor may change shape depending on the next action, e.g. a vertical I bar for text insertion, a hand for scrolling or moving the page, a graphics symbol (pipette, brush, pencil, paint bucket, etc.) for image editing, a directional arrow for dragging the edge or corner of a window, an hour glass to indicate that the processor is busy, or an index finger to indicate a hyperlink. Also a help box might appear at the cursor position giving brief information about what action will happen if the position is selected. Pointer trails are an optional feature to enhance the visibility of the mouse cursor under poor screen contrast conditions, although some people find these irritating (they can be disabled).

Haptic Interface

A force feedback haptic input device

Haptic input is a tactile feedback technology that takes advantage of a user's sense of touch by applying forces, vibrations, and/or motions to the user. This mechanical stimulation may be used to assist in the creation of virtual objects (objects existing only in a computer simulation), for control of such virtual objects, and for the enhancement of the remote control of machines and devices (teleoperators). The word "haptic", from the Greek haptikos, means pertaining to the sense of touch.
An early form of haptic device was used in the servomechanism systems used to operate control systems in large "fly-by-wire" modern aircraft. In earlier aircraft with physical connections between the controls and the aerodynamic surfaces, buffeting was felt as the aircraft approached a stall, a useful warning to the pilot of a dangerous flight condition. This physical connection has been removed in modern aircraft, so that these forces are no longer perceived at the controls. To replace this missing cue, the angle of attack is measured, and when it approaches the critical stall point a "stick shaker" (an unbalanced rotating mass) is engaged, simulating the effects of the buffeting. This is known as "haptic feedback".
Alternatively "force feedback" is an increasing resistance applied to the control as the edge of a simulated mass is approached. This allows the operator to "feel" and work around unseen obstacles.
Teleoperators are remote controlled robotic tools, and when simulated contact forces are given to the operator the method is called "haptic teleoperation". The first electrically actuated teleoperators were built in the 1950s at the Argonne National Laboratory in the United States to remotely handle radioactive substances. Since then, the use of force feedback has become more widespread in all kinds of teleoperators such as underwater exploration devices controlled from a remote location, flight simulators, and simulations of medical operations.
When such devices are simulated using a computer (as they are in operator training devices) it is useful to provide the force feedback that would be felt in actual operations. Since the objects being manipulated do not exist in a physical sense, the forces are generated using haptic force generating operator controls.
Simple haptic devices are common in the form of game controllers, in particular of joysticks and steering wheels. An example is the simulated automobile steering wheels that are programmed to provide a "feel" of the road. As the user makes a turn or accelerates, the steering wheel responds by resisting turns or slipping out of control. In 2007 Novint released the Falcon, the first consumer 3D touch device with high resolution three-dimensional force feedback, allowing the haptic simulation of objects, textures, recoil, momentum, physical presence of objects in games. The vibration of mobile phones (as an alternative to ringing) is a form of haptic feedback, as is the momentary vibration of keys pressed on a simulated keypad or touchscreen. Haptics are gaining widespread acceptance as a key part of virtual reality systems. A robot hand can be supplied with the sense of touch, pressure, and position to reproduce the human grip in all its strength, delicacy, and complexity, and it has been used by NASA for remote control of space robots. In future, rather than travel to an operating room, haptics will allow the surgeon to become a telepresence; this will allow expert surgeons to operate from across the country, increasing availability of expert medical care. Haptic technology will provide tactile and resistance feedback to the surgeon as he operates the robotic device. Simulated operations will let surgeons and surgical students practice and train more. The idea behind such a development is that, just as commercial pilots train in flight simulators before they pilot real aircraft, surgeons will be able to practice their first incisions without actually cutting anyone.


ZigBee is a specification for a suite of high level communication protocols using small, low-power digital radios communicating with wireless personal area networks (WPANs), such as wireless headphones connecting with cell phones via short-range radio. The technology defined by the ZigBee specification is intended to be simpler and less expensive than other WPANs, such as Bluetooth. The ZigBee Alliance is a group of companies that maintain and publish the ZigBee standard.
ZigBee-style networks began to be conceived around 1998, when many installers realized that both WiFi and Bluetooth were going to be unsuitable for many applications. In particular, many engineers saw a need for self-organizing ad-hoc digital radio networks. The IEEE 802.15.4 standard for Zigbee was completed in May 2003 and ratified on 14.12.2004. Zigbee is administered by The ZigBee Alliance of more than 150 companies. The origin of the name Zigbee refers to the behaviour of honey bees after return to the beehive.
ZigBee is a low-cost, low-power, wireless mesh networking proprietary standard. The low cost allows the technology to be widely deployed in wireless control and monitoring applications, the low power-usage allows longer life with smaller batteries, and the mesh networking provides high reliability and larger range. Typical application areas include smart lighting, advanced temperature control, safety and security, films, music, water sensors, power sensors, energy monitoring, smoke and fire detectors, smart appliances and access sensors. Devices connected to the Zigbee networks are: a ZigBee Coordinator (ZC), ZigBee Routers (ZR), and ZigBee End Devices (ZED), the last containing just enough functionality to talk to the parent node, allowing the end device to be asleep a significant amount of the time thereby giving long battery life.

Human Interface Devices

2009 Techstyle Wireless Presenter HID with laser pointer, for remote control of a PowerPoint presentation, with USB receiver dongle

Human Interface Device (HID) is a general term for the devices by which a human may interact with a computer, including keyboard, mouse, joystick, graphics tablet or device controller. Mice and keyboards are frequently fitted with USB connectors, but because most PC motherboards up to 2007 still retained PS/2 connectors for the keyboard and mouse, they are often supplied with a small USB-to-PS/2 adaptor, allowing use with either USB or PS/2 interface. Joysticks, keypads, tablets, wireless presenters with receiver dongles for remote control of a PowerPoint presentation, and other human-interface devices are also progressively migrating from MIDI, PC game port, and PS/2 connectors to USB.
HIDs for wireless control of PowerPoint presentations at minimum have a button for "next slide" (equivalent to a mouse left click or space bar press) and a button for "previous slide" (equivalent to the slightly more complex mouse right click and selection of Previous from the pull down menu). More complex HIDs allow remote selection of one of several screens from software running in parallel, e.g. PowerPoint, Excel with data and graphs, Internet, online videos, etc.

Back to Top

Send all comments, updates and queries for The Staffordshire University Computing Futures Museum Input Page to Dr John Wilcock

Version: 03 19 June 2010 updated by Dr John Wilcock