Computing in Queen Elizabeth's ascendancy year
Queen Elizabeth’s accession was notable also for significant advances in computer hardware and software
The Nicholas was developed by the same team that had previously designed its Type 152 (previous picture)
APEC machines such as the APE(X)C came with a combination of hardware and software advances
An Elliott 402 computer was set-up in the BBC’s election night studio
An engineer checks the installation of the Elliott 402 ‘Electric Brain’ at the 1955 British Instrument Industries Exhibition
Finger on the pulse of innovation: scientist Alec Reeves
In the early 1950s telex and facsimile machines were the ‘endpoint devices’ of a datacoms infrastructure
1952 was the year that Princess Elizabeth inherited the throne - and when several innovations fuelled technological leaps for the UK's nascent computer industry.
When Princess Elizabeth became Queen Elizabeth II there were four computers in the UK. But whilst only the most optimistic of monarchists might have envisaged the Queen still on the throne in 2012, no one could have imagined the changes that were being designed in UK computer labs that year. The electronic computer was still very new; discounting wartime electronic machines, the first electronic 'stored program control' computers had only come into operation in 1948. During 1952 the total of UK computers rose by 50 per cent as two new British machines were brought into operation - Elliott Automation's 'Nicholas', and Birkbeck College's 'APE(X)C'.
These early electronic machines used what we now call 'first-generation' computer technology - using 'delay lines' or cathode ray tubes for memory, and electronic valves for switching. In 1952 there were crucial developments that brought into play small memory devices and printed circuit boards, and also made major advances in software. Perhaps as important, given the fact that early computers were expensive at a time of fiscal austerity, was the fact that they were used for commercial applications.
Elliott Automation was for a while one of the most influential - and certainly the most prolific - of the early British computer manufacturers. Its background in military technologies such as shell trajectory calculations ('fire control'), radar, and automation led it directly into the exciting new world of electronic computers, and it made machines for all these uses.
Later, it also tackled the new commercial market for electronic data processing machines, moving computer applications out of the scientific and military and into everyday businesses. Elliott Automation's Nicholas machine was a one-off, designed and built quickly for a 'guided bomb project'. It was both physically and 'logically' a small machine, developed by the same team that had previously designed the Elliott Type 152 naval gunnery control computer. The machines shared design features such as the structure of the 'word' length used by them. The 152 went out of service in 1952 as the Elliott Theory Department's attention turned to the Nicholas.The latter's use of hardware was economic, so the instruction set was limited, with just 30 instructions.
This instance of 1952 design philosophy led in the 1970s to 'risc' (reduced instruction set computing) machines, which are based on the idea that operating systems with a limited number of machine instructions can be more effective in use of memory - and thus more efficient in running application software - than complex instruction sets.
By the standards of 2012 Nicholas was most certainly small in capacity - its memory was'1kB of 32-bit words and it had no secondary memory. It did, however, use quite'small nickel delay lines rather than the'large, unwieldy and temperamental mercury delay lines used on other early computers.
The hardware designer of Nicholas was Charles Owen, who later left Elliott and joined Ferranti, another British defence and electronics company active in computer manufacture in the 1950s. At Ferranti Owen developed plug-in hardware components which 'captured' engineering design expertise for reuse in other computer systems. After that he went to IBM, and worked on its System/360 series family.
More important, in hindsight, are the software aspects of the Nicholas story. At that time users had to write all their own programs in binary machine code - computer suppliers did not supply any software; but for the Nicholas machine, Elliott Automation developed a software program called 'Translation Input'. This allowed users to prepare their programs in an alphabetic code which Translation Input transformed into binary machine code. Such software soon became known as 'assemblers'.
The Translation Input software was developed by Ruth Felton. One of her colleagues, Ed Hersom, recalls that "a visiting IBM executive who was touring computer installations worldwide told us ours was the first site he had seen where such a program was actually in use". Ruth's husband, George Felton, also worked on the project, developing a series of software sub-routines for users.
"Nicholas was relatively easy to program, and we developed things for it including floating-point routines," George Felton recalled. "The result was that we built a library of routines such as floating point. The following year we also had a matrix interpretive scheme so that we could do operations on modest-sized matrices."
Another female programmer at Elliott, joining the company in 1954, was Dina Vaughan. In 1958 she left to found Vaughan Programming Services, the first British software house. The Nicholas machine found a novel application in 1955 - predicting the result of the General Election for the BBC's election night coverage. A special computer program was written allowing individual constituency results to be entered into the machine as they were announced. The program would then calculate the likely overall result from the accumulating results. Only one glitch occurred when some duff data got in, and it took some minutes to reset and get back on track.
The Nicholas machine was based at Elliott's Borehamwood site, and communication of results and forecast had to be handled over the telephone. The idea of using computers on election night gained much attention and for the next election in 1959 an Elliott 402 computer was set up in the BBC's election night studio. Such performances did as much as anything else to bring the attention of these new machines to public attention.
First steps in academic computing
Another British computer to be announced to the world in 1952 was developed at Birkbeck College, London. APE(X)C's full name is the All-Purpose Electronic Computer, with the 'X' standing in for a machine identifier, in this case meaning X-ray for the mathematical work it did on X-ray crystallography. The machine's creator was Andrew Booth, a mathematician with special interests in both mathematical theory and machinery. Booth was appointed to Birkbeck in 1945 where he initially developed an electro-mechanical calculator, but then began to conceive of developing an electronic computer.
Again, this was to be a small machine. It would use limited resources and be built quickly. It was first used successfully to run a program on 2 May 1952. As the college's annual report noted"It has shown the expected speed of about several hundred times as fast as mechanical methods, but has exceeded expectation in its reliability and freedom from breakdown." An improved model, APE(R)C, was soon manufactured and delivered to the British Rayon Company, the 'R' standing for 'Rayon'. Like the Nicholas, the APEC machines came with a combination of hardware and software advances. The machine deployed the first magnetic rotating drum memory technology - a technology that was the'forerunner of standard memory disks. Booth also made important advances in designing the logic for use in multipliers employed in computer arithmetic units, and which still form the basis of modern design.
Kathleen Britten, who became Booth's wife, was also a mathematician. She did a lot'of the programming for the APEC machines and later wrote a book on programming. In the late 1950s she worked on programming neural networks to help understand animal pattern recognition, and then went on to work on character-recognition software. The APEC machines became the basis for the computers developed by another company, BTM, for commercial applications. BTM, which stands for British Tabulating Machinery, was the main supplier of commercial punched-card data storage and processing systems in Britain. It saw the potential for electronic computers in supplementing - and supplanting - its products and adapted Booth's designs into its HEC (Hollerith Electronic Computer) range.
The HEC became probably the best-selling British computer by the end of the 1950s, and formed the basis of the ICT 1200 range when BTM merged its computer operations with Powers Samas in 1959. The computers of the early 1950s were 'first-generation' machines using electronic valves for switching and delay lines for memory. 1954 and 1955 saw the arrival of the first transistor switches, the introduction of core memories and the development of 'second generation' computers; but no sooner had those technologies gone into commercial use, than they were being declared old technology. The subsequent 'third generation' of computers would depend on integrated circuit transistor technology for both switching and storage; however, this technology would not actually become commercial until the early 1960s.
In fact, the concept of the integrated circuit was not new even to the 1960s. British radar engineer Geoffrey Dummer made a revolutionary claim to a technical conference in the US in May 1952. "With the advent of the transistor and the work on semi-conductors generally, it now seems possible to envisage electronic equipment in a solid block with no connecting wires," he said. "The block may consist of layers of insulating, conducting, rectifying and amplifying materials, the electronic functions being connected directly by cutting out areas of the various layers."
In this short sentence, Dummer introduced the world to the idea of the integrated circuit; unfortunately he lacked the backing to turn it into a reality. He did, however, create a small project at defence and electronics company Plessey to construct a model formed from a block of semi-conductor material that was 'doped' and shaped as four transistors. The concept resurfaced in the US two years later when Jack Kilby patented his ideas for integrated circuits which were turned into commercial reality by Robert Noyce, the founder of Intel.
Dummer later recalled"It seemed so logical. We had been working on smaller and smaller components, improving reliability as well as size reduction. I thought the only way we could ever attain our aim was in the form of a solid block. You then do away with all your contact problems, and you have a small circuit with high reliability."
Dummer did his best to promote the integrated circuit as an important future technology which British industry and government should follow up, but was unsuccessful. "I have attributed it to war-weariness, but that is perhaps an excuse," he said. "The plain fact is that nobody would take the risk. This Ministry wouldn't place a contract because they hadn't an application. The applications people wouldn't say we want it because they had no experience. The Americans took financial gambles, whereas this was very slow in this country."
No patents were registered for what was to prove a world-shattering invention, so when Kilby moved on to the same technological territory it was in effect up for grabs. Not for the last time did the lack of commercial acumen in the British technology industry give away the technological lead - and immensely valuable intellectual property.
Could the telex network have become a proto-Internet?
Data communications were unheard of in the early 1950s. There were only a few computers, each running its own applications. They had no need to communicate. It would be another decade before concepts like packet switching were to be developed at the National Physical Laboratory. Similarly, digitisation of voice communications remained but a theoretical possibility in 1952. British telecomms and radar engineer, (and IEE member) Alec Reeves (1902-1971) had developed the idea of 'pulse code modulation' in 1938, but it had to wait until the late 1950s/early 1960s, and the introduction of transistors, before it could be used in operational networks.
It would however be a mistake to think that companies, governments, and international institutions did not need to communicate appreciable quantities data. A bigger misapprehension made at the time was to assume that this requirement would not grow rapidly with global post-war economic recovery.
That mistake was made by the British Post Office. It was the teleprinter (or 'telex') that was becoming ever-more important for corporate communications. The teleprinter was an inter-war invention, and became a key part of the war effort; without it the great armed forces of the Second World War would have been impossible to organise. The teleprinter's value was firmly established, and after the war firms increased their use of the technology.
Teleprinter (and, to an extent, facsimile) transmissions were also vital for international commercial and governmental communications - the first transatlantic telephone cable did not come into operation until 1958, while a network of submarine telegraph cables had been laid since the late 19th century, focused on London.
Initially, teleprinter transmissions had been carried over the Public Switched Telephone Network (PSTN), but traffic growth suggested that it would be more economical and manageable to build a dedicated network. Work started in the early 1950s, and the London part of the network came into operation in 1954. Connections were made by manual operators until automatic switching was fully introduced in 1960.
Transmission also evolved from 12-channel lines using amplitude modulation, via 'frequency-shift keying' systems, to 'time-division multiplexing' carrying hundreds of channels. Teleprinter communications used a binary digital coding system that developed into later computer coding standards, so is arguably the forerunner of the modern data communications.
Growth in use of the service was spectacular - a 50-fold increase in the 25 years from 1955 to 1980. Yet Post Office strategists failed to spot anything like the full extent of trend. It decided on a long-term plan of expanding its teleprinter network slowly, devoting minimal resources to building an extensive corporate data network. Its planners looked decades into the future and decided that Britain would need a network capable of handling no more than 10,000 teleprinter terminals by 1980.
In fact, the actual number of connections at that date exceeded 80,000; and the number of teleprinter 'calls' increased from just over one million in 1958 to 91 million inland and 77 million international calls in 1980. Only then did 'data communications', in the modern sense of the term, begin seriously to displace the teleprinter.
Other computer landmarks of 1952
First computer in the Netherlands is constructed by WL van der Poel.
Heinz Nixdorf founded Nixdorf Computer Corp in Germany; it remained independent until merging with Siemens in 1990.
IBM introduced the 701 computer, the company's first fully electronic model. This computer also had the ability to read/write magnetic tape, but at this stage it still relied mainly on punched cards for I/O.
Announced on 21 May, the IBM 726 was one of the first practical high-speed magnetic tape systems for electronic digital computers. It used a unique 'vacuum channel' method of keeping a loop of tape circulating between two points, allowing the tape drive to start and stop the tape in a split-second.
John von Neumann's IAS computer became operational at the Institute for Advanced Studies in Princeton, NJ, USA.
Mathematician Grace Hopper completed the first compiler, a program that allows a computer user to use English-like words instead of numbers.
On US Election night, 4 November, CBS News borrowed a UNIVAC to make a scientific prediction of the outcome of the race for the presidency between Dwight D Eisenhower and Adlai Stevenson.
SourcesComputer History Museum, CED Magic, History of Computing Project
|To start a discussion topic about this article, please log in or register.|
"Is augmented reality the next big thing or a marketing gimmick? Is it fundamental to the future or a fashion faux pas?"
- "Contracts for Difference" in the Explanatory Notes to the Energy Bill [09:46 am 22/05/13]
- Circuit Breaker [07:36 am 22/05/13]
- Sellafield MOX Plant Lessons Learned [10:02 pm 21/05/13]
- Philips 8051 assembler [09:53 pm 21/05/13]
- Isolation for repair of transformer feeder [08:46 pm 21/05/13]
Tune into our latest podcast