1952 was the year that Princess Elizabeth inherited the throne - and when several innovations fuelled technological leaps for the UK's nascent computer industry.
When Princess Elizabeth became Queen Elizabeth II there were four computers in the UK. But whilst only the most optimistic of monarchists might have envisaged the Queen still on the throne in 2012, no one could have imagined the changes that were being designed in UK computer labs that year. The electronic computer was still very new; discounting wartime electronic machines, the first electronic 'stored program control' computers had only come into operation in 1948. During 1952 the total of UK computers rose by 50 per cent as two new British machines were brought into operation - Elliott Automation's 'Nicholas', and Birkbeck College's 'APE(X)C'.
These early electronic machines used what we now call 'first-generation' computer technology - using 'delay lines' or cathode ray tubes for memory, and electronic valves for switching. In 1952 there were crucial developments that brought into play small memory devices and printed circuit boards, and also made major advances in software. Perhaps as important, given the fact that early computers were expensive at a time of fiscal austerity, was the fact that they were used for commercial applications.
Elliott Automation was for a while one of the most influential - and certainly the most prolific - of the early British computer manufacturers. Its background in military technologies such as shell trajectory calculations ('fire control'), radar, and automation led it directly into the exciting new world of electronic computers, and it made machines for all these uses.
Later, it also tackled the new commercial market for electronic data processing machines, moving computer applications out of the scientific and military and into everyday businesses. Elliott Automation's Nicholas machine was a one-off, designed and built quickly for a 'guided bomb project'. It was both physically and 'logically' a small machine, developed by the same team that had previously designed the Elliott Type 152 naval gunnery control computer. The machines shared design features such as the structure of the 'word' length used by them. The 152 went out of service in 1952 as the Elliott Theory Department's attention turned to the Nicholas.The latter's use of hardware was economic, so the instruction set was limited, with just 30 instructions.
This instance of 1952 design philosophy led in the 1970s to 'risc' (reduced instruction set computing) machines, which are based on the idea that operating systems with a limited number of machine instructions can be more effective in use of memory - and thus more efficient in running application software - than complex instruction sets.
By the standards of 2012 Nicholas was most certainly small in capacity - its memory was'1kB of 32-bit words and it had no secondary memory. It did, however, use quite'small nickel delay lines rather than the'large, unwieldy and temperamental mercury delay lines used on other early computers.
The hardware designer of Nicholas was Charles Owen, who later left Elliott and joined Ferranti, another British defence and electronics company active in computer manufacture in the 1950s. At Ferranti Owen developed plug-in hardware components which 'captured' engineering design expertise for reuse in other computer systems. After that he went to IBM, and worked on its System/360 series family.
More important, in hindsight, are the software aspects of the Nicholas story. At that time users had to write all their own programs in binary machine code - computer suppliers did not supply any software; but for the Nicholas machine, Elliott Automation developed a software program called 'Translation Input'. This allowed users to prepare their programs in an alphabetic code which Translation Input transformed into binary machine code. Such software soon became known as 'assemblers'.
The Translation Input software was developed by Ruth Felton. One of her colleagues, Ed Hersom, recalls that "a visiting IBM executive who was touring computer installations worldwide told us ours was the first site he had seen where such a program was actually in use". Ruth's husband, George Felton, also worked on the project, developing a series of software sub-routines for users.
"Nicholas was relatively easy to program, and we developed things for it including floating-point routines," George Felton recalled. "The result was that we built a library of routines such as floating point. The following year we also had a matrix interpretive scheme so that we could do operations on modest-sized matrices."
Another female programmer at Elliott, joining the company in 1954, was Dina Vaughan. In 1958 she left to found Vaughan Programming Services, the first British software house. The Nicholas machine found a novel application in 1955 - predicting the result of the General Election for the BBC's election night coverage. A special computer program was written allowing individual constituency results to be entered into the machine as they were announced. The program would then calculate the likely overall result from the accumulating results. Only one glitch occurred when some duff data got in, and it took some minutes to reset and get back on track.
The Nicholas machine was based at Elliott's Borehamwood site, and communication of results and forecast had to be handled over the telephone. The idea of using computers on election night gained much attention and for the next election in 1959 an Elliott 402 computer was set up in the BBC's election night studio. Such performances did as much as anything else to bring the attention of these new machines to public attention.
First steps in academic computing
Another British computer to be announced to the world in 1952 was developed at Birkbeck College, London. APE(X)C's full name is the All-Purpose Electronic Computer, with the 'X' standing in for a machine identifier, in this case meaning X-ray for the mathematical work it did on X-ray crystallography. The machine's creator was Andrew Booth, a mathematician with special interests in both mathematical theory and machinery. Booth was appointed to Birkbeck in 1945 where he initially developed an electro-mechanical calculator, but then began to conceive of developing an electronic computer.
Again, this was to be a small machine. It would use limited resources and be built quickly. It was first used successfully to run a program on 2 May 1952. As the college's annual report noted"It has shown the expected speed of about several hundred times as fast as mechanical methods, but has exceeded expectation in its reliability and freedom from breakdown." An improved model, APE(R)C, was soon manufactured and delivered to the British Rayon Company, the 'R' standing for 'Rayon'. Like the Nicholas, the APEC machines came with a combination of hardware and software advances. The machine deployed the first magnetic rotating drum memory technology - a technology that was the'forerunner of standard memory disks. Booth also made important advances in designing the logic for use in multipliers employed in computer arithmetic units, and which still form the basis of modern design.
Kathleen Britten, who became Booth's wife, was also a mathematician. She did a lot'of the programming for the APEC machines and later wrote a book on programming. In the late 1950s she worked on programming neural networks to help understand animal pattern recognition, and then went on to work on character-recognition software. The APEC machines became the basis for the computers developed by another company, BTM, for commercial applications. BTM, which stands for British Tabulating Machinery, was the main supplier of commercial punched-card data storage and processing systems in Britain. It saw the potential for electronic computers in supplementing - and supplanting - its products and adapted Booth's designs into its HEC (Hollerith Electronic Computer) range.
The HEC became probably the best-selling British computer by the end of the 1950s, and formed the basis of the ICT 1200 range when BTM merged its computer operations with Powers Samas in 1959. The computers of the early 1950s were 'first-generation' machines using electronic valves for switching and delay lines for memory. 1954 and 1955 saw the arrival of the first transistor switches, the introduction of core memories and the development of 'second generation' computers; but no sooner had those technologies gone into commercial use, than they were being declared old technology. The subsequent 'third generation' of computers would depend on integrated circuit transistor technology for both switching and storage; however, this technology would not actually become commercial until the early 1960s.
In fact, the concept of the integrated circuit was not new even to the 1960s. British radar engineer Geoffrey Dummer made a revolutionary claim to a technical conference in the US in May 1952. "With the advent of the transistor and the work on semi-conductors generally, it now seems possible to envisage electronic equipment in a solid block with no connecting wires," he said. "The block may consist of layers of insulating, conducting, rectifying and amplifying materials, the electronic functions being connected directly by cutting out areas of the various layers."
In this short sentence, Dummer introduced the world to the idea of the integrated circuit; unfortunately he lacked the backing to turn it into a reality. He did, however, create a small project at defence and electronics company Plessey to construct a model formed from a block of semi-conductor material that was 'doped' and shaped as four transistors. The concept resurfaced in the US two years later when Jack Kilby patented his ideas for integrated circuits which were turned into commercial reality by Robert Noyce, the founder of Intel.
Dummer later recalled"It seemed so logical. We had been working on smaller and smaller components, improving reliability as well as size reduction. I thought the only way we could ever attain our aim was in the form of a solid block. You then do away with all your contact problems, and you have a small circuit with high reliability."
Dummer did his best to promote the integrated circuit as an important future technology which British industry and government should follow up, but was unsuccessful. "I have attributed it to war-weariness, but that is perhaps an excuse," he said. "The plain fact is that nobody would take the risk. This Ministry wouldn't place a contract because they hadn't an application. The applications people wouldn't say we want it because they had no experience. The Americans took financial gambles, whereas this was very slow in this country."
No patents were registered for what was to prove a world-shattering invention, so when Kilby moved on to the same technological territory it was in effect up for grabs. Not for the last time did the lack of commercial acumen in the British technology industry give away the technological lead - and immensely valuable intellectual property.