Meet the one and only professor of the art of computer progamming, algorithmic artist Don Knuth
“Some of the things people have to do now are pretty boring,” Knuth at this year’s prestigious IET/BCS Turing Lecture at Savoy Place last Tuesday (1 February). “If 99 per cent of your job is just to figure-out how to make library calls and find out what order the parameters come, I would not want my grandchildren to have to do that.”
Knuth wants programmers to be more literate, and invented a style in which the code is written in such a way that it documents itself for the benefit of anyone who reads it. With “literate programming”, the algorithms captured in each program tell their own stories, becoming pieces of art that can be studied as much as executed. In his multi-volume and ongoing work ‘The Art of Computer Programming’, Knuth continues to collect and document algorithms for study and reworking into new programs. Literate programming, in principle, makes it possible for any piece of source code to become part of a canon of computer programming.
“I am not so much a fan of reusable software as I am of re-editable software,” Knuth says. “Algorithms are not made to be packaged into black boxes, but kept readable so that anyone using the code can understand how it is meant to work.”
Knuth explained in an exclusive interview with E&T before his public session, “You don't realise how a non-literate program affects the form of the program that comes out. When you write a subroutine one thing you have to do is check whether the parameters make sense. If not you have to have some form of error recovery.” The trouble with most subroutine code is, he explains, is that that there is “more code devoted to recovering from errors than real code. That feels wrong”.
Literate programming does not do away with the parameter checks entirely, they are moved off to a block that is out of the way. “With literate programming, you write a routine and then write a line that says ‘check the parameters for accuracy’,” Knuth explains. “The idea of literate programming is to present a program in a form that a human being can understand. It is a bunch of modules with maybe a dozen lines of code in each. Each has a point that can be described in English, and formally. Tools can then extract a hypertext document that a human can read or something that a compiler can read.”
With literate programming, the actual language chosen to write code matters less. Knuth explains he has just one criterion: “I will write in any language that is supported by a good debugger.”
But, despite being introduced many years ago, literate programming has not attracted the esteem to match that of Knuth’s books. “Not enough people have caught onto the idea of literate programming yet,” he admits. “I introduced literate programming when lectured in London in 1982. I specifically chose England because I didn’t think Americans knew how to write a literate document. So, I decided I would go to England and talk about literate programming.”
He continues: “Today, tens of thousands people have got it, but it's still a minority interest. The alternatives are good enough to get by with and there is a lot of peer pressure to stick with those methods. There is one program that I wrote that I would not have been able to do with another style of programming. But most programs are not of that intellectual complexity. Ninety per cent of code can be written without literate programming.”
Packaged into black-box modules, programmers do not have to wade through the parameter-checking code to find an algorithm inside to build ever-larger systems. “Algorithms are not as much of the total action as they used to be,” Knuth reflects. “I believe there is more need for algorithms than ever; but everything else is growing at a much faster rate.”
In recent years, because of the shift in focus in programming, Knuth feels more alignment with mathematics than computer science: his ‘Art of Computer Programming’ is published by the American Mathematical Society: “I read maths journals, and see a lot of stuff that I have to put in my computer programming book.”
“In the 1960s, I found myself attending a mathematics lecture, and saying ‘so what, so what, so what’. Mathematics was becoming so detached. It was so involved in generalisations that the mathematicians lost their way. That has now corrected itself. They are making mathematics more constructive and I find that enriching,” Knuth says, adding that he feels that computer science has now headed into more abstract realms itself.
Separating algorithms from computer science may be no bad thing. “An algorithm is a good way to learn something,” he maintains. “If you think about how to write a program for something, it is a really good exercise. Programming is a great discipline for figuring out what you don’t understand.”
A keen musician, Knuth sees a clear connection between algorithms and the constraints that composers embrace. “I am a big believer in the use of constraints in art,” he says. “Musicians embrace constraints.”
Classical counterpoint is often expressed as set of rules that govern how different melody lines relate to each other. A sonata applies a further set of rules that govern how different parts of the music are organised. “When you have constraints, such as those in sonata form, it brings out your creative side,” says Knuth, pointing to movements such as the loose French collective of artists and mathematics, Ouvroir de Littérature Potentielle (Oulipo). “Georges Perec’s La Vie Mode d’Emploi has constraints for every chapter.”
Says Knuth, “I could also use a computer and algorithms to help me understand the constraints for a piece of music, such as a sonata in 7/4 time.”
The constraint of time led to Knuth’s creation of the TEX software for typesetting. Realising that it would take him years to complete ‘The Art of Computer Programming’, he wanted to be sure that the look and feel of the last volume would match the first. And so he developed a typesetting system that would provide him with that control. Because it contains formatting instructions designed to ease the display of mathematical formulae, the system and derivatives such as LaTEX are now used widely by scientists.
Although he believes algorithms can help elucidate the world in which we live, Knuth believes that the ability to formulate them is not common in people. Asked whether computer scientists are born rather than made, he responds: “I believe that is true. I don’t know whether the next generation will have more geeks per hundred than now – it’s possible that things happen in the early years that greatly affect that – but throughout my life I’ve noticed that the number has been one out of fifty; amazingly so at one point.
“It was in the late 1970s that I was talking to someone at the University of Illinois,” Knuth continues. “I asked, how many grad students are there in total? The answer was 11,000. Computing science majors? 220. The other 49 are also important. They have their own contribution to make. We will get people who are the bridges between the different approaches.”
According to Knuth, to design good software, “you have to understand how people will use it. To build a good tool for programmers, you have to be sure a programmer will understand it. When it came to typography, we had people who were good at art and letterforms and those who were good at programming. Each group was sympathetic to the needs of the other.”
However, he sees his multivolume opus as being for the dedicated programmer rather than as a cookbook of algorithms for the dabbler. “Someone has to write a book that isn’t for dummies,” says Knuth.
Although the open source movement is preserving a large body of software that can be analysed by future students of computer science, Knuth is concerned at the industry’s lost heritage. “Code is one of the greatest artefacts. People have been pretty good at preserving parts of computers, but bad at saving the source code.”
Knuth points to efforts by organisations such as the Computer History Museum in Mountain View, California to redress the balance. “The museum has been digitising a lot of the early materials. They got the code from Mr [Edsger] Dijkstra for the THE 8x operating systems, something he wrote before he considered ‘goto’ statements harmful. The downside is that the optical character recognition software isn’t good enough to recognise the code accurately. It will take dedicated people to fix that.”
What is important to him is that people preserve the art of the algorithm in the face of moves to make software more of a cut-and-paste exercise. Knuth has some sympathy with another band of artists who learned to deal with the constraints of the technology of their time: “When sound came to movies, the people who were making silent films felt it was terrible that their art was going away.”
The 2011 BCS/IET Turing Lecture, ‘An Evening With Don Knuth’ continues 8 February 2011 in Manchester, 10 February 2011 in Glasgow.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.