A boy with ones and zeros

Computer science education looks at its principles

The IT industry wants students to learn computer science not spreadsheet skills at school. But is it possible to teach long-lasting principles?

Education and computing do not sit well together. Unlike most school subjects, which can proceed more or less unchanged for decades, computing moves fast. In his MacTaggart lecture at the Edinburgh TV Festival last summer in which he lambasted the British education system for failing to teach children proper computer science, Google executive chairman Eric Schmidt, recalled: "To relay information to my first computer I had to punch holes in cards."

It was the same situation for Professor Steve Furber of the University of Manchester who led a Royal Society team that put together a report calling for a radical overhaul of the computer-related teaching UK schoolchildren receive. Furber was a member of the Acorn group that built the BBC Micro used by many children of the 1980s in what seems to be a heyday for computer science in schools before it regressed into teaching basic spreadsheet scrolling skills. Happy experiences with these machines, which were designed to interface with the real world through custom I/O ports provided the inspiration for today's barebones personal micro, the Raspberry Pi.

"It's interesting how the Raspberry Pi has generated so much buzz recently. And there a lot of mentions of the BBC Micro around it," says Furber.

Educational policy is struggling to keep up. There is no shortage of groups that are demanding change. Attempting to influence policy and day-to-day education are efforts such as: e-Skills' Behind the Screen; Next Gen Skills chaired by games industry veteran Ian Livingstone; Computing at School, which is backed by Google and Microsoft Research; and the Royal Society's 'Computing in Schools' report. These groups want, at varying levels, for computer science to be made a central subject in schools.

The Next Gen Skills report written by Livingstone and Alex Hope explained why the games industry wants more graduates suitable for its needs. But there is a warning in the Livingstone report: when trying to meet the needs of business, education can get it badly wrong. In the competition for students and their associated fees, colleges and universities have developed apparently attractive vocational courses that turn out not to be fit for purpose.

"There are already many university courses purporting to provide specialist training for video games and visual effects. But most of these courses are flawed, leaving those graduating from them with poor job prospects," wrote the authors of the Next Gen Skills report.

Although video games production, like any other industry, relies on a mixture of specific skills, core subjects play a vital role. It is these where education seems to be failing the most. "We have found a worrying lack of understanding of the importance of maths, physics and art," the report added.

A review of international computer science courses and qualifications taught by schools conducted by Simon Peyton-Jones of Microsoft Research and colleagues found that many concentrated on specific languages and practice rather than principles.

Changes on the horizon

On the one hand, it's a fairly safe bet that programs written in C++ will still be widespread if not in the majority by the time today's secondary-school intake graduate. But the coming decades are likely to see massive changes in the way computers behave that could have dramatic effects on programming at the language level.

The Pi, based on the same kind of ARM processor as that found in an entry-level smartphone is, like the BBC Micro, a product of its era. By the time the age group at which it is aimed could conceivably have graduated from university, the Pi of its time could look very different. The first-release Pi has one ARM processor core. The Pi2022 is likely to sport several hundred and might not even use the Boolean logic to which today's computer programmers are accustomed.

At the Design Automation and Test in Europe conference in March, Professor Leon Chua of the University of California at Berkeley explained how the axons and synapses of the brain could be represented as memristors, devices that he proposed 40 years ago but which only became practical to make artificially within the past decade.

"The memristor can learn by itself. That's why I say you will see smart machines coming out in the next ten years or so that are truly intelligent and they will be so small," Chua claimed.

Speaking at a conference organised by the Computing at School group last year, Peyton-Jones said: "We are trying to prepare people for a world that doesn't yet exist, requiring technologies that have not yet been invented to solve problems of which we are not yet aware."

The key said Peyton-Jones is not to teach even programming skill per se but "long-lasting, slow-dating disciplines. Skills date far more quickly".

Peyton-Jones described a conversation with a Department for Education official who asked whether there was really a core body of knowledge in computer science that does not change from year to year. "It was a surprise to him that there is," said Peyton-Jones.

At the same meeting, veteran computer science researcher Tony Hoare, who provided the inspiration behind the parallel-programming model of the Inmos Transputer, explained how concepts that date back to the work of Aristotle and Euclid still apply today.

"I think Aristotle was a computer scientist before computers were invented, although I wouldn't recommend teaching Aristotelian logic today.

Hoare added: "Euclid was the designer of the world's first programming language. I'm very interested in Euclid because I have made the idea of writing programs together with their proofs as my research goal throughout my life. I have pursued this ideal without realising that Euclid had already achieved it. So, you begin to see the advantage of teaching principles and ideas that have stood the test of time." 

Stan Williams, who led the team at HP Labs that built the first deliberate demonstration of a memristor and who is now a vice president of the company, has set his sights not so much on artificial brains but on replacing transistor-based logic. The HP Labs researchers have found themselves going back to older forms of logic to try to build a framework in which memristors can operate.

The logic that suits memristors is not Boolean, Williams says. Its roots lie in material implication, developed by Bertrand Russell and is potentially simpler to use although it behaves differently to the form of digital logic today's programmers handle.

"Material implication is computationally complete: you can perform universal computation using memristors," Williams claims, adding that he expects memristors to start replacing transistors for logic later this decade.

Groups such as Computers at School have put together proposed skeleton curricula that put more emphasis on principles, although not necessarily reintroducing the Euclidean geometry that disappeared in the past few decade. There remains a question of whether computer science should be taught separately or subsumed into regular learning.

Computational thinking

Backed by companies such as Google and Microsoft, Jeannette Wing of Carnegie-Mellon University has proposed the idea of teaching 'computational thinking' in which students attack real-world problems using algorithms, although Furber warns that more research is needed to determine whether it will work in practical education.

Conrad Wolfram, cofounder of the maths-software company Wolfram Research, has argued that maths could be taught far better by making more use of computers - students would program functions and graphs to explore how concepts such as calculus work.

"The problem we have got in maths education is not that computers dumb it down. It's that it's dumbed-down already," Wolfram says, claiming the use of computers'make it possible for younger students to attack more practical, relevant problems.

Furber argues there is a need for computer science to remain separate: "Computer science is an academic discipline in its own right. If you diffuse that message and say you don't need computer science as a standalone subject then I think you lose that message. That's not to say you can't do useful things across the curriculum. And it may be that the maths teachers are best placed to talk about algorithms and programming.

"But teaching should take into account that there are other computing systems out there, such as biological ones. It's important that computer science isn't just about the machines you can buy today, it's about where it's going and what might have to say about other systems, such as biology."

Even Chua's thinking memristors use conventional approaches to computation compared to the mathematical leaps that are required to make use of quantum algorithms. Defining the principles that get students on the road to developing those systems will take some work. 

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close