Opinion - First person
Martyn Thomas, consultant and expert witness and James Hayes, IT editor, offer their views on current issues.
We have the technology
IT problems are in the news again. This month it's the cancellation of C-Nomis, the system at the heart of the UK Department of Justice's National Offender Management System, after costs had nearly doubled.
According to the Guardian, justice secretary Jack Straw is frustrated that so many people in both public and private sectors were taken in by "snake oil salesmen" and not very competent IT contractors. Meanwhile, Joe Harley, CIO at the Department of Work and Pensions, said: "Today, only 30 per cent of our [IT] projects are successful. We need to up the quality of what we do at a reduced cost of doing so."
This is not just a UK problem - such failures are reported around the world - but the constant stream of bad news is damaging and frustrating. Damaging because the failures waste money, and delay the introduction of improved services; frustrating because the stories obscure the successes, and because the failures are largely avoidable. The IT sector panel (ITSP) of the IET advocates a three-pronged approach to the introduction of new IT, the first rule of which is to keep it simple. Every engineer knows that trying to build the first or the biggest of anything is a risk, and that reliable large systems have usually evolved from reliable small systems.
This means being modest and breaking the application into useful, freestanding components that can be implemented independently. If you want a nationwide or enterprise-wide information system, build it out of local or departmental systems that provide a useful service locally and can share data to give the broader service when required. Focus on the essentials; if you get the architecture right, you will be able to add the 'nice to have' later. The sector panel's second recommendation is to take a holistic view, adopting the mantra "every IT project is really a business change project that happens to need IT". You soon realise that the biggest costs and risks lie in getting the business changes right, agreed and implemented, not in the technology.
This realisation leads you to budget properly for the changes - for the planning and negotiation, the interfaces, the prototyping, the staff changes, relocation and training - as well as for the technology. And it leads you to manage the whole implementation project. Above all, it makes it more likely that you will get the IT requirements right, before you lock yourself into an implementation contract, or select an inappropriate technical solution. Finally, we believe you should use an architect. Turning user requirements into an unambiguous technical specification is a specialist task, and one that is likely to involve making some high-level design decisions. Civil engineers have a similar problem, which they have solved through a two-stage procurement process. First the customer employs an architect, who works with the users to reveal all the requirements. When this is agreed, the architect helps the customer select the implementation team and manage the cost and timescales. This approach has been recommended for complex IT systems by the Royal Academy of Engineering. So don't let the headlines get you down. IT has transformed the world in the past 50 years, and this is only the beginning. We have the technology!
L Martyn Thomas CBE FREng FIET is a member of the IET's IT Sector Panel
Short, sharp cure for data anxiety
In this era of ubiquitous computing, you could be forgiven for thinking that technophobia - scourge of workplace digitalisation in the 1980s and 1990s - has become a thing of the past. With everyone from cab drivers to silver surfers clamouring for cool web tools and PDAs, a fear of computers appears to have been a transitory aberration. If anything, technophilia seems mandatory as we lug our lappies from hotspot to hotspot.
But although most people seem comfortable with computer technology (within their own comfort level of proficiency), a new affliction has loomed in the form of 'data anxiety disorder' (DAO). This is a condition in which users of enterprise IT systems become irrationally fixated with the notion that no matter how many copies and backups they have of their critical (and non-critical) data, they won't be able to find it again when they need it. As a result, they habitually make multiple 'spare' copies in order to douse the pangs of DAO. We've all been there. At various stages of drafting even relatively unimportant documents we'll make interim copies 'just in case'. How many of us bother to delete these copies when the document has been completed and submitted? Hundreds of these useless copies will be included in the backup of the central server that takes place over night. If the documents in question happen to be big PDFs or PowerPoint files, then terabytes of rubbish data end-up getting interred in storage and backup systems. Worse, they hamper the rapid restoration of truly critical data if needed. Data anxiety is, arguably, a condition borne of a more overarching überzeitgeist - 'data democratisation'. Employees in many organisations are now pretty well able to generate as much data as they want - text and images - and are under no obligation to then 'own' that data or take responsibility for its effects on the enterprise IT systems. Data democratisation means that users of enterprise systems expect the freedom to generate volumes of data as and when they wish. Frantic IT departments may have the occasional swoop on obvious transgressors - staffers who put 100 holiday photos (@ 2Mb per snap) on the shared drive for colleagues to enjoy, and then forget to delete them afterwards - but it's a futile gesture. IT departments are overstretched and have more important things to do with their time then go trawling through the thousands of volumes searching for superfluous data to purge. An alternative course of action - and one that is being increasingly mooted among IT professionals - is that like having a cap on email Inboxes, rank-and-file users should have limits placed on the amount of standard data they are allowed to generate over a given period or project. The Web 2.0 generation of knowledge workers now being recruited will wail and whinge at the prospect of such constraints on their work patterns, while meanwhile the words 'Welcome to the real world' pop up on their screen savers.