Unlocking life's secrets
Can computers model the functions of the human body?
The pharmaceutical industry is in a bind. It costs close to $1bn to get just a single treatment to market, more than double the level of the 1980s. But even after expensive trials, drugs such as Vioxx can get out into the marketplace. It was only then that doctors discovered that Vioxx, a treatment for osteoarthritis, could trigger fatal heart attacks.
The problem is unforeseen side-effects. Drug designers focus on producing compounds that target specific molecules in a cell. But the same drugs can interfere with other parts of the body. At present there is no way to work out all the effects a single drug can have. By building a virtual model of the human body, computer researchers hope to overcome the problem and start to develop drugs in the virtual environment. At a conference on systems biology in Tokyo this year, scientists put their names to a declaration in which they agreed to try to build a 'virtual human' as a computer model.
The declaration read: "The time is now ripe to initiate a grand challenge project to create over the next 30 years a comprehensive, molecules-based, multi-scale, computational model of the human ('the virtual human'), capable of simulating and predicting, with a reasonable degree of accuracy, the consequences of most of the perturbations that are relevant to healthcare."
Even for a grand-challenge project such as this, 30 years sounds like a long time. But many involved with these schemes believe it is an ambitious timescale: originally, they were aiming for 2050 as a realistic completion date. Part of the problem with medicine is that the underlying biology is not well understood. Biological textbooks are packed with detail and diagrams that document mind-bogglingly complex cycles of chemical reactions. Yet it is not enough even to begin modelling the body in any detail because the interactions between all the different systems are not well understood.
Professor Jaroslav Stark, director of the Centre for Integrative Systems Biology at Imperial College (CISBIC), says: "Biology is generating vast amounts of data. What it is finding incredibly difficult is to integrate the data into any understanding of how the elements work together, to understand the biological features. "We want to know why the cell reacts in a certain way to a pathogen. The standard approach is that you measure all the components and then build a model. The problem is that [with experiments] you often can't generate the data that you want."
Very often the experiments proceed by selectively turning genes off. The biologist then looks to see what happened with a gene knocked out of action. But the practice has severe limitations. "Very few genes work in isolation. If you want to proceed down that line, you need to start knocking out pairs, triples, ten genes at a time. There are so many combinations that you can't be comprehensive at all," explains Stark.
The danger with the model-centric approach is that it could take decades to build an effective model of any of the big diseases simply because they involve so many different elements of the body. Even with the voluminous amounts of data from biological studies that are available today, there simply is not enough information to build the components that will go into the eventual virtual human.
"I think the answer is to develop models that overcome the limitations of the data. Build models that allow you to discover hidden effects - the things that you don't measure because it isn't practical or you can't measure them," Stark says.
In this approach, the computer acts as a testbed for theories of how a system inside a cell or organ works. Stark reckons that the act of building a model can indicate areas where research can probe interactions and help refine the model instead of trying to uncover all of the secrets of an individual reaction. Stark uses the discovery of Neptune as an analogy. In the early 19th century, astronomers were puzzled by the orbit of Uranus, which did not move as expected. "The quality of data got so good that they suddenly realised that the data didn't fit the model. You could say that the model is just wrong - and they thought about modifying the inverse square law of gravity - or hypothesise that there is another planet pulling on it," explains Stark.
By following the model, it was possible to predict the orbit of the mysterious planet tugging at Uranus, and Neptune was uncovered. "It is an example of a model going wrong then allowing you to predict something that hadn't been observed. Sometimes, and I would say often, the best model is not the one that fits the data but one that has discrepancies."
Not all of the models have to run in silico. Researchers are re-engineering cells to try out theories of how biological systems depend on control loops and feedback. This is what Professor Hans Westerhoff of the Manchester Centre for Integrative Systems Biology calls synthetic systems biology, invoking a phrase borrowed from the late physicist Richard Feynman: "What I cannot build I cannot understand."
Scientists such as Lingchong You at Duke University and Alexander Ninfa of the University of Michigan are trying to replicate the behaviour of feedback and clock circuits found in living tissues by combining genes in new ways and transplanting them into simple bacterial cells. They are, in effect, building genetic circuits from scratch in an attempt to reverse-engineer life. "We are using our designs as platforms to look at the behaviour of biological systems," You says. In trying to transplant new functions into existing cells, the scientists find unexpected interactions that sometimes prevent the circuits from operating. And sometimes the circuits function even though part of the new circuit does not actually work, as You found when trying to put an oscillating circuit into cells of the common laboratory bacterium escherichia coli.
But each of these experiments sheds new light on the behaviour of the cell that informs the design of new computer models. They will also help in designing biological processes that will produce drugs in volume.
However, it may not be necessary to have computers to perform simulations of biological behaviour down to the level of molecules reacting in a cell. Biologists often look at the thousands of chemical reactions mediated by enzymes as forming a large network. The result from one reaction will feed another enzyme and there is often a choice of destinations. This is what gives biological networks their redundancy and living cells their robustness.
If you block one enzyme, for example, there is often an alternative path that a chemical can take. But all networks have weak points. Westerhoff has worked out a way to identify the parts of a network that are badly affected by a change. He reckons this kind of analysis can identify candidates for drug therapies. Westerhoff tried the technique on the trypanosome parasite that causes sleeping sickness in humans. Although there are drugs that can treat the disease, they are highly toxic. An alternative is to attack the weakest link in the metabolism of the parasite, which in the robustness model turned out to be a set of proteins that carry glucose through the parasite's cell wall.
The team found that blocking the glucose transporter stopped the parasite in its tracks. Although the parasite's biological network shifted in response, the glucose transporter link turned out to have a weakness for a reason: it forms a key part of trypanosome's lifecycle.
"The trypanosome is fooled into thinking about its life at that point. When the transport inhibitor hits, the trypanosome feels as though it is in a low-glucose environment when it is actually sitting in high glucose. That is the way it feels when the trypanosome goes into the tsetse fly," Westerhoff explains. Westerhoff's next step is to determine whether the drugs identified by his model will work on living cells. That, at least, is how the analysis has to be performed today. As models of the body improve, even that work will move over to the computer, with live tests only being performed once the software has determined how effective and how toxic the new drug might be.
Westerhoff believes that these systems-level analyses, which do not need complex models to be run on supercomputers, could provide valuable clues for drug designers. "Drug engineers would engineer systems at the most fragile step you would think," he says, but that is not necessarily the case, especially in diseases such as cancer where research has focused on genes found to trigger tumours.
The problem is that such genes are needed for healthy cells as well. A robustness analysis for one of these situations shows that the tumour results from the gene becoming too active and is actually one of the most robust parts of the tumour system. "The tumour is less fragile with respect to this step," says Westerhoff. However, if that step is more robust in the tumour, the chances are that other parts of the cell became more fragile.
"The oncogene itself is not the right target. But, using systems biology you should be able to figure out what the target should be," Westerhoff claims. Westerhoff cautions that robustness analysis is far from being a general-purpose tool for predicting cell behaviour. You cannot simply calculate a robustness score for the new system by adding together the effects of a set of changes using his model.
The danger is that knocking out a bunch of genes will change the whole system radically enough for it to be much more robust than expected. But for drugs that target a single point in the system, it means a lot more work can be done in the computer to pinpoint steps that can be targeted by drug designers long before any experiments have to be done in the lab.
As science proceeds towards the virtual human over the next 30 years, we are likely to see more of these theories spin off and help drive the engineering of better medicines.