In biology labs around the world, the graduate student or 'lab rat' has long been a synonym for drudge. But the machines have started taking over. Biologists are now using technology for its mechanical doggedness.
"We are now in the parallel age," says Urban Liebel, a researcher at the Karlsruhe Institute of Technology and one of the founders of Acquifer, a company set up to make robotic microscopes to improve the throughput of experiments that need to perform imaging. The microscope makes it possible to perform hundreds of drug tests on species such as zebrafish, which are widely used in biological research, and process the results automatically.
Imaging is a particularly useful tool for biology for a number of reasons. One is the way that genetic changes can affect cell and organism growth and development. Some changes may be indistinguishable from the normal or 'wild type' organism. Others can cause dramatic alterations in size or shape, if they do not kill the cell entirely or, if tagging chemicals are used, the image-processing algorithms can quickly see the results in the form of fluorescent blooms when the right protein is present.
The shift to robotic assistance started at the turn of the millennium – the mapping of the first complete human genome coupled with advances in genetic manipulation changed attitudes to biological research. Sequencing itself has become highly automated, using robots to manipulate samples and apply chemicals that aid the laborious process of breaking up the DNA into measurable segments, analysing the results and then feeding them to a computer to try to piece the jigsaw back together.
For many researchers, cell-level biology is changing from being a qualitative science to a quantitative one. But, to get at the data, biologists need to conduct thousands of individual experiments. Doing as many as they can at once is not just a massive time-saver, it's essential.
Professor Ross King of the Manchester Institute of Biotechnology says: "There is nothing specific about biology that favours the use of robots except there is often a lot more to do in biology. Living organisms are unbelievably complicated machines. They have thousands of genes and thousands of different molecules, all interacting. The experiments you have to do are not that hard to work out but there are lots of experiments to be done."
Chris Bakal, team leader at the London-based Institute of Cancer Research, adds: "DNA sequencing is allowing us to identify many genes and mutations but we still don't know what many of those genes do. We knock out genes in cells one at a time and see the effect of each change on cell shape, but it means imaging millions of different cells."
ICR uses a pair of automated microscopes to study the cells, sending their data to a supercomputer for feature extraction and analysis, together with robots to process the cells and deliver the chemicals used to selectively turn off a selected gene in each one. The result is a massive database that links genes and their protein products to features found in cancers, potentially extending the number of possible targets for future drugs, and combinations of them.
In many of these experiments, robots pick and place specimens under the lens of a microscope or deposit precisely measured amounts of chemical across an array of well plates or test tubes that may be analysed up to a hundred at a time.
Liebel sees clear clear advantages in repeatability to provide not just more consistent results between labs scattered around the world but to bring the experience of advances in microscopy to a wider audience. The problem is that the microscopes themselves are complex to operate. He says: "Conventional instruments are not designed for reproducibility across different labs. Manufacturers have not focused on creating light sources that are stable over thousands of hours because traditionally we've had experts using them. But in Europe we have 6,000 labs alone and just 100 microscope experts. That blocks our [ability to glean] knowledge.
"We are working on something that works more like an iPhone. It requires both biological understanding and robotics understanding. So, it's a team sport. To build machines like this takes not just one discipline but five," Liebel argues.
As well as automating experiments, robots are being deployed to make the biological parts needed for them and even to provide a different perspective on experiment design and have teased out a number of issues with robotic system design.
At the 2008 International Conference on Systems Biology in Gothenburg, Professor Ehud Shapiro of the Weizmann Institute in Tel-Aviv argued that the combination of robotics and a couple of key reactions that have underpinned the world of genetic engineering could yield what he called a 'DNA word processor'. Many experiments call for libraries of DNA sequences that involve small variations but which are substantially the same.
To a large extent, biologists have been able to rely on splicing existing DNA sequences together using naturally occurring enzymes to make many of them. But for experiments that involve larger changes to DNA, they need artificially synthesised sequences, which are expensive especially if a large number of variants are needed. Editing smaller sequences makes more sense. So, the concept for Shapiro's DNA word processor turned into a European Union research project called Cadmad, which is tuning the system to make a machine able to produce DNA libraries on demand for those situations where synthesis is not economically viable.
Sequences can be glued together and pieces chopped out, all coordinated by a computer program. Each step takes time but it is possible to have the robots perform many of the operations in parallel so that completed sequences can be produced in less than a day. Systems like these expose issues of repeatability and noise even with machinery that should be able to perform the same procedure the same way day in, day out.
King says: "Mostly the noise is the problem. Biological systems are incredibly noisy to start with. Even with experiments done by a robot, which are very accurate and very uniform, we would find experiments not working for a week or so at a time. It was hard to figure out why."
The source of the noise is not necessarily what the robot is doing but the source of key chemicals: subtle changes in composition can easily throw the results out of line. Mechanical lab assistants can also be brittle and easily confused compared with their biological counterparts. For example Ginkgo Bioworks, a biotech offshoot of the Massachusetts Institute of Technology, found its chemical-handling robot could be easily fooled by air bubbles when liquid levels ran low. This led to the robot failing to flush pipes correctly, causing errors downstream until new maintenance protocols were introduced.
Tuval Ben Yehezkel, visiting scientist at the Weizmann Institute of Science, says: "In my experience robots are accurate if you optimise and tune them in advance to the exact application you intend to run on them. If you change something, even seemingly harmless things like small changes to chemicals, it may interfere with robot accuracy. Every robotic script has to be tested before it's run and very little if at all should be changed compared to the test run in order to achieve the best results.
The lack of internal sensors to monitor the transfer of liquids can be an issue with today's robot designs, Ben Yehezkel says, adding that he is surprised by "how little traditional, macro-volume – that is, above 1µl – liquid-handling robot technology in the lab has advanced during the past decade".
The need for repeatability and consistent results drove a number of design decisions in the Acquifer microscope. The most important is the use of internal sensors to monitor its performance: "The machine is self-aware," Liebel says.
The team opted to use a linear motor, which Liebel says is unusual for this type of instrument, because of its consistency under computer control. LEDs change colour subtly with rises and falls in temperature, so the microscope uses a Peltier-effect device in concert with the sensors to maintain a consistent level of heat and light output.
Liebel's team found some novel approaches to improving repeatability in experiments involving zebrafish. These are small with a reasonably consistent size and shape, but if they appear in random orientations under the microscope someone either has to indicate that to the computer or hope that the image-processing algorithms can cope.
Three-dimensional printing provided a way of making customised well-plate arrays that let the fish slot into the holes neatly. "You can print these tools for €4 each and create shapes that orient something like a zebrafish. It massively reduces the cost of drug screening," Liebel claims.
Integration is one of the biggest issues in building more sophisticated systems. "There is an increasing amount of robotics but it tends to be isolated to one part, with very little integration between robots," says King. "There is a lack of low-level standards for integration. The manufacturers are shooting themselves in the foot a bit here. With better integration they would have more success."
Ben Yehezkel says the control software needs to improve. At Cadmad they have developed much of their own to take recipes for DNA sequences through to robot-control programs. So far they have focused on using MatLab to build the high-level software and using that to interface to the low-level robot programs. "I would say that user-friendly high-level interfaces for biologists to program robots on a daily basis are the major thing that is lacking," he says. "Typically, nowadays you still need an expert to program an experiment on a liquid-handling robot. This should change. It would make robots much more accessible for biologists."