Modelling technology transforms design down to education
From elections to medicine, networks to roads, computer modelling and simulation changes how we analyse innovation
The Solar Energy Research Institute uses live data to predict how much energy photovoltaic panels will capture
Float an idea: Olin College’s Sailbot project was developed using simulation tools to hone scientific techniques
Statistical forecaster: Nate Silver’s presidential election forecast got closer than most
The issue for companies such as Jaguar Land Rover now is to find people who are experienced in model-based design
Apposite’s Linktropy can emulate bandwidth, latency, congestion
Computer modelling, simulation and emulation technique is changing the way we understand the world as well as the control strategy innovations that underpin the systems we use.
Hours after the polls closed in the US and the results came in from electoral colleges in key swing states, it became clear that Barack Obama would claim a second term as President. Although billed as a neck-and-neck race, the results more or less matched those predicted by a computer model built by statistician Nathaniel 'Nate' Silver.
The model, updated by regular polls, was designed to predict how public opinion would translate into the popular vote, and ultimately into the Electoral College decisions that determine who would be the next American leader.
Silver's model, even after Barack Obama's nervous performance in the first debate, and amid the overwhelming opinion of it being a neck-and-neck race with Republican Mitt Romney, consistently predicted a comfortable victory for the incumbent. It demonstrated that the computer model, which had predicted previous election outcomes accurately, was still on form – some 60 years after the first use of computer technology to foretell electoral outcomes in 1952.
As Romney was conceding late on the first Tuesday in November, the results of a much larger software model were kicking into action. Trucks were rolling out from Tesco's huge warehouses, in the process of delivering tonnes of foodstuffs to the supermarket chain's 3,000 outlets around the UK, their contents determined by the output of a statistical model that takes into account the local weather, local buying habits, and which day it is.
"If you can't buy strawberries in your local store on a given day, it is probably our fault," admits Duncan Apthorpe, manager of the supply chain systems development programme at the giant retailer. "Our job is to improve the systems that figure out how much stock to deliver to each Tesco branch overnight."
To work out how to ship stock from its warehouses to close to 3,000 stores across the UK, Tesco employs two IBM mainframes and a Teradata data warehouse system that stores billions of transactions. "It pretty much contains a complete record of what we have done," says Apthorpe. His team has extracted some of the data built-up over years to look at shopper data, combining that with historic weather data to work out how it causes change in purchasing behaviour. Some of the conclusions are not unexpected – as temperatures rise so do sales of barbecue steaks and salad items.
"If you just take a 10°C rise in temperature, you are looking at a 300 per cent rise in sales of barbecue meat, but a quarter fewer Brussels sprouts," Apthorpe reckons. But hidden within the data are more subtle phenomena, such as the "first hot weekend effect" – salad sales on two consecutive weekends tend to be higher on the first even though these two weekends saw temperatures peak at the same level. "It is only in the past year that we've got onto this effect," reports Apthorpe.
The core of the system is a regression model built in Mathworks' Matlab of weather and sales to work out what changes make consumers change behaviour. "Temperature and sunshine give the strongest effects," Apthorpe explains.
Radar weather prediction
While Tesco is restocking its stores overnight based on the weather forecast, halfway around the world Andre Nobre and colleagues at the Solar Energy Research Institute of Singapore (SERIS) are using live data collected from weather stations to drive a model that predicts how much energy photovoltaic panels across the island will be able to capture over the coming hours. The results are visualised using National Instruments' DAIdem data management and processing software.
Doppler radar can capture clouds as they head towards the island. The model uses that information to work out how solar yields will fall as the clouds pass over different PV installations on the island. In 2013, the radar data will be backed up with real-time measurements by more than 70 tiny embedded computers scattered across Singapore, which phone in their readings using the 3G wireless network.
Nobre explains that the model can be used to inform the utilities of what they need to do: "We can say, 'In two hours there will be no solar irradiance on the other side of the island. Be ready: you can't count on solar in two hours'." Models provide a way of analysing what is happening inside complex systems, particularly those where it is impractical to carry out experiments against control groups because they are too big or unethical.
"Biological systems are complex but typically we can't measure what's going on inside them. You can't open someone up and put a sensor in their heart to see what is happening. So we have to build models," explains Professor David Gavaghan of the University of Oxford, a member of a group pushing the use of mathematical modelling into biological and medical research.
"The real problem with biological systems is that processes at different scales are all intertwined. A single discrete effect might have a really dramatic effect," continues Gavaghan, pointing to the way that the human heart relies on coordination between many different muscle cells synchronised by electrochemical signals. Disrupt those signals momentarily and things can suddenly go very wrong.
"If you get hit by an ice hockey puck at the wrong point of the cycle, it will kill you. It's why they have defibrillators at the side of ice hockey rinks," Gavaghan adds. "Pharmaceutical companies are interested in this work because cardio toxicity is responsible for the majority of drug failures in clinical trials." The aim is to use computational models of the heart running on high-speed computers to assess, based on cell-level experiments, how the organ will react to the drug long before it becomes a candidate for in vivo trials.
Because of this type of work, mathematical modelling is likely to have a dramatic effect the way biology is conducted, Gavaghan argues: "At school, you used to do biology because you didn't want to do physics or maths. That's going to change."
Uncovering complex secrets
In mainstream engineering, modelling is becoming a central part of the development process. Although engineers have more latitude to peer deep into complex electrical and mechanical systems, some of them are so complex that computer models are needed to make sense of them. Even the combustion engine, which is based on a conceptually simple cycle of events, still holds many secrets.
For decades, engineers have been able to work around the hidden details because they were comparatively unimportant to reliable operation. The quest for fuel efficiency demands that the reasons for what used to be rare problems are uncovered. Manufacturers are working to 'downsize' engines in a bid to make them more efficient in everyday use. By moving to smaller engines, it is possible to keep them in their region of highest efficiency over a greater proportion of their time on the road. But smaller engines lack power when higher performance is needed – which is not popular with drivers.
One answer is to move to greater levels of turbocharging; but, in diesel engines, the result can be premature ignition that sometimes results in catastrophic 'mega-knocks'. This is where the pressure inside the cylinder suddenly soars to 300bar because the fuel ignites prematurely.
Research establishments such as Brunel University have invested in 'glass' single-cylinder engines using lasers to probe the chemical reactions in real-time but with the aim of building better mathematical models of the processes inside the engine. From those they will develop control strategies that prevent the conditions that lead to mega-knock forming.
Testing of control strategies involves the extensive use of hardware-in-the-loop systems driven by technical computing software, such as MathWorks' Simulink or LabView from National Instruments (NI), to ensure that the models reflect reality. It is easy for hunches about the way a system operates to go badly awry when presented with unexpected real-world data. Chris Washington, senior product manager at National Instruments, says researchers are using programmable hardware and not just microprocessors to provide faster response times. Experimental vehicles fitted with racks of computing equipment and programmable logic are being driven in different environments to gauge the impact of different control strategies.
"You can measure the piston with microsecond accuracy and uses those measurements to initiate fueling tasks – I/O modules allow us to control the fuel injectors directly," Washington explains. "You can switch strategies for individual cylinders and see the changes."
Simulating the network
Hardware-in-the-loop simulations can be used to troubleshoot existing systems as well as design them. Apposite Technologies, for example, sells hardware that emulates long-distance connections, such as those that pass through satellite networks.
"What we are doing is replicating the conditions of the actual network so that users can test application performance in the lab. If a customer calls up a vendor and says their product isn't working properly over the customer's satellite network, the vendor needs a way to replicate a satellite network to troubleshoot the issue," explains DC Palter, president of Apposite. "They'll use our device to simulate the connection between the different locations, insert that between a live client and server, and see how the actual application runs. This sort of testing can be set up and run in a matter of minutes."
One aspect of model-based design that James Truchard, head of National Instruments, argues is that it relieves developers from having to use low-level languages such as C++ or Java to build functional software when the main aim is simply getting an algorithm to work as intended.
"Complexity is putting tremendous demands on designers and testers. Need a higher level of abstraction and to integrate those abstractions into the design process," declared Truchard at the recent NI Week conference in the company's home city of Austin, Texas.
Jos Martin, principal software engineer at MathWorks, speaking at the company's UK conference in November 2012, said tools such as Matlab "serve as the experimenter's interface. They can continually develop new behaviours and conduct more complex experiments".
Martin adds: "For product development, taking some of these innovations and bringing them into a product you are no longer talking about individuals but teams. A good example of this is the Chevrolet Volt." This is Chevrolet's electric car with claimed extended range of up to 300 miles.
At the same time, educators are beginning to take the view that traditional science and engineering syllabi are inadequate. Ray Almgren, vice president of academic relations at National Instruments, said at NI Week: "Somewhere between dropping our kids at kindergarten and sending them to college we manage to extinguish their interest in science and engineering."
Simulation provides a way to engage students in what scientific techniques can do rather than focusing too much on abstract constructs. Dave Barrett, an associate professor at Massachusetts-based Olin College, says traditional teaching methods for concepts such as calculus boil down to: "Trust us, one day it will be useful. Students sit there thinking 'why am I learning this?'. This approach does not work well."
A lot of the work revolves around model-based design. One project on which Bennett worked was for an autonomous sailing boat, or sailbot, that was entered for a championship held in Vancouver earlier in 2012 within just a few months. Olin student Jaime McCandless says: "We based our strategy around parallelising team work as much as possible."
Jason Curtis, another student on the team, adds: "We divided into three separate teams: a mechanical team, an electrical team and a coding team."
Using models of the mechanical and electrical system, the coding team was able to test control software for the sailbot before any hardware was ready, says McCandless. "Thanks to this, the very first time the physical boat set sail, she could sail autonomously."
Although some university departments, such as electrical engineering at the University of Manchester and mechanical engineering at the University of Leeds, have embraced similar approaches, this is a slow-moving trend.
Tom Lee, chief education officer at Quansar, a company that develops hardware-in-the-loop tools for education: "There are a lot of professors who believe that rigorous theory is a must for engineering."
Modeling does not conflict with the idea of teaching rigorous theory. People such as Mathematica chief designer Stephen Wolfram believe software modeling can bring even the abstract concepts to life. Not using a computer to model a problem may seem as bizarre in a decade as the idea of throwing away the slide rule was in the past century.
New car design drives ahead with modelling
Graphical models provide a way of communicating design intent more easily than raw software code. Chevrolet's parent company General Motors has taken models developed for the electric car and used them for other vehicles, such as software that uses multiple cameras positioned around the vehicle to detect potential hazards.
"At GM, the aim is to create global software product line. At the heart of that library is a set of Simulink models that are very large. The library was created by hundreds of engineers and consumed by hundreds more," says Jos Martin, principal software engineer at MathWorks, speaking at the company's UK conference last November.
Reuse is not the only driver for high-level software models in automotive design. Andy Richardson, head of simulation at UK car engineering and manufacturing firm Jaguar Land Rover, says: "Business efficiency is a really key factor. Getting it right first time is crucial to this. We have to reduce the amount of physical testing that we do. When you look at the costs that can be involved, the costs of late changes can be disastrous. Very late changes can run to tens of millions of pounds. A single prototype vehicle costs half a million pounds; we don't want to build too many of those."
Software-intensive modelling avoids the need to build a complete prototype until the design is very close to production and full road tests become vital. The issue for companies such as Jaguar Land Rover now is to find people who are experienced in model-based design. "We have linked with education to build training modules so we could ensure that we have the people to take systems development forward," says Richardson. "Systems engineering underpins the development of new products and underpins how we do simulation."
Predictive modelling of invented WAN traffic
Back in the 1990s, Sun Microsystems' favourite slogan was "the network is the computer" (credited to John Burdette Gage, employee #21). Even today that's only partially true, but highly distributed processing is set to become the norm.
The European Commission sees much greater use of real-time data from smart sensors sitting alongside roads or in the home, feeding back data into smart grids. Crunched by computers distributed around the network, the data will make it possible to advise drivers on how to avoid congestion and tell home appliances when energy is cheaper. Rolf Riemenschneider, research programme officer for embedded systems and control at the European Commission, says: "The world becomes a global system of systems." The result will be not just an increase in data traffic but time-sensitive data on traffic movements could itself be badly affected by network congestion.
For example, the CLEVER project team, comprising people from the University of Bristol, Clamart, and EDF Energy, have run simulations for energy smart grids on a massive parallel to look at how small changes in protocols affect huge networks. The smart-grid model uses a combination of 'actors' – synthetic models of individual units in the network that can be replicated many times in the memory of the parallel computer. The individual models often use techniques such as queueing theory to control how data moves through the network. Similar approaches have been taken by widely used research simulators such as GTNetS, developed at the Georgia Institute of Technology and the commercial tool OPNet, which is in the process of being acquired by WAN optimiser Riverbed.
Meanwhile, the need to plan for the trafficking of big data sets across the wide area creates opportunities for companies like Apposite Technologies: its network emulation appliances simulate network attributes such as bandwidth, latency, jitter, packet loss, congestion and other 'like impairements'. Such tools are used by carriers provisioning network build-out, and large users (big automotive makers, say) that need to plan for large files to be moved around R&D sites around the world.
|To start a discussion topic about this article, please log in or register.|
"Even the smallest of creatures in the most far-flung places around the world are getting wired up for tracking"
- What to Specialise in Electronics Engineering?? [03:02 am 03/04/14]
- Britain to have just one remaining coal pit by the end of 2015 [01:11 am 03/04/14]
- LV Generator Star point earthing - UK [08:35 pm 02/04/14]
- East West Rail - the Oxford to Bedford route [07:33 pm 02/04/14]
- Small nuclear power [06:06 pm 02/04/14]
The essential source of engineering products and suppliers.
Tune into our latest podcast