Time to market
The same technology that drives many of the most advanced computerised trading systems is on its way to a computer near you.
When the open-outcry system disappeared from the stock and commodities trading halls around London, it opened the door to a new generation of computer systems.
Instead of relying on the ability of traders in brightly coloured blazers to hear shouts of good and bad deals on the floor, it fell to computers to pull in the many offers, and work out when it was time to buy and time to sell. The traders started to become overseers of the technology, deciding on the rules, rather than doing the deals themselves.
The key to the trading systems is complex event processing (CEP): a technique that provides a computer with the ability to look at hundreds of near simultaneous actions and work out what to do. Because it is derived from a style of design called event-driven architecture (EDA), some believe that CEP is a technology that can be used well beyond finance.
Matt Davies, director of product marketing at BEA Systems in Europe, the Middle East and Africa, said the two often come together, "but there are cases where you need just one of them".
It is typically the EDA piece that people generally need, not necessarily the CEP part. The difference is one of volume. "CEP is as much about dealing with the volume of requests as applying the rules," Davies adds.
Petri Arola, senior manager at Detica - a company that implements financial systems for banks - explained why CEP tools are needed. "The event itself is not necessarily so complex," explains Arola. "It may be an order or a trade. They are not in themselves complex things. But each event happens in the context of other events. This event happens after this event, and within the timeframe of another event. What you need to do is entirely dependent on the context."
"What you want to be able to do is take that stream, and maybe multiple streams, and look for time-based patterns within them," explains Giles Nelson, director of technology at Progress Software. "Maybe three credit-card transactions within a period of time; maybe that is more than the historical norm, so you send an alert, maybe to the owner, and ask what is going on."
Why is different software needed for this type of task?
"Traditionally, it was hard to deal with this kind of moving window, especially where you have a large number of events moving through," Arola says. "The complexity and amount of data depend on the type of algorithm you are running. But some of the algorithms are very complex. You need full order-book depth from all the trading venues and you need to be able to process data from a lot of different pricing feeds."
What a CEP package does is perform pattern recognition on the incoming events and filter them so that only aggregate events demand further processing. And it copes with high volumes of events, such as the many deals made in a stock-trading system.
"In something like foreign-exchange trading, the event processing is not rocket science, but the volumes are extremely high, and you need to respond extremely quickly," Arola points out. As Nelson says, one of the benefits of CEP is that you can combine streams of events. An example comes from the world of telecommunications, another base of users of the technology. "If you are operating a network, you want to see bottlenecks before customers get affected. Look at the low-level network traffic: is the bandwidth what you are expecting? You can combine that now with other information. Are we delivering the number of short messages that we were expecting? Are we delivering MMS messages? Are they all working together?" These are all vital questions.
The ability to take in many different sources is something that Detica aims to take advantage of in a major project for the Financial Services Authority (FSA).
Charged with regulating financial institutions, the FSA wants a system to watch for signs of abuse in the money markets.
"The FSA project is a big, big win for us where we are using CEP technology," says Arola. The problem with the FSA is the amount of incoming data that it has to process. "The volumes are very high: the FSA needs order-book data to detect certain types of abuse. The approach we are using is to take algorithmic trading technology and turn it on its side."
Arola adds: "We already use the platform to create trading signals. We use the same system to create detection signals. However, it is a bigger problem than a more focused algorithm would be because we have to look across products and at the underlying derivatives. You have to have a whole view. That entails some complexity."
But CEP is the only technology that currently has the answer to the analytical problems that need to be solved, "although we are also using data mining and other tools", Arola reveals.
There are now a number of CEP products on the market, some using proprietary declarative languages to express the rules.
BEA Systems has gone with a Java-based approach. "There is even an Open Source initiative looking at a CEP engine," says Nelson.
"There is a set of tools that goes with this," Davies says. These tools are needed to capture and express the rules to the runtime engine and do so in a way that does not demand absolute specificity. "You can capture catch-all situations: say 'this is outside the normal boundaries by 15- to 20 per cent', and pass it onto a node where there is an extra check on the data, for example."
"What is common to all the offerings is that you have a CEP engine. It takes in event streams of data and uses analytics to correlate within those streams," explains Nelson.
CEP and SOA
Financial institutions are "pushing the envelope and require more processing capability, but without requiring exponential increase in hardware costs", writes David Chappell, VP and chief technologist for SOA at Oracle. "The growth of 'extreme transaction processing' (XTP) in areas such as fraud detection, risk computation, and stock trade resolution are pushing current solutions such as those based on the mainframe to the limit," Chappell says in his blog. "These new applications require a new computing paradigm. We're seeing that SOA coupled with XTP is the future for financial services infrastructure as the means to achieve these goals that are often perceived as conflicting."
XTP pertains to a certain class of applications that need to handle large volumes of data that need to be absorbed, correlated, and acted upon. Chappell adds: "Typically, the data that is processed by XTP applications comes in the form of large numbers of events, and represents data that changes frequently. XTP-style applications require that transactions and computations occur in memory and do not incur a heavy reliance on conventional back end systems due to the need for extremely fast response rates while still maintaining transactional integrity. "
The high volume of events triggered by trades introduces a level of complexity in itself that demands a specialised engine. Arola explains: "When you increase volume and you want to decrease latency, something always gets more complex and harder to do. The tools now help you do more complex processing and help you get the same experience as you would get in conventional message-oriented architectures.
However, you do have to be diligent about how you manage the events and where they go. To some degree, it is a question of the choice of platform and how well built and robust the platform is, and how skilled the developers are in making use of that platform."
Arola continues: "The platforms are quite proprietary. You can develop in Java. But it is often better working in their own language. However, there is a cost to developer productivity because you have to learn some new development language to make use of these tools."
Although finance has embraced CEP as a technology, it is not used everywhere.
"There is one area of financial services where I have yet to come across a requirement for CEP. That is in insurance. I have yet to see a use-case that is compelling for CEP there, although that may change," Nelson reckons. "The industry has changed and consumer aggregation sites have changed how insurance is priced. The speed of that market has only increased, so I would say that in five years time there will be deployments. But, right now, you don't really need that because you can do what you need to do in a batch-oriented way."
With an event-driven architecture, there are elements within it that are applicable to everybody, Nelson continues, but with CEP "we will have to wait and see. It is not applicable to every situation".
Technologies such as radio-frequency identification (RFID) are likely to drive CEP in distribution and logistics. "There might not be a lot of it, but RFID can provide very timely data," says Nelson. "If you don't process it and use the value now, what is the point in capturing it?"
There are hazards in trying to process large volumes of events. One issue lies in fraud detection.
"If you look at anti-money laundering, the big problem for departments that look after that is that the systems generate too many false positives," according to Arola. However, using the right tools should help cut down on the false positives and identify the right ones more often, or at least significantly reduce the amount of manual interaction needed.
Davies said banks will need to make greater use of the technology to deal with problems such as criminal activity. "Everything will happen a lot quicker. Banks will need to automate things such as anti-money laundering to let it run more quickly."
Is CEP useful everywhere? It has spread out from the financial sphere into telecom and even virtual worlds. The US mortgage provider Sallie Mae, for instance, has adopted the technology for spotting potentially fraudulent student-loan applications. Games maker Bioware bought into CEP to look out for unwanted game behaviour in massively-multiplayer online games the company is developing. One problem is veteran gamers triggering so-called 'Easter eggs' - surprises buried in the game by the developers - and disrupting the play for other, less experienced players.
CEP can be used to analyse who triggered the Easter eggs to see if there is a pattern: are they doing it repeatedly to get an unfair advantage over others? Linden Labs, the creator of Second Life, has bought a CEP package to help its operations team deal with the huge number of virtual-world events that its grid of computers generates.
As Progress Software's Giles Nelson observes, "the market is nascent. We are just seeing the beginnings. At the moment, it is being driven by specific areas: RFID; finance; telecom. But in a few years, CEP will be much more mainstream".
Oracle's Chappell sees that typical use cases for using an SOA grid to build next generation applications are in the areas of fraud detection, trade resolution, and risk management calculations away from the mainframe to low cost commodity hardware. "When talking about this concept, I often get the question, or the observation, that CEP is positioned to solve some of the same thorny problems of tracking business exceptions such as what is required for fraud detection," Chappell says.
"Actually, XTP is tied to CEP in that they are both about consuming and correlating large amounts of event data and doing something meaningful with it."
However, often the amount of event data that needs to be captured and processed far exceeds the capacity for conventional storage mechanisms, warns Chappell: "In the words of some of our customers, sometimes 'there just isn't a disk that can spin fast enough' for the volumes of data that needs to be processed."