GOOGLE ADS

Saturday, July 31, 2010

Processing Complex Events (During these, oh well, Complex Times)

The worn-out saying about how we learn new things every day applies to this blog topic too. Namely, my interest in Progress Software Corporation has long been due to its renowned OpenEdge development platform. Indeed, many enterprise resource planning (ERP) and other applications providers leverage (embed) OpenEdge as Progress Software partners. Sure, I also follow and have recently written about the company’s forays in the service-oriented architecture (SOA) space with its two respective offerings: Actional for web services management and Sonic for enterprise service bus (ESB) and messaging.

But in late 2007, out of mere courtesy, I accepted a briefing about Progress Apama, the company’s platform for complex event processing (CEP), algorithmic trading, and whatnot. Given the overwhelming nature (“rocket science” of a sort) of the offering’s concept, I now admit that I could not wait for the briefing to end.

Actually, I felt bamboozled like those ordinary mortal FBI agents in CBS’ primetime hit show “Numb3rs.” In that show, time and again the whiz kid math genius (the brother of the FBI team leader) tries to explain to these action-rather-than-theory agents how some complex and arcane math theory can be applied to make sense out of seemingly chaotic and unrelated events. Eventually, complex math solves some important crimes, often by detecting patterns that are not obvious to the naked eye.

Well, fast forward to early 2009, where at Progress’ Analyst Summit (a traditional Boston winter fixture event) we could all find out that Progress Apama is possibly the best performing and growing part of the company. OpenEdge, while still contributing to over 60 percent to Progress’ total revenues, is a mature business that is now sold mostly to independent software vendors (ISVs). In addition, the recent financial markets (and consequently the overall economic) crisis and related cases of high-profile frauds (”white-collar crimes”) have made me conduct my own study of Apama and become familiar with its underlying concept.

Frankly, I no longer grapple as much with the concept of CEP per se (Progress Software refers to CEP as “The Brains of the High Velocity Business”). Where I still get lost though is when it comes to CEP’s relationships with other like technologies and concepts “du jour.”

CEP: Complex…What?!

In fact, IT-Director’s article from two years ago confirms that there are various issues that confront the event processing community. There now seems to be common agreement that event-processing as a term should best be used to encompass both CEP and event stream processing (ESP), the latter term arguably applying to events that are not necessarily complex in themselves.

In addition to just what exactly event processing is, and whether and how it differs from operational business intelligence (BI), another vexing issue is how big the event processing market is. For those of you that might want to delve into philosophical discussions about which concept is broader (and which came first) within the alphabet soup of CEP, ESP, SOA, event driven architecture (EDA) and business activity monitoring (BAM), here is one of ZDNet’s blog posts. There is also an excellent blog post on a practical combination of SOA, EDA, and CEP, plus, you can always peruse the Event Processing blog from the horse’s mouth (it is maintained by Progress Apama staffers) .

Principles of CEP-based Systems

In plain English, CEP lands itself well to any environment that treats any business update as an “event.” Such organizations want to enable users to rapidly define event-based business rules to identify patterns indicating opportunities and threats to the business. These encapsulated rules (either as “if-then” statements or structural query language [SQL] statements) are loaded into a real-time computing (RTC) CEP engine.

The correlating engine is permanently connected to multiple event sources and destinations (with volumes of events and related data points) and offers analysis and response within an extremely low latency period. Events can be captured and preserved in time-order for a historical pattern analysis and root-cause analysis (RCA).

Given that algorithmic trading in capital markets was one of the first real-life applications of CEP, let’s translate the above general CEP principles into trading terms. The continuing digitization of financial market data and the advancement of electronic market access has created a market environment in which competitive differentiation amongst financial service firms rests with split-second algorithmic execution that can exploit minuscule and momentary advantages in price, time, and available liquidity.

To that end, a trading company will treat any market update as an “event” and will enable users to rapidly build quantitative algorithms (based on their vast experience and know-how) to identify trading opportunities and risk breaches. Germane trading rules are then loaded into a trading system that offers real-time analysis and response with a latency measured in milliseconds.

The trading system is permanently connected to a number of relevant market data sources, news-feeds, and trading venues (exchanges). Finally, events can be captured and preserved in time-order for backtesting and digital forensics analysis.

In summary, the drivers for CEP adoption are the following:

* Applications with high throughput and latency requirements. Such requirements from market trends such as higher velocity business event flows, more voluminous (and yet shorter-lived) transactions, and rapidly changing market conditions. These trends in turn pose the challenges onto customers in terms of how to detect opportunities and threats in real-time, and how to show the health of their business; and
* The need for rapid software development and customization, and increasing application complexity (temporal and/or spatial logic, real-time analytics, etc.). The customers’ challenge in this regard is how to accelerate the deployment of new capabilities.

Differing from BI

CEP differs from traditional computing in the requirement for continuous execution of logic scenarios against a huge and continuous stream of information. This sharply contrasts with the traditional query processing model where data must be retrieved from a database based on indexed values. An example of static data processing would be the ability to answer the question: “What were the best performing stocks last week?”

Conversely, CEP is aimed at providing event-driven query and analytic processing (e.g., providing algorithmic financial trading solutions in the financial arena) in real-time. Another IT-Director article explains that the conventional query-and-report approaches to these environments are only suitable for environments of smaller scale or those in which limited numbers of data feeds are being monitored.

In particular, these reporting solutions cannot handle environments where large numbers of data feeds need to be combined and correlated in a complex and dynamic (on-the-fly) manner. Typical alert engines are only really suited for monitoring individual threshold (trigger) events (and they cannot predict the probability that the process will cross the threshold), while more comprehensive solutions such as conventional database approaches simply lack the capability for real-time processing (RTP).

Even speedy in-memory databases require the data to be committed to the database prior to query processing and indexes to be updated, both of which inherently mean some time delay. Rather than committing data to a database and then processing it, CEP platforms data is directly processed as it is fed into the system, without resorting to the use of a database. In principle, all of the resulting database overhead activities are therefore omitted, resulting in better performance and significantly improved scalability.

To illustrate with some examples of high frequency trading rules, one rule could be defined like this: “When the stock X’s price moves two percent outside its five-minute moving average margin, buy it now!” For a slightly more complex example: “IF the stock X’s price moves outside two percent of its moving average margin, followed by my basket moving up by half percent AND the stock Y’s price moves up by five percent OR the stock X’s price moves down by two percent, ALL WITHIN any two-minute time period THEN buy the stock X and sell stock Y!”

Again, the singular trait of CEP is its ability to handle complex event sequences coming from multiple data streams within real-time constraints. CEP accomplishes this via automated actions and built-in temporal logic and sequencing.

Back to Apama (not Panama or Obama, Bozo!)

Progress Apama became part of Progress Software via the acquisition of the former Apama LTD in April of 2005. Apama is the core technology foundation for Progress’ initiatives in CEP and the company’s go-to-market initiatives that leverage that CEP platform in capital markets for the following “daily bread” actions: algorithmic trading, market aggregation, real-time pricing, smart order routing, and market surveillance.

Prior to its acquisition by Progress Software, Apama had a few dozen customers in London, New York, and Boston. Today, however, after leveraging the global parent’s infrastructure, Apama is marketed and sold in all the major financial centers in the world.

Apama was founded in 1999 in Cambridge (UK), by John Bates and Giles Nelson. Fellow Cantabrigians and CEP visionaries Bates and Nelson are co-holders of the patents on Apama’s core technology, which is a commercially-productized expression of their efforts to create a platform for the unique characteristics of “event-based” applications.

Originally, Apama had set out to try and resolve a number of telecommunications-based real-time mobility issues, but had then realized that there were additional commercial opportunities in a wide range of environments. As a result, the company has historically focused on financial markets and specifically financial trading systems where real-time event-based trading systems are in high demand.

The capital markets segment has indeed proven to be an early proof point for the Apama CEP platform. Apama’s design philosophy and architecture were intended to provide a platform that allows traders to quickly develop and deploy distinctive proprietary strategies that exploit these opportunities and mitigate risks.

In addition to the above-mentioned CEP applications in capital markets, other current (or future) uses in the segment are the following: commodities trading, bonds trading and pricing, foreign exchange (Forex) aggregation and algorithms, futures exchange and options algorithms, equities trading, cross-asset trading, real-time risk management, broker algorithms, news-driven algorithms, and so on.


SOURCE:
http://blog.technologyevaluation.com/blog/2009/03/25/processing-complex-events-during-these-oh-well-complex-times-%E2%80%93-part-i/

No comments:

Post a Comment

website hit counter
web counter