This article is the second installment in a three part series that talks about the business issues being faced by large trading operations & infrastructures in Capital Markets space. This post discusses a real world reference architecture using Big Data techniques and is more technical in nature. The final part of this series will focus on business recommendations for disruptive innovation in this area.
For part 1, please visit – http://www.vamsitalkstech.com/?p=303
Introduction
With globalization driving the capital markets and an increasing number of issuers, one finds an increasing amount of complexity across a range of financial instruments and assets (stocks, bonds, derivatives, commodities etc.) and venues (NYSE, NASDAQ, CME, Dark Pools etc.). Added to that shrinking margins & regulatory pressures are further driving buy side players to look into how existing business processes & systems integration currently work with a view to making these more transparent, efficient and agile.
The business drivers (as noted in the first post in this three part series) from a Capital Markets perspective-
1.Re-tool existing trading infrastructures so that they are more integrated yet loosely coupled and efficient
2.Automating complex trading strategies that are quantitative in nature across a range of asset classes like equities, forex,ETFs and commodities etc
3.Needing to incorporate newer & faster sources of data (social media, sensor data, clickstream date) and not just the conventional sources (market data, position data, M&A data, transaction data etc). Pure speed can only get a firm so far
4.Retrofitting existing trade systems to be able to accommodate a range of mobile clients who have a vested interest in deriving analytics. e.g marry tick data with market structure information to understand why certain securities dip or spike at certain points and the reasons for the same (e.g. institutional selling or equity linked trades with derivatives)
5.Helping traders create algorithms as well as customizing these to be able to generate constant competitive advantage
The need of the hour is to provide enterprise architecture capabilities around designing flexible trading platforms that are built around efficient use of data, speed, agility and a service oriented architecture. The choice of open source is key as it allows for a modular and flexible architecture that can be modified and adopted in a phased manner – as you will shortly see.
Business Requirements
Trading platforms are concerned with executing orders coming in from portfolio managers on the Buy side, order management & monitoring through the execution process and providing electronic access to a wide variety of venues. From the Sell side one needs to provide support for handling customer orders and managing trading positions.
The following business requirements must be met of systems that offer Buy/Sell trading capabilities –
- Architecture must offer support front, mid & back office trading capabilities with support for both simple and complex rule based and algorithmic trade strategies
- Support the development lifecycle & seamless cutover in terms of backtesting and live implementation of the above strategies. In short support an iterative and DevOps based methodology. The goal is to ensure that folks developing startegies can test their models across the widest spectrum of asset classes in the most productive manner possible
- Display well designed & highly intuitive trade and blotter UIs for trade management with support for mobile technologies. This is critical in ensuring a smooth user experience
- Support a business model trading as a service – TaaS that can potentially be sold as a utility over open APIs
- Support a hybrid & scale-out deployment model. Services that provide the core business functionality should be deployable all the way from bare metal to VMs to Docker containers on a private or a public cloud as required. A core requirement is to use Open Source Software and commodity
- Support a rule based trading model (declarative) that will evolve to supporting predictive analytics with ingrained support for both complex event processing (CEP) as well as business workflow (ideally support for the BPMN standard notation)
- Support integration with a wide variety of external participants across the globe. The platform must truly be global in terms of supporting exchanges & products (FOREX, open across different hours
- Support a wide variety of financial products and formats with FIX being the primary
- Provide support for order capture, trading & crossing
- Provide the ability to cross buy and sell side market orders (when both side orders are detected in the system)
- Auto route and execute orders based on accounts, quantity and real time market data
- Support other complex order routing requirements as applicable
- Finally, support a high degree of scalability, as volumes grow – system should be able to autoscale to accomodate a high volume of trades/sec with a desirable latency in milliseconds & well defined SLA’s for Order Entry & Disaster Recovery at a minimum
Design Tenets –
- At the application tier – a SOA based approach is key – all core business functions are modeled as SOA services or even microservices
- The choice of an ESB/message tier to interconnect all market participants
- Open messaging standard – AMQP (Advance Message Queuing Protocol) chosen as the transport protocol of choice for performance and industry reasons. Legacy architectures by financial and stock exchanges to optimize the cost of their IT infrastructures have been hobbled by proprietary vendor and legacy protocols. AMQP was developed by a consortium of banks and vendors (JP Morgan and Red Hat, VMWare among others) and functions as the lingua franca for financial services backbone messaging. It is now deployed in a range of industries ranging from healthcare to manufacturing to IoT (Across verticals). Using AMQP avoids lock-in and costly bridging technology. Further, organizations like NYSE have been leading development of technologies like OpenMAMA, which intends to provide a vendor agnostic middleware API that supports event driven messaging. One exmple use is to allow market data vendors to publish their messages like quotes & trades over an industry standard platform while allowing them to build value based services on the platform. Our intention is to future proof our architecture by basing it on open standards
- FIX (Financial Information Exchange) run over AMQP will be the primary business interchange protocol
- Apache Kafka or Fuse ESB chosen as the messaging tier or service bus
- A BRMS (Business Rules Mgmt System) provides Rules, CEP and BPM under a single umbrella. This tier contains the definition and the runtime for rules for order management, routing, crossing, matching.
- In memory analytics provided by an in memory data grid or even using a Spark in memory layer
- The data layer is based on an Apache Hadoop platform and is architected based on a lambda architecture (developed by Nathan Marz). More on this in the below sections
Figure 1 – Reference Architecture for Trading Platform
The key components of the Trading Platform Architecture as depicted above are –
- Order Management System – which displays a rich interactive portal with a user interface; clients call in brokers via the telephone or place orders electronically. These orders are routed to the OMS. The OMS receives the orders, performs proper matching and decides the best avenue and price based on business rules/complex events and then routes them to the appropriate market venue to get these filled
- A Market Data distribution service connects to market data providers (e.g. Bloomberg, Thomson Reuters etc.) and send in regular updates to the OMS, rules around what market data becomes the reference point for OMS, i.e. if the same market data is available from multiple sources which takes priority
- Connectivity is also established via FIX gateways to the distribution service.
- The business rules approach adds another dimension to BPM by enabling one to leverage declarative logic with business rules to build compact, fast and easy to understand trading logic. An example of this is in a sector (e.g. Trading platforms, Mortgage underwriting applications) where market conditions result in changing business rules as well as business processes comprised in the satisfaction of buy/sell requests.
- Complex Event Processing (CEP) – The term Event by itself is frequently overloaded and can be used to refer to several different things, depending on the context it is used. In our trading platform, when a sell operation is executed, it causes a change of state in the domain that can be observed on several actors, like the price of the securities that changed to match the value of the operation, the owner of the individual traded assets that change from the seller to the buyer, the balance of the accounts from both seller and buyer that are credited and debited, etc. Intercepting a cloud of these events and having a business process adapt and react to them is key to have an agile trading platform.
- The data management layer spans information stores like the Security Master, Customer Master, Holdings and Account/Product Master etc. This layer also needs to deal with Data Governance.
Figure 2 – Trade Rules Modeling
The flow of data in the system can be depicted as shown below –
Figure 3 – Overall Trading Process flow
The intention in adopting a SOA (or even a microservices) architecture is to be able to incrementally plug in lightweight business services like performance measurement, trade surveillance, risk analytics, option pricing etc.
The data architecture is based on the lambda system developed by Nathan Marz.The lambda architecture solves the problem of computing arbitrary functions on arbitrary data in real time by decomposing the problem into three layers: the batch layer, the serving layer, and the speed layer.
Data Architecture –
Figure 4 – Data process flow (source VoltDB)
On a very high generic level the data architecture has 3 components.
- the Batch Layer – constantly ingests, stores and processes market data, social media data, reference data, position data etc and constantly precomputes views
- the Speed Layer processes real-time feeds & produces tactical views of the same
- the Service layer which holds the batch views relevant for the queries needed by predictive analytics.
The Lambda Architecture is aimed at applications built around complex asynchronous transformations that need to run with low latency (say, a few seconds to a few hours) which is perfectly suited to our business case.
Advantages of an open architecture –
- Cost-Effective – An open source stack reduces cost almost by 50% when compared to a legacy system built on mainframe or proprietary technology.
- Data Governance – Effectively provided by the Hadoop stack
- Scalable – Provides a high degree of scalability in the architecture
- Innovative – Built on the most robust architecture and state-of-the-art technology
- Deployment – Supports a variety of deployment architectures, on premises or on the cloud
- Load balancing support is built in to handle increasing volumes
- Visibility into business rules as well as support for monitoring workflows
New Age trading platforms built on open source can not only be deployed across physical, virtual, mobile, and cloud environments but also include complementary paradigms – integration, business rules and complex event processing (CEP) capabilities can add to operational efficiency, new business models, better risk management & ultimately a higher degree of profitability.
References –
http://voltdb.com/blog/simplifying-complex-lambda-architecture
1 comment
Hello Chemitiganti-San. excellent and insightful job. Can you please cover Trading Architectures topic as well when you do the innovation workshops in Tokio & Seoul in August?