Open Access Open Access  Restricted Access Subscription Access
Open Access Open Access Open Access  Restricted Access Restricted Access Subscription Access

Framework to Process High Frequency Trading Using Complex Event Processing


Affiliations
1 KLS GIT, Gogte Institute of Technology, Belagavi, Karnataka, India
2 KLE MSSCET, Belagavi, Karnataka, India
     

   Subscribe/Renew Journal


The financial services industry had always been a data intensive industry. From insurance to capital markets the role of data has been pivotal for a lot of applications like financial modeling, portfolio optimization, asset/liability matching, fraud detection and risk modeling. The big data revolution has provided a lot of options for innovation and improved efficiency in this domain. At the same time, a new set of challenges has been thrown up which need to be overcome for future growth and sustainability in the financial services industry. In recent times the securities trading market has undergone dramatic changes resulting in the growth of high velocity data. Velocity being one of the Vs of Big data, presents a unique set of challenges to the capital markets. The tradition approach of using Business Intelligence (BI) is no longer scaling especially in terms of the velocity of data. During the previous decade most of the firms in the capital markets have made significant investments in their ability to collect, store, manage and analyze (to some extent) large amount of data. Based on the benefits offered by big data analytics, financial services firms are now able to provide highly personalized and real time location based services rather than only product-based services which was possible earlier. The rise of electronic trading and the availability of real time stock prices and real time currency trading make it necessary to have real time risk analysis. Market participants who have the ability to analyze the data in real time will be able to garner a disproportionate part of the available profit pool. The availability of huge amounts of financial data, high rate of data generation, and the heterogeneity of financial data make it difficult to capture, process and perform timely analysis of data. Traditional financial systems are not designed to cope with a wide variety of data, especially unstructured data from Twitter, news, social media, blogs etc which affect market dynamics in real time. Traditional data warehousing and BI techniques like extract, transform and load (ETL) take a huge amount of time (often days) to process the large amounts of data and are thus not receptive to real time analytics.

This paper discusses the implication of the rise of big data and especially that of high velocity data in the domain of High Frequency Trading (HFT), a growing niche of securities trading. We first take a brief look at the intricacies of HFT including some of the commonly used strategies used by HFT traders. The technological challenges in processing 5623HFT and responding to the real time changes in the market conditions are also discussed. Some of the potential technological solutions to solve the issues thrown up by HFT are analyzed for their effectiveness to address the real time performance requirements of HFT. We identify Complex Event Processing (CEP) as a candidate to address the HFT problem. The paper is divided into 3 parts; part A deals with understanding HFT and the challenges that it poses to the technological processing. In Part B we look at Complex Event Processing (CEP) and the types of problems it can be applied to. In Part C we show a framework to process HFT using techniques derived from CEP.


Keywords

High Frequency Trading, Complex Event Processing, Big Data Processing.
Subscription Login to verify subscription
User
Notifications
Font Size


  • I. Aldridge, High Frequency Trading, John Wiley & Sons, Inc. 2010.
  • SEC, “Concept release on equity market structure,” no. 34, 2010.
  • M. Chlistalla, “High frequency trading, better than its reputation?,” Dtsch. Bank Res., vol. 8, no. 3, pp. 217-224, 2011.
  • M. Durbin, All About High-Frequency Trading, McGraw Hill Professional, 2010.
  • C. M. Jones, “What do we know about high-frequency trading?,” Columbia Bus. Sch. Res. Pap. No. 13-11, pp. 1-56, 2013.
  • R. S. Miller, and G. Shorter, “High frequency trading Overview of recent developments,” Washingt. Congr. Res. Serv., 2016.
  • M. Avellaneda, and J.-H. Lee, “Statistical arbitrage in the US equities market,” Quant. Financ., vol. 10, no. 7, pp. 761-782, 2010.
  • J. A. Brogaard, T. Hendershott, and R. Riordan, “High-frequency trading and price discovery,” Rev. Financ. Stud., vol. 27, no. 8, pp. 2267-2306, 2014.
  • J. A. Brogaard, “High frequency trading and its impact on market quality,” Management, p. 66, 2010.
  • SEC, “Equity market structure literature review part II: High frequency trading,” U.S. Secur. Exch. Comm. Staff Div. Trading Mark., vol. 2014, no. March, pp. 1-37, 2014.
  • A. Hadoop, “Hadoop,” 2009-03-06. Available: http//hadoop. apache. org, 2009.
  • V. Abramova, and J. Bernardino, “NoSQL databases: MongoDB vs cassandra,” Proc. Int. Conf. Comput. Sci. Softw. Eng. ACM 2013, pp. 14-22, 2013.
  • J. Dean, and S. Ghemawat, “MapReduce,” Communications of the ACM, vol. 53, no. 1. p. 72, 2010.
  • A. Spark, “Apache SparkTM - Lightning-fast cluster computing,” Spark.Apache.Org, 2015. .
  • G. F. Pfister, “An introduction to the infiniband architecture,” High Perform. Mass Storage Parallel {I/O} Technol. Appl., no. 42, pp. 617-632, 2001.
  • J. Dean, and L. A. Barroso, “The tail at scale,” Commun. ACM, vol. 56, no. 2, p. 74, 2013.
  • G. Cugola, and A. Margara, “Processing flows of information,” ACM Comput. Surv., vol. 44, no. 3, pp. 1-62, 2012.
  • D. Luckham, and R. Schulte, “Event processing glossary - Version 1.1,” Processing, no. July, pp. 1-19, 2008.
  • M. Fugazza, and A. Nicita, “The direct and relative effects of preferential market access,” J. Int. Econ., vol. 89, no. 2, pp. 357-368, 2013.
  • P. Grun, “Introduction to InfiniBand TM for end users,” 2010.
  • S. Swarnkar, and J. Jenq, “Implementation of FIX engine and order management systems using ASP . NET C#,”.
  • S. Kumar, F. Morstatter, and H. Liu, Twitter Data Analytics, Springer, p. 89, 2013.

Abstract Views: 178

PDF Views: 3




  • Framework to Process High Frequency Trading Using Complex Event Processing

Abstract Views: 178  |  PDF Views: 3

Authors

A. Acharya
KLS GIT, Gogte Institute of Technology, Belagavi, Karnataka, India
N. S. Sidnal
KLE MSSCET, Belagavi, Karnataka, India

Abstract


The financial services industry had always been a data intensive industry. From insurance to capital markets the role of data has been pivotal for a lot of applications like financial modeling, portfolio optimization, asset/liability matching, fraud detection and risk modeling. The big data revolution has provided a lot of options for innovation and improved efficiency in this domain. At the same time, a new set of challenges has been thrown up which need to be overcome for future growth and sustainability in the financial services industry. In recent times the securities trading market has undergone dramatic changes resulting in the growth of high velocity data. Velocity being one of the Vs of Big data, presents a unique set of challenges to the capital markets. The tradition approach of using Business Intelligence (BI) is no longer scaling especially in terms of the velocity of data. During the previous decade most of the firms in the capital markets have made significant investments in their ability to collect, store, manage and analyze (to some extent) large amount of data. Based on the benefits offered by big data analytics, financial services firms are now able to provide highly personalized and real time location based services rather than only product-based services which was possible earlier. The rise of electronic trading and the availability of real time stock prices and real time currency trading make it necessary to have real time risk analysis. Market participants who have the ability to analyze the data in real time will be able to garner a disproportionate part of the available profit pool. The availability of huge amounts of financial data, high rate of data generation, and the heterogeneity of financial data make it difficult to capture, process and perform timely analysis of data. Traditional financial systems are not designed to cope with a wide variety of data, especially unstructured data from Twitter, news, social media, blogs etc which affect market dynamics in real time. Traditional data warehousing and BI techniques like extract, transform and load (ETL) take a huge amount of time (often days) to process the large amounts of data and are thus not receptive to real time analytics.

This paper discusses the implication of the rise of big data and especially that of high velocity data in the domain of High Frequency Trading (HFT), a growing niche of securities trading. We first take a brief look at the intricacies of HFT including some of the commonly used strategies used by HFT traders. The technological challenges in processing 5623HFT and responding to the real time changes in the market conditions are also discussed. Some of the potential technological solutions to solve the issues thrown up by HFT are analyzed for their effectiveness to address the real time performance requirements of HFT. We identify Complex Event Processing (CEP) as a candidate to address the HFT problem. The paper is divided into 3 parts; part A deals with understanding HFT and the challenges that it poses to the technological processing. In Part B we look at Complex Event Processing (CEP) and the types of problems it can be applied to. In Part C we show a framework to process HFT using techniques derived from CEP.


Keywords


High Frequency Trading, Complex Event Processing, Big Data Processing.

References