Return to site

Deep Order Flow Imbalance: Extracting Alpha at Multiple Horizons from the Limit Order Book

· Academic Research

Deep Order Flow Imbalance: Extracting Alpha at Multiple Horizons from the Limit Order Book

broken image

Written by PETTER N. KOLM, JEREMY TURIEL AND NICHOLAS WESTRAY

Abstract:

We employ deep learning in forecasting high-frequency returns at multiple horizons for 115 stocks traded on Nasdaq using order book information at the most granular level. While raw order book states can be used as input to the forecasting models, we achieve state-of-the-art predictive accuracy by training simpler "off-the-shelf" artificial neural networks on stationary inputs derived from the order book. 

Specifically, models trained on order flow significantly outperform most models trained directly on order books. 

Using cross-sectional regressions we link the forecasting performance of a long short-term memory network to stock characteristics at the market microstructure level, suggesting that "information-rich" stocks can be predicted more accurately. Finally, we demonstrate that the effective horizon of stock specific forecasts is approximately two average price changes.

broken image

Introduction:

In this article we employ deep learning (DL) in forecasting high-frequency returns at multiple horizons for 115 stocks traded on Nasdaq using order book infor�mation at the most granular level. 

In the last decade, DL has experienced enormoussuccess, outperforming more traditional approaches in areas such as image classification, computer vision and natural language processing (Krizhevsky et al., 2012;LeCun et al., 2015; Schmidhuber, 2015; Goodfellow et al., 2016; Devlin et al.,(2018)

A key reason for this success is that DL learns suitable representations directly from the raw data, unlike conventional machine learning (ML) approacheswhere features are designed by hand and frequently involve domain expertise. 

broken image

Artificial neural networks (ANNs) have proven to be particularly good at extractingintricate relationships in complex and high-dimensional settings without human input, especially when trained on large amounts of raw data. Although counterintuitive from the perspective of traditional statistical and ML techniques, where the researcher uses handcrafted features and progresses from simpler to more complex models, the essence of DL and related approaches is eloquently summarized in the following quote by Rich Sutton: 

“The biggest lesson that can be read from 70 yearsof AI research is that general methods that leverage computation are ultimately themost effective, and by a large margin.” (emphasis added) (Sutton, 2019).

While results have been mixed because of the lack of enough data, ANNs have been applied to a number of problems in finance (see, for example, the surveys by Wong et al. (1998), Li et al. (2010), Elmsili et al. (2018), Ozbayoglu et al. (2020), and Sezer et al. (2020)). Recently, by leveraging large datasets extracted from limit order books (LOB) in equity markets, ANNs have been successful at forecasting high-frequency returns (see, for example, Tsantekidis et al. (2017), Tran et al.(2019), Zhang et al. (2018), Sirignano (2019), Sirignano et al. (2019), Zhang et al.(2019a), Zhang et al. (2019b), Luo et al. (2019), Tsantekidis et al. (2020), Briolaet al. (2020), Zhang et al. (2021a), and Zhang et al. (2021b)).

broken image

The literature to date has focused on two key empirical questions. 

First, to demonstrate that deep learning models outperform classical statistical and machine learning approaches, including penalized linear models, decision trees and kernel-based models. 

Second, to identify the optimal ANN architecture for return classification. While the answer to the first question has been established in favor of ANNs, the choice of the optimal ANN architecture is less clear.

broken image