There is a vast literature on the investment decision making process and associated assessment of expected returns on

investments. Traditionally, historical performances, economic theories, and forward looking indicators were usually

quant

quant interview questionsquant

quant risk quantitative trading

quantitative trading

put forward for investors to judge expected returns. However, modern finance theory, including quantitative models

and econometric techniques, provided the foundation that has revolutionised the investment management industry over

the last 20 years. Technical analysis have initiated a broad current of literature in economics and statistical physics

refining and expanding the underlying concepts and models. It is remarkable to note that some of the features of

financial data were general enough to have spawned the interest of several fields in sciences, from economics and

econometrics, to mathematics and physics, to further explore the behaviour of this data and develop models explaining

these characteristics. As a result, some theories found by a group of scientists were rediscovered at a later stage

by another group, or simply observed and mentioned in studies but not formalised. Financial text books presenting

academic and practitioners findings tend to be too vague and too restrictive, while published articles tend to be too

technical and too specialised. This guide tries to bridge the gap by presenting the necessary tools for performing

quantitative portfolio selection and allocation in a simple, yet robust way. We present in a chronological order the

necessary steps to identify trading signals, build quantitative strategies, assess expected returns, measure and score

strategies, and allocate portfolios. This is done with the help of various published articles referenced along this guide,

quant

quant interview questionsquant

quant risk quantitative trading

quantitative trading

as well as financial and economical text books. In the spirit of Alfred North Whitehead, we aim to seek the simplest

explanations of complex facts, which is achieved by structuring this book from the simple to the complex. This

pedagogic approach, inevitably, leads to some necessary repetitions of materials. We first introduce some simple

ideas and concepts used to describe financial data, and then show how empirical evidences led to the introduction of

complexity which modified the existing market consensus. This book is divided into in five parts. We first present

and describe quantitative trading in classical economics, and provide the paramount statistical tools. We then discuss

quantitative trading in inefficient markets before detailing quantitative trading in multifractal markets. At last, we we

present a few numerical tools to perform the necessary computation when performing quantitative trading strategies.

The decision making process and portfolio allocation being a vast subject, this is not an exhaustive guide, and some

fields and techniques have not been covered. However, we intend to fill the gap over time by reviewing and updating

this book.

0.1.2 An overview of quantitative trading

Following the spirit of Focardi et al. [2004], who detailed how the growing use of quantitative methods changed

finance and investment theory, we are going to present an overview of quantitative portfolio trading. Just as automation

and mechanisation were the cornerstones of the Industrial Revolution at the turn of the 19th century, modern finance

theory, quantitative models, and econometric techniques provide the foundation that has revolutionised the investment

management industry over the last 20 years. Quantitative models and scientific techniques are playing an increasingly

important role in the financial industry affecting all steps in the investment management process, such as

• defining the policy statement

• setting the investment objectives

• selecting investment strategies

• implementing the investment plan

• constructing the portfolio

• monitoring, measuring, and evaluating investment performance

21

quant

quant interview questionsquant

quant risk quantitative trading

quantitative trading

Quantitative Analytics

The most significant benefit being the power of automation, enforcing a systematic investment approach and a struc-

tured and unified framework. Not only completely automated risk models and marking-to-market processes provide

a powerful tool for analysing and tracking portfolio performance in real time, but it also provides the foundation for

complete process and system backtests. Quantifying the chain of decision allows a portfolio manager to more fully

understand, compare, and calibrate investment strategies, underlying investment objectives and policies.

Since the pioneering work of Pareto [1896] at the end of the 19th century and the work of Von Neumann et al.

[1944], decision making has been modelled using both

1. utility function to order choices, and,

2. some probabilities to identify choices.

As a result, in order to complete the investment management process, market participants, or agents, can rely either

on subjective information, in a forecasting model, or a combination of both. This heavy dependence of financial asset

management on the ability to forecast risk and returns led academics to develop a theory of market prices, resulting in

the general equilibrium theories (GET). In the classical approach, the Efficient Market Hypothesis (EMH) states that

current prices reflect all available or public information, so that future price changes can be determined only by new

information. That is, the markets follow a random walk (see Bachelier [1900] and Fama [1970]). Hence, agents are

coordinated by a central price signal, and as such, do not interact so that they can be aggregated to form a representative

agent whose optimising behaviour sets the optimal price process. Classical economics is based on the principles that

1. the agent decision making process can be represented as the maximisation of expected utility, and,

2. that agents have a perfect knowledge of the future (the stochastic processes on which they optimise are exactly

the true stochastic processes).

The essence of general equilibrium theories (GET) states that the instantaneous and continuous interaction among

agents, taking advantage of arbitrage opportunities (AO) in the market is the process that will force asset prices toward

equilibrium. Markowitz [1952] first introduced portfolio selection using a quantitative optimisation technique that

balances the trade-off between risk and return. His work laid the ground for the capital asset pricing model (CAPM),

the most fundamental general equilibrium theory in modern finance. The CAPM states that the expected value of

the excess return of any asset is proportional to the excess return of the total investible market, where the constant of

proportionality is the covariance between the asset return and the market return. Many critics of the mean-variance

optimisation framework were formulated, such as, oversimplification and unrealistic assumption of the distribution

of asset returns, high sensitivity of the optimisation to inputs (the expected returns of each asset and their covariance

quant

quant interview questionsquant

quant risk quantitative trading

quantitative trading

matrix). Extensions to classical mean-variance optimisation were proposed to make the portfolio allocation process

more robust to different source of risk, such as, Bayesian approaches, and Robust Portfolio Allocation. In addition,

higher moments were introduced in the optimisation process. Nonetheless, the question of whether general equilibrium

theories are appropriate representations of economic systems can not be answered empirically.

Classical economics is founded on the concept of equilibrium. On one hand, econometric analysis assumes that, if

there are no outside, or exogenous, influences, then a system is at rest. The system reacts to external perturbation by

reverting to equilibrium in a linear fashion. On the other hand, it ignores time, or treats time as a simple variable by

assuming the market has no memory, or only limited memory of the past. These two points might explain why classical

economists had trouble forecasting our economic future. Clearly, the qualitative aspect coming from human decision

making process is missing. Over the last

30

years, econometric analysis has shown that asset prices present some

level of predictability contradicting models such as the CAPM or the APT, which are based on constant trends. As a

result, a different view on financial markets emerged postulating that markets are populated by interacting agent, that

is, agents making only imperfect forecasts and directly influencing each other, leading to feedback in financial markets

and potential asset prices predictability. In consequence, factor models and other econometric techniques developed

22

quant

quant interview questionsquant

quant risk quantitative trading

quantitative trading

Quantitative Analytics

to forecast price processes in view of capturing these financial paterns at some level. However, until recently, asset

price predictability seemed to be greater at the portfolio level than at the individual asset level. Since in most cases

it is not possible to measure the agent’s utility function and its ability to forecast returns, GET are considered as

abstract mathematical constructs which are either not easy or impossible to validate empirically. On the other hand,

econometrics has a strong data-mining component since it attempts at fitting generalised models to the market with

free parameters. As such, it has a strong empirical basis but a relatively simple theoretical foundation. Recently, with

the increased availability of data, econophysics emerged as a mix of physical sciences and economics to get the best

of both world in view of analysing more deeply asset predictability.

Since the EMH implicitly assumes that all investors immediately react to new information, so that the future is

unrelated to the past or the present, the Central Limit Theorem (CLT) could therefore be applied to capital market

analysis. The CLT was necessary to justify the use of probability calculus and linear models. However, in practice,

the decision process do not follow the general equilibrium theories (GET), as some agents may react to information

as it is received, while most agents wait for confirming information and do not react until a trend is established. The

uneven assimilation of information may cause a biased random walk (called fractional Brownian motion) which were

extensively studied by Hurst in the 1940s, and by Mandelbrot in the 1960s and 1970s. A large number of studies

showed that market returns were persistent time series with an underlying fractal probability distribution, following

a biased random walk. Stocks having Hurst exponents,

H

quant

quant interview questionsquant

quant risk quantitative trading

quantitative trading

, greater than

1

2

are fractal, and application of standard

statistical analysis becomes of questionable value. In that case, variances are undefined, or infinite, making volatility

a useless and misleading estimate of risk. High

H

values, meaning less noise, more persistence, and clearer trends

than lower values of

H

, we can assume that higher values of

H

mean less risk. However, stocks with high

H

values

do have a higher risk of abrupt changes. The fractal nature of the capital markets contradicts the EMH and all the

quantitative models derived from it, such as the Capital Asset Pricing Model (CAPM), the Arbitrage Pricing Theory

(APT), and the Black-Scholes option pricing model, and other models depending on the normal distribution and/or

finite variance. This is because they simplify reality by assuming random behaviour, and they ignore the influence

of time on decision making. By assuming randomness, the models can be optimised for a single optimal solution.

That is, we can find optimal portfolios, intrinsic value, and fair price. On the other hand, fractal structure recognises

complexity and provides cycles, trends, and a range of fair values.

New theories to explain financial markets are gaining ground, among which is a multitude of interacting agents

forming a complex system characterised by a high level of uncertainty. Complexity theory deals with processes where

a large number of seemingly independent agents act coherently. Multiple interacting agent systems are subject to

contagion and propagation phenomena generating feedbacks and producing fat tails. Real feedback systems involve

long-term correlations and trends since memories of long-past events can still affect the decisions made in the present.

Most complex, natural systems, can be modelled by nonlinear differential, or difference, equations. These systems are

characterised by a high level of uncertainty which is embedded in the probabilistic structure of models. As a result,

econometrics can now supply the empirical foundation of economics. For instance, science being highly stratified, one

can build complex theories on the foundation of simpler theories. That is, starting with a collection of econometric

data, we model it and analyse it, obtaining statistical facts of an empirical nature that provide us with the building

blocks of future theoretical development. For instance, assuming that economic agents are heterogeneous, make

mistakes, and mutually interact leads to more freedom to devise economic theory (see Aoki [2004]).

With the growing quantity of data available, machine-learning methods that have been successfully applied in sci-

ence are now applied to mining the markets. Data mining and more recent machine-learning methodologies provide

a range of general techniques for the classification, prediction, and optimisation of structured and unstructured data.

Neural networks, classification and decision trees, k-nearest neighbour methods, and support vector machines (SVM)

are some of the more common classification and prediction techniques used in machine learning. Further, combi-

natorial optimisation, genetic algorithms and reinforced learning are now widespread. Using these techniques, one

can describe financial markets through degrees of freedom which may be both qualitative and quantitative in nature,

each node being the siege of complicated mathematical entity. One could use a matrix form to represent interactions

23

quant

quant interview questionsquant

quant risk quantitative trading

quantitative trading