The Alpha Engine Designing An Automated Trading Algorithm
The Alpha Engine Designing An Automated Trading Algorithm
Algorithm
Anton Golub1 , James B. Glattfelder2 , and Richard B. Olsen1
1
Lykke Corp, Baarerstrasse 2, 6300 Zug Switzerland
2
Department of Banking and Finance, University of Zurich, Switzerland
April 5, 2017
Abstract
We introduce a new approach to algorithmic investment management that
yields profitable automated trading strategies. This trading model design is the
result of a path of investigation that was chosen nearly three decades ago. Back
then, a paradigm change was proposed for the way time is defined in financial
markets, based on intrinsic events. This definition lead to the uncovering of a
large set of scaling laws. An additional guiding principle was found by embed-
ding the trading model construction in an agent-base framework, inspired by
the study of complex systems. This new approach to designing automated trad-
ing algorithms is a parsimonious method for building a new type of investment
strategy that not only generates profits, but also provides liquidity to financial
markets and does not have a priori restrictions on the amount of assets that are
managed.
1 Introduction
The asset management industry is one of the largest industries in modern so-
ciety. Its relevance is documented by the astonishing amount of assets that
are managed. It is estimated that globally there are 64 trillion USD under
management [6]. This is nearly as big as the world product of 77 trillion USD
[39].
1
A drawback of all such methodologies is, however, the absence of a consistent
and overarching framework. What appears as a systematic approach to asset
management often boils down to gut feeling, as the manager chooses from a
broad blend of theories with different interpretations. For instance, the choice
and configuration of indicators is subject to the specific preference of the analyst
or trader. In effect, practitioners mostly apply ad hoc rules which are not
embedded in a broader context. Complex phenomena such as changing liquidity
levels as a function of time go unattended.
This lack of consensus, or intellectual coherence, in such a dominant and rel-
evant industry underpinning our whole society is striking. Especially in a day
and age where computational power and digital storage capacities are grow-
ing exponentially, at shrinking costs, and where there exists an abundance of
machine learning algorithms and big data techniques. To illustrate, consider
the recent unexpected success of Googles AlphaGo algorithm beating the best
human players [11]. This is a remarkable feat for a computer, as the game of Go
is notoriously complex and players often report that they select moves based
solely on intuition.
There is, however, one exception in the asset management and trading in-
dustry that relies fully on algorithmic trade generation and automated execu-
tion. Referred to under the umbrella of term high-frequency trading, this
approach has witnessed substantial growth. These strategies take advantage of
short-term arbitrage opportunities and typically analyse the limit order books
to jump the queue, whenever there are large orders pending [10]. While high-
frequency trading results in high trade volumes the assets managed with these
type of strategies are around 140 billion [34]. This is microscopic compared to
the size of the global assets under management.
2
environment for the research and development of fully automated and algorith-
mic trading strategies. Indeed, any profitable trading algorithm for this market
should, in theory, also be applicable to other markets.
3
some approaches rely more on number crunching than others. Ideally, any trad-
ing model algorithm should be implementable with reasonable resources to make
it useful and applicable in the real world.
4
To summarize, our aim is to develop trading models based on parsimo-
nious, self-similar, modular, and agent-based behavior, designed for multiple
time horizons and not purely driven by trend following action. The intellectual
framework unifying these angles of attack is outlined in Section 3. The result
of this endeavor are interacting systems that are highly dynamic, robust, and
adaptive. In other words, a type of trading model that mirrors the dynamic
and complex nature of financial markets. The performance of this automated
trading algorithm is outlined in the next section.
In closing, it should be mentioned that transaction costs can represent real-
world stumbling blocks for trading models. Investment strategies that take
advantage of short-term price movements in order to achieve good performance
have higher transaction volumes than longer-term strategies. This obviously
increases the impact of transaction costs on the profitability. As far as possible,
it is advisable to use limit orders to initiate trades. They have the advantage
that the trader does not have to cross the spread to get his order executed, thus
reducing or eliminating transaction costs. The disadvantage of limit orders is,
however, that execution is uncertain and depends on buy and sell interest.
5
20%
15%
Total P&L
10%
5%
0%
time NZD/JPY
AUD/NZD
NZD/CAD
3%
EUR/NZD
NZD/USD
AUD/JPY
GBP/AUD
CAD/JPY
EUR/AUD
CHF/JPY
GBP/CAD
GBP/JPY
2%
EUR/CAD
USD/CAD
AUD/USD
USD/CHF
P&L
EUR/USD
USD/JPY
EUR/GBP
EUR/CHF
GBP/USD
1%
EUR/JPY
GBP/CHF
0%
Figure 1: Daily Profit & Loss of the Alpha Engine, across 23 currency pairs, for
eight years. See details in the main text of this section and Section 4.
6
have a greater number of intrinsic events and hence more opportunities for the
model to extract profits from the market. This behavior can be witnessed during
the financial crisis, where its deleterious effects are somewhat counterbalanced
by an overall increase in profitable trading behavior of the model, fueled by the
increase in volatility.
The variability in performance of the individual currency pairs can be ad-
dressed by calibrating the aggressiveness of the model with respect to the
volatility of the exchange rate. In other words, the model trades more fre-
quently when the volatility is low, and vice versa. For the sake of simplicity,
and to avoid potential over-fitting, we have excluded these adjustments to the
model. In addition, we also refrained from implementing cross-correlation mea-
sures. By assessing the behavior of the model for one currency pair, information
can be gained that could be utilized as an indicator which affects the models
behaviour for other exchange rates. Finally, we have also not implemented any
risk management tools.
In essence, what we present here is a proof of concept. We refrained from
tweaking the model to yield better performance, in order to clearly establish
and outline the models building blocks and fundamental behavior. We strongly
believe there is great potential for obvious and straightforward improvements,
which would give rise to far better models. Nevertheless, the bare-bones model
we present here already has the capability of being implemented as a robust and
profitable trading model that can be run in real-time. With a leverage factor
of 10, the model experiences a drawdown of 7.08% while yielding an average
yearly profit of 10.05% for the last four years. This is still far from realizing
the coastlines potential, but, in our opinion, a crucial first step in the right
direction.
Finally, we conclude this section by noting that, despite conventional wis-
dom, it is in fact possible to beat a random walk. The Alpha Engine produces
profitable results even on time series generated by a random walk, as seen in
Figure 9 in Appendix B. This unexpected feature results from the fact that the
model is dissecting Brownian motion into intrinsic time events. Now these di-
rectional changes and overshoots yield a novel context, where a cascading event
is more likely to be followed by a de-cascading event than another cascading
one. In detail, the probability of reaching the profitable de-cascading event after
a cascade is 1 e1 0.63, while the the probability for an additional cascade
is about 0.37. In effect, the procedure of translating a tick-by-tick time series
into intrinsic time events skews the odds in ones favourfor empirical as well
as synthetic time series. For details see [19].
In the next section, we will embark on the journey that would ultimately
result in the trading model described above. For a prehistory of events, see
Appendix A.
7
framework of time, this voyage set out to chart new terrain. The whole history
of this endeavor is described in Appendix A. In the following, the key elements
of this new paradigm are highlighted.
8
Figure 2: (Left) directional-change and overshoot events. (Right) a coastline repre-
sentation of the EUR USD price curve (2008-12-14 22:10:56 to 2008-12-16 21:58:20)
defined by a directional-change threshold = 0.25%. The blue triangles represent
directional-change and the green bullets overshoot events.
investigations [16]. In particular, this price curve will be used as input for the
trading model, as described in Section 3.4. With the publication [15], the first
decade came to a close.
9
Figure 3: Coastline representation of a price curve for various directional-change
thresholds .
hi . (1)
This justifies the procedure of dissecting the price curve into directional-change
and overshoot segments of the same size, as seen in Figures 2 and 3. In other
words, the notion of the coastline is statistically validated.
Scaling laws are a hallmark of complexity and complex systems. They can
be viewed as a universal law of nature underlying complex behavior in all its
10
domains.
11
gives rise to a stochastic and highly dynamic price evolution. In this vein, a
long or short position in the market can be understood as an agent. In detail,
a position pi is comprised of the set {xi , gi }, where xi is the current mid (or
entry price) and gi represents the position size and direction.
Figure 4: Simple rules: the elements of coastline trading. Cascading and de-
cascading trades increase or decrease existing positions, respectively.
In a next step, we combined the event-based price curve with simple rules of
interactions. This means that the agents interact with the coastline according
to a set of trading rules, yielding coastline traders [18, 2, 13]. In a nutshell,
the initialization of new positions and the management of existing positions
in the market are clocked according to the occurrence of directional change
or overshoot events. The essential elements of coastline trading are cascading
and de-cascading trades. For the former, an existing position is increased by
some increment in a loss, bringing the average closer to the current price. For
a de-cascading event, an existing position is decreased, realizing a profit. It
is important to note, that because positions sizes are only ever increased by
the same fixed increments, coastline trading does not represent a Martingale
strategy. In Figures 4 and 5 examples of such trading rules are shown.
With these developments, the second decade drew to a close. Led by the
introduction of event-based time, uncovering scaling-law relations, the novel
framework could be embedded in the larger paradigm related to the study of
complex systems. The resulting trading models were, by construction, auto-
mated, agent-based, contrarian, parsimonious, adaptive, self-similar, and mod-
ular. However, there was one crucial ingredient missing, to render the models
robust and hence profitable in the long-term. And so the journey continued.
12
Figure 5: Real-world example of coastline trading.
13
Figure 6: The transition network of states in the event-based representation of the
price trajectories. Directional changes and overshoots are the building blocks of
the discretized price curve, defining intrinsic time.
stylized network of states seen in Figure 6. The evolution of intrinsic time can
progress from a directional change, to another directional change or an over-
shoot. Which, in turn, can transition to another overshoot event or a back
to a directional change .
We define the surprise of the transitions from state si to state sj as
ij = logP(si sj ), (2)
which, as mentioned, is the point-wise entropy that is large when the probability
of transitioning from state si to state sj is small and vice versa. Consequently,
the surprise of a price trajectory within a time interval [0, T ], that has experi-
enced K transitions, is
K
[0,T ]
X
K = logP(sik sik+1 ). (3)
k=1
by virtue of the central limit theorem [33]. In other words, for large K,
converges to a normal distribution. Equation (4) now allows for the introduction
of our probability indicator L, defined as
!
[0,T ]
K K H (1)
L=1 , (6)
K H (2)
14
where is the cumulative distribution function of normal distributions. Thus,
an unlikely price trajectory, strongly deviating form a Brownian motion, leads
to a large surprise and hence L 0. We can now quantify when markets show
normal behavior, where L 1. Again, the reader is referred to [19] for more
details.
We now assess how the overshoot event should be chosen. The standard
framework for coastline trading dictates, that an overshoot event occurs in the
price trajectory when the price moves by in the overshoots direction after
a directional change. In the context of the probability indicator, we depart
from this procedure and define the overshoots to occur when the price moves by
2.525729. This value comes from maximizing the second order informativeness
H (2) and guarantees maximal variability of the probability indicator L. For
details see [19].
The probability indicator L can now be used to navigate the trading models
through times of severe market stress. In detail, by slowing down the increase
of the inventory of agents during price overshoots, the overall trading models
exposure experiences smaller drawdowns and better risk-adjusted performance.
As a simple example, when an agent cascades, i.e., increases its inventory, the
unit size is reduced in times where L starts to approach zero.
For the trading model, the probability indicator is utilized as follows. The
default size for cascading is one unit (lot). If L is smaller than 0.5, this sizing
is reduced to 0.5, and finally if L is smaller than 0.1, then the size is set to 0.1.
Implementing the above mentioned measures allowed the trading model to
safely navigate treacherous terrain, where it derailed in the past. However,
there was still one crucial insight missing, before a successful version of the
Alpha Engine could be designed. This last insight evolves around a subtle
recasting of thresholds which has profound effects on the resulting trading model
performance.
15
No Trend ( 2 = 0) Positive Trend ( 2 0) Negative Trend ( 2 0)
0.5 %
0.5 %
0.5 %
0.4 %
0.4 %
0.4 %
0.3 %
0.3 %
0.3 %
down
down
down
0.2 %
0.2 %
0.2 %
0.1 %
0.1 %
0.1 %
0.1 % 0.2 % 0.3 % 0.4 % 0.5 % 0.1 % 0.2 % 0.3 % 0.4 % 0.5 % 0.1 % 0.2 % 0.3 % 0.4 % 0.5 %
up up up
In Figure 7 the result of a Monte Carlo simulation is shown. For the situation
with no trend (left-hand panel) we see the contour lines being perfect circles.
In other words, by following any defined circle, the same number of directional
changes are found for the corresponding asymmetric thresholds. Details about
the analytical expressions and the Monte Carlo simulation regarding the number
of directional changes can be found in [20].
This opens up the space of possibilities, as up to now, only the 45-degree line
in all panels of Figure 7 were considered, corresponding to symmetric thresholds
= up = down . For trending markets, one can observe a shift in the contour
lines, away from the circles. In a nutshell, for a positive trend the expected
number of directional changes is larger if up > down . This reflects the fact
that an upward trend is naturally comprised of longer up-move segments. The
contrary is true for down moves.
Now it is possible to introduce the notion of invariance as a guiding principle.
By rotating the 45-degree line in the correct manner for trending markets, the
number of directional changes will stay constant. In other words, if the trend
is known, the thresholds can be skewed accordingly to compensate. However,
16
it is not trivial to construct a trend indicator that is predictive and not only
reactive.
A workaround is found by taking the inventory as a proxy for the trend. In
detail, the expected inventory size I for all agents in normal market conditions
can be used to gauge the trend: E[I(up , down )] is now a measure of trendiness
and hence triggers threshold skewing. In other words, by taking the inventory as
an invariant indicator, the 45-degree line can be rotated due to the asymmetric
thresholds, counteracting the trend.
A more mathematical justification can be found in the approach of what is
known as indifference prices in market making. This method can be trans-
lated into the context of intrinsic time and agents inventories. It then mandates
that the utility (or preference) of the whole inventory should stay the same for
skewed thresholds and inventory changes. In other words, how can the thresh-
olds be changed in a way that feels the same as if the inventory increases or
decreased by one unit? Expressed as equations
U (down , up , I) = U (down , up , I + 1), (10)
and
U (down , up , I) = U (down , up , I 1), (11)
where U represents a utility function. The thresholds up , down , up , and down
are indifference thresholds.
A pragmatic implementation of such an inventory-driven skewing of thresh-
olds is given by the following equation, corresponding to a long position
(
down 2 if I 15;
= (12)
up 4 if I 30.
17
Figure 8: Cascading with asymmetric thresholds. A stylized price curve is shown
in both panels. (Left) The original (symmetric) setup with an upward directional
change event (continuous line) and two overshoots (dashed lines). Short position size
increments are shown as downward arrows. (Right) The situation corresponding to
asymmetric thresholds, where intrinsic time accelerates and smaller position size
increments are utilized for coastline trading. See details in text.
will trigger a cascading event. In other words, one (negative) unit of exposure
(symbolized by the large arrows) is added to the existing short position. The
two overshoot events in the left-hand panel trigger identical cascading events. In
the right-hand panel, the same events are augmented by asymmetric thresholds.
Now up = down /4. As a result, each overshoot length is divided into four
segments. The new cascading regime is as follows: increase the position by one-
fourth of a (negative) unit (small arrow) at the directional change and another
fourth at the first, second, and third asymmetric overshoots each. In effect, the
cascading event is smeared out and happens in smaller unit sizes over a longer
period. For the cascading events at the first and second original overshoot, this
procedure is repeated.
This concludes the final chapter in the long history of the trading model
development. Many insights from diverse fields were consolidated and a unified
modelling framework emerged.
18
curve with occurrences of intrinsic time events triggers an increase or decrease
in position sizes.
In detail, an intrinsic event is either a directional change or a move of size
in the direction of the overshoot. For each exchange rate, we assign four coast-
line traders CTi [up/down (i)], i = 1, 2, 3, 4, that operate at various scales, with
upward and downward directional change thresholds equaling up/down (1) =
0.25%, up/down (2) = 0.5%, up/down (3) = 1.0%, and up/down (4) = 1.5%.
The default size for cascading and de-cascading a position is one unit (lot).
The probability indicator Li , assigned to each coastline trader, is evaluated on
the fixed scale (i) = up/down (i). As a result, its states are directional changes
of size (i) or overshoot moves of size 2.525729 i . The default unit size for
cascading is reduced to 0.5 if Li is smaller than 0.5. Additionally, if Li is smaller
than 0.1, then the size is further reduced to 0.1.
In case a coastline trader accumulates an inventory with a long position
greater than 15 units, the upward directional change threshold up (i) is in-
creased to 1.5 of its original size, while the downward directional change thresh-
old down (i) is decreased to 0.75 of its original size. In effect, the ratio for the
skewed thresholds is up (i)/down (i) = 2. The agent with the skewed thresh-
olds will cascade when the overshoot reaches 0.5 of the skewed threshold, i.e.,
half of the original threshold size. In case the inventory with long position
is greater than 30, then the upward directional change threshold up (i) is in-
creased to 2.0 of its original size and the downward directional change threshold
down (i) is decreased to 0.5. The ratio of the skewed thresholds now equals
up (i)/down (i) = 4 . The agent with these skewed thresholds will cascade when
the overshoot extends by 0.25 of the original threshold, with one-fourth of the
specified unit size. This was illustrated in the right-hand panel of Figure 8. The
changes in threshold lengths and sizing is analogous for short inventories.
This concludes the description of the trading model algorithm and the mo-
tivation of the chosen modeling framework. Recall that the interested reader
can download the code from GitHub [35].
19
beneficial to the markets as a whole. The more such strategies are implemented,
the less we expect to see runaway markets but healthier market conditions
overall. By construction, the trading model only ceases to perform in low-
volatility markets.
It should be noted that the model framework presented here can be real-
ized with reasonable computational resources. The basic agent-based algorithm
shows profitable behavior for four directional change thresholds, on which the
positions (agents) live. However, by adding more thresholds the model behav-
ior is expected to become more robust, as more information coming from the
market can be processed by the trading model. In other words, by increasing
the model complexity the need for performant computing becomes relevant for
efficient prototyping and back-testing. In this sense, we expect the advance-
ments in high performance computing in finance to positively impact the Alpha
Engines evolution.
Nevertheless, with all the merits of the trading algorithm presented here,
we are only at the beginning. The Alpha Engine should be understood as
a prototype. It is, so to speak, a proof of concept. For one, the parameter
space can be explored in greater detail. Then, the model can be improved by
calibrating the various exchange rates by volatility, or by excluding illiquid ones.
Furthermore, the model treats all the currency pairs in isolation. There should
be a large window of opportunity for increasing the performance of the trading
model by introducing correlation across currency pairs. This is a unique and
invaluable source of information not yet exploited. Finally, a whole layer of risk
management can be implemented on top of the models.
We hope to have presented a convincing set of tools motivated by a consistent
philosophy. If so, we invite the reader to take what is outlined here and improve
upon it.
A A History of Ideas
This section is a personal recount of the historical events that would ultimately
lead to the development of the trading model algorithm outlined in this chapter,
told by Richard B. Olsen:
The development of the trading algorithm and model framework dates back
to my studies in the mid 70s and 80s. From the very start my interests in
economics were influenced by my admiration of the scientific rigor of natural
sciences and their successful implementations in the real world. I argued that
the resilience of the economic and political systems depends on the underlying
economic and political models. Motivated to contribute to the well being of
society I wanted to work on enhancing economic theory and work on applying
the models.
I first studied law at the University of Zurich and then, in 1979, moved
to Oxford to study philosophy, politics, and economics. In 1980 I attended a
course on growth models by James Mirrlees, who, in 1996, would receive a Nobel
prize in economics. In his first lecture he discussed the short-comings of the
models, such as [25]. He explained that the models are successful in explaining
20
growth as long as there are no large exogenous shocks. But unanticipated events
are inherent to our lives and the economy at large. I thus started to search
for a model framework that can both explain growth and handle unexpected
exogenous shocks. I spent one year studying the Encyclopedia Britannica and
found my inspiration in relativity theory.
In my 1981 PhD thesis, titled Interaction between Law and Society, at
the University of Zurich, I developed a new model framework that describes in
an abstract language, how interactions in the economy occur. At the core of
the new approach are the concepts of object, system, environment, and event-
based intrinsic time. Every object has its system that comprises all the forces
that impact and influence the object. Outside the system is its environment
with all the forces that do not impact the object. Every object and system
has its own frame of reference with an event-based intrinsic time scale. Events
are interactions between different objects and their systems. I concluded that
there is no abstract and universal time scale applicable to every object. This
motivated me to think about the nature of time and how we use time in our
everyday economic models.
After finishing my studies, I joined a bank working first in the legal depart-
ment, then in the research group, and finally joined the foreign exchange trading
desk. My goal was to combine empirical work with academic research, but was
disappointed with the pace of research at the bank. In the mid 80s, there was
the first buzz about start-ups in the United States. I came up with a business
idea: banks have a need for quality information to increase profitability, so there
should be a market for quality real time information.
I launched a start-up with the name of Olsen & Associates. The goal was
to build an information system for financial markets with real time forecasts
and trading recommendations using tick-by-tick market data. The product
idea combined my research interest with an information service, which would
both improve the quality of decision-making in financial markets and generate
revenue to fund further research. The collection of tick market data began in
January 1986 from Reuters. We faced many business and technical obstacles,
where data storage cost was just one of the many issues. After many setbacks
we successfully launched our information service and eventually acquired 60 big
to mid-sized banks across Europe as customers.
In 1990, we published our first scientific paper [29] revealing the first scaling
law. The study showed that intraday prices have the same scaling law exponent
as longer-term price movements. We had expected two different exponents: one
for intraday price movements, where technical factors dictate price discovery,
and another for longer-term price movements that are influenced by fundamen-
tals. The result took us by surprise and was evidence that there are universal
laws that dictate price discovery at all scales. In 1995 we organized the first
high frequency data conference in Zurich, where we made a large sample of tick
data available to the academic community. The conference was a big success
and boosted market microstructure research, which was in its infancy at that
time. In the following years we conducted exhaustive research testing all possi-
ble model approaches to build a reliable forecasting service and trading models.
Our research work is described in the book [15]. The book covers data collection
21
and filtering, basic stylized facts of financial market time series, the modelling
of 24 hour seasonal volatility, realized volatility dynamics, volatility processes,
forecasting return and risks, correlation, and trading models. For many years
the book was a standard text for major hedge funds. The actual performance of
our forecasting and trading models was, however, spurious and disappointing.
Our models were best in class, but we had not achieve a breakthrough.
Back in 1995, we were selling tick-by-tick market data to top banks and
created a spinoff under the name of OANDA to market a currency converter on
the emergent Internet and eventually build a foreign exchange market making
business. The OANDA currency converter was an instant success. At the start
of 2001, we were completing the first release of our trading platform. At the
same time, Olsen & Associates was a treasure store of information and risk
services, but did not have cash to market the products and was struggling for
funding. When the Internet bubble burst and markets froze, we could not pay
our bills and the company went into default. I was able to organize a bailout
with a new investor. He helped to salvage the core of Olsen & Associates with
the aim of building a hedge fund under the name of Olsen Ltd and buying up
the OANDA shares.
In 2001, the OANDA trading platform was a novelty in the financial indus-
try: straight through processing, one price for everyone, and second-by-second
interest payments. At the time, these were true firsts. At OANDA, a trader
could buy literally 1 EUR against USD at the same low spread as a buyer of
1 million EUR against USD. The business was an instant success. Moreover,
the OANDA trading platform was a research laboratory to analyse the trades
of ten thousands of traders, all buying and selling at the same terms and condi-
tions, and observe their behaviour patterns in different market environments. I
learned hands on, how financial markets really work and discovered that basic
assumptions of market efficiency that we had taken for granted at Olsen & As-
sociates were inappropriate. I was determined to make a fresh start in model
development.
At Olsen Ltd I made a strategic decision to focus exclusively on trading
model research. Trading models have a big advantage over forecasting models:
the profit and losses of a trading model are an unambiguous success criterion of
the quality of a model. We started with the forensics of the old model algorithms
and discovered that the success and failure of a model depends critically on the
definition of time and how data is sampled. Already at Olsen & Associates we
were sensitive to the issue of how to define time and had rescaled price data
to account for the 24 hour seasonality of volatility, but did not succeed with a
more sophisticated rescaling of time. There was one operator that we had failed
to explore. We had developed a directional change indicator and had observed
that the indicator follows a scaling law behaviour similar to the absolute price
change scaling law [21]. This scaling law was somehow forgotten and was not
mentioned in our book [15]. I had incidental evidence that this operator would
be successful to redefine time, because traders use such an operator to analyse
markets. The so-called point and figure chart replaces the x-axis of physical
time with an event scale. As long as a market price moves up, the prize stays
frozen in the same column. When the price moves down by a threshold bigger
22
than the box size, the plot moves to the next column. A new column is started,
when the price reverses its direction.
Then I also had another key insight of the path dependence of market prices
from watching OANDA traders. There was empirical evidence that a margin
call of one trader from anywhere in the world could trigger a whole cascade
of margin calls in the global foreign exchange markets in periods of herding
behaviour. Cascades of margin calls wipe out whole cohorts of traders and
tilt the market composition of buyers and sellers and skew the long-term price
trajectory. Traditional time series models cannot adequately model these phe-
nomena. We decided to move to agent-based models to better incorporate the
emergent market dynamics and use the scaling laws as a framework to calibrate
the algorithmic behaviour of the agents. This seemed attractive, because we
could configure self-similar agents at different scales.
I was adamant to build bare-bone agents and not to clutter our model
algorithms with tools of spurious quality. In 2008, we were rewarded with
major breakthrough: we discovered a large set of scaling laws [16]. I expected
that model development would be plain sailing from thereon. I was wrong. The
road of discovery was much longer than anticipated. Our hedge fund had several
significant drawdowns that forced me to close the fund in 2013. At OANDA,
things had also deteriorated. After raising 100 million USD for 20% of the
company in 2007, I had become chairman without executive powers. OANDA
turned into a conservative company and lost its competitive edge. In 2012, I
left the board.
In July 2015, I raised the first seed round for Lykke, a new startup. Lykke
builds a global marketplace for all asset classes and instruments on the blockchain.
The marketplace is open source and a public utility. We will earn money by
providing liquidity with our funds and or customers funds, with algorithms as
described in this paper.
23
B Supplementary Material
Time
Figure 9: Profit & Loss for a time series, generated by a geometric random walk
of 10 million ticks with annualized volatility of 25%. The average of 60 Monte
Carlo simulations is shown. In the limiting case, the P&L curve becomes a smooth
increasing line.
24
% Jan Feb Mar Apr May June July Aug Sep Oct Nov Dec Year
2006 0.16 0.15 0.07 0.12 0.22 0.17 0.19 0.20 0.18 0.08 -0.00 0.04 1.58
2007 0.08 0.22 0.14 0.02 -0.05 -0.03 0.32 0.59 0.07 0.11 0.47 0.20 2.03
2008 0.24 0.07 0.05 0.50 0.26 0.09 0.26 0.16 0.66 2.22 1.27 0.98 6.03
2009 1.14 1.41 1.17 1.00 0.75 0.59 0.22 0.19 -0.13 0.28 0.06 0.25 7.70
2010 0.15 -0.34 0.24 0.14 0.30 0.17 0.27 -0.02 0.03 0.06 0.14 -0.31 1.42
25
2011 0.45 0.13 0.11 -0.16 0.04 -0.06 -0.40 0.43 0.45 -0.03 0.32 -0.03 0.97
2012 -0.08 0.19 0.29 0.08 -0.12 0.15 -0.20 0.23 0.10 0.13 0.12 0.11 0.86
2013 -0.17 -0.01 -0.10 -0.08 0.32 0.52 0.04 0.24 -0.10 0.01 -0.01 -0.16 0.77
Table 1: Monthly performance of the unleveraged trading model. The P&L is given in percentages. All 23 currency
pairs are aggregated.
References
[1] R. Albert and A.L. Barabasi. Statistical Mechanics of Complex
Networks. In: Review of Modern Physics 74.1 (2002), pp. 4797.
[2] Monira Aloud, Edward Tsang, Alexandre Dupuis, and Richard
Olsen. Minimal agent-based model for the origin of trading ac-
tivity in foreign exchange market. In: 2011 IEEE Symposium
on Computational Intelligence for Financial Engineering and Eco-
nomics (CIFEr). IEEE. 2011, pp. 18.
[3] Monira Aloud, Edward Tsang, Richard B Olsen, and Alexandre
Dupuis. A directional-change events approach for studying finan-
cial time series. In: Economics Discussion Papers 2011-28 (2011).
[4] Jrgen Vitting Andersen and Didier Sornette. A mechanism for
pockets of predictability in complex adaptive systems. In: EPL
(Europhysics Letters) 70.5 (2005), p. 697.
[5] Han Ao and Edward Tsang. Capturing Market Movements with
Directional Changes. In: Working paper: Centre for Computa-
tional Finance and Economic Agents, Univ. of Essex (2013).
[6] Pooneh Baghai, Onur Erzan, and Ju-Hon Kwek. The $64 trillion
question: Convergence in asset management. 2015.
[7] Amer Bakhach, Edward P. K. Tsang, and Wing Lon Ng. Fore-
casting Directional Changes in Financial Markets. In: Working
paper: Centre for Computational Finance and Economic Agents,
Univ. of Essex (2015).
[8] Bank of International Settlement. Triennial Central Bank Survey
of foreign exchange and OTC derivatives markets in 2016. Mone-
tary and Economic Department. 2016.
[9] A.L. Barabasi and R. Albert. Emergence of Scaling in Random
Networks. In: Science (1999), p. 509.
[10] Antoine Bouveret, Cyrille Guillaumie, Carlos Aparicio Roqueiro,
Christian Winkler, and Steffen Nauhaus. High frequency trading
activity in EU equity markets. 2014.
[11] Jim X Chen. The Evolution of Computing: AlphaGo. In: Com-
puting in Science & Engineering 18.4 (2016), pp. 47.
[12] T.M. Cover and J.A. Thomas. Elements of information theory.
New York, NY, USA: John Wiley & Sons, 1991.
26
[13] Alexandre Dupuis and Richard B. Olsen. High Frequency Fi-
nance: Using Scaling Laws to Build Trading Models. In: Hand-
book of Exchange Rates. Ed. by Ian W. Marsh Jessica James and
Lucio Sarno. John Wiley & Sons, Inc., 2012, pp. 563584. isbn:
9781118445785. doi: 10.1002/9781118445785.ch20. url: http:
//dx.doi.org/10.1002/9781118445785.ch20.
[14] J Doyne Farmer and Duncan Foley. The economy needs agent-
based modelling. In: Nature 460.7256 (2009), pp. 685686.
[15] Ramazan Gencay, Michel Dacorogna, Ulrich A Muller, Olivier Pictet,
and Richard Olsen. An introduction to high-frequency finance. Aca-
demic press, 2001.
[16] J. B. Glattfelder, A. Dupuis, and R. B. Olsen. Patterns in high-
frequency FX data: Discovery of 12 empirical scaling laws. In:
Quantitative Finance 11.4 (2011), pp. 599614.
[17] James B. Glattfelder. Decoding Complexity. Springer, Berlin, 2013.
[18] James B. Glattfelder, Thomas Bisig, and Richard B. Olsen. R&D
Strategy Document. Tech. rep. A Paper by the Olsen Ltd. Research
Group, 2010. url: https://arxiv.org/abs/1405.6027.
[19] Anton Golub, Gregor Chliamovitch, Alexandre Dupuis, and Bastien
Chopard. Multiscale representation of high frequency market liq-
uidity. In: Algorithmic Finance 5.1 (2016).
[20] Anton Golub, James B. Glattfelder, Vladimir Petrov, and Richard
B. Olsen. Waiting Times and Number of Directional Changes in In-
trinsic Time Framework. Lykke Corp & University of Zurich Work-
ing Paper. 2017.
[21] Dominique M Guillaume, Michel M Dacorogna, Rakhal R Dave,
Ulrich A Muller, Richard B Olsen, and Olivier V Pictet. From the
birds eye to the microscope: A survey of new stylized facts of the
intra-daily foreign exchange markets. In: Finance and stochastics
1.2 (1997), pp. 95129.
[22] Dirk Helbing. Agent-based modeling. In: Social self-organization.
Springer, 2012, pp. 2570.
[23] John C. Hull. Options, Futures and other Derivative Securities. 9th
edition. Pearson, London, 2014.
[24] ISDA. Central Clearing in the Equity Derivatives Market. 2014.
[25] Nicholas Kaldor and James A Mirrlees. A new model of economic
growth. In: The Review of Economic Studies 29.3 (1962), pp. 174
192.
27
[26] Thomas Lux and Michele Marchesi. Volatility clustering in fi-
nancial markets: a microsimulation of interacting agents. In: In-
ternational journal of theoretical and applied finance 3.04 (2000),
pp. 675702.
[27] B. Mandelbrot. The Variation of Certain Speculative Prices. In:
Journal of Business 36.4 (1963).
[28] Ulrich A Muller, Michel M Dacorogna, Rakhal D Dave, Olivier V
Pictet, Richard B Olsen, and J Robert Ward. Fractals and intrin-
sic time: A challenge to econometricians. In: presentation at the
XXXIXth International AEA Conference on Real Time Economet-
rics, 14-15 Oct 1993 in Luxembourg (1993).
[29] Ulrich A Muller, Michel M Dacorogna, Richard B Olsen, Olivier
V Pictet, Matthias Schwarz, and Claude Morgenegg. Statisti-
cal study of foreign exchange rates, empirical evidence of a price
change scaling law, and intraday analysis. In: Journal of Banking
& Finance 14.6 (1990), pp. 11891208.
[30] Mark EJ Newman. The structure and function of complex net-
works. In: SIAM review 45.2 (2003), pp. 167256.
[31] M.E.J. Newman. Power Laws, Pareto Distributions and Zipfs
Law. In: Contemporary Physics 46.5 (2005), pp. 323351.
[32] Vilfredo Pareto. Cours dEconomie Politique. In: (1897).
[33] H. D. Pfister, Soriaga J. B., and P. H. Siegel. On the Achievable
Information Rates of Finite State ISI Channels. In: Proc. IEEE
Globecom. Ed. by David Kurlander, Marc Brown, and Ramana
Rao. ACM Press, Nov. 2001, pp. 4150.
[34] Tom Roseen. Are Quant Funds Worth Another Look? 2016.
[35] The Alpha Engine: Designing an Automated Trading Algorithm
Code. https://github.com/AntonVonGolub/Code/blob/master/
code.java. Accessed: 2017-01-04. 2017.
[36] Johannes Voit. The Statistical Mechanics of Financial Markets. 3rd
edition. Springer, Berlin, 2005.
[37] G.B. West, J.H. Brown, and B.J. Enquist. A General Model for
the Origin of Allometric Scaling Laws in Biology. In: Science
276.5309 (1997), p. 122.
[38] Stephen Wolfram. A New Kind of Science. Wolfram Media, Cham-
paign, 2002.
[39] World Bank. World Development Indicators database. 2015.
28
[40] George Kingsley Zipf. Human behavior and the principle of least
effort. Addison-Wesley, Reading, MA, 1949.
29