We are about to experience a revolution in portfolio management, triggered by machine intelligence as we enter the ‘AI economy’. The investment industry has been starting to amplify investment intelligence in Wealth & Asset Management by machines’.
Many financial products today are still based on linear models and the Markowitz framework. It is lauded as Nobel prize winning and scientific. However, the math of choice may be inadequate as has recently been pointed out by several articles, e.g. in the Journal of Portfolio Management (Focardi and Fabozzi or de Prado). They argue that economics has become a theory that describes an idealized economic and financial reality without any strong connection to data. For example, Mean-Variance (MV) optimisation was created in the 1950s where computation time was expensive and data was rare. It does not address the high level of uncertainty in financial markets leading to instability, concentration and underperformance. The consequences of using overly simplistic and outdated statistical methods are palpable.
The reason is, inverting covariance matrices often leads to errors of such magnitude that they entirely offset the benefits of diversification. Small estimation errors from short samples lead to grossly incorrect inversions. That is a problem for Risk Parity and MV portfolios which are optimal in-sample but tend to perform poorly out-of-sample. The Markowitz’s curse is that optimisation is likely to fail precisely when there is a greater need for finding a diversified portfolio. This is not a good omen for linear finance, factor investing and smart-beta funds.
As an example we point to the FinTech movement where we have observed the rise of robo advisory focusing on improved user experience and smooth processing. However, the most important user experience is performance and robustness thereof. Most robo strategies are still based on the Markowitz paradigm and they started their operations after the Global Financial Crisis so they have no track of performance during financial crises. It will be important to evaluate those new solutions during those periods and our guess is that their performance of some robos will be lacking. This will be bad news for new and younger investors like millennials with low financial literacy and no experience in dramatic losses. Put simple, the ‘robo is not robust’ which applies at least to those vendors relying on traditional optimisation.
Portfolio optimization is not the only possible way of building a portfolio. Several modern financial technologies like Machine Intelligence, Data Science and Advanced Analytics are available as a new kind of math to work with the level of uncertainty. In contrast to financial economists who tend to argue on a normative (‘should happen’) level, these modern technologies allow to address crucial financial questions from a more descriptive (‘is happening’) level.
One such technology is Network Analysis and Graph Theory. It is better suited to capture the complexity inherent to financial systems. The reason is, unlike the geometry that underlies all econometrics, Graph Theory inherits the notion of ‘hierarchy” that pervades complex data sets like financial markets produce. Hierarchical relations are key to understanding complex phenomena, such as contagion within a system. Network Theory has made significant progress in representing interactions among economic agents, e.g. in financial markets.
In an epochal paper entitled ‘The Architecture of Complexity’, Nobel prize laureate Herbert A. Simon, whose father was an electrical engineer who had come to the US from Germany after earning his engineering degree from the Technische Hochschule of Darmstadt, attempted to draw together what he had learned about the structure of complex systems: ‘complexity frequently takes the form of hierarchy. Hierarchy, I shall argue, is one of the central structural schemes that the architect of complexity uses.’ Simon was among the pioneers of several of today's important scientific domains, including artificial intelligence, information processing, decision-making, organization theory, complex systems, and computer simulation of scientific discovery. Simon has never been as modern as today.
Someone smart once said that all the world is a graph, a ‘network of networks’. A graph can be understood as a relational map between pairs of items. Graph Theory can answer questions regarding the logical structure, architecture and dynamics of a complex system. Graph Theory is applied by Google to rank hyperlinks (‘PageRank algorithm’), by GPS systems to find your shortest path home and by LinkedIn to suggest connections. The reason for consumer web giants like Google, Amazon and Apple being so successful is their DNA of network analysis. They have demonstrated impressively that relationships between data points are often more valuable than the Big Data itself. In finance and banking however, network analysis seems to be one of the most underrated financial technologies today.
In a recent paper by de Prado in the Journal of Portfolio Management he presents a new portfolio construction method that substantially improves the out-of-sample-performance of diversified portfolios. Paradoxically, even though Markowitz-style allocation is optimal in-sample, it loses out-of-sample to his method. Correlation matrices lack the notion of hierarchy, because all investments are potential substitutes to each other. One reason for the instability of optimizers is that the vector space is modelled as a complete (fully connected) graph, where every node is a potential candidate to substitute another (‘the full correlation matrix’). His approach basically filters out unimportant links in order to reveal a higher order robust and tail risk improved efficient frontier. It is called Hierarchical Risk Parity (HRP) and Monte Carlo experiments show that HRP delivers lower out-of-sample variance than Mean-Variance (MV) approaches. HRP also produces less risky portfolios out-of-sample compared to traditional Risk Parity methods. It uses modern mathematics (Graph Theory and Machine Learning techniques) to build a diversified portfolio only based on the information contained in the covariance matrix. i.e. without explicit forecasts.
Building upon the fundamental notion of hierarchy, the HRP approach weights are distributed top-down, consistent with how many asset managers build their portfolios, e.g. from asset class to sectors to individual securities. Dating back to 2011 we have also been developing a similar approach called the ‘The Cluster Based Waterfall Approach’. It extracts a hierarchical tree structure from the correlation matrix and splits the weights at each tree bisection into equal parts:
It can be seen that the more nested and deeper an instrument is in the tree, the less weight is assigned. The weighting scheme avoids the strongly interconnected, contagious parts of the tree. HRP has a similar mechanism while also accounting for the risk of the portfolio instruments: the weights are adjusted at each split level according to the inverse variance of the lower-level clusters, therefore extending the Risk Parity approach by the notion of hierarchy.
We have implemented a web-based application using a simplified and generalised version of HRP to benchmark against other well-known risk-based portfolios. Our test investment universe consists of some of the most traded ETFs (equities and bonds). Here are the application’s results for a long-only monthly rebalanced portfolio in a walk forward test starting just before the Global Financial Crisis (best values are highlighted in green):
The most striking observation is that the HRP portfolio exhibits the lowest variance although the minimum variance strategy was supposed to reach that goal. HRP is also characterised by low and short draw downs in all stress periods of the market, underpinning its robust and almost anti-fragile behaviour. The surface below the underwater (‘draw down’) mark was also computed and put into relation with realised returns: this is the so called ‘Pain Ratio’ where HRP is also the winner. This means that the investor’s journey is the least painful one creating the best ‘user experience’, especially in robo advice.
Further analysis and extensions of the HRP approach have been carried out by researchers Wiecki and Raffinot. In the last couple of years I have also dedicated my research to numerous aspects of graph-based machine intelligence, e.g. in the context of Parametric Portfolio Policies and in the context of Interconnectedness Risk and Active Portfolio Management. Further examples are applications for active ETF portfolio management and for multi-asset futures.
No one has a reliable way of predicting the future. That is why we must diversify. It is the cheapest and most effective way of dealing with uncertainty. In our computerised world, the Markowitz-style approaches from the 1950s are not the answer. But how can we establish a reliable and objective way of diversification? How can machines amplify our investment results? Computers, algorithms and data are available at almost zero cost. Many industries today rely on machine intelligence and we have shown in this article how it helps to diversify our investments.
Examples cover stress testing and identification of asset shifts and regime switches in markets. An example is the network-based contagion analysis of weekly influences across European government bonds by Schwendner et al. and the handling of Risk On/Risk Off Dynamics with Correlation Regimes and Correlation Networks.
This is just an article and no investment advice.
This Blog is part of a series of blogs building up on each other: