Amplifying Investment Intelligence in Wealth & Asset Management by Machines

We are currently observing a renaissance of AI. Self-driving cars, personal assistance in business activities, navigating the web with bot systems, natural language processing, pattern recognition, reading emotions, and playing complicated games like Asian Go or Jeopardy are the latest spectacular developments where machines can beat the human grandmasters or solve complex tasks. However, the term AI is definitely hyped and often misunderstood. Machines are far away from performing many ‘human’ tasks, and there are many unresolved problems. Most AI algorithms are based on rather simple mathematical procedures that have been around for decades and even centuries. ‘Neuronal networks,' for example, had disappointed many and now they are emerging with some improvements, rebranding as ‘deep learning.'

However, the combination of data sets available, high-performance computing power, infinite storage and more intelligent algorithms have lately fueled the developments. These things are available to anyone at low cost, so we are also talking about a democratization of AI. Are we entering the ‘AI economy’? All tech giants have been positioning themselves in the AI field already, or they are trying to do so. What about Asset & Wealth Management? Will AI be a major trend on the road to complete digitalisation? How will the Fintech movement come in?

These and related topics will be discussed in this monthly blog. We also discuss hands-on AI use cases in Asset & Wealth Management. There will be a range of useful applications discussed: aggregation of models/signals, allocation across assets/models, risk management, capturing dynamics like cycles and regime switches, diversification, dynamic leverage, long/short signals, tail risk hedging, scenario generation, stress testing, risk management, robo portfolios, portfolio health checks, new investment products like smart beta, research tools, portfolio construction tools, fund-of-funds construction/monitoring, ESG portfolio management, regulatory technology and many more. In this first blog entry, we discuss a graph-based machine learning approach to create alternative indices with robust performance and lower risk.

Our belief is that we are now coming to a tipping point where it suddenly pays off to use AI in Asset & Wealth Management on a wider spectrum simply because results are now convincing enough. There will not be another winter for AI because this time the necessary maturity of the technology seems to be given.

However, we believe that the true intelligence of AI will also be how smart it is integrated into existing and established processes. And how accepted it is by its human pilots or users. For example, in an investment process, it has to be clarified how much performance or robustness can be attributed to a specific AI step. Also, the functioning, mechanics, and interpretation of results have to be intuitive and transparent, especially for less technical portfolio managers, advisors and investors, of course. Other questions are:

  1. Which task within a systematic investment/advisory/reporting process is suitable to be operated by AI? Could it be outsourced?
  2. How will portfolio managers and advisors use AI in their everyday work?
  3. Could processes be transformed and centered around some AI tasks in order to create new products and investment services?

It is clear that we cannot discuss all aspects and algorithms in AI and its combination with advanced analysis and data science. Therefore, we restrict ourselves mostly on graph-based machine intelligence being aware that there are many other interesting approaches like the already mentioned deep learning, fuzzy logic, evolutionary algorithms, support vector machines, decision trees, ensemble learning, Bayes learning, random forests, etc.

The choice for graph-based machine intelligence was made for several reasons:

  • The data requirements are low as the asset graph of a portfolio can be learned from asset return series widely available. No special or difficult data sets are needed. Each portfolio with time series can be analyzed specifically.
  • It is very close to traditional approaches like correlation analysis and Markowitz thinking, but it is still very innovative.
  • It is a non-linear approach rationalizing market complexity and portfolio dynamics in a way that supports the transformation to more systematic, data-driven and evidence-based investment & advisory processes and more robust performance results through the cycles.
  • Comprehending an investment portfolio as an asset graph with interconnections and influences is very natural as the model IS the real-world problem. People are used to think in relations so visualizing and exploring asset relations is a very natural thing to do. The machine can support automated interpretations, analyses, diagnoses and reportings on top of the asset graph. These very valuable additional pieces of information are displayed together with the asset graph layer, creating a maximum of intuition and transparency of the AI procedure and its interpretation of results.
  • Graph databases are the fastest-growing category in database management systems right now. This reflects the worldwide success of graph-based methods in many disciplines: internet and social media giants, commercial banks, insurance companies, the health care sector and many natural scientists use network analysis as their standard tool to understand interconnected data and complex systems. Think about Google’s PageRank algorithm. Or think about the financial crisis where the hidden systemic risk uncovered the ‘too connected to fail’ status of Lehmann, later leading to a much wider use of network models at central banks and supervisors worldwide. It seems very obvious to also utilize the graph approach to solving Asset & Wealth Management problems.

In the announced first example on graph-based machine intelligence, we discuss how an alternative equity index can be constructed with this technology. Many index construction providers start with an index construction step/rule where they filter the index universe to get a subset of components that are supposed to be diversified. In additional steps, other features like factor exposures or risks are filtered or weighted to come up with a new index with specific characteristics, styles or themes.

In our example, we look at this first diversification step only. It is often done with correlation analysis. In a simple way, the average correlation of each stock to all others is measured. Those stocks with high average correlations are excluded. The problem here is that the correlation matrix is subject to estimation noise. Also, the valuable network/graph information contained in the correlation matrix is left untouched by simply averaging all correlations. The method is too inefficient for such an important first index construction step.

This is a reasonable scenario for utilizing graph-based diversification. The graph approach is more robust when processing noisy data. Also, the graph structure can be directly interpreted with respect to the objective: diversification. Finally, the graph approach captures the dynamics of asset relations over time, yielding adaptive diversification. Here are some screenshots of the graph-based index construction tool we are using here:

The asset graph delivers a very valuable piece of information: Where is the core of the asset network where systemic risk, spillover, contagion and other erratic things happen? On the other hand, where is the network periphery where assets are rather decoupled from systemic risk? These would be highly interesting candidates for a diversified portfolio.

We are now utilizing our approach on 363 stocks from the S&P with almost 20 years of historical data to get a long backtest covering several cycles and crises. We are just testing the first rule of the index construction: the diversification step and we are doing it with the standard average correlation approach and with the graph-based filter approach, each finding 50/363 stocks to invest in equally-weighted.

And here are the results: as expected, the graph-based approach clearly outperforms the standard correlation approach in all parameterisations, set-ups and also on different data sets with zero bias. This comes in absolute and risk-adjusted terms. These are the other parameters of our specific example:

  • Yearly rebalancing based on the daily return series of the past 12 months
  • Benchmark 1: equally weighted portfolio of all 363 stocks (‘naïve’)
  • Benchmark 2: randomly choosing 50 stocks from the sectors health care and consumer (discretionary and staples)

Here are the results: 

graph-based machine intelligence low correlation naive consumer/health
Annualized
Return
0.22 0.20 0.15 0.17

 

graph-based machine intelligence low correlation naive consumer/health
Worst
Drawdown
0.46 0.48 0.51 0.42

 

graph-based machine intelligence low correlation naive consumer/health
Annualized
Sharpe Ratio
(Rf=0%)
1.24 1.07 0.75 0.91

 

We interpret the results as follows:
Highest (risk-adjusted) returns could be accomplished by the graph approach, followed by the standard correlation approach. After the first signal generations in the simulation at about year 2000 we could observe that these two approaches often choose stocks from the sectors health care and consumer (discretionary and staples). An index constructor in year 2000 might have gotten the idea to directly invest in these sectors may be because she thinks these are less cyclical and there is a constant demand in the economy for these services and products. So every year we randomly sample 50 stocks from these sectors and equal weight them. That is Benchmark 2. Neither Benchmark strategy 1 (‘naïve’) nor 2 can beat the correlation and graph-based strategies.

And this is our explanation for the outperformance:
The diversification filter rules find attractive stocks in terms of absolute and relative performance. That is one origin of the performance. It can be explained by the systemic risk approach where the system’s core stocks are exposed to cyclical shocks, stress, contagion and risk spill overs, so investing in the system’s periphery means to mitigate the problems that stocks exhibit in the core. So stocks in the periphery exhibit lower risk and higher returns on average.
Also, the risk of the periphery stocks is mainly an idiosyncratic risk as they are rather decoupled from systemic influences. But idiosyncratic risk can be diversified away. That is the other origin of performance based on a diversification return.
This is in line with similar structural findings and results found by ourselves and by other practitioners and researchers publishing in respected journals.


 This Blog is part of a series of blogs building up on each other: 

  1. Amplifying Investment Intelligence in Wealth & Asset Management by Machines
  2. Using AI to establish a reliable and objective way of diversification
  3. Machine intelligence for robo strategies, factor investing, smart-beta and impact investing

Über den Autor

  • Dr. Jochen Papenbrock

    Dr. Jochen Papenbrock

    CEO and founder of Firamis

    Dr. Jochen Papenbrock is CEO and founder of Firamis. He has more than 10 years’ experience of management and technology consulting as well as quantitative modeling in the financial industry. Also, he has invented, developed and operationalised several innovative financial technologies. He is author, consultant, data scientist, entrepreneur, financial engineer, fintech enthusiast, inventor, programmer, quant, researcher, risk management expert, and trainer/coach. He earned his doctorate and degree in business engineering at KIT (Karlsruhe Institute of Technology), Germany.

    RSS-Feed abonnieren