AI and Deep Learning in Risk and Investment Management

Deep learning, or the ability of machines to learn using artificial neural networks, has been around since the mid-1960s. In recent years, however, advances in computation performance and storage technologies are enabling the large quantities of data required for effective neural networks to be rapidly processed. As a result, the use of deep learning is increasingly being explored across multiple industries and geographies. In this article, Mike O’Hara and Joel Clark of The Realization Group discuss the possible use cases for deep learning within financial services, particularly in risk and investment management, as well as the challenges that arise from an infrastructure and data storage perspective. They talk to Verne Global’s Stef WeegelsUlrich Noegel of big xyt, Oliver Maspfuhl from Commerzbank, Terence Chabe of Colt Technology Services, Holger Boschke of TME and Invesco’s Bernhard Langer

Artificial intelligence (AI) and deep learning may offer the potential to replace the need for human beings in certain parts of the financial services sector, bringing unprecedented innovations, efficiencies and cost savings to bear. But the successful implementation of such technologies requires unimagined innovation from humans, such that it may yet be a long time coming.

This irony at the heart of deep learning was articulately expressed by Haruhiko Kuroda, governor of the Bank of Japan, in a speech in April 2017. “If there is any risk the role of human beings are overwhelmingly replaced by AI, that would be when human beings stop thinking independently and autonomously,” he said.

“From a financial perspective, it is most important for us to think independently and positively on how to make an efficient and effective use of new technologies such as AI and big data analytics to further develop and improve financial markets and services.”

Deep learning, an o shoot of the broader family of machine learning and AI methods, is not a new concept and has existed in some form since the 1960s. It is based on the notion of learning tasks using artificial neural networks inspired by the biological nervous system. The technology is highly advanced, and requires vast volumes of data and compute power.

Machine learning methods, like deep learning, construct predictive models from sample input extracted from large data sets. These models can provide data-driven algorithms which perform better, or are more easily constructed, than traditional modeling techniques. Example applications include classification problems like e-mail spam detection or image recognition, as well as the predictive analytics such as credit score computation or optimisation of investment decisions for improved financial performance.

With the rising complexity of financial markets and the increased technological sophistication of securities trading and processing, it should come as little surprise that there is now so much focus on possible use cases for deep learning. If such methods have the potential to help navigate complex markets and regulations, boost profits, cut costs or bring other efficiencies to bear, they naturally merit exploration. But it remains very much a work in progress and a big challenge.

“There are lots of new providers offering these type of services and partly using artificial intelligence, but my take is that in essence we are really just talking about intelligent systems, rather than something that can develop and support itself. As far as Digital Wealth Management is concerned these solutions apply to three main areas – customer relationship management, portfolio management, and communication and reporting. Over the next have years or so, we will see many new applications in these areas that use artificial intelligence,” says Holger Boschke, chairman of the supervisory board of TME AG.

There are lots of new providers offering these type of services and partly using artificial intelligence, but my take is that in essence we are really just talking about intelligent systems, rather than something that can develop and support itself.
Holger Boschke, TME AG

In addition he also highlights potential use cases for deep learning beyond the three areas especially in marketing and managing risk. Indeed cyber security and market surveillance are two functions that are occupying the hearts and minds of many market practitioners and regulators, as the need to protect from hackers and monitor internal staff activity increases. Both require the real-time monitoring of vast volumes of data and information that could theoretically be aided by the use of artificial neural networks.

But this is not an area for experimentation. So grave is the threat of cyber attack that no one wants to leave its detection to chance, and deep learning would need to be fully tried and tested before it could be implemented as a first line of defence against cyber-attack at a large financial institution. In the meantime, the testing of potential use cases extends from the back once to the front once, where deep learning is being considered by some quantitative hedge funds as a tool for the generation of alpha.

On the one hand investment managers can try to generate alpha by discovering hidden dependencies in large and potentially alternative data sets, on the other end of the spectrum of the financial value chain around risk management and compliance can also be dependent of the analysis of very big data sets.
Ulrich Noegel, big xyt

“The applications for artificial intelligence and machine learning are very diverse. On the one hand investment managers can try to generate alpha by discovering hidden dependencies in large and potentially alternative data sets, on the other end of the spectrum of the financial value chain around risk management and compliance can also be dependent of the analysis of very big data sets. To detect market abuse, for example, one needs to identify illegal patterns across a lot of big data sets from different markets,” says Ulrich Noegel, co-founder of analytics at big xyt, a provider of interactive analytics of large data sets.

Meanwhile the complex regulatory environment that has evolved since the financial crisis adds further opportunities for the deployment of deep learning techniques. While some regulations are principally focused on market practices and surveillance, many of the new Basel Committee capital requirements require much more intensive analysis of data than in the past.

The Fundamental Review of the Trading Book (FRTB), which overhauls the market risk capital framework, is a case in point. The use of internal capital calculation models, which is critical for many banks to ensure a sensible level of capital is held, becomes much more complex under FRTB and requires vast troves of previously uncollected data to be analysed.

“Deep learning offers a highly effective way to deal with changes in regulation, particularly those that involve monitoring and maintaining large quantities of historical data. FRTB, for example, requires a huge amount of historical market data, sometimes going back an entire decade. The data must be properly cleaned and analysed, and deep learning can be used to spot discrepancies in the data and issues to be investigated,” says Stef Weegels, global sales director for financial services and capital markets at Verne Global.

Deep learning offers a highly effective way to deal with changes in regulation, particularly those that involve monitoring and maintaining large quantities of historical data.
Stef Weegels, Verne Global

The potential to deploy deep learning as a tool for risk management, compliance and supervision could be a real game changer for the financial services industry. The financial crisis of 2008 highlighted the failure of supervisors to detect and deal with potential sources of systemic risk, but given machine learning is typically used to spot patterns and anomalies in large data sets, its effective deployment offers the potential to better protect the system in the future.

“Risk management will pro t greatly from the opportunity to use more data sources than in the past, because the complex dependencies between events have always been notoriously di cult to quantify. But with the advent of big data and the ability to link much more data than we could have years ago, it should now be possible to know in advance when risks are emerging from high dependencies in the system and then mitigate such risks,” says Oliver Maspfuhl, group credit risk and capital management at Commerzbank.

Despite the multitude of possible use cases for deep learning, from alpha-seeking hedge funds to cyber security, market surveillance and FRTB data mining, the associated infrastructure requirements remain a major barrier to entry. The hurdle may not be insurmountable, but it certainly threatens to delay progress in the near term. The technology requires a level of infrastructure build and computing power that is simply not widely available today.

The infrastructure requirement falls broadly into two categories. Firstly, deep learning requires
a vast quantity of data, which in turn often requires external storage. Many firms already use data centres to house parts of their infrastructure, but the storage requirements are likely to rise steeply if deep learning techniques are being used. Secondly, the compute power required to train systems in deep learning is of an order of magnitude higher than traditional requirements.

With the advent of big data and the ability to link much more data than we could have years ago, it should now be possible to know in advance when risks are emerging.
Oliver Maspfuhl, Commerzbank

“Training of systems in deep learning is an intensive process,” says Weegels. “It requires huge data sets that in turn need industrial scale data storage solutions, and it also requires a lot of compute power to train the systems. Premium co-location data centres are not necessary.The data to support deep learning, unlike trading systems that need to be within a few metres of an exchange system, can literally be held anywhere in the world.”

Firms that are serious about implementing deep learning in some form will need to consider whether they can afford to hold all of the necessary data within their own infrastructure, or alternatively if some type of cloud-based repository may be more realistic in the long term.

Despite the proliferation of cloud-based services, some firms remain uncomfortable with storing sensitive client and transaction data beyond their own firewalls. This may turn out to be a stumbling block in the effective adoption of deep learning, as some practitioners believe it would be impossible to store all of the data that is required without outsourcing to the cloud.

“The vendors operating these deep learning platforms obviously want to make it as easy as possible for their clients to connect, and the cloud is the natural place for the data to be hosted. People have been using the cloud for a while now, and they expect to see the same kind of flexibility across the network,” says Terence Chabe, business development manager at Colt Technology Services.

Reservations about cloud-based data storage are not confined to data security or sensitivity, however. There is also a concern about the accessibility of data when it is needed, and while cloud services have clearly advanced a great deal over the past decade, firms cannot afford to lose control over their data if they choose to store it in the cloud. Moreover, as the data requirements of advanced machine learning applications expand, firms may soon be forced to make an “all or nothing” decision, committing to housing both compute and storage in the same location, with significant cost, accessibility and security implications.

The vendors operating these deep learning platforms obviously want to make it as easy as possible for their clients to connect.
Terence Chabe, Colt Technology Services

While the concept of artificial intelligence and machine learning is very well-established, what is new is the advent of big data and the ability to organise, store and maintain much larger quantities of data than in the past. However firms choose to manage their data, if they can do so effectively then they stand a much greater chance of exploiting deep learning opportunities.

“Many of the relevant algorithms have been around for a while, but we now have much greater capabilities in terms of computational power, storage, and the ability to handle big data in a much more powerful way. This is what makes the combination of big data technology and deep learning algorithms so powerful right now,” says Noegel.

Commerzbank’s Maspfuhl concurs, adding that the real challenge is not in the deep learning technology itself, which is likely to be commoditized and used on an o -the-shelf basis initially, but rather in extracting the correct data in the correct format so that it can be effectively run through the algorithms to produce beneficial results.

“The first challenge is not to get the data but to integrate the data – to take it from different sources, find the links between it and then make it all t together. That is the hard part and there is no standard way to do it. The second challenge is to know which data you are allowed to use under data protection regulations. There is a strong tendency to build products using customer data, but one has to be aware of privacy rules,” Maspfuhl explains.

It seems clear that financial market participants must continue to explore the opportunities associated with deep learning, but it remains to be seen how quickly progress will be made. Bernhard Langer, chief investment officer of Invesco Quantitative Strategies, believes proper exploration of deep learning cannot be achieved in isolation, and firms need to source relevant expertise from across the industry if they are to achieve anything meaningful in this area.

“Experimentation in the use of neural networks in the 1990s didn’t work out but this time we have much better data, machines and applications. At Invesco, we have an internal research team that works in this field, and we are also working with academia and several start-ups in Silicon Valley to develop machine learning,” says Langer.

But without widely accepted use cases for deep learning, it is more di cult to concentrate industry attention around specific problems. There is clearly no shortage of functions that have the potential to bene t from this kind of technology, but not everyone agrees on where it could be most effectively applied.

Experimentation in the use of neural networks in the 1990s didn’t work out but this time we have much better data, machines and applications.
Bernard Langer, Invesco

“Deep learning won’t work for everything,” warns Verne Global’s Weegels. “It is based on large quantities of data and systems must be trained to ensure they come up with the same responses every time. This could be very powerful in the conventional support systems we used to use, but it’s much harder to see how it might be applied to front-office functions such as trading.”

Perhaps the most important thing is not to let the demand for deep learning solutions fade, even if there are currently more questions than answers. It is important to take a both a long-term and short-term view about where the quest for deep learning might ultimately lead.

For more information on the companies mentioned in this article visit:

www.colt.net
www.invesco.co.uk
www.commerzbank.com
www.prmia.org
www.big-xyt.com
www.tme-ag.de
www.verneglobal.com