作者:Paul McNelis
格式:英文PDF
出版社:Elsevier
As regular readers of this column will know, I always look forward to the season of spring. Living in London, the start of spring represents the end of the long, dark, cold winter and the start of some warmer and sunnier weather. Not only does this make life more bearable, but it also means that it’s time for me to get my sports car out of hibernation and take it for a drive. True to form, last week saw me going through the annual ritual, and while the roads aren’t yet dry, I’m pleased to report that I’ve been for my first proper drive of the year, and I have a smile on my face again!
Spring is also a time for taking stock, and in particular, I’m sure there are plenty of risk managers considering if there are alternative methodologies that can be used to build better models. Despite the complexities of finance, there are surprisingly few mathematical techniques used across the whole industry of risk management:
Within the world of credit risk, simple models are developed using regression techniques (and often, nothing more complex than linear regression is used); more complex models are built on some form of calculus based on insights around the movements of prices.
In market risk, while the math can get difficult, the underlying methods are simple: backward-looking models simply observe historic price changes and attempt to fit distributions or measure a confidence level of price changes; more advanced forward-looking models can be built using insights around predicting future values based on past performance.
Finally, within operational risk, practitioners attempt to fit data points to distributions and define levels of correlation. The difficulties in this area are largely around the lack of data, rather than mathematical complexities.
This month’s book, Neural Networks in Finance, provides an overview of the often-discussed (albeit rarely used in finance) technique of neural networks. Neural networks came to prominence in the 1980s, on a wave of publicity that suggested that neural networks are models built in the way the brain operated and therefore such models would allow computer programmers to write code to emulate the human reasoning process. While, inevitably, a large amount of the media interest has died away, neural networks have developed over the years, and are used in a wide variety of applications, including most voice and pattern recognition software.
It is easy to hear the name “neural networks” and assume that the underlying math is complex, but the math behind neural networks is relatively simple and analogous to curve fitting. To fit a curve through a set of points, you first decide the functional form of the curve you are trying to fit, and then you take a set of input data and go through some optimization routine to calculate the set of (previously unknown) coefficients that define the curve. To calculate a neural network, you again start with a set of data points (albeit, rather than just fitting a curve through a small number of data points, the data set for developing a neural network might be much larger), and these data points define the transformation from input to output. Whereas most graph-fitting is a one-step process, with a well-defined and easy-to-write functional form, neural networks actually consist of a number of internal steps and no closed form. The name neural network suggests that there is a network of neural nodes involved in the process, and that is exactly correct. The network starts with a number of input nodes, from which the first layer of calculation nodes is computed. Additional calculation nodes can be made from these calculation nodes and so on, until the final layer of calculations provides the output from the network. In this way, a relatively simple, one-layer set of calculations may be done, or more complex multi-layer calculations may be done. Each node is a function of the nodes at the previous layer, and there are a number of different functions that may be used to calculate nodes.
What the author does in the first half of this book is to provide an excellent, easy-to-read introduction to the math behind neural networks. As well as providing a good introduction, the author goes on to explain the different functional forms, although there isn’t a rigorous coverage of which form to use in different applications, suggesting that there is an element of trial and error when developing a neural network (or at least, a number of methods, each of which may or may not work for a given problem).
Having provided this mathematical background, the author goes on to build a number of example networks for financial problems. The first two examples are of market-risk-type errors, in terms of providing forward-looking estimates of inflation for two Asian countries, and the latter examples are credit risk examples around credit card default rates and bank default rates. While the subject areas are very interesting for each example, the author doesn’t go into the detail of the precise methods used, nor is a full analysis of the outputs provided. And I think this is one of the drawbacks with neural networks: while the models can be demonstrated to work well, it is hard to explain exactly how the model works. For example, under the latest Basel II regulations, a bank is expected to provide a full and complete documentation of any model used: such documentation should include an explanation of the parameters within the model, as well as a subjective review that each of the factors in the model is intuitive. With a neural network, it is very difficult to explain how the model actually works, and so this intuitive explanation is not possible. What the author does also show is that neural networks generally perform around a similar level to existing methods, suggesting that there is no huge improvement to be had from investment in a new modeling type.
The author has aimed to put together a text to provide readers with an understanding of what neural networks are and how they work, allowing novices to develop simple neural networks, and he has clearly succeeded. The book provides a clear introduction to the subject and would be a great introduction to the subject. However, financial engineers looking for a new modeling solution may find that neural networks still don’t live up to their own hype – I wonder if the models had originally been given a less emotive name, they would be so well known today . . .(financial engineering news上的介绍)