Andreas Dombret: Trying to see in the dark - the challenge of financial regulation

Speech by Dr Andreas Dombret, Member of the Executive Board of the Deutsche Bundesbank, at the Centre for Financial Studies, Frankfurt am Main, 28 November 2014.

The views expressed in this speech are those of the speaker and not the view of the BIS.

Central bank speech  | 
01 December 2014
PDF version
(91kb)
 |  5 pages

1. Introduction

Professor Krahnen,

Ladies and gentlemen

Thank you for the opportunity to speak at the Centre for Financial Studies. It is a pleasure to be here today. As you know, the Bundesbank and the Centre for Financial Studies are closely linked through various channels. The exchange between our two institutions has proved very fruitful for both sides. Thus, I am glad to be able to continue this dialogue today.

Henry Kissinger once allegedly said: "There cannot be a crisis next week. My schedule is already full." With that remark he captured the true nature of crises: their unpredictability. Crises usually do not adhere to people's schedules - not even to Henry Kissinger's. Instead, they occur unexpectedly - they come out of the blue, so to speak. The financial crisis was no exception in this respect.

Thus, in order to prevent crises, banking supervisors as well as risk managers within banks must try to feel their way in the dark and to anticipate potential risks. This requires two things: The will to see in the dark and the ability to do so. Let us take a closer look at both issues.

2. The will to see in the dark

Dealing with risk is not something that is confined to the financial sector. It is something we all have to do every day. Crossing a street entails risk, as does eating sushi or investing in a new business venture. Yet, while we all have to deal with risks in our daily lives, it seems we are not very good at it.

Much research has been conducted on how people deal with risk under different circumstances. With regard to economic decisions, this research shows that under certain circumstances people's decision are less driven by potential gain than by potential losses. Thus, they take on less risk than would be rational.

Nobel Laureate Daniel Kahneman and his co-author Amos Tversky, for instance, find that most people will reject a gamble with even odds to win or lose, unless the possible win is at least twice the size of a possible loss.1 So, it seems that people are not only aware of risks, but even try to avoid them more than is strictly necessary.

Against this backdrop, one question comes to mind. If people are naturally risk-averse, then what was wrong with the banks in the run-up to the financial crisis? Well, in the experiments I mentioned, it was always the people's own money at stake. As soon as it is other people's money at stake, things change.

Joseph Stiglitz and his co-authors, for instance, find that financial crises have become more frequent since deposit insurance systems were introduced and central banks were established as the lenders of last resort.2

From this, we can conclude that moral hazard plays an important role in the way banks deal with risk. The crisis has shown that we are faced with a situation in which "heads, the bank wins; tails, the taxpayers lose". Liability and control are off-kilter and this reduces banks' incentives to manage risks prudently - hence their will to see in the dark is diminished. Thus, if we want to make banks care about risks, we have to restore the balance between liability and control.

Here, the most prominent example is the "too big to fail" problem. A fundamental lesson learned from the financial crisis is that the failure of very large or interconnected banks can destabilise the entire financial system. Just think of 15 September 2008, when the collapse of Lehman Brothers triggered the global financial crisis.

Banks that are too big to fail are in a situation that is comfortable for them, but uncomfortable for society. Why is that? These banks benefit from an implicit insurance policy at no cost to themselves. They know that whenever they get into trouble, the government is likely to step in and bail them out in order to prevent a systemic meltdown.

There is no doubt that this implicit insurance policy creates all the wrong incentives. If a bank can rely on state support when it gets into difficulties, it will no longer see risk and return as two sides of the same coin. Profits from risky business remain with the bank, while the taxpayer is left to bear any potential losses. Thus, to promote a sound risk culture within banks, we have to mitigate the "too big to fail" problem.

In order to achieve this objective, we need mechanisms that allow even large banks to fail without destabilising the entire financial system. The possibility of failure is fundamental to any market economy and it must apply to banks. As the economist Allen Meltzer said: "capitalism without failure is like religion without sin - it doesn't work". Only if banks see a realistic threat of failure they will have an incentive to manage risks in a prudent manner.

In Europe, we have recently taken some steps in the right direction. From 2016 onwards, we will have a European resolution mechanism that allows us to resolve failing banks without destabilising the financial system and without burdening the taxpayers. In my view, this Single Resolution Mechanism has the potential to restore the balance between liability and control. Thus, it will have a fundamental and positive effect on banks' risk culture.

At the global level, the G20 recently agreed upon a proposal that requires global systemically important banks to improve their capital structure. In particular, these banks will need to ensure a minimum amount of total loss-absorbing capacity, which may be as much as 20%, including the minimum capital requirements. This will ensure that, in case of a bank failure, it is the owner and the creditors that will have to bear the costs, not the taxpayers. This will also strengthen the incentives of banks to engage in prudent risk management.

However, we have to look beyond incentives for banks. We also have to look at incentives for those who work in banks. According to the Financial Stability Board, a crucial element of a sound risk culture is the alignment of compensation with prudent risk-taking. In the past, compensation schemes were often skewed toward rewarding short-term success without taking into account related risks.

Consequently, the European Union has taken action and implemented new rules. According to these rules, the variable components of bank managers' remuneration are restricted to 100% of the fixed components. If shareholders agree, this can be increased to 200%. Furthermore, banks have to establish a remuneration committee and meet several new disclosure requirements on remuneration.

Nevertheless, we certainly have to be wary of regulatory arbitrage and unintended consequences. Capping the variable components of remuneration has led banks to increase the fixed components. This is likely to increase the fixed costs of banks and thus influence their competitiveness in a negative way.

At the same time, the new EU rules are stricter than the rules which apply at the global level. Thus, we have to be aware of the danger of regulatory arbitrage; working toward a global level playing field for remuneration is therefore essential.

Despite these caveats, I am confident that the regulatory measures will help to strengthen incentives for prudent behaviour. However, in the end it is the banks themselves that have to adopt a new culture which is focused more on sustainable returns and less on short-term gains. Here, we still have some way to go.

3. The ability to see in the dark

Ladies and gentlemen, the examples of the "too big to fail" problem and of compensation schemes show that regulation has done a great deal to increase the willingness of banks and of bankers to see in the dark. The incentives for banks to prudently manage risks have certainly been strengthened. But are banks actually able to do so?

Research in this field suggests that people are naturally disadvantaged when it comes to dealing with risks and probabilities. It is a well-established fact that people are prone to a number of systematic biases when it comes to decision-making and judgments under uncertainty. Among those biases are, for example, the neglect of base-rate information, over-confidence or incorrect perceptions regarding the frequency of certain events.

But shouldn't professional risk managers be aware of these biases and be able to mitigate them? Well, it seems that even professionals are not immune to systematic biases, a view that is underpinned by at least two studies.

A recent study addressed the question of whether professional managers in finance were aware of the housing bubble of 2004 to 2006. To answer this question, the researchers analysed the private investment decisions of financial managers. The results indicate that financial managers did not behave very cautiously when investing privately in the housing market. This suggests that even professionals were not aware of the bubble that had formed.3

In another study, stock market professionals and laypeople provided forecasts for the prices of various stocks and estimated the size of their own and the other group's errors in doing so. Both groups expected the professionals' forecasting errors to be smaller than those of the laymen. In fact, however, the errors of both groups in predicting stock prices were roughly the same. In a subsequent set-up, both groups were asked to pick the best-performing stock from two options. Here, both the professionals and the laymen proved to be overconfident with regard to their ability to pick the best-performing stock. Eventually, the professionals picked the correct stock in only 40% of cases. Had they randomly picked one of the two options that were offered they would have been right about 50% of the time.4

Studies such as these led Nassim Taleb to provocatively state that: "certain professionals, while believing they are experts, are in fact not. Based on their empirical record, they do not know more about their subject matter than the general population, but they are much better at narrating - or worse at smoking you with complicated mathematical models." This is certainly an exaggeration, but it seems justified to conclude that our ability to see in the dark is somewhat limited.

The quote points to another issue that is relevant in this regard. Can statistical models mitigate people's biases when it comes to handling risk? To some degree, they certainly can raise our awareness of potential risks but they are by no means perfect. With regard to the financial crisis, some banks claimed to have experienced a "25 sigma" event - a point in the probability distribution that is 25 standard deviations away from the mean. Statistically speaking, such an event should have only occurred once since the universe came into existence.

Statistical models are often criticised for systemically neglecting such "high-severity-low-probability" risks.5 One reason for this is that many models use the past as the basis for the future when assessing the probability of certain events. This weakness could be referred to as the chicken fallacy. A chicken that is fed by a farmer every morning may expect this pattern to continue in the future. However, as the philosopher Bertrand Russell concludes: "The man who has fed the chicken every day throughout its life at last wrings its neck instead, showing that more refined views as to the uniformity of nature would have been useful to the chicken".

A second criticism with regard to risk management models is that, for the sake of practicality, they are often based on the normal distribution. The drawback of this is that these models underestimate the probability of events at the far ends of the distribution. It seems that "25 sigma" events happen far more often than might be expected. Consequently, we have to be cautious and acknowledge the limits of the statistical models used in risk management and banking supervision.

In the field of banking supervision, statistical models are used to calculate risk weights for assets in order to determine capital requirements. Here, studies by the Basel Committee on Banking Supervision indicate that these risk weights vary across banks to a somewhat worrying degree and that not all deviations can be properly explained.6

Consequently, there has been some discussion about adapting the regulatory approach to these models. Among other things, it is envisaged that a capital floor will be set up based on the revised standardised approaches. In addition, consideration is being given to introducing fixed loss-given-default parameters for portfolios of unsecured loans with a low share of default. In addition, a risk-invariant leverage ratio will serve as a backstop for the risk-weighted capital requirements.

In addition to "high-severity-low-probability" events, there will always be true "black swans" that appear completely unexpectedly and with dire consequences. Such events are the real challenges for risk management and financial regulation. They are intractable and cannot be managed within the scope of statistical models because they are beyond all expectations. They lurk in the dark, and we are simply unable to see them.

What lessons can we learn from these observations? I would like to emphasise three points in this regard. First, we have to be aware that our knowledge is limited and our decisions are probably biased. Second, we have to acknowledge the limitations of statistical models. Third, we have to accept that black swans exist. Banks have to be prepared to handle such extreme events. In other words, risk management has to be prepared for anything at all times.

What role can regulation play? If we accept that the next crisis may come unexpectedly, out of the blue, there is one key strategy - and that is to draw a line of defence which is independent of the nature of the crisis.

The central building block for such a line of defence is capital, or equity, to be precise. Capital buffers are the most general means of protecting a bank from shocks, regardless of whether they are expected or come out of the blue. Consequently, the reform of financial regulation has focused on strengthening the capital rules for banks. The new Basel III framework requires banks to hold more capital of a better quality. This will make banks more resilient and better prepared to survive future crises.

4. Conclusion

Ladies and gentlemen, the financial crisis came out of the blue and has left a lot of damage in its wake. Hence the starting point of my speech today - that we have to attempt to feel our way in the dark in order to better anticipate future crises. This requires two things: the willingness to see in the dark and the ability to do so.

With regard to the willingness to see in the dark, I talked about setting the right incentives for banks to engage in prudent risk management. The "too big to fail" problem and compensation schemes were two examples of how financial regulation can strengthen incentives.

With regard to the ability to see in the dark, the outlook is less optimistic. Research shows that people are naturally disadvantaged when it comes to handling risks and probabilities - a deficiency that statistical models can mitigate only to a certain extent.

Consequently, banks need universal buffers to prepare for shocks, regardless of where they come from. And, in this context, equity is the most general buffer. That is why the reform of financial regulation focused on improving the capital rules for banks. However, we still have to feel our way in the dark when it comes to regulation. We have increased capital buffers, but have we reached the right level? This is an issue we must endeavour to shed more light on.

To sum up, crises will never adhere to our schedules, and we will never be able to prevent the threat of future crises. All we can do is to mitigate its impact. Pursuing this objective is worth all our effort and the progress that has been made in the area of financial regulation is rather encouraging.

Thank you.


1 Tversky, A; Kahneman, D (1992), Advances in prospect theory: Cumulative representation of uncertainty. In Journal of Risk and Uncertainty, Vol 5, No 4, pp 297-323.

2 Hellmann, T F; Murdock, K C; Stiglitz, J E (2000), Liberalization, Moral Hazard in Banking, and Prudential Regulation: Are Capital Requirements Enough? In The American Economic Review, Vol 90, No 1, pp 147-165.

3 Cheng, I; Raina, S; Xiong, W (2014), Wall Street and the Housing bubble. In American Economic Review, Vol 104, No 9, pp 2797-2829.

4 Törngren, G; Montgomery, H (2004), Worse Than Chance? Performance and Confidence Among Professionals and Laypeople in the Stock Market. In The Journal of Behavioral Finance, Vol 5, No 3, pp 148-153.

5 Taleb, N (2007), The black swan - the impact of the highly improbable. London.

6 For example, Basel Committee on Banking Supervision (2013), Regulatory Consistency Assessment Programme (RCAP) - Analyses of risk-weighted assets for credit risk in the trading book, July 2013. Basel.