Searching for price and financial stability: challenges for central bank statistical services
Speech by William R White, Economic Adviser and Head of the Monetary and Economic Department, Bank for International Settlements, at the Irving Fisher Committee conference on 'Challenges to central bank statistical activities', Basel, 21 August 2002
As the Economic Adviser and Head of the Monetary and Economic Department of the Bank for International Settlements, I would like to extend a very warm welcome to participants at this conference organised by the Irving Fisher Committee. When Paul Van den Bergh and Marius van Nieuwkerk approached me about the possibility of the BIS contributing to such a conference, I was very enthusiastic for at least three reasons.
The first of these is my respect for the work of Irving Fisher; in effect, friends of Irving Fisher are friends of mine. Indeed, I have just been rereading his classic Econometrica article of 1933 entitled “The Debt Deflation theory of Great Depressions” and found it full of illuminating if sometimes disquieting insights. One of these, which might be thought less than amusing in light of the recent Enron affair and other corporate governance scandals, has to do with the various phases through which people are encouraged to take on heavy debt levels, which eventually become unsustainable as the deflationary process unfolds. The first phase starts with “the lure of big prospective dividends and gains in the remote future”, and the last one involves “the development of downright fraud, imposing on a public which had grown credulous and gullible”. Does it not all sound very familiar?
My second reason for wanting to welcome you here reflects the fact that I now have over 30 years of experience in this business of central banking. Over those years I have become steadily more convinced of the need for good data and for good statistical analysis. We must, of course, begin with some theoretical construct as to which hypotheses we wish to test. But the choice of the particular data that might shed the greatest light on the validity of those hypotheses is also extremely important. In effect, as statisticians and economists, we should always ask ourselves “What are these data for?” And by the same token, we should also be prepared to change the data we collect in response to changing requirements. The simple logic of - rubbish in, rubbish out - surely applies with respect to the empirical testing of hypotheses. Unfortunately this insight continues to be missed by many economists and econometricians. They often seem fearful of questioning their data, perhaps because it might throw doubts on the validity of their results and, of course, the likelihood of publication.
My third reason for wanting to support this conference is that data issues are a major preoccupation for many central banks. As an institution set up explicitly to support central bank cooperation at the international level, the BIS has a natural interest in such endeavours. While we have in fact already done a great deal in this area, a subject to which I will return at the end of my presentation, I feel strongly that we could do more. So again I welcome you all today and hope that I will have the opportunity to do so again in the future.
In my presentation today, I will try to provide a broad overview of the challenges inherent in the statistical activities of central banks. As I do so, I am conscious of how much I have forgotten since the eight years I spent during the 1980s on the National Income Advisory Committee to Statistics Canada. In contrast, I hope that I have learned some new things at the BIS. The one constant we all face in this area has been our ultimate objective. What we are interested in as central bankers, whether statisticians or economists or policymakers, are better policies to support sustainable growth and living standards over time.
What has changed almost continuously, however, has been our perception of the principal threat to achieving this constant objective. In the 1960s, policymakers first became aware of the dangers posed by inflation produced by excessive demand. In the 1970s, the problems of dealing with inflation were exacerbated by supply side shocks, largely resulting from increases in oil prices. In the 1980s, the debt crisis in the emerging market economies focused the attention of policymakers on the extent to which creditor banks in the industrial countries could themselves be hurt by debtor defaults. Since then, there have been successive financial crises affecting Mexico (1994), East Asia (1997), Russia (1998), LTCM (1998), the Nasdaq (2000 onwards) and Argentina (2001). Still more recently, there have been financial pressures in Brazil and other emerging market economies and the ongoing collapse of stock prices around the world. Looking forward, there seems to be a growing recognition that these crises may not have independent origins but may rather be manifestations of some underlying global processes that might not yet be fully played out.
In the time I have available, I will deal in turn with the statistical challenges faced by those concerned with price stability, those concerned with financial stability and then briefly with the nexus between the two. I will finish with a “paid political contribution” about the actual and potential contribution of the BIS in this area.
2. Challenges related to price stability
This area is what I would call “traditional” postwar macroeconomics. Even so, there have been enormous changes in the way in which monetary policy in particular has been conducted over the last 30 years. Most of these changes have had to do, not with the day-to-day implementation of monetary policy, but rather with altered views about the framework conditioning those day-to-day decisions. In my view, there are three essential elements to this framework: the institutional, the philosophical and the empirical. Before getting to the third of these, likely to be of the greatest interest to this audience, let me say just a few words about the other two.
The institutional framework comprises the mandate, powers and accountability of central banks. In every respect, enormous changes have taken place in recent decades. As to the mandate, there has been a growing preference to give central banks a mandate to establish and maintain price stability. Sometimes this mandate has been very explicit, particularly in the case of countries with poor track records in this regard, but in other cases an implicit mandate has been no less influential. As to powers, more and more central banks have been given sole authority over the application of the instruments of monetary control. Indeed, this is what most central banks now recognise to be the essence of their so-called independence. Finally, most central banks are now behaving in a much more transparent way so that their performance can be judged by both the political authorities and the financial markets. A crucial requirement for this accountability is the provision of relevant information, both statistical and of other sorts.
The philosophical framework refers to how policymakers approach the uncertainties and the trade-offs inherent in the process of making policy decisions. While inflation targeting now has greater importance as an objective, concerns about other aspects of economic performance must continue to carry at least some weight; the level and volatility of output and unemployment, the movement of the exchange rate and so on. The particular complication caused by concerns about financial stability is one I will return to. Moreover, decisions have to be taken in the face of conflicting statistical and anecdotal evidence about underlying economic processes. Finally, there is the issue of whether policymakers should be following a maximising strategy, to squeeze out all the benefits possible from the economic system, or a minimaxing strategy of avoiding really bad outcomes. Looking at the actual conduct of monetary policy in various jurisdictions, differences of view can be observed as easily as changes in view over time.
Returning more directly to the interests of this audience, the conduct of monetary policy in the pursuit of price stability also requires an empirical framework. That is, we must have some idea of the transmission mechanism that relates what central banks can control, the various aspects of the central bank balance sheet, to the ultimate inflation objective. This was not always recognised. Montagu Norman, Governor of the Bank of England for much of the prewar period, once told his Economic Adviser “your job is not to tell me what to do but to explain to me why I have done it”. Today, whether the authorities use a one-equation model in the governor’s head, or a 500-equation macromodel with satellites of various sorts, some idea of cause and effect is deemed necessary.
A schematic diagram for first year undergraduates might have the following form. Bank reserves set daily by the central banks affect interest rates, the exchange rate and other financial variables over some longer horizon. In turn, over some still longer period, these affect the level of spending on domestically produced goods and services. In turn, this affects the “gap” between aggregate demand and the productive potential (supply) of the economy. Finally, “gaps” affect the rate of increase of wages and prices, which is what the policymaker is ultimately aiming at. At first glance this all seems quite straightforward. This is true even when we pass on to the practical question of how this process might be operationalised. We choose a target level for the inflation rate. We rely on the National Income Accounts (NIA) and other sources for our data. We estimate econometrically our macro models and use them to forecast whether the current policy settings are consistent with our objectives. And finally, we constantly update our forecasts and our policies in the light of incoming information. So what is the problem?
In fact, the potential problems were flagged as far back as the 1940s in a series of exchanges between Jan Tinbergen, the Dutch inventor of econometric modelling, and John Maynard Keynes along with their respective disciples. These issues were subsequently highlighted in a book published in 1968 entitled “On Keynesian economics and the economics of Keynes”. In his book, Axel Leinhofvud asserts that Keynes agreed that the IS-LM model, devised by Sir John Hicks, did capture the interaction between the various functional forms referred to in Keynes’ General Theory. This model subsequently became the basis for “Keynesian economics ”. However, Keynes, or at least many of his disciples at Cambridge, England, denied that these functional forms could ever be estimated empirically. The fundamental problem with each, and also with the transmission mechanism that relies on such estimates, is that expectations about the future are central to all economic decisions. This is the essence of “the economics of Keynes”. These expectations are both hard to measure and essentially ephemeral. This fundamental problem, along with a number of others, continues to bedevil the application of the empirical framework today.
Let me be more specific about some of the other statistical challenges facing those pursuing the goal of price stability. Perhaps the first issue is that of which index to target. In principle, we should be trying to measure the costs associated with aggregate price movements, and should then try to stabilise the index which best represents those costs. In practice, this is never done. Indeed, in most jurisdictions it has simply been assumed that stabilising some variant of the CPI is appropriate. This, of course, still leaves us with many choices, not least that of how the index is to be created (the weights issue), the choice of caveats and exemptions (commonly food and energy prices) and the distinction between first-round and subsequent effects of shocks (like tax increases) on measured price movements. I note, in passing, that stabilising the CPI implicitly assumes that productivity gains over time will show up as increases in nominal wages. Going back to prewar days, there was in fact a lively debate as to whether this was sensible or not.
When we turn to the transmission mechanism, perhaps the most important empirical questions have to do with estimating the supply side potential of the economy. This in turn impinges on our capacity to measure “gaps”. The recent experience of the United States is instructive in that measures of potential growth were first revised sharply upwards, but have more recently been revised partially back down. In a sense, there is nothing new in this. Measures of potential based on factor inputs into production functions have always revealed that the “unexplained residual” was by far the biggest contributor to growth. In the same vein, statistical measures of potential based on detrending have always been highly dependent on the precise methodology used. But a more novel element in the recent US experience is the role played by the choice of hedonic price indices to deflate nominal expenditures in the IT area. To some degree, this has affected the perceptions of productivity growth differentials with Europe, where such indices are not in widespread use. The use of hedonic indices also invites the question of how gains in measured productivity can increase profits (and support stock markets) when the vehicle for this is lower prices. As an aside, it is also worth noting that estimates of potential are crucial for distinguishing cyclical from secular changes in the government’s fiscal balances. As governments increasingly worry about the medium term sustainability of their fiscal stance, this issue gains in importance.
Closely related to this particular measurement problem is that of estimating the “natural rate” of unemployment or the NAIRU. Not only are there questions about the conceptual basis for these calculations, but there are also many practical difficulties. One approach is to back out these numbers from structural estimates of the equations determining wages and prices. I recall thinking about this when I was Chief of the Research Department at the Bank of Canada and being horrified by some of the underlying statistical assumptions. The dependent variable in the wage equation was not contract data at a given moment in time, but changes in average earnings, the cumulative effect of years of contract negotiations in the distant past. Among the explanatory variables were potential growth and inflationary expectations, both of which suffered from severe measurement problems. Finally, an estimate of the actual level of unemployment was assumed to condition wage pressures, in spite of the fact that participation rates were also subject to very wide swings in response to market conditions. Given these circumstances, which still exist today, it is not surprising that estimates of the NAIRU are subject to wide margins of error.
A more general problem facing policymakers is that of structural change. In transition and emerging market economies, this can be the product of deregulation and technological changes, among other factors. Another issue is the so-called Lucas critique, which notes that a change in the behaviour of policymakers will induce changes in private sector behaviour. A good example here has to do with the reaction of exchange rates to domestic inflation. If the policymaker is not expected to resist the inflation, the exchange rate should weaken. Conversely, if the policymaker is expected to tighten policy to resist inflation, the exchange rate might well strengthen. The bottom line here is that measures of tensions will change over time, as might the estimated effects (even the signs) of those tensions. These possibilities do not make the policymakers’ use of the empirical framework any easier.
Finally, the application of an empirical framework in the pursuit of price stability often suffers from significant data revisions. As someone once said: “What with the revisions to the National Income Accounts, the past has now become as uncertain as the future”. Recent revisions in the United States have revealed that the recession was three quarters rather than one quarter long. More importantly, they have implied that productivity growth and NIA profits were in fact somewhat lower than earlier thought. Of course, one could question the importance of this revision since, until very recently, no one was looking at the NIA profits measures in the first place. Rather, the measure receiving all the attention in the markets was the profits measure produced by Standard and Poor’s, which had the great attraction of continuing to rise while other measures were falling. Indeed, as a share of GDP, NIA profits have been falling since 1998. In sum, empirical measures can be revised. But revisions to how we freely choose to focus on the data may be more important still.
Faced with these kinds of empirical problems, which were as endemic in the 1970s as today, policymakers have often turned to other indicators to guide their conduct of monetary policy. Financial indicators such as exchange rate developments and growth rates for monetary and credit aggregates have traditionally been popular. Unfortunately, they also pose empirical problems. A monetary authority choosing a fixed exchange rate regime must still decide: fixed against which currency? And if a basket of currencies is to be chosen, which ones should be in it and with what weights? As for monetary and credit aggregates, choices must be made between alternative aggregates since they cannot all be controlled simultaneously given the limited range of central bank control instruments. Moreover, the criteria for making such choices are still not very clear. Finally, with structural change in the financial system, the demand functions for such aggregates can swing wildly and provide policymakers with all sorts of inappropriate signals for action. It was just such developments that prompted Gerry Bouey, Governor of the Bank of Canada, to declare that “the Bank did not abandon monetary targets, they abandoned us”.
I have made these comments to this particular audience to underline two points. First, a great deal of work still needs to be done to better refine the data and the models we use in the conduct of monetary policy. This applies even when we are pursuing such a traditional objective as price stability. My second point relates to the first one. We need to be modest about what we know and the limitations of our metier. Overpromising and underachieving is the easiest way to lose credibility, though it must be admitted that being unambitious in the pursuit of policy objectives can also lead to the same outcome. Being transparent about the shortcomings in our data, and our capacity to analyse the data, may incline the public to be generally more forgiving about perceived shortfalls in our performance. It may also lead to a greater willingness to provide resources to improve both our data and our analytical capacities. Both are sorely needed.
And as if this were not enough for statisticians to worry about, a number of challenges have arisen in recent years as regards the pursuit of financial stability. Let me now turn to this issue.
3. Challenges related to financial stability
I will turn in a moment to the issue of what kinds of information we need to make assessments of the expected costs (in the statistical sense) of financial crises. Clearly, we need some combination of measures of the changing probabilities of financial crises and the costs of those crises should they occur. But before turning to this issue, a few words seem warranted about the dynamic economic processes which seem to give rise to financial instability. Note that none of this is in your standard IS-LM model or its many descendents. Rather, the analytic framework has more in common with mid-European analysis of business cycles prior to World War I, abetted on the down side of the cycle by the work of Irving Fisher to which I referred just a few moments ago.
The cycle begins in a wave of optimism, often associated with some innovation such as new technology, the opening up of some new markets or desirable changes in economic policy. These events provide the basis for expectations of profit growth well above prevailing rates of interest on fixed-term instruments. While initially based on sound foundations, this justified optimism turns imperceptibly into unjustified optimism. Not uncommonly, this process unfolds with inflation maintained at low levels, as heavy levels of investment ensure that supply potential increases parri passu with demand. However, at a certain point, profits begin to decline, either in the light of overexpansion (and an inability to raise prices) or because costs begin to mount. This is followed by an investment “bust” which, in the limit, can lead to financial distress, subsequent headwinds affecting the real economy and even deflation. The financial distress arises from the fact that the investment expansion is normally financed with excessively cheap credit expansion, which rebounds on the lender. The tendency to deflation arises from the fact that inflation was already low when the bubble burst.
This characterisation of the financial cycle was first based on observations in Europe and the western hemisphere in the period prior to World War I. However, more recent crises also seem to have been rather similar in character. The Great Depression in the United States, the experience of Japan in the 1990s and the East Asian crisis all are reminiscent of this story. Recent experience with the Nasdaq bubble in the United States, and the associated IT investment cycle, also seems similar in some respects although the reliance on market-based rather than bank financing is a material difference from earlier episodes.
Identifying potential problems arising from financial instability has been rendered more difficult in the current world by the major structural changes that have taken place in the world financial system over the last three decades. Three major changes can be identified: securitisation, globalisation and consolidation. Each has aspects that could make financial stability more likely or less likely. Securitisation refers to the growing tendency, most marked in the English-speaking countries, for credit to be provided directly through market processes as opposed to indirectly through financial intermediaries. On the one hand, this might help reduce financial instability since exposures are much more widely spread. Moreover, since the banking system is less involved, potential threats to the functioning of the payment system would also be reduced. On the other hand, markets are much less interpersonal than is “relationship banking” and could be much more volatile, particularly when under stress. Globalisation refers to the process through which all markets, both national and international, are becoming much more tightly linked. On the one hand, this might reduce financial instability since the impact of shocks can be much more widely dispersed. On the other hand, there might also be new dangers. The capacity to finance “excesses” may now be greater, with an associated danger of sudden changes of view about debt sustainability. An associated possibility is that individual countries may now be more subject to “runs” if (and this is a big assumption) foreigners prove more skittish than local investors. Consolidation refers to the growing degree of concentration in many financial markets. On the one hand, financial instability may be reduced since bigger firms tend to be better diversified and might be presumed to have better risk management capabilities. On the other hand, concentration could have systemic implications should a big firm fail, and the possibility of herding in markets might also be increased. The bottom line is that all of these structural changes make it harder to identify where there might be emerging vulnerabilities in the operations of the financial system.
Given how hard it is to identify emerging problems in the financial system, there has been a growing recognition of the need to approach this problem in a multifaceted way. For this reason, internal governance of financial institutions, along with public sector oversight and market discipline, are all expected to provide incentives to encourage prudent behaviour on the part of individual participants in the financial system.1 Note, however, that for each of these incentive systems to function properly, there must be adequate information to allow judgments to be made about the various risks being taken and the prices being charged for doing so. Note, moreover, that information about the health and exposure of financial institutions must also be based on adequate information about the health of those to whom they have lent.
All three of these incentive systems to encourage prudent financial behaviour require good data both from non-financial corporations and from financial institutions. That is why the recent corporate accounting scandals are so worrisome. They threaten the integrity of all of the more aggregated information systems which rely on such data as inputs. At the level of the financial institutions themselves, we also need more accurate measures of risk exposure, in particular of problem loans and non-performing loans. One shortcoming in the measurement of credit risk in particular is that credits extended during periods of rapid economic expansion are generally treated as “low risk”. This is largely due to the fact that, in such periods, recorded loan losses are low and this recent good performance tends to be simply extrapolated into the future. In reality, given that the business cycle has not (and will never) disappear, this is precisely the time when bad credits are actually building up to materialise during the subsequent downturn. Since both the internal governance process and market discipline might be subject to assessments of exposure which are too low in upturns, and potentially too high in downturns, the role of public sector oversight becomes all the more important.
This raises more directly an issue only alluded to above. What should be the objective of public sector oversight? Traditionally, the focus has been on the financial viability of individual institutions. This is the normal preoccupation of financial supervisors. The reporting requirements to satisfy the needs of such microprudential oversight are essentially those I have just referred to above. What is being increasingly recognised, however, is that oversight should also be conducted with a view to ensuring that the system as a whole is stable; what the BIS has for years referred to as macroprudential oversight. The reason for this is rather obvious in the light of numerous recent financial crises which, ex post, have generally had costs amounting to many percentage points of GDP.
Measuring the expected costs of financial crises, ex ante, requires evidence pertaining to the changing probability of a crisis occurring. A number of suggestions can be made as to the kinds of data that might prove illuminating in this regard. Nevertheless, we are far from having reliable guides in this respect. A great deal of work has been done on macroprudential indicators, with some by my colleagues Claudio Borio and Philip Lowe showing particular promise. They find that a combination of indices pertaining to credit, asset prices and investment (generally focused on sustained deviations from trends) can generally predict crises while avoiding too many “false positives”. This type of work has been held back, however, by a serious data deficiency; namely, in many countries there are no long-term time series for property prices. This is astonishing when one considers how frequently banking crises have been triggered by booms in commercial property. This is an area where statisticians clearly have much useful work to do.
Another area where there are serious data shortcomings in many countries is that of sectoral balance sheets, in particular the level of corporate debts denominated in foreign currency. It is clear from the Mexican (1994), East Asian (1998), Turkish (2000) and Argentine (2001) crises that currency mismatches grievously aggravated the economic downturn once the domestic currency had depreciated in value. The current difficulties facing Brazil, where many debts have been indexed either to inflation or to the exchange rate, have a similar character. While off-balance sheet transactions can seriously complicate the task of assessing exposures of this sort, the domestic banks should be the principal agents pressing for full disclosure. They should recognise that it is their own survival that is on the line should a large enough number of their clients be thrown into default because of an exchange rate change.
In recent years, the international banking statistics collected by the BIS have been improved in many ways. In particular, the consolidated banking statistics now give a fairly clear picture of the exposure of individual countries to international bank debt, as well as the exposure of the national banking systems that have given such credits. Yet some shortcomings still remain. We have relatively little information about the joint exposures of financial conglomerates that include banks, investment banking and insurance. Moreover, there are a whole host of new instruments and financial structures that could conceivably contribute to financial instability. Asset-backed securities, special purpose vehicles, credit derivatives, the reinsurance industry and hedge funds all seem to have been growing very rapidly in recent years. Yet we do not have a great deal of information about these developments and their possible implications.
If there is enhanced demand for data that could indicate growing financial vulnerability, it is fortunate that there is also an element of enhanced supply. It is now recognised, that there is significant information about exposures in the financial market variables themselves. For example, measures of implied volatility derived from option prices tell us something about the market’s own assessments of market risk going forward. Various measures (like bid-ask spreads, turnover, and spreads between new and seasoned issues) can indicate emerging problems with respect to liquidity. And there are a whole host of indicators (spreads, Merton type estimates of the probability of default, and credit default swaps) about how credit risk is evolving. While these indicators do not always point in the same direction, the growing efficiency of arbitrage indicates that this should become less of a problem in the future. Supervisory data, commercial databases, and data based on the workings of payment systems and custody arrangements may also provide insights into whether, and in what ways, the financial system is showing increasing signs of vulnerability.
4. The nexus between price stability and financial stability
There are certainly great statistical challenges in assessing whether price stability is being threatened. There are perhaps even greater challenges in assessing whether financial stability is being threatened. Unfortunately, the possibilities for interaction between price stability and financial stability make the resulting challenge even greater than the sum of the parts. “Chaotic” outcomes, multiple equilibria and the sudden transformation of “good” states of affairs to “bad” ones become all the more likely. This is not a world in which formal modelling based on high-frequency data is likely to provide much insight about the future. Rather, the best guide might be the study of history and the pathology of low-frequency events.
It is a fact that movements in the aggregate price level (for goods and services) can have implications for financial stability. When aggregate prices are rising, as we saw in the 1970s and 1980s, speculation and imprudent lending for the purchase of assets which keep their real value (in particular real estate) lead to an overextension of credit which then feeds back on the health of the financial system. When prices are falling, as we have seen recently in Japan, the burden of debt becomes increasingly onerous. Real interest rates can start to rise, and monetary policy becomes increasingly impotent as nominal rates fall to zero. Unserviced debts eventually threaten the health of the lenders. As noted above, Irving Fisher probably said as much about this phenomenon as needs to be said. But it is also the case that the health of the financial system can have important implications for price stability. A financial system burdened with bad debts, and unsure about both its own solvency and that of its customers, is unlikely to make credit easily available. In a downturn, such lending behaviour would almost certainly make the downturn worse. In turn, this would exacerbate any existing trends for inflation to fall and eventually lead to deflation.
Reflections of this sort indicate why central banks, even those without microprudential supervisory responsibilities, must retain a concern for macroprudential oversight. The interactions between the behaviour of the financial system and overall macroeconomic performance are crucial and of natural interest to central bankers. This said, if both independent supervisory authorities and central banks have an interest in financial stability, there needs to be some agreement as to how they should interact. These agreements need to cover both normal circumstances and times of crisis. The objective must be to exploit the comparative advantage of those who approach problems from the “bottom up”, and those whose approach is more “top-down”. It should be self-evident that both approaches are valid.
5. Central bank cooperation on statistical issues and the role of the BIS
Since I now have a captive audience, let me spell out quite briefly the contribution being made by the BIS to international financial cooperation with respect to statistical issues. We have done a great deal on this front in recent years, but the list of remaining challenges and open statistical questions remains a daunting one.
First, there is the issue of traditional macroeconomic monitoring. In this area, many of you will be familiar with the work of the BIS Data Bank. The original purpose of the Data Bank was to collect the principal macroeconomic time series data used by the central banks of the G10 in the conduct of their own monetary policy. Each central bank was (and is) charged with ensuring the quality of its own data contributions, with a view to getting access in exchange to similarly high-quality data from others. The BIS provides the technical platforms for the exchange of such data, and our staff also provide an additional layer of quality control. In recent years, two major trends have become noticeable. The first is that the number of countries participating in the Data Bank has increased significantly, and this expansion seems set to continue. One possible vehicle for this may be regional central banking groups (like CEMLA and SEACEN), which might piggyback off our efforts in Basel. The hope would be to encourage much enhanced statistical reporting from all of their regional participants. The second trend has seen BIS staff move beyond quality control towards providing a consulting role to those raising questions about the nature of the data provided by other countries. In this regard, clarifications about the comparability of definitions across countries come quite high on the list of services requested.
Second, there is the issue of providing data which is useful for macroprudential monitoring. The BIS now collects and compiles a very large number of financial market indicators. While these are currently largely drawn from the financial sectors in the more advanced industrial countries, data from the larger emerging market and transition economies are being added on a regular basis. In addition to the principal statistics indicating the state of the macroeconomic background, indicators of market risk, credit risk and liquidity risk figure importantly. Capital flows and movements in credit aggregates are also presented. So too are many statistics pertaining to the process of financial intermediation and to the health of both corporate and household balance sheets. As noted above, however, there continue to be many areas, for many countries, where important statistical series are simply not available.
Third, and closely related to the second, the BIS collects and compiles a wide variety of international financial statistics. The international banking statistics are well known and have been extended and improved in many ways in recent years. Moreover, a whole set of further improvements are in the works and should be finally implemented within the next year or two. These data were originally intended to provide evidence about the exposure of creditor banks to international loans, but they have also proved useful as the basis for calculations of debt exposures by borrowers. The joint BIS-IMF-OECD debt statistics incorporate inputs from all three institutions in a coherent way. Moreover, the BIS has recently conducted research to identify why such “creditor-side” statistics on short-term debt exposure sometimes differ from the “debtor-side” statistics collected by the countries themselves.
The International Financial Statistics group at the BIS also collects a wide range of other statistics. This reflects the fact that international finance is no longer only (or even primarily) provided by banks, and that there is much more to international finance than traditional loans. Accordingly, the BIS provides statistics (largely based on commercial sources) on the international issuance of securities.2 Moreover, in addition to the quarterly collection of data about exchange-traded derivatives, the BIS also performs a triennial survey of activity in over-the-counter derivatives markets. This latter survey complements the triennial survey of the world’s significant foreign exchange markets (over 40 at last count), which has now been repeated five times.
While not primarily statistical, it is important to underline two related and important contributions by the BIS with respect to these numbers. First, they are either made publicly available (see www.bis.org) or are available to participating central banks through our new extranet facility called eBIS. If you are not aware of eBIS, and the wealth of facilities and information it provides to central banks, then you should be. Second, these numbers are regularly evaluated by the various groups of national experts which meet here at the BIS, whether they are primarily concerned with price stability, financial stability or both. These groups include the regular meetings of Governors, the Basel Committee, the Committee on the Global Financial System, the Markets Committee and the Financial Stability Forum, among many others.
The final issue I would like to deal with are future statistical challenges. The first of these must be to respond to past challenges, described above, which have not yet been adequately addressed. But in addition, a number of trade-offs can be identified where we need a clearer understanding of what precisely it is that we want. One such trade-off is that between timeliness and accuracy. More of the former means less of the latter. Another tradeoff is between the desire for international harmonisation and the need to reflect accurately national idiosyncrasies of various sorts. A third is the choice between stability and change in the adoption of new technologies and reporting formats. In this context, the most pressing issue must surely be the urgency with which we all strive to adopt newly emerging web-based information exchange standards. The new initiative, strongly supported by the BIS, on Statistical Data and Metadata Exchange (SDMX) should in time provide us with increasing insights as to the costs and benefits of alternative paths to follow.
To conclude, a lot has been done to foster international cooperation between central banks on statistical issues, but a lot remains to be done. This is the first meeting of the Irving Fisher Committee in which the BIS has played an important supportive role. We have enjoyed doing so. I would also add my hope that this first meeting with active BIS support for your important work will not be the last.
1 For example, this was the primary motivation for the decision by the Basel Committee on Banking Supervision to rely on the three pillar approach (capital standards, supervisory oversight and market discipline) in setting out the New Basel Capital Accord.
2 These numbers would be usefully complemented by information about the issuance of securities in domestic markets, and by data on the purchase by foreigners of domestically issued securities as well as the purchase by domestic entities of securities issued in international financial markets. Such data is not yet commonly available.