Anita Angelovska Bezhoska: Statistical implications of the new financial landscape

Remarks by Ms Anita Angelovska Bezhoska, Vice-Governor of the National Bank of the Republic of Macedonia, at a panel discussion at the 8th IFC Conference "Statistical implications of the new financial landscape", Bank for International Settlements, Basel, 8-9 September 2016.

The views expressed in this speech are those of the speaker and not the view of the BIS.

Central bank speech  | 
22 September 2016
PDF version
 |  6 pages

Contributors to the note: Ana Mitreska and Maja Andreevska.

Since the burst of the crisis, central banking community has been providing unprecedented monetary stimulus with a view of influencing financial conditions and thus reviving the growth. The measures have gone far beyond the pre-crisis mode of operation. If conducting a monetary policy in the pre-crisis period seemed like steering a ship with rudder being enough powerful to protect against the waves, the waves that emerged in 2007-2008 were too strong and reliance only on the traditional instrument posed a risk of sinking. Thus, when the central banks approached/reached zero-lower bound, they were compelled to start experimenting with new policy tools such as balance sheet policies, forward guidance and in more recent period, even more extremely, with negative policy interest rate. Has this monetary accommodation served the purpose of influencing the financial and economic conditions? While there is ample evidence that the measures have succeeded in influencing financing conditions, so far there is not strong empirical evidence on their impact on output and inflation. In fact, despite the ample liquid and ultra-low interest rates, the output growth remains weak and inflation stubbornly low pointing to a weak monetary transmission mechanism.

This new puzzling reality has triggered more profound discussion on the monetary policy frameworks with the academia and policy making community. Are the main traditional monetary policy principles -price stability as a main objective, some form of flexible IT as a most adequate strategy, policy rate as a main instrument- still valid? As crisis clearly demonstrated that price stability is not a guarantee for financial stability and that financial instability can have long lasting effects on output and inflation, the discussions focus on the need to take into account financial stability in the monetary policy frameworks. Thus, the question: Is the so called "flexible IT" enough flexible to deal with the current and future challenges or maybe as some observers have suggested, a thought should be given to the other alternative strategies such as targeting price level or nominal GDP. The discussions even touch upon the question of adding explicit financial stability objective. For example, as Borio (2016) claims "if mandates (of the central banks) are seen as overly constraining the room for maneuver, revisiting them should not be taboo. After all, mandates are a means to an end." Concerning the instruments, the current debates focus on whether the unconventional tool kit is temporary, as initially considered, or will become conventional tool supplementing the traditional monetary instrument. This is, in particular, in the context of the ultra-low interest rate environment and still anemic recovery. Some of the instruments (balance sheet policies) have even blurred the clear lines between the fiscal and monetary policy, and challenged the notion of central bank independence.

Given the importance of the good statistics on monetary policy decision making, what does this new financial landscape imply for statistics? The crises usually tend to provide an impetus for reshaping of our thinking about the statistical concepts/frameworks of proper policymaking. Thus, the Great Depression shifted the focus to development of national accounts, growth of the euro-dollar markets during 1960 s and 1970s to the development of international banking statistics, and Asian financial crisis shifted the focus to strong linkages between the financial and external stability being conducive to development of data standards initiatives (GDDS, SDDS, and ROSCs). The emergence of the latest crises again put the spotlight on the financial sector, revealing the devastating impact of lax regulation of the financial sector in the more advanced economies on the global economy and once again accentuated the need for tighter surveillance of financial stability. The cross-border linkages, internationalization and integration of the financial markets and institutions were underestimated, and so were their global economic consequences. This time it was the international dimension of the financial system that was emphasized and very importantly, the macro-financial linkages. Both of them do have major implications for the production of statistics and for the regular set of required data, needed for making informative decisions in monetary policy and in the micro and macro prudential area.

While the crisis was not a result of a lack of proper data and the quality of statistics in the pre-crisis period was deemed satisfactory, still the crisis brought to the surface some areas that were rather poorly covered by the statistics. Highly globalized world marked by high and rising trade and especially financial interconnectedness among economies, as well as strong interlinkages between different sectors within the economies, in particular financial sector (including non-regulated segment) and non-financial corporations due to strong macro-financial linkages ask for enhanced data sets. Broader data sets will help in better understanding the developments in the economic and financial sector, cross-sectoral and cross-border linkages and subsequently, in adopting timely measures to prevent build-up of imbalances, as well as in dealing more effectively with the unfolding effects of the crisis.

This time, international efforts in reexamining the statistical frameworks and data gaps are reflected in G20 Data Gaps initiative. It contains 20 recommendations in the four key segments: build-up of risk in the financial sector, cross-border financial linkages, vulnerability of domestic economies to shocks, and improving communication of official statistics. The initiative has been serving as a guideline for most of the progress in the area of statistics, ever since. Filling in the statistical gaps that has been set as a medium to long-term objective is concentrated along a couple of dimensions: availability of data - especially availability of granular data and data related to unconventional instruments, international comparability, further quality improvements and timeliness of the data.

At the current juncture, the need for the data granularity is pertinent for all the aspects of the policymaking, including the monetary policymaking, monetary analysis and the research in this area. The recent crisis did not question the need and usefulness of the traditional aggregate economic and financial statistics (GDP and its components, aggregate credit growth, employment, total external and public debt, budget deficit etc.,), but surely revealed that it is not enough. If we want to increase the effectiveness of monetary policy measures, we should accept the more granular and micro data, as a regular analytical underpinning for the monetary policy decisions nowadays. Of course, the idea is not to act selectively with the monetary policy measures. I still believe that monetary policy tackles the aggregates. It is more about having wider understanding of why the aggregates behave in the manner they do, and more importantly, to have better understanding of the heterogeneous behavior of the different economic agents in the economy. The ultimate outcome of this should be a better understanding of how the monetary policy transmission functions, what the main constraints for more effective transmission are and what can be done to overcome the obstacles. Hence, micro and granular data should supplement the aggregate statistics, allowing us for timely detection and better understanding of "financial innovation, regulatory changes or a behavioral reaction to a changing economic environment". Also, granular data imply higher flexibility for statisticians in responding to urgent and ad-hoc requests of the policy makers.

At this point, I would like to pinpoint several examples that clearly demonstrate the value added of data granularity from policy perspective.

How can we explain the phenomenon of slow credit growth despite the massive monetary policy accommodation? Obviously we need to dive into individual/ more granular data to understand the causes of the impaired transmission - is it demand or supply side? In our case, scrutinizing the individual balance sheet pointed to different obstacles for different banks - although in some cases it was the liquidity issue, mostly it was capital constraints in the context of deleveraging, new Basel regulation, as well as relatively high level of NPLs. It raised the question of the effectiveness of monetary policy transmission and factors which hinder it. In this respect, the aggregate data was not informative enough, but micro bank-by-bank data were needed to detect constraints in a more proper manner. The scrutinization of the balance sheets of individual banks revealed that the main obstacles varied for different banks. For some of them the limited capital was the main constraining factor for expansion of their activities. Given that at that point in time the space for increasing the capital through profit retention or injection of additional capital was limited, the banks were not in a position to expand more, without jeopardizing their capital adequacy ratios. In other banks, the quality of the current credit portfolio and the large amount of NPLs were obstacle for providing more credit to the economy. For some of the banks, the sources of financing of the credit activity were limited, acting as drag on their credit activity. Hence, the exploration of the individual bank data indicated that there are various reasons why the monetary measures do not yield the expected results and that different solutions are relevant for different banks.

Lending survey has been widely recognized as an important source of relevant data on the credit market. In the midstream of the crisis, in particular, not only central banks, but also some international institutions started to conduct lending surveys to tackle the potential structural rigidities, precluding more rapid credit growth. In the case of Macedonia, we are running lending surveys since 2006. Ever since these alternative "soft" data have been an auxiliary, but at the same time a very important information source allowing us to detect the paths of the demand and supply of credits, the main determinants driving the two, but also the banks' vision on the future developments. Very often, as policymakers, we go beyond the aggregate results and screen the results of the survey either by observing individual banks data, or data by group of banks (large, medium, small), as their behavior might differ significantly and can "be lost under the mask of the average". This helps in better understanding the monetary policy shocks of different types of financial institutions (small/large, foreign/domestic, across different business models).

Credit registry established in 1998 that provides a loan-by-loan data appears to be very useful tool for financial stability and monetary policy purposes. The recent crisis has emphasized that despite the fact that there is a vast data on credit available, still more granular, frequent and flexible credit data is needed. Though this is mainly for the purpose of credit risk assessment and financial stability issues, to which I will refer later, the availability of loan-by-loan data and their characteristics allows for more profound monetary policy research. In particular, the individual loan-by-loan data can be utilized for instance, for exploring the risk-taking monetary policy channel, which has been on the margin of the research. Further, data from the credit registry allowed us to investigate where are the credit flows channeled - to what extent it supports the SME segment, which is a key segment in reviving the growth and boosting employment, to what extent it is channeled to non-tradable sector creating possible bubbles down the road.

Of course, the banking system is only one sequence of the monetary policy transmission mechanism. The corporates and households are the other very important part, whose behavior affects the outcome of the undertaken monetary policy measures. As a first line of policymaking, we surely still rely on stylized macro models, where we strive to estimate or calibrate the average behavior of the economy, or the behavior of representative agents. Of course, the reality can be much more different and heterogeneity much more pronounced than envisaged. This is why in the more recent period, as a central bank, we strive to gather as much as possible micro information, for the corporate sector in particular. One of the important steps was the launch of the first Survey on Wage and Price Dynamics, similar to the survey conveyed for the euro area, which provided one of the most comprehensive overviews on the behavior of corporates in terms of wage and price adjustments. The results of the survey, and the subsequent research, gave valuable information. It pertains to information on the nominal and real wage rigidities, and the way the corporate sector adjusts its balance sheet in response to different shocks. All of this supports the understanding of the monetary policy transmission and calibrating economic models that incorporate heterogeneous agents.

One of the lessons from the crisis was that not many central banks had enough granular data on the corporate sector. As mentioned, granular data serve as an important analytical tool for assessing the potential vulnerabilities in the economy, and thus as an important input in the monetary and financial analysis of the central bank. We recognized the importance of the corporate sector data quite early, in the context of detected financial stability risks in particular. Aggregate balance sheets of the Macedonian corporate sector have been transmitted from Macedonia's Central Registry to the central bank since. Yet again, it was recognized that the aggregation loses information, and a more disaggregated approach should be employed. Since 2014, as a central bank, we have an access to the individual balance sheets of all the corporate entities in the economy. The granularity of the data allows for a closer look at the behavior of the firms in different sectors, their performances, their leverage, structure of their sources of financing, the potential vulnerabilities which might cumulate in different sectors, with different implications for the economy in general. All of this is valuable information, not affordable by using the aggregates only.

I would still like to emphasize that for small economies, as the Macedonian, the usage of micro data on a regular basis for monetary policy purposes is not an unknown practice. Given that, we are small and open economy, with a strategy of de facto fixed exchange rate, where in depth scrutinization of the external sector data has been a practice with a long tradition. The incoming data on external sector is analyzed on aggregate and sectoral level, but also some of the micro data on individual companies is being monitored on a monthly basis. This pertains mainly to some of the traditional sectors with relatively large weight in the total export, like the metal industry. Given its importance, and at the same time its large exposure to external shocks, regular monitoring of the individual data provides first signals for potential vulnerabilities. Even more, as the economy is undergoing major structural changes, particularly in the export sector, the regular monitoring of the newcomers provides a very clear vision of their impact on the economy in general. Even more, as some of the aggregate official statistical data, where the weight structure of the different sectors is revised at a predefined time interval, cannot capture the impact of the structural reforms, having satellite individual data on new companies is profoundly beneficial for policy analysis and decision-making. The granularity of the data available helps largely to improve the forecasting process in the Bank, which is the core in the anticipative monetary policymaking. Given the heterogeneity, in the external sector in particular, the probability that different segments behave and will behave differently is large. For that reason, as a complement to the model-based forecasting process, we regularly consult "soft" survey data on business plans of the main exporters and importers. Firm-by-firm data analysis is also used as a tool when screening the data on capital flows (FDI, external debt, portfolio investments where data are available on a security-by security basis), as a complementary to the analysis on the aggregate level.

When discussing monetary policy and the data requirements, after the crisis, one of the important areas is of course the implementation of unconventional monetary policy measures. Does the unconventional monetary policy ask for unconventional statistics? Unconventional measures have varied across countries depending on many country specific factors including real and financial sector imbalances built-in in the run-up to the crisis and fiscal and monetary policy buffers. Still, given that many of the unconventional instruments were more targeted at specific segments of the markets/economy, in principle the traditional data sets may not be fully fit for the purpose entailing a need for additional information to make a good diagnosis, prescribe a proper medicine and finally evaluate the success of the treatment. This is especially the case if the unconventional measures become conventional or new unconventional tools emerge along the horizon, which is not to be excluded. Thus, again a move from traditional aggregate-based statistics to a granular multi-purpose statistics seems a way forward. Going back to our experience with non-standard measures, given the shallow and underdeveloped financial market, we did not face with the same data challenges as central banks in more advanced economies. Yet, in terms of the unconventionality, efforts were made to affect the distribution of the already solid credit growth, by supporting credit to corporate sector, to net-exporters and energy producers in particular. These were identified as segments that on a longer term can support the external position of the economy. In addition, measures were undertaken, supportive to the financing of the corporate sector through the issuance of longer-term debt securities, as alternative source of financing. In general, innovations in monetary frameworks and instruments go hand in hand with innovations in statistics. The non-standard measures underline a need to devise a specific dataset, produced on a regular basis, for monitoring the effectiveness of the measures undertaken.

The financial crisis has underpinned the importance of the financial stability, which forced many central banks to place much higher priority to the financial stability concerns. The focus has shifted from the stability and risks pertaining to individual financial institution to the systemic risk concept. In other words, what matters evenly is the potential instability of the financial system as a whole, and the risk of adverse loops with the real sector. More importantly, the crisis has stressed the time dimension of the systemic risk that is the procyclicality of the financial behavior and the structural dimension, which arises from the interconnectedness among the financial institutions and the possible contagion effects. Hence, the macro prudential analysis and measures are gaining more and more weight, and so are the new datasets requirements for pursuing the macro prudential role. I think that even prior to global crisis, central banks were well equipped with a wide range of macroeconomic, monetary and financial data. What was probably missing was a well-established platform for pursuing strong macro-financial analysis, higher frequency of some of the indicators, and uniformity in the way indicators were calculated. Even more, again the usage of granular, micro data on individual financial institutions was a weak link, precluding the identification of potential interlinkages and contagion effects needed for the macro prudential analysis. The exploration of micro data on the borrowers was also a missing element, which at certain point could be an important tool for assessing the potential risks to financial stability and stability of the economy in general.

The G20 Data Gap Initiative is probably one of the most systematic approaches, which could provide a general stocktaking and monitoring of the progress in terms of the "missing data" for financial stability purposes. The build-up of risks in the financial sector was an area, where the crisis pinpointed the need to create or enhance the data coverage. It was acknowledged that more efforts are needed in the area of capturing enough wide spectrum of indicators for location of leverage or excessive-risk taking, in the unregulated part in particular ("the shadow system"), but also liquidity, credit and tail risks within the regulated sector. An attention was also drawn to the need for the countries to produce soft data on lending standards, by running lending surveys, and to factor in, in the analysis, not only the aggregate figure and averages, but also ranges and distribution of the data.

Being more specific, a special focus was put on the Financial Soundness Indicators (FSIs), which encompass several dimensions of potential risk for the financial stability. The IMF has managed to set-up a template for core and encouraged indicators, and to dramatically increase the number of countries that report FSIs to the Fund. Probably in some of the countries, this was a push-up mechanism to rethink the quality and the availability of the indicators. As for our own experience, we do compile and publish the entire core and most of the encouraged indicators, apart from those that relate to real estate market, distributional aspects of the loans (geographic) and data pertaining to more complex financial instrument, nonexistent in our economy. A special attention should be put on the real estate market, which proved to be one of the triggers for the crisis in some of the countries and missing aspect in the vulnerability analysis. Though comprehensive official data is not published, still the scrutiny of the real estate market, the deviation of the compiled real estate price indices from the equilibria, the prudential screening of the collateralized loans within the central bank is done on a permanent basis. Yet, a more comprehensive data and statistics in this area, both on housing and commercial property segments, is a challenge for the forthcoming period.

One of the important financial stability considerations was the inclusion and monitoring of the system - wide macro prudential risk, meaning inclusion of all the regulated financial institutions, as well as the shadow - unregulated ones. The lessons learned from the financial crisis point to the fact that systemic risk can arise from any part of the financial system. That is why we also need more data to monitor the other financial institutions, supervised and regulated, and unregulated, as well. In this context, Macedonia started collecting detailed data from the other financial institutions (OFIs) since 2012. Later on, the regulators expressed interest in data sharing of these data with the National Bank. Our experience, as worldwide, confirmed that this is where the cooperation between the central bank and other regulatory bodies becomes a necessity, but even more so, good statistics and inter-agency exchange of information and data sharing becomes a priority. Let me point out that Macedonia did not experience significant damage in the crisis period, since the structure of the financial system is very traditional (the MFI subsector holds almost 90% of total assets), as well as conservative in terms of the financial instruments employed (mainly deposits and credits). Nevertheless, the financial stability analysis scrutinizes the risks in all financial institutions, and conveys inter-sectoral contagion analysis to assess possible spillover effects in case of a shock. The central bank cooperates very closely with other regulatory bodies through the Financial Stability Committee, and the cooperation has intensified in the last couple of years, spreading to statistical data collection as well. In 2014, bilateral agreements with all regulatory bodies for OFIs have been amended encompassing securities statistics, arranging data sharing and streamlining the data collection for the needs of the National Bank through the regulators, enabling parallel access to the data, as well as increased quality controls in the process of data collection, and above all overcoming confidentiality issues.

The other side of the coin is that enhanced data sets pose some challenges. More frequent granular data increases the reporting burden of the industry, but also the burden of the statisticians. This further underlines the need for "merit and cost" procedure and reinvestigating the potential of data sharing among domestic or international institutions, which brings us to the issue of comparability and possibility of using the similar data sets for different purposes. Better cooperation among regulatory agencies may help in this regard. The switch from "aggregate statistics" to granular multi-purpose statics requires adequate statistical resources including IT infrastructure and tools to process and present data in a meaningful form for the policymakers, as they should be able to see the wood and not be lost among the trees. The extensive production and usage of individual, micro data raises an important question on the data security and data confidentiality issues. So far, we have seen that it can be an important constraint both for statistics, but also for the analytical work. Knowing that the data are available in the system, but restricted for use, is very often a reality in our day-to-day work. I think that a stronger international initiative and joint effort of the statistics community as well of the policy makers in reexamining the concept of confidentiality is warranted, of course without jeopardizing the protection of the confidentiality of individual data by reporters, but stimulating coherence and cooperation.