Data, technology and policy coordination

Keynote speech by Mr Agustín Carstens, General Manager of the BIS, at the 55th SEACEN Governors' Conference and High-level Seminar on "Data and technology: embracing innovation", Singapore, 14 November 2019.  

BIS speech  | 
14 November 2019

Introduction

It is a great honour to address this distinguished audience today. We meet against the backdrop of the Singapore Fintech Festival and the opening, here in Singapore, of one of the first three BIS Innovation Hub Centres.

Singapore has positioned itself as a centre of innovation, research and development at the heart of the world's most dynamic economic region.1 The impressive achievements in fintech relate in no small part to the work of the Monetary Authority of Singapore (MAS) and Singaporean authorities in creating a solid public infrastructure to foster innovation.

This morning, I will discuss the role of personal data in digital financial innovation. The use of new technology with such data holds great promise, but it also presents new and complex policy trade-offs, and a clear need for domestic and international policy coordination. I would also like to share some thoughts on how the work of the BIS can contribute to this debate.

The value of personal data

Personal data are often touted as the gold of the 21st century.2 Our transactions data, browsing histories, geolocation and broader digital footprint can all be highly valuable in assessing credit quality, pricing insurance policies or marketing financial services. For example, one recent study finds that a user's operating system (iOS versus Android) has information content on income, that the time of day purchases are made (morning versus night) is correlated with character, and that the use of lower case or a name in the user's email address has information on reputation.3 Another study finds that non-traditional information from mobile phone applications and e-commerce platforms can significantly improve the predictive power of credit scoring models.4 Very often, this results in greater efficiency and lower costs.5 In many cases, there are benefits to consumers and society from new applications. For instance, the use of data can foster greater financial inclusion, greater convenience and more tailored and personalised products.

In credit, we are already seeing evidence that fintech and big tech credit, using alternative data, has been a boon for borrowers who are unserved or underserved by banks.6 In China, the major platforms have facilitated credit for hundreds of millions of new personal and business borrowers.7 In many countries, including here in Southeast Asia, access to transaction data, payment of utility bills, platform reviews, etc is driving greater access to financial services. Leveraging their personal data, taxi drivers can borrow to buy their own cars, and students can finance their education. Even in the United States, research suggests that personal transaction data can help the 45-60 million "thin credit file" Americans, ie those who have inadequate credit history, to obtain loans.8

In insurance, the use of personal data can help extend coverage to clients who did not previously have access. This can include small farmers in need of crop insurance, based on geolocation and weather data. Similarly, big techs and large insurers are using data on everything from people's search histories to their driving behaviour to price insurance policies.

Yet as we all know, there are important questions about how best to organise the access to personal data - in other words, rights or control over data. If data are the new gold, what is the new gold standard? There are important questions about the distribution of the gains from the use of data among customers, financial institutions, big techs and others, and about the impact on competition. Finally, there are fundamental policy questions about data privacy. Answers will depend in part on the science - for instance, on the technological possibilities presented by machine learning and big data. However, they may also depend on social preferences, which have deep cultural roots.

The scope for gains from better tailoring of products will depend on the type of personal data shared. Some data are purely private or only meant to be shared with a restricted number of users - eg medical records. At the other extreme are data that people may want to share freely, and which can be shared without causing any harm. In between, there may be data that can be lent out (temporarily) and combined with other data, eg for credit assessments or insurance pricing. There may also be data that are not valuable to users (eg browsing histories), but may be valuable to private sector companies as they may help better target both general and customer-specific services. As a user, I may want to sell such data to the highest bidder.

The complex trade-offs between stability, efficiency and privacy

This must all be conducted within a carefully calibrated regulatory and policy context. For public policy, there are broadly three objectives at play here: not only the well known areas of financial stability and fair competition, but also data protection. The growing importance of data protection and privacy introduces new problems that could alter the usual trade-offs between the three objectives.9

One problem is that ownership of personal data is rarely clearly defined. In many countries, the default outcome is that financial institutions or big techs have de facto ownership of customer data. As such, these firms often reap a large share of the profits from new data use. For instance, if companies can estimate more precisely how much customers are willing to pay, they can engage in price discrimination, charging varying prices for the same service, and capture a greater share of the consumer surplus.10 Left to its own devices, this will not lead to an increase in consumer welfare.

One solution is to assign property rights over data to consumers (ie the "Coasian solution", named after the Chicago economist Ronald Coase). But this brings legal, regulatory and conceptual challenges. For example, especially big techs in particular are able to obtain data from activities outside financial services. How should we assign the property rights for such mixed data? Another issue is the importance of network effects. Data can only be efficiently used in large amounts. In other words, there are returns to scale and scope in data. This gives incumbents that already have extensive data on customers an advantage over potential competitors, which might deter entrance of new firms. However, even if we could create a level playing field between providers of financial services, it is not clear that we should: fragmenting the data landscape might preclude potential benefits from being generated in the first place. Finally, data are non-rival, ie multiple parties can use data without diminishing the availability for others. As such, some argue that we should not be talking about ownership around data at all. They prefer the term data rights.11

In the light of these challenges, solutions like data stacks can help. We will hear more later today about Aadhaar in India and MyInfo here in Singapore. Digital identity can be an important foundation for digital services, and once these digital infrastructures are in place, payments, government services and a host of other solutions are made possible. Making consumers data-rich, and giving them greater ability to give informed consent over their data can bring important improvements.12 Recent research suggests that assigning control rights to consumers can generate outcomes that are close to optimal.13

Another issue is that there can be costs to the widespread sharing and use of data. People value their privacy, and breaches of personal data are harmful.14 Arguably, data privacy also has attributes of a fundamental right that cannot be traded off against economic benefits. Even if data privacy is guaranteed by law, breaches of personal data can occur - and they can erode trust in the financial system. A number of recent high-scale breaches of consumer data underscore these risks. Think for instance of the theft of credit card information on over 106 million American and Canadian customers in the Capital One hack, or theft of personal data from 9.4 million Cathay Pacific passengers. Until now, these large-scale breaches have not led to large changes in consumer behaviour or effects on financial stability, but one can imagine cases where such breaches could have broader effects. There is research suggesting that given certain characteristics of data, especially non-rivalry, firms may have an incentive to underinvest in data security.15

Available evidence suggests that cultural views towards data privacy differ across countries, and across age cohorts. For example, in one recent survey, respondents were asked if they would be open to their bank securely sharing their data with other organisations in exchange for better offers on financial services.16 In India, 65% of respondents said yes. In the Netherlands, this was only 13%. At the country level, it appears that willingness to share data is correlated with the level of income per capita, declining as incomes increase. That suggests that these preferences may change as economies develop. In the same survey, 38% of 25- to 34-year-olds globally were willing to share their data, but only 16% of those over 65 were.

Finally, there are important questions about how data are processed, and the potential for discrimination, financial exclusion and even exploitation. Different algorithms on the same raw data will result in very different outcomes. This has led observers to say that "algorithms are opinions embedded in code".17 There is some evidence on discriminatory outcomes in credit. For instance, one recent study of the US mortgage market found that black and Hispanic borrowers were less likely to benefit from lower interest rates from machine learning-based credit scoring models than non-Hispanic white and Asian borrowers.18 Even more worrying is the potential for intentional harm. There is evidence on new methods for actors to misuse personal data to manipulate the behaviour of consumers, through their understanding of factors like emotional contagion and behavioural biases. For instance, one study based on about 670,000 Facebook users unaware of the experiment found that people's emotional state can be transferred to others through contagion. This can lead people to experience the same emotions without being aware of the cause. Beyond the scientific result, this experiment clearly raises economic, not to mention ethical, concerns about a firm's ability to manipulate consumer and investor sentiment.19 Could similar capabilities, in the wrong hands, be used to manipulate markets or cause financial instability?

Policy considerations

This brings us to policy considerations. Here, I would like to lay out three challenges.

First, at the domestic level, central banks and financial regulators may not yet be up to speed on personal data issues. They need to upgrade their understanding, and to coordinate with competition and data protection authorities. However, we must bear in mind that the mandates and practices of these different national bodies may not always be compatible. For example, financial regulators focus on the specifics of the financial sector, whereas competition and data privacy laws often impose general standards that apply to a wide range of industries. Moreover, financial regulation is often based on international standards, while data protection and competition policy are much more national - to the extent that not all countries even have a unique competition or data protection authority.20

Second, at the international level, we are seeing a wide divergence in regulations on the use of personal data. In the European Union, the General Data Protection Regulation (GDPR) assigns data rights to individuals. It enshrines strict rules and sanctions for the misuse of data. In India, the India Stack generates large volumes of new data, and users have control over them. One of the leading figures in the technology field in India, Nandan Nilekani, refers to this as "data democracy". In China and several other countries, data localisation rules prevent data from being shared across borders. These rules may be justified by national security, but they can also be misused to facilitate protectionism.21 In the United States, there is a patchwork of sector-specific legislation on data use. In practice, companies have relatively free access to data. Some companies, most famously Apple, have resisted calls to share data with public authorities.22 Meanwhile, some countries have a national data or artificial intelligence strategy; many others do not.23 Especially in emerging market and developing economies, there may be tough questions about whether to invest scarce resources into a data strategy rather than more fundamental policy goals like public health or physical infrastructure.

Third, there could be challenges for all the relevant international authorities to discuss and coordinate on issues of personal data in finance. There is an International Conference of Data Protection and Privacy Commissioners (ICDPPC) which meets annually - most recently in late October in Albania. The ICDPPC is doing important work. Yet there is no formal standard-setting body responsible for personal data use, either in financial services or across industries. Some commentators have called for the development of international standards for the digital economy.24 As difficult as it may be for national authorities to define regulatory approaches for personal data usage, it may be even more difficult to form minimum standards at the international level.

The role of the BIS

The BIS considers data central to many discussions on digital financial innovation. There are at least two ways in which the BIS can contribute to the international discussion.

First, the BIS, in collaboration with the global standard-setting bodies, can convene collaborative discussions between international public sector authorities. Some of these discussions are already taking place. To name just one example, a BIS conference in September on stablecoins included central banks, regulators, ministries of finance, and competition and data protection authorities.25 It also involved representatives from the private sector, academia and civil society. This was an enlightening experience, and we hope to continue this dialogue. Such exchanges can help to identify the adequate dimensions of the issues, build a body of knowledge and practices, and identify the most important issues for policy.

Second, the BIS Innovation Hub can help develop public goods of relevance to personal data. This can include foundational work on digital identity, and the so-called global stack. In India, but also here in Singapore, digital ID has led to impressive gains in facilitating financial inclusion, particularly by facilitating account opening and better know-your-customer processes. Just imagine what we can do if we can extend this experience to the international level, and the world of cross-border payments.

The BIS, together with its partners, is taking a leading role in central banks' innovation efforts with the BIS Innovation 2025 strategy. As part of this effort, we will build on our collaboration with central banks from around the world - including those in SEACEN - and with other public authorities and the private sector. We are all approaching the same challenges, from different directions and with different but complementary skills and experiences. The IT revolution knows no borders, but the goal remains the same: a stable monetary and financial system underpinning a healthy and resilient global economy.



1       Eighty of the top 100 tech firms in the world currently have a presence in Singapore. Moreover, a global ranking of startup ecosystems finds that Singapore is at the top of the class for qualities like connectedness and attracting talent. See Singapore Economic Development Board, "Singapore flexes its standing as Asia's technology capital", March 2018; and Startup Genome, Global startup ecosystem report 2019, May 2019.

2      See Diane Coyle, "How much is a data gold mine worth?" 14 July 2019, for a discussion of the value of personal data and economic reasons (eg non-rivalry, positive and negative externalities) why markets for data may not work well on their own.

3      See T Berg, V Burg, A Gombović and M Puri, "On the rise of fintechs - credit scoring using digital footprints", NBER Working Papers, no 24551, April 2018. The authors note that consumers may plausibly change their online behaviour if digital footprints are widely used for credit decisions. This may have further societal and regulatory consequences.

4      See L Gambacorta, Y Huang, H Qiu and J Wang, "How do machine learning and non-traditional data affect credit scoring? New evidence from a Chinese fintech firm", BIS Working Papers, forthcoming.

5      For a wealth of use cases, see R Menon, "Can the three musketeers click? Finance, technology, trust", Bank of France Lecture, Paris, 14 May 2019. For evidence on efficiency gains in the US mortgage market, see A Fuster, M Plosser, P Schnabl and J Vickery, "The role of technology in mortgage lending", The Review of Financial Studies, vol 32, no 5, 2019.

6      See J Jagtiani and C Lemieux, "Do fintech lenders penetrate areas that are underserved by banks?", Journal of Economics and Business, no 100, 2018; H Tang, "Peer-to-peer lenders versus banks: substitutes or complements?", The Review of Financial Studies, vol 32, no 5, 2019; H Hau, Y Huang, H Shan and Z Sheng, "Fintech credit, financial inclusion and entrepreneurial growth", mimeo, 2018; and C de Roure, L Pelizzon and P Tasca, "How does P2P lending fit into the consumer credit market?", Deutsche Bundesbank, Discussion Papers, no 30, 2016.

7      See J Frost, L Gambacorta, Y Huang, H S Shin and P Zbinden, "Big tech and the changing structure of financial intermediation", Economic Policy, forthcoming.

8      FinRegLab, "The use of cash-flow data in underwriting credit", July 2019.

9      See BIS, "Big tech in finance: opportunities and risks", Annual Economic Report 2019, Chapter III, June 2019.

10      O Bar-Gill, "Algorithmic price discrimination: when demand is a function of both preferences and (mis)perceptions", University of Chicago Law Review¸ vol 86, 2018.

11      See D Elliott, "Data rights in finance: key public policy questions and answers", May 2019.

12      See N Nilekani, "Data to the people: India's inclusive internet", Foreign Affairs, September/October 2018.

13      C Jones and C Tonetti, "Nonrivalry and the economics of data", Stanford Graduate School of Business Working Paper, no 3716, 2019.

14      For a further discussion, see A Acquisti, C Taylor and L Wagman, "The economics of privacy", Journal of Economic Literature, vol 54, no 2, 2016.

15      Y Carrière-Swallow and V Haksar, "The economics and implications of data: an integrated perspective", IMF Departmental Papers, vol 19/16, 2019.

16      See EY, Global FinTech Adoption Index 2019, June 2019. The exact prompt was: "I would be comfortable with my main bank securely sharing my financial data with other organisations if it meant that I received better offers from other traditional financial institutions".

17      C O'Neil, Weapons of math destruction: how big data increases inequality and threatens democracy, Broadway Books, 2016.

18      A Fuster, P Goldsmith-Pinkham, T Ramadorai and A Walther, "The effect of machine learning on credit markets", VoxEU, 11 January 2019.

19      A Kamer, J Guillory and J Hancock, "Experimental evidence of massive-scale emotional contagion through social networks", Proceedings of the National Academy of Sciences of the United States of America, March 2014.

20      For example, there is no unique data protection authority in India or the United States. Competition policy in the United States falls under the remit of both the Department of Justice and the Federal Trade Commission.

21      See R Menon, "Singapore FinTech - innovation, inclusion, inspiration", speech at Singapore FinTech Festival, November 2018.

22      Apple, "A message to our customers", February 2016.

24      See eg M Girard, "Global Standards for Digital Cooperation", October 2019; and R Fay, "Digital platforms require a global governance framework", October 2019.