Moore's Law vs. Murphy's Law in the Financial System: Who's Winning?

Breakthroughs in computing hardware, software, telecommunications, and data analytics have transformed the financial industry, enabling a host of new products and services such as automated trading algorithms, crypto-currencies, mobile banking, crowdfunding, and robo-advisors. However, the unintended consequences of technology-leveraged finance include firesales, flash crashes, botched initial public offerings, cybersecurity breaches, catastrophic algorithmic trading errors, and a technological arms race that has created new winners, losers, and systemic risk in the financial ecosystem. These challenges are an unavoidable aspect of the growing importance of finance in an increasingly digital society. Rather than fighting this trend or forswearing technology, the ultimate solution is to develop more robust technology capable of adapting to the foibles in human behavior so users can employ these tools safely, effectively, and effortlessly. Examples of such technology are provided.


Introduction
In 1965-three years before he co-founded Intel, now the largest semiconductor chip manufacturer in the world-Gordon Moore published an article in Electronics Magazine in which he observed that the number of transistors that could be placed onto a chip seemed to double every year. This simple observation, implying a constant rate of growth, led Moore to extrapolate an increase in computing potential from sixty transistors per chip in 1965 to sixty thousand in 1975. This number seemed absurd at the time, but it was realized on schedule a decade later. Later revised by Moore to a doubling every two years, "Moore's Law" has been a remarkably prescient forecast of the growth of the semiconductor industry over the last 40 years, as Figure 1 confirms. Technological change is often accompanied by unintended consequences. The Industrial Revolution of the 19th century greatly increased the standard of living, but it also increased air and water pollution. The introduction of chemical pesticides greatly increased the food supply, but it increased a number of birth defects before we understood their properties. And the emergence of an interconnected global financial system greatly lowered the cost and increased the availability of capital to businesses and consumers around the world, but those same interconnections also served as vectors of financial contagion that facilitated the Financial Crisis of [2007][2008][2009]. As a result, the financial industry must weigh Moore's Law against Murphy's Law, "whatever can go wrong, will go wrong," as well as Kirilenko and Lo's (2013) technology-specific corollary, "whatever can go wrong, will go wrong faster and bigger when computers are involved." Some of the unintended consequences of financial technology include firesales, flash crashes, botched initial public offerings, cybersecurity breaches, catastrophic algorithmic trading errors, and a technological arms race that has created new winners, losers, and systemic risk in the financial ecosystem. The inherent paradox of modern financial markets is that technology is both the problem and, ultimately, the solution.
Markets cannot forswear financial technology-the competitive advantages of algorithmic trading and electronic markets are simply too great for any firm to forgobut rather must demand better, more robust technology, technology so advanced it becomes foolproof and invisible to the human operator. Every successful technology has gone through such a process of maturation: the rotary telephone versus the iPhone, paper road maps versus the voice-controlled touchscreen GPS, and the kindly reference librarian versus Google and Wikipedia. Financial technology is no different. To resolve the paradox of Moore's Law versus Murphy's Law, we need version 2.0 of the financial system.

Moore's Law and Finance
Moore's Law now influences a broad spectrum of modern life. It affects everything from household appliances to biomedicine to national defense, and its impact is no less evident in the financial industry. As computing has become faster, cheaper, and better at automating a variety of tasks, financial institutions have been able to greatly increase the scale and sophistication of their services. The emergence of automated algorithmic trading, online trading, mobile banking, crypto-currencies like Bitcoin, crowdfunding, and robo-advisors are all consequences of Moore's Law.
At the same time, the combination of population growth and the complexity of modern society has greatly increased the demand for financial services. In 1900, the total human population was estimated to be 1.5 billion, but little more than a century later-a blink of an eye in the evolutionary timescale-the world's population has grown to 7 billion (see Figure 2). The vast majority of these 7 billion individuals are born into this world without savings, income, housing, food, education, or employment. All of these necessities today require financial transactions of one sort or another, well beyond the capacity of the financial industry in 1900. Therefore, it should come as no surprise that innovations in computer hardware, software, telecommunications, and storage continue to shape Wall Street as a necessary part of its growth. In fact, technological innovation has always been intimately interconnected with financial innovation. New stamping and printing processes, used to prevent clipping, counterfeiting, and other forms of financial fraud, directly led to the modern system of paper banknotes and token coinage. The invention of the telegraph sparked a continentspanning communications revolution that led the creation of the modern futures market in nineteenth-century Chicago. And improvements to the ticker tape machinesymbolic of Wall Street for over a century-made Thomas Edison his early fortune.

Technology and Derivatives
Not very long ago, most trades were made through traders and specialists on the floors of the exchanges. The first electronic exchange, NASDAQ, opened in 1971, but it was originally only a quotation system for the slow-moving over-the-counter market. Most trades were placed over the telephone and executed on the trading floor well into the 1980s. Today, however, nearly all trades on the major financial exchanges are consummated electronically. Moore's Law made the floor specialist obsolete, and trading volume increased exponentially to meet this increase in trading capacity. If the modern financial system had to rely on human specialists to manage even a fraction of this market volume, it would need a trading floor larger than a sports arena.
The symbiosis between technology and finance has accelerated the pace of the financial markets beyond mere human capacity at all levels of the financial system.   The inexorable march of technological progress is part of a much broader trend of finance increasing its role in modern society. Figure 5 presents four illustrations of this trend. Figure 5(a) shows that aggregate employment in the finance and insurance sectors has been increasing steadily over time, unlike the manufacturing sector, which employs about the same number of workers today as it did in the 1940s. The manufacturing sector is able to produce a much greater GDP with the same amount of labor because of technological progress, especially automation. This explanation is confirmed in Figure 5(b), which shows an upward-sloping graph of the value-added per capita in the manufacturing sector, a clear measure of increasing productivity in manufacturing. However, Figure 5(b) also shows that the finance and insurance sectors have an even more steeply sloped productivity curve. This difference in value-added per capita should translate into higher wages for finance and insurance professionals, and financiers. Finance is becoming more and more important. Therein lies the problem.  1948 1953 1958 1963 1968 1973 1978 1983 1988 1993 1998 2003

Moore's Law Meets Murphy's Law
Moore's Law has an unspoken corollary. The rapid growth of financial innovation also means that much of this innovation is adopted without understanding the full risks. It follows, then, that financial innovation is peculiarly susceptible to Murphy's Law: "Anything that can go wrong, will go wrong." Murphy's Law originally comes from the postwar aviation industry, a time when aerospace engineers were finding ways to break the sound barrier and fly faster than the speed of sound, then a new and untested technology. Today, financial engineers are finding ways to move markets faster than the speed of thought. There is one important difference between the two industries, however. Aerospace engineers could test their designs through the efforts of brave test pilots before moving to production. Financial innovation necessarily relies on simulation and past market statistics before it is implemented into the financial system.
As participants in the financial system, we ourselves are the test pilots for the accelerated pace of financial innovation.
From this perspective, perhaps the real surprise is that the financial system has not suffered more technological "prangs" in recent years, to borrow another term from the aerospace industry. But markets are resilient in a way that aircraft are not. Self-interest motivates the investor in a market to take advantage of any technological lapse in its functioning, and in doing so, the investor incorporates that information into market activity. It is only when the market innovation causes a system-wide malfunction that the market fails to compensate. Unfortunately, these breakdowns, although temporary, seem to be occurring at an accelerating rate. Moore's Law has apparently increased the risk of Murphy's Law in the financial system, and the following are some sobering examples. 6

The Quant Meltdown of August 2007
Beginning on Monday, August 6, 2007, and continuing through Thursday, August 9, some of the most successful hedge funds in the industry suffered record losses. Despite the secretive nature of hedge funds and proprietary trading desks, the Wall Street Journal was able to report that some had lost 10 to 30% of their value in a single week. 7 What made these losses even more extraordinary was the fact that they seemed to be concentrated almost exclusively among quantitatively managed equity market-neutral or "statistical arbitrage" hedge funds, giving rise to the event's nickname of the "Quant Quake" or "Quant Meltdown." Although many outside observers were willing to speculate, no institution suffering such losses was willing to comment publicly on the causes of the Quant Meltdown. To address this lack of transparency, Khandani and Lo analyzed the events of the meltdown by simulating the returns of the contrarian trading strategy of Lehmann, and Lo and MacKinlay, on the historical data. 8

The Flash Crash
Their "Unwind Hypothesis" proposed that the losses during the second week of August 2007 were initially due to the forced liquidation of one or more large equity market-neutral portfolios. However, this large portfolio was not unique, but one of many portfolios that had converged on a similar selection, presumably as a result of a widely adopted financial innovation within the hedge fund industry. The price impact of this massive and sudden unwinding caused these similar but independent portfolios to experience losses. These losses in turn caused some funds to deleverage their portfolios, yielding an additional price impact that led to further losses and more deleveraging, and so on, in a deadly feedback loop.
Many of the affected funds were considered to be at the vanguard of industry practice.
The In other words, there is no single "culprit" than can be punished for this debacle, nor any new regulation that can guarantee such an event will never happen again. These charges have not yet been decided upon in a court of law, so they must necessarily remain hypothetical, but there is nothing prima facie implausible about these allegations as a possible component of an explanation for the Flash Crash. Even without fraudulent intent, adding to a large order imbalance in an exchange where market makers were overwhelmed would make the conditions for a flash crash more likely. If these allegations hold, however, they show that the financial system also must be able to cope with innovations that are deliberately antagonistic to the wellbeing of the system.

The BATS and Facebook IPOs
On was expected to generate huge order flows, NASDAQ prided itself on its ability to accommodate high volume of trades so capacity was not a concern. In fact, NASDAQ's IPO Cross software was reputed to be able to compute an opening price from a stock's initial bids and offers in less than 40 microseconds, approximately 10,000 times faster than the blink of an eye.
At the start of the Facebook IPO, demand was so heavy that it took NASDAQ's computers up to five milliseconds to calculate its opening price, more than 100 times slower than usual. As these computations were running, NASDAQ's order system allowed investors to change their orders up the moment the opening trade was printed on the tape. These few milliseconds before the print were more than enough for new orders and cancellations to enter NASDAQ's auction book, causing the IPO software to recalculate the opening trade price, during which time even more orders and cancellations entered its book, compounding the problem. 13 Although scheduled to begin at 11:00am, Facebook's IPO opened a half hour late because of these delays. As of 10:50am, traders had not yet received acknowledgments of pre-opening order cancellations or modifications. Even after NASDAQ formally opened the market, many traders still had not received these critical acknowledgements, creating more uncertainty and anxiety.
Software engineers call this situation a "race condition"; a race between new orders and the print of an opening trade created an infinite loop that could only be broken by manual intervention, something that hundreds of hours of testing had apparently missed.

Knight Capital Group
At market open on August 1, 2012, the well-known U.S. broker-dealer Knight Capital Group-one of the largest equity traders in the industry at the time and among the most technologically sophisticated-issued a surge of unintended orders electronically.
Many of these orders were executed, resulting in a rapid accumulation of positions "unrestricted by volume caps" that created significant swings in the prices of 148 stocks between 9:30 and 10:00am. 17 What could have caused this disaster? Knight subsequently attributed it to "a technology issue… related to a software installation that resulted in Knight sending erroneous orders into the market." Apparently, the SEC later determined that this was the result of a program functionality called "Power Peg," which had not been used since 2003, and had not been fully deleted from Knight's systems.
Unable to void most of these unintentional trades, Knight Capital was forced to liquidate them at market prices, resulting in a $457.6 million loss that wiped out its entire capital base. It's share price plunged 70% and Knight was forced to seek a rescuer; it was eventually acquired by competing broker-dealer GETCO in December 2012. 18

The Treasury Flash Crash
The most surprising aspect of this incident was the fact that Knight was widely considered to be one of the best electronic market makers in the industry, with telecommunications systems and trading algorithms far ahead of most of their competition.
cornerstone market of U.S. Treasuries, whose plunge in October 2014 is still unexplained. Moreover, while solving Technology 101 issues is clearly important, a financial system that relies on all of its parts functioning at 100% efficiency is brittle to accident and deliberate bad intent. The linkages made possible by technological innovation may have increased systemic financial risk in unforeseen ways, but to lower this new form of systemic risk, the solution must be to make financial technology more robust, not to reach for an illusory perfection. Better software engineering in our financial system is analogous to improvements in our public health system to prevent the ill effects of bugs, but we also need a financial immune system that is able to adapt to circumstances to prevent system-wide catastrophes.
What do the financial failures in the preceding section have in common? The common hallmark is a coordinated response to unexpected loss. Under normal conditions, unanticipated financial losses affect market participants narrowly, e.g., the individual investor faced with a margin call they are unable to make. When the losers are sufficiently large in size or number, however, their responses can threaten the financial stability of the system as a whole. Unanticipated losses can cause widespread panic-in the form of flights to safety, rapid price declines, and/or the evaporation of liquiditythat once triggered, is impossible to contain. Technological innovation changes the probability of these losses in unanticipated ways.

Adaptive Regulation
One way that the financial system can adapt to these changing circumstances is to employ dynamic regulation in the financial markets. implement systems in the same spirit as SPAN, but the focus of macroprudential policies must necessarily be the entire financial system, the organism as a whole. The CME is able to treat activities outside its purview as exogenous events, while the financial system must address the endogenous nature of systemic risk and the impact of the regulatory requirements themselves.

Law Is Code
Therefore, to regulate the financial system as a whole, we need to better understand financial regulation as a whole. The U.S. legal system is a working example of adaptive regulation, based on principles of common law that date back to the Middle Ages, and it incrementally changes in response to societal needs and political pressure. However, it was not designed for periods of rapid change, and many of the Founders saw a deliberative pace in legal change as a positive goal. Codification of federal law began startlingly late in American history (1926), and federal statutes are still poorly organized.
It is fruitful to think of the law as the software of the American operating system-yet if a team of software engineers was to analyze the corpus of federal law, they would see thousands of pages of poorly documented code, with a multitude of complex, spaghettilike dependencies between individual modules. 29 Table 1 Using metrics for measuring the quality of software (see ), Li, Azar, Larochelle, Hill, and Lo (2015) analyzed the entire text of the U.S. legal code (all the permanent laws of the United States) and drew some sobering conclusions about its complexity and potential for unintended consequences. One particularly informative measure is a network-based measure of complexity using the degree of "connectivity" across different sections of the U.S. legal code, where a "connection" between two sections of the code is defined as a simple cross-reference of one section by another. Li et al. (2015) cite the example of Section 37 U.S.C. § 329, which 29 Li et al. (2015). Size of cross-reference network core versus periphery 5. Complexity: Code with a large number of conditions, cases, and exceptions is difficult to understand and prone to error.  The layout of these connections-often called the "network topology" in the jargon of mathematical graph theory-can also be used to construct quantitative measures of complexity. One such measure is the notion of a "strongly connected" set of nodes, defined to be a set of nodes in which there is a path from every node to every other node in the set. For example, in Figure 7 nodes B and E form a strongly connected set, but nodes (B, E, A) do not because there is no path from E to A within the subset of these three nodes.
When applied to an entire network, it can be shown that the nodes can be partitioned into a finite number of disjoint subsets, each of which is strongly connected and the union of all these strongly connected subsets is the entire network. A natural measure of complexity can then be defined as the size of the largest strongly connected subset, which is called the "core." In Figure 7, the core is the subset (D, F, G, H) and its size is 4 nodes. The larger the size of the core, the more interconnected is the network; changes to one part of the core could affect every other part of the core (because there exists at least one path from every node to every other node). As the core increases in size, the possible interactions between different nodes grow exponentially. which governs the entire banking industry; its network structure is displayed in Figure   8(c). With an extremely large core and many connections between the core and the periphery, it is easy to see how small changes can lead to unpredictable and unintended consequences in other parts of the network. These new tools provide an X-ray of the hidden structures within current banking regulation. It is perhaps unsurprising that the core sections on banking regulation have to do with the powers of the corporation, insurance funds, and holding companies since that is where the vast majority of financial assets are organized. These sections of the law are of critical importance to the U.S. financial system. To pursue the software analogy further, any effort to reform banking regulation should begin with a systematic "refactoring" and simplification of these sections, improving their internal structure without altering their external behavior, rather than adding increasingly complicated patches to the law whose systemic effects are unknown.

Transparency vs. Privacy
One compelling concern about a systemic, macroprudential approach to financial regulation is financial privacy. Most of the financial industry relies on unpatentable business processes to make a living, as Myron Scholes discovered when he confronted Texas Instruments about its infringement on the Black-Scholes formula. As a result, the financial industry necessarily practices security through obscurity, preferring to use trade secrets to protect its intellectual property. Hedge funds and proprietary trading desks take this to an extreme, essentially serving as "black boxes" for investors, as opaque as the law will allow. However, even the average financial institution has a need to limit disclosure of their business processes, methods, and data, if only to protect the privacy of their clients. Accordingly, government policy has tread carefully on the financial industry's disclosure requirements.
How can financial institutions provide the information that adaptive regulation requires, without feeling burdened or threatened by regulatory intrusion? One solution is to make the interactions between financial institutions and regulators secret. Suppose person 1 takes his salary S1 and adds to it a random number of his choosing X1 to obtain the sum Y1 = S1 + X1 and then shares this sum (but not the components) with person 2. Person 2 then performs the same calculation, adding a random number of her choosing, X2, to her salary S2 and then adding these two values to person 1's information to obtain Y2 = Y1 + S2 + X2. She then passes Y2 to person 3 who adds his random number and salary to it before passing it to the next person, and so on. This process continues from one person to the next until the last person, n, adds his salary and random number to it, yielding Yn = S1 + S2 + ··· + Sn + X 1 + X2 + ··· + Xn.
Now suppose person n passes this sum to person 1 and asks him to subtract his random number X1 from it before passing it to person 2. Person 2 does the same operation, subtracting her random number X2 from the cumulative sum before passing the value to person 3, and so on. Once the process returns to person n, who subtracts his random number, Xn, from the cumulative sum, the value remaining is the sum of all the salaries S1 + S2 + ··· + Sn, which, when divided by the number n which is observable, yields the average salary in the room. Figure 9 summarizes this simple algorithm. At no point during this process did anyone have to reveal his or her private information, yet by the end of the process, the average salary was computed. Such algorithms are the essence of secure multi-party computation.
Person 1 Person 2 Z n = Z n-1 − X n Person n are the proprietary information of each institution and only publicly disclosed with a lag. From a systemic-risk perspective, the individual values are of less importance than the aggregate sum, depicted by the area graph in Figure 10(a). Using a particular algorithm designed just for this purpose, Abbe et al. (2012) show that the individual time series can be encrypted, as in the line graphs in Figure 10(b), yet the sum of the encrypted time series yields the very same bar graph as in Figure 10(a). Aggregate sums can be shared by financial institutions while maintaining the privacy of each institution. Using secure multi-party computation tools, it is possible to construct mathematical protocols that allow aggregate measures to be computed without revealing any of the Of course, techniques like secure multi-party computation certainly do not eliminate the need for regulations or regulators-for example, there is no way to ensure that institutions report truthfully other than through periodic examination-but they can lower the economic cost of sharing certain types of information and provide incentives for the private sector to do so voluntarily. If financial institutions can maintain the privacy of their trade secrets while simultaneously sharing information that leads to more accurate measures of threats to financial stability, they stand to benefit as much as the regulators and the public.

Conclusion
These examples show how technology can reduce the additional systemic financial risk brought about by technological innovation. This is not a paradox. Rather, it is a consequence of the symbiotic relationship between finance and technology. Not very long ago, the financial markets were the most informationally intensive places on Earth, the collective intelligence of the markets incorporating the world's data into prices faster than any computer of the time. Today, the financial markets are one informationally intensive system among many, in a symbiotic relationship with search engines, social networks, messaging systems, and the growing colossus of Big Data.
In this brave new networked world, we will need to adopt a more advanced systems approach to financial technology. No financial engineer or programmer or designer of exchange servers should assume that a new product will function in isolation, but should rather imagine a changing financial environment where past statistics almost certainly will not apply. Similarly, no financial regulator should assume that an innovator will not find a way to circumvent a regulation, perhaps in a worse way than what the regulation originally intended to ban. To return to the analogy of software engineering, perhaps we should be assembling tools for financial system administrators to monitor and troubleshoot problems in the markets, similar to the way a sysadmin monitors and troubleshoots problems in a computer system.
To do this effectively, however, we need more and better information about the operation of financial markets. Going back to the example of the Flash Crash, the CFTC investigators were unable to find signs of Sarao's alleged activities because they were only given a list of completed transactions. "Spoofing," however, cancels the transactions before they are executed, leaving no evidence in the market print. All important market failures and events need to be analyzed scrupulously, and no data must be withheld from investigators.
One potential model for this scrupulous form of analysis already exists. 32 The National Transportation Safety Board (NTSB) has an excellent track record in analyzing and determining the causes of transportation accidents in the U.S. The NTSB has no regulatory authority, freeing the agency to criticize regulations and regulators that it believes may have contributed to the cause of an accident. In addition, the NTSB has subpoena power to obtain the information it needs to make a full analysis of an accident. The NTSB's accident report is not admissible as evidence in lawsuits for civil damages, which allows the stakeholders to be much more candid about their role in an accident. As a result, an NTSB report is able to address the systemic causes of an accident, as it did in its report on USAir Flight 405, which put the ultimate cause of that flight's crash in 1993 on a system-wide failure in de-icing procedures. 33 Better information about financial system failures will require better tools to remedy those failures. Here, mention must be made of the Food and Drug Administration's (FDA's) call for greater "regulatory science." Under an NTSBlike system, stakeholders in financial system failures would have less reason not to be candid about their possible shortcomings-but if this is still insufficient, secure multiparty methods may allow financial information to be observed without identifying specific financial institutions, in a form of cryptographic redaction. Like the financial system, the human body is also an immensely complicated and hyper-connected assemblage of disparate parts. The FDA's mission for over a century has been to protect that body by prohibiting certain dangerous or fraudulent products, and testing the efficacy of others.
To continue to do so effectively in the future, the FDA has proposed a broad strategy to harness science to serve regulation. For example, many models and assays currently used in toxicology are of limited accuracy in predicting adverse events in human beings. They are still in use, however, because they are still considered best practice-a state of affairs that should be uncomfortably familiar to many financial regulators. The 33 NTSB (1993).

FDA (2011).
FDA's regulatory science proposal would clearly define the reliability of these tests and their limitations-also something that should be familiar to financial regulators. The global financial system has experienced exponential growth as a result of its intimate, symbiotic relationship with Moore's Law and new technologies. This has resulted in an unfortunate expansion of new forms of systemic risk, as new linkages made possible by these new technologies changed previously well-understood probabilities of risks in unexpected ways. However, the same technologies that created these linkages also allow us to monitor and supervise the financial system in ways that would have been unthinkable in earlier years. Because of Moore's Law, it is now possible to regulate margin requirements dynamically, analyze financial regulation as though it were a recalcitrant piece of computer code, and oversee aggregate financial data publicly without violating financial privacy or confidentiality requirements.
Although it is too soon to tell, it may be that the past few years have been a temporary blip in the symbiotic Red Queen's Race between finance and technology. Just as technology can add risk to a system, technology can remove it as well.