David Rice is an internationally recognized information security professional and an accomplished educator and visionary. For a decade he has advised, counseled, and defended global IT networks for government and private industry. David has been awarded by the U.S. Department of Defense for "significant contributions" advancing security of critical national infrastructure and global networks. Additionally, David has authored numerous IT security courses and publications, teaches for the prestigious SANS Institute, and has served as adjunct faculty at James Madison University. He is a frequent speaker at information security conferences and currently Director of The Monterey Group.
You may or may not have an inkling of what insecure software is, how it impacts your life, or why you should be concerned. That is OK. This book attempts to introduce you to the full scope and consequence of software's impact on modern society without baffling the reader with jargon only experts understand or minutia only experts care about. The prerequisite for this book is merely a hint of curiosity.
Although we interact with software on a daily basis, carry it on our mobile phones, drive with it in our cars, fly with it in our planes, and use it in our home and business computers, software itself remains essentially shroudeda ghost in the machine; a mystery that functions but only part of the time. And therein lays our problem.
Software is the stuff of modern infrastructure. Not only is software infused into a growing number of commercial products we purchase and services we use, but government increasingly uses software to manage the details of our lives, to allocate benefits and public services we enjoy as citizens, and to administer and defend the state as a whole. How and when we touch software and how and when it touches us is less our choice every day. The quality of this software matters greatly; the level of protection this software affords us from harm and exploitation matters even more.
As a case in point, in mid-2007 the country of Estonia, dubbed "the most wired nation in Europe" because of its pervasive use of computer networks for a wide array of private and public activities, had a significant portion of its national infrastructure crippled for over two weeks by cyber attacks launched from hundreds of thousands of individual computers that had been previously hijacked by Russian hackers. Estonia was so overwhelmed by the attacks Estonian leaders literally severed the country's connection to the Internet and with it the country's economic and communications lifeline to the rest of the world. As one Estonian official lamented, "We are back to the stone age." The reason for the cyber attack? The Russian government objected to Estonia's removal of a Soviet-era war memorial from the center of its capital Tallinn to a military cemetery.
The hundreds of thousands of individual computers that took part in the attack belonged to innocents; businesses, governments, and home users located around the world unaware their computers were used as weapons against another nation and another people. Such widespread hijacking was made possible in large part because of insecure softwaresoftware that, due to insufficient software manufacturing practices leaves defects in software that allows, among other things, hackers to hijack and remotely control computer systems. Traditional defensive measures employed by software buyers such as firewalls, anti-virus, and software patches did little to help Estonia and nothing to correct software manufacturing practices that enabled the attacks in the first place.
During the same year, an experienced "security researcher" (a euphemism for a hacker) from IBM's Internet Security Systems was able to remotely break into and hijack computer systems controlling a nuclear power plant in the United States. The plant's owners claimed their computer systems could not be accessed from the Internet. The owners were wrong. As the security researcher later stated after completing the exercise, "It turned out to be the easiest penetration test I'd ever done. By the first day, we had penetrated the network. Within a week, we were controlling a nuclear power plant. I thought, 'Gosh, this is a big problem.'"
Indeed it is.
According to IDC, a global market intelligence firm, 75 percent of computers having access to the Internet have been infected and are actively being used without the owner's knowledge to conduct cyber attacks, distribute unwanted email (spam), and support criminal and terrorist activities. To solely blame hackers or hundreds of thousands of innocent computer users, or misinformedand some might say "sloppy"power plant owners for the deplorable state of cyber security is shortsighted and distracts from the deeper issue. The proverbial butterfly that flaps its wings in Brazil causing a storm somewhere far away is no match for the consequences brought about by seemingly innocuous foibles of software manufacturers. As one analyst commented regarding insecure software as it related to hijacking of the nuclear reactor's computer systems, "These are simple bugs mistakes in software, but very dangerous ones."
The story of Estonia, the nuclear reactor, and thousands of similar news stories merely hint at the underlying problem of modern infrastructure. The "big problem" is insecure software and insecure software is everywhere. From our iPhones (which had a critical weakness in its software discovered merely two weeks after its release) to our laptops, from the XBOX to public utilities, from home computers to financial systems, insecure software is interconnected and woven more tightly into the fabric of civilization with each passing day and with it, as former U.S. Secretary of Defense William Cohen observed, an unprecedented level of vulnerability. Insecure software is making us fragile, vulnerable, and weak.
The threat of global warming might be on everyone's lips, and the polar ice caps might indeed melt but not for a time. What is happening right now because of world-wide interconnection of insecure software gives social problems once limited by geography a new destructive range. Cyber criminals, terrorists, and even nation states are currently preying on millions upon millions of computer systems (and their owners) and using the proceeds to underwrite further crime, economic espionage, warfare, and terror. We are only now beginning to realize the enormity of the storm set upon us by the tiny fluttering of software manufacturing mistakes and the economic and social costs such mistakes impose. In 2007, "bad" software cost the United States roughly $180 billion; this amount represents nearly 40 percent of the U.S. military defense budget for the same year ($439 billion) or nearly 55 percent more than the estimated cost to the U.S. economy ($100 billion) of Hurricane Katrina, the costliest storm to hit the United States since Hurricane Andrew.1
Since the 1960s, individuals both within and outside the software community have worked hard to improve the quality, reliability, and security of software. Smart people have been looking out for you. For this, they should be commended. But the results of their efforts are mixed.
After 40 years of collaborative effort with software manufacturers to improve software quality, reliability, and security, Carnegie Mellon's Software Engineering Institute (SEI)an important contributor to software research and improvementdeclared in the year 2000 that software was getting worse, not better.. Such an announcement by SEI is tantamount to the U.S. Food and Drug Administration warning that food quality in the twenty-first century is poorer now than when Upton Sinclair wrote The Jungle in 1906.2 Unlike progress in a vast majority of areas related to consumer protection and national security, progress against "bad" software has been fitful at best.
While technical complications in software manufacturing might be in part to blame for the sorry state of software, this book argues that even if effective technical solutions were widely available, market incentives do not work for, but work against better, more secure software. This has worrisome consequences for us all.
Incentives matter. Human beings are notoriously complex and fickle creatures that will do whatever it takes to make themselves better off. There is nothing intrinsically wrong with this behavior. People, looking out for their own best interests are what normal, rational human beings are want to do. However, the complication is that society is a morass of competing, misaligned, and conflicting incentives that leads to all manner of situations where one individual's behavior may adversely affect another. Nowhere is this more obvious than in free market economies. As such, Geekonomics is the story of software told through the lens of humanity, not through the lens of technology.
To see and to understand insecure software merely as a technical phenomenon to be solved by other technical phenomena is to be distracted from the larger issue. Software is a human creation and it need not be mysterious or magical. It also need not to make us fragile, vulnerable, and weak. To understand software and its implications for society requires an understanding of how humans behave, not necessarily how software behaves. More specifically, this book looks at the array of incentives that compel people to manufacture, buy, and exploit insecure software. In short, incentives matter for any human endeavor and without understanding the incentives that drive people toward or away from a particular behavior, all the potential technical solutions that might help address the problem of insecure software will sit idle, or worse, never be created at all. After 40 years of effort with debatable improvement, this much is evident.
As with any complex issue, and especially with a complex issue such as software manufacturing, there are few "right" answers regarding how to fix the problem. However, there are ways of approaching complex issues more fruitful than others that are worth investigating. Protecting economic and national security from the effects of insecure software is as much an economic issue as it is a technological issue. We know software is as notoriously complex and fickle as the humans that create it, if not more so. But as a human creation, we need not understand insecure software in its entirely; we need merely to get humans to stop creating the stuff. And this is where incentives come in.
At base, economics teaches us, at least in part, how to get incentives right. Of course, economists are not always right when it comes to forecasting the expected effects of a particular incentive, but economics allows us to approach complex issues from a scientific perspective and make reasonable, better-informed decisions. By using and analyzing dataeven imperfect dataeconomics allows us to view the world as it is, look back as it was, and to anticipate how it might be. Incentives help navigate the path to a desired future. The desired future of this author is a stable, secure, global infrastructure that propels humanity beyond its wildest dreams.
There are three primary themes in Geekonomics:
First, software is becoming the foundation of modern civilization; software constitutes or will control the products, services, and infrastructure people will rely on for a wide variety of daily activities from the vital to the trivial.
Second, software is not sufficiently engineered at this time to fulfill the role of "foundation." The information infrastructure is the only part of national infrastructure that is destructively tested while in use; that is, software is shipped containing both known and unknown weaknesses for which software buyers are made aware of and must fix only after installation (or after losing control of your nuclear power plant). The consequences are already becoming apparent and augurs ill for us all.
Third, important economic, legal, and regulatory incentives that could improve software quality, reliability, and security are not only missing, but the market incentives that do exist are perverted, ineffectual, or distorted. Change the incentives and the story and effects of insecure software change also.
Because of the complexity of software itself and the complexity of manufacturing software, no single discipline, even one as powerful as economics, is sufficient for holistically addressing the topic at hand. As such, this book also contains a splash of psychology, physics, engineering, philosophy, and criminology which are mostly framed within the context of incentives. This book does not contain the complete story of insecure software, only those parts that a single author can realistically include in a book meant to inform, entertain, and enlighten.
I like software. I really do. Though the tone of my writing is often forceful and urgent regarding insecure software in general and software manufacturers in particular, I truly appreciate all the things I can do with software that I could not possibly do as quickly, efficiently, or cheaply without it. Writing this book was infinitely easier using a word processor than with a traditional type writer (of which I have not owned one for 20 years). But everything has a cost and not all costs are readily apparent at the time of acquisition. I had no less than three separate storage locations (laptop hard drive, USB key, network storage) for the book's manuscript just in case something should happen, which it inevitably did. My word processor application (which will remain nameless) crashed or froze roughly 40 times in the course of writing this book. Without software this book might not have been written as quickly compared to older methods. That is not in question. Without reliable backups however, this book would not have been written at all.
Ironically, in writing this book I attempted to avoid providing a litany of software disasters in hopes of escaping claims that I might be promoting "fear, uncertainty, and doubt," a claim that so often pollutes and plagues discussions regarding software security; yet, many of my non-expert reviewers (for whom the book is focused) thought I was being unfair to software manufacturers because I did not provide the necessary probative evidence to establish why software manufacturers are partly to blame for threatening the foundation of civilization. "What was needed?" I asked. A litany of software disasters would be helpful, came the reply. And so my hope is that the litany of disasters provided in this book are seen as necessary to provide context and perspective for those unfamiliar with the subject and impact of insecure software, rather than the primary focus of the book.