David Rice is an internationally recognized information security professional and an accomplished educator and visionary. For a decade he has advised, counseled, and defended global IT networks for government and private industry. David has been awarded by the U.S. Department of Defense for "significant contributions" advancing security of critical national infrastructure and global networks. Additionally, David has authored numerous IT security courses and publications, teaches for the prestigious SANS Institute, and has served as adjunct faculty at James Madison University. He is a frequent speaker at information security conferences and currently Director of The Monterey Group.
You may or may not have an inkling of what insecure software is, how it impacts your life, or why you should be concerned. That is OK. This book attempts to introduce you to the full scope and consequence of software's impact on modern society without baffling the reader with jargon only experts understand or minutia only experts care about. The prerequisite for this book is merely a hint of curiosity.
Although we interact with software on a daily basis, carry it on our mobile phones, drive with it in our cars, fly with it in our planes, and use it in our home and business computers, software itself remains essentially shroudeda ghost in the machine; a mystery that functions but only part of the time. And therein lays our problem.
Software is the stuff of modern infrastructure. Not only is software infused into a growing number of commercial products we purchase and services we use, but government increasingly uses software to manage the details of our lives, to allocate benefits and public services we enjoy as citizens, and to administer and defend the state as a whole. How and when we touch software and how and when it touches us is less our choice every day. The quality of this software matters greatly; the level of protection this software affords us from harm and exploitation matters even more.
As a case in point, in mid-2007 the country of Estonia, dubbed "the most wired nation in Europe" because of its pervasive use of computer networks for a wide array of private and public activities, had a significant portion of its national infrastructure crippled for over two weeks by cyber attacks launched from hundreds of thousands of individual computers that had been previously hijacked by Russian hackers. Estonia was so overwhelmed by the attacks Estonian leaders literally severed the country's connection to the Internet and with it the country's economic and communications lifeline to the rest of the world. As one Estonian official lamented, "We are back to the stone age." The reason for the cyber attack? The Russian government objected to Estonia's removal of a Soviet-era war memorial from the center of its capital Tallinn to a military cemetery.
The hundreds of thousands of individual computers that took part in the attack belonged to innocents; businesses, governments, and home users located around the world unaware their computers were used as weapons against another nation and another people. Such widespread hijacking was made possible in large part because of insecure softwaresoftware that, due to insufficient software manufacturing practices leaves defects in software that allows, among other things, hackers to hijack and remotely control computer systems. Traditional defensive measures employed by software buyers such as firewalls, anti-virus, and software patches did little to help Estonia and nothing to correct software manufacturing practices that enabled the attacks in the first place.
During the same year, an experienced "security researcher" (a euphemism for a hacker) from IBM's Internet Security Systems was able to remotely break into and hijack computer systems controlling a nuclear power plant in the United States. The plant's owners claimed their computer systems could not be accessed from the Internet. The owners were wrong. As the security researcher later stated after completing the exercise, "It turned out to be the easiest penetration test I'd ever done. By the first day, we had penetrated the network. Within a week, we were controlling a nuclear power plant. I thought, 'Gosh, this is a big problem.'"
Indeed it is.
According to IDC, a global market intelligence firm, 75 percent of computers having access to the Internet have been infected and are actively being used without the owner's knowledge to conduct cyber attacks, distribute unwanted email (spam), and support criminal and terrorist activities. To solely blame hackers or hundreds of thousands of innocent computer users, or misinformedand some might say "sloppy"power plant owners for the deplorable state of cyber security is shortsighted and distracts from the deeper issue. The proverbial butterfly that flaps its wings in Brazil causing a storm somewhere far away is no match for the consequences brought about by seemingly innocuous foibles of software manufacturers. As one analyst commented regarding insecure software as it related to hijacking of the nuclear reactor's computer systems, "These are simple bugs mistakes in software, but very dangerous ones."
The story of Estonia, the nuclear reactor, and thousands of similar news stories merely hint at the underlying problem of modern infrastructure. The "big problem" is insecure software and insecure software is everywhere. From our iPhones (which had a critical weakness in its software discovered merely two weeks after its release) to our laptops, from the XBOX to public utilities, from home computers to financial systems, insecure software is interconnected and woven more tightly into the fabric of civilization with each passing day and with it, as former U.S. Secretary of Defense William Cohen observed, an unprecedented level of vulnerability. Insecure software is making us fragile, vulnerable, and weak.
The threat of global warming might be on everyone's lips, and the polar ice caps might indeed melt but not for a time. What is happening right now because of world-wide interconnection of insecure software gives social problems once limited by geography a new destructive range. Cyber criminals, terrorists, and even nation states are currently preying on millions upon millions of computer systems (and their owners) and using the proceeds to underwrite further crime, economic espionage, warfare, and terror. We are only now beginning to realize the enormity of the storm set upon us by the tiny fluttering of software manufacturing mistakes and the economic and social costs such mistakes impose. In 2007, "bad" software cost the United States roughly $180 billion; this amount represents nearly 40 percent of the U.S. military defense budget for the same year ($439 billion) or nearly 55 percent more than the estimated cost to the U.S. economy ($100 billion) of Hurricane Katrina, the costliest storm to hit the United States since Hurricane Andrew.1
Since the 1960s, individuals both within and outside the software community have worked hard to improve the quality, reliability, and security of software. Smart people have been looking out for you. For this, they should be commended. But the results of their efforts are mixed.
After 40 years of collaborative effort with software manufacturers to improve software quality, reliability, and security, Carnegie Mellon's Software Engineering Institute (SEI)an important contributor to software research and improvementdeclared in the year 2000 that software was getting worse, not better.. Such an announcement by SEI is tantamount to the U.S. Food and Drug Administration warning that food quality in the twenty-first century is poorer now than when Upton Sinclair wrote The Jungle in 1906.2 Unlike progress in a vast majority of areas related to consumer protection and national security, progress against "bad" software has been fitful at best.
While technical complications in software manufacturing might be in part to blame for the sorry state of software, this book argues that even if effective technical solutions were widely available, market incentives do not work for, but work against better, more secure software. This has worrisome consequences for us all.
Incentives matter. Human beings are notoriously complex and fickle creatures that will do whatever it takes to make themselves better off. There is nothing intrinsically wrong with this behavior. People, looking out for their own best interests are what normal, rational human beings are want to do. However, the complication is that society is a morass of competing, misaligned, and conflicting incentives that leads to all manner of situations where one individual&...