Geekonomics: The Real Cost of Insecure Software Hardcover – Nov 29 2007
Customers Who Bought This Item Also Bought
No Kindle device required. Download one of the Free Kindle apps to start reading Kindle books on your smartphone, tablet, and computer.
To get the free app, enter your e-mail address or mobile phone number.
About the Author
David Rice is an internationally recognized information security professional and an accomplished educator and visionary. For a decade he has advised, counseled, and defended global IT networks for government and private industry. David has been awarded by the U.S. Department of Defense for "significant contributions" advancing security of critical national infrastructure and global networks. Additionally, David has authored numerous IT security courses and publications, teaches for the prestigious SANS Institute, and has served as adjunct faculty at James Madison University. He is a frequent speaker at information security conferences and currently Director of The Monterey Group.
Excerpt. © Reprinted by permission. All rights reserved.
You may or may not have an inkling of what insecure software is, how it impacts your life, or why you should be concerned. That is OK. This book attempts to introduce you to the full scope and consequence of software's impact on modern society without baffling the reader with jargon only experts understand or minutia only experts care about. The prerequisite for this book is merely a hint of curiosity.
Although we interact with software on a daily basis, carry it on our mobile phones, drive with it in our cars, fly with it in our planes, and use it in our home and business computers, software itself remains essentially shroudeda ghost in the machine; a mystery that functions but only part of the time. And therein lays our problem.
Software is the stuff of modern infrastructure. Not only is software infused into a growing number of commercial products we purchase and services we use, but government increasingly uses software to manage the details of our lives, to allocate benefits and public services we enjoy as citizens, and to administer and defend the state as a whole. How and when we touch software and how and when it touches us is less our choice every day. The quality of this software matters greatly; the level of protection this software affords us from harm and exploitation matters even more.
As a case in point, in mid-2007 the country of Estonia, dubbed "the most wired nation in Europe" because of its pervasive use of computer networks for a wide array of private and public activities, had a significant portion of its national infrastructure crippled for over two weeks by cyber attacks launched from hundreds of thousands of individual computers that had been previously hijacked by Russian hackers. Estonia was so overwhelmed by the attacks Estonian leaders literally severed the country's connection to the Internet and with it the country's economic and communications lifeline to the rest of the world. As one Estonian official lamented, "We are back to the stone age." The reason for the cyber attack? The Russian government objected to Estonia's removal of a Soviet-era war memorial from the center of its capital Tallinn to a military cemetery.
The hundreds of thousands of individual computers that took part in the attack belonged to innocents; businesses, governments, and home users located around the world unaware their computers were used as weapons against another nation and another people. Such widespread hijacking was made possible in large part because of insecure softwaresoftware that, due to insufficient software manufacturing practices leaves defects in software that allows, among other things, hackers to hijack and remotely control computer systems. Traditional defensive measures employed by software buyers such as firewalls, anti-virus, and software patches did little to help Estonia and nothing to correct software manufacturing practices that enabled the attacks in the first place.
During the same year, an experienced "security researcher" (a euphemism for a hacker) from IBM's Internet Security Systems was able to remotely break into and hijack computer systems controlling a nuclear power plant in the United States. The plant's owners claimed their computer systems could not be accessed from the Internet. The owners were wrong. As the security researcher later stated after completing the exercise, "It turned out to be the easiest penetration test I'd ever done. By the first day, we had penetrated the network. Within a week, we were controlling a nuclear power plant. I thought, 'Gosh, this is a big problem.'"
Indeed it is.
According to IDC, a global market intelligence firm, 75 percent of computers having access to the Internet have been infected and are actively being used without the owner's knowledge to conduct cyber attacks, distribute unwanted email (spam), and support criminal and terrorist activities. To solely blame hackers or hundreds of thousands of innocent computer users, or misinformedand some might say "sloppy"power plant owners for the deplorable state of cyber security is shortsighted and distracts from the deeper issue. The proverbial butterfly that flaps its wings in Brazil causing a storm somewhere far away is no match for the consequences brought about by seemingly innocuous foibles of software manufacturers. As one analyst commented regarding insecure software as it related to hijacking of the nuclear reactor's computer systems, "These are simple bugs mistakes in software, but very dangerous ones."
The story of Estonia, the nuclear reactor, and thousands of similar news stories merely hint at the underlying problem of modern infrastructure. The "big problem" is insecure software and insecure software is everywhere. From our iPhones (which had a critical weakness in its software discovered merely two weeks after its release) to our laptops, from the XBOX to public utilities, from home computers to financial systems, insecure software is interconnected and woven more tightly into the fabric of civilization with each passing day and with it, as former U.S. Secretary of Defense William Cohen observed, an unprecedented level of vulnerability. Insecure software is making us fragile, vulnerable, and weak.
The threat of global warming might be on everyone's lips, and the polar ice caps might indeed melt but not for a time. What is happening right now because of world-wide interconnection of insecure software gives social problems once limited by geography a new destructive range. Cyber criminals, terrorists, and even nation states are currently preying on millions upon millions of computer systems (and their owners) and using the proceeds to underwrite further crime, economic espionage, warfare, and terror. We are only now beginning to realize the enormity of the storm set upon us by the tiny fluttering of software manufacturing mistakes and the economic and social costs such mistakes impose. In 2007, "bad" software cost the United States roughly $180 billion; this amount represents nearly 40 percent of the U.S. military defense budget for the same year ($439 billion) or nearly 55 percent more than the estimated cost to the U.S. economy ($100 billion) of Hurricane Katrina, the costliest storm to hit the United States since Hurricane Andrew.1
Since the 1960s, individuals both within and outside the software community have worked hard to improve the quality, reliability, and security of software. Smart people have been looking out for you. For this, they should be commended. But the results of their efforts are mixed.
After 40 years of collaborative effort with software manufacturers to improve software quality, reliability, and security, Carnegie Mellon's Software Engineering Institute (SEI)an important contributor to software research and improvementdeclared in the year 2000 that software was getting worse, not better.. Such an announcement by SEI is tantamount to the U.S. Food and Drug Administration warning that food quality in the twenty-first century is poorer now than when Upton Sinclair wrote The Jungle in 1906.2 Unlike progress in a vast majority of areas related to consumer protection and national security, progress against "bad" software has been fitful at best.
While technical complications in software manufacturing might be in part to blame for the sorry state of software, this book argues that even if effective technical solutions were widely available, market incentives do not work for, but work against better, more secure software. This has worrisome consequences for us all.
Incentives matter. Human beings are notoriously complex and fickle creatures that will do whatever it takes to make themselves better off. There is nothing intrinsically wrong with this behavior. People, looking out for their own best interests are what normal, rational human beings are want to do. However, the complication is that society is a morass of competing, misaligned, and conflicting incentives that leads to all manner of situations where one individual's behavior may adversely affect another. Nowhere is this more obvious than in free market economies. As such, Geekonomics is the story of software told through the lens of humanity, not through the lens of technology.
To see and to understand insecure software merely as a technical phenomenon to be solved by other technical phenomena is to be distracted from the larger issue. Software is a human creation and it need not be mysterious or magical. It also need not to make us fragile, vulnerable, and weak. To understand software and its implications for society requires an understanding of how humans behave, not necessarily how software behaves. More specifically, this book looks at the array of incentives that compel people to manufacture, buy, and exploit insecure software. In short, incentives matter for any human endeavor and without understanding the incentives that drive people toward or away from a particular behavior, all the potential technical solutions that might help address the problem of insecure software will sit idle, or worse, never be created at all. After 40 years of effort with debatable improvement, this much is evident.
As with any complex issue, and especially with a complex issue such as software manufacturing, there are few "right" answers regarding how to fix the problem. However, there are ways of approaching complex issues more fruitful than others that are worth investigating. Protecting economic and national security from the effects of insecure software is as much an economic issue as it is a technological issue. We know software is as notoriously complex and fickle as the humans that create it, if not more so. But as a human creation, we need not understand insecure software in its entirely; we need merely to get humans to stop creating the stuff. And this is where incentives come in.
At base, economics teaches us, at least in part, how to get incentives right. Of course, economists are not always right when it comes to forecasting the expected effects of a particular incentive, but economics allows us to approach complex issues from a scientific perspective and make reasonable, better-informed decisions. By using and analyzing dataeven imperfect dataeconomics allows us to view the world as it is, look back as it was, and to anticipate how it might be. Incentives help navigate the path to a desired future. The desired future of this author is a stable, secure, global infrastructure that propels humanity beyond its wildest dreams.
There are three primary themes in Geekonomics:
First, software is becoming the foundation of modern civilization; software constitutes or will control the products, services, and infrastructure people will rely on for a wide variety of daily activities from the vital to the trivial.
Second, software is not sufficiently engineered at this time to fulfill the role of "foundation." The information infrastructure is the only part of national infrastructure that is destructively tested while in use; that is, software is shipped containing both known and unknown weaknesses for which software buyers are made aware of and must fix only after installation (or after losing control of your nuclear power plant). The consequences are already becoming apparent and augurs ill for us all.
Third, important economic, legal, and regulatory incentives that could improve software quality, reliability, and security are not only missing, but the market incentives that do exist are perverted, ineffectual, or distorted. Change the incentives and the story and effects of insecure software change also.
Because of the complexity of software itself and the complexity of manufacturing software, no single discipline, even one as powerful as economics, is sufficient for holistically addressing the topic at hand. As such, this book also contains a splash of psychology, physics, engineering, philosophy, and criminology which are mostly framed within the context of incentives. This book does not contain the complete story of insecure software, only those parts that a single author can realistically include in a book meant to inform, entertain, and enlighten.
I like software. I really do. Though the tone of my writing is often forceful and urgent regarding insecure software in general and software manufacturers in particular, I truly appreciate all the things I can do with software that I could not possibly do as quickly, efficiently, or cheaply without it. Writing this book was infinitely easier using a word processor than with a traditional type writer (of which I have not owned one for 20 years). But everything has a cost and not all costs are readily apparent at the time of acquisition. I had no less than three separate storage locations (laptop hard drive, USB key, network storage) for the book's manuscript just in case something should happen, which it inevitably did. My word processor application (which will remain nameless) crashed or froze roughly 40 times in the course of writing this book. Without software this book might not have been written as quickly compared to older methods. That is not in question. Without reliable backups however, this book would not have been written at all.
Ironically, in writing this book I attempted to avoid providing a litany of software disasters in hopes of escaping claims that I might be promoting "fear, uncertainty, and doubt," a claim that so often pollutes and plagues discussions regarding software security; yet, many of my non-expert reviewers (for whom the book is focused) thought I was being unfair to software manufacturers because I did not provide the necessary probative evidence to establish why software manufacturers are partly to blame for threatening the foundation of civilization. "What was needed?" I asked. A litany of software disasters would be helpful, came the reply. And so my hope is that the litany of disasters provided in this book are seen as necessary to provide context and perspective for those unfamiliar with the subject and impact of insecure software, rather than the primary focus of the book.
Inside This Book(Learn More)
Top Customer Reviews
Yes, the Author does have some ideas on how to deal with software crisis of our time. His suggestions are not technical; they are economic incentives to encourage software manufacturers to produce better, more reliable and more secure software. Great, but even the Author himself doesn't seem to think that any of them is easy to implement in real life.
Well, I happen to know a few things about software, and I have a problem with the term "software manufacturers" to start with. Software industry is very different from other industries, because software is not really manufactured. The process of its development is much more similar to design in traditional industries. It is, by its nature, a highly creative process, which cannot be completely automated, regulated, standardized or licensed, like it or not. And that is the reason why none of the Book's ideas sound realistic enough.
Sure, I share a lot of Author's concerns and indignation, but when he compares software with Portland cement, cars or screws, he can't be serious. Why? Because the software is incredibly diverse, and although some kinds of it might resemble cement, screws and cars, other kinds are more like thermometers, camcorders or space rockets, or everything in between. Most software, however, is hard to visualise or compare to anything else, and that's where the Book's analogies are deceiving.Read more ›
Most Helpful Customer Reviews on Amazon.com (beta)
Geekonomics explains the fundamental reasons why software of all types usually fails to deliver what we need, especially security, and the threat that this failure invites. The dangers that Rice describes are on the scale of global warming. Did this statement get your attention? Good, because it's true, and the magnitude and imminence of this problem deserves your attention. Just like the threat of global warming, we dare not ignore the threat of insecure software, because software has become the infrastructure of the modern world.
Geekonomics is not only an important book, it is also a good book. Rice is smart and thoughtful, and he knows how to write. If you rely on software (and who doesn't?), you should read this book. If you produce software, you should read this book. You might not like what you read, but you need to hear it, and we all need to do something about it.
It is important to remember that Geekonomics is almost exclusively a vulnerability-centric book. Remember that the "risk equation" is usually stated as "risk = vulnerability X threat X impact". While it is silly to assign numbers to these factors, you can see that decreasing vulnerability while keeping threat and impact constant results in decreased risk. This is the author's thesis. Rice believes the governing issue in software security is the need to reduce vulnerability.
The problem with this approach is that life is vulnerability. It is simply too difficult to eliminate enough vulnerability in order to reduce risk in the real world. Most real world security is accomplished by reducing threats. In other words, the average citizen does not reduce the risk of being murdered by wearing an electrified, mechanized armor suit, thereby mitigating the vulnerability of his soft flesh and breakable neck. Instead, he relies on the country's legal system and police force to deter, investigate, apprehend, prosecute, and incarcerate threats.
Consider now the issue of safety vs security. The author makes comparisons using the London sewer, various aspects of driving, and the New York subway system. Especially in the first two cases, these are exclusively issues of safety, not security. What is the difference? Safety incidents happen because a system fails. Security incidents happen because an intelligent adversary exploits a system. The outcome of the London sewer and driving cases would be much different if the Nazis were bombing the sewer system or Mad Max was shooting at cars or blowing holes in pavement. In short, the author cannot suggest that an approach that works against a safety problem is going to succeed against a security problem. Security problems are more dynamic because the threat perceives, adapts, and returns in ways unexpected by the victim.
As far as open source goes (ch 6), the author makes several statements which show he does not understand the open source world. First, on p 247 the author states "While a binary is easy for a computer to read, it is tremendously difficult for a person -- even the original developer -- to understand." This is absolutely false, and the misunderstandings continue in the same paragraph. Reverse engineering techniques can determine how binaries operate, even to the point that organizations like the Zeroday Emergency Response Team (ZERT) provide patches for Microsoft vulnerabilities without having access to source code!
Second, on p 248 the author states "The essence of open source software is the exact opposite of proprietary software. Open source software is largely an innovation after-the-fact; that is, open source software builds upon an idea already in the marketplace that can be easily replicated or copied." On what planet?
Third, on p 263 the author states "[O]pen source projects are almost always threatened by foreclosure," meaning if the developer loses interest the users are doomed. That claim totally misses the power of open source. When a proprietary software vendor stops coding a product, the customers are out of luck. When an open source software developer stops coding a product, the customers are NOT out of luck. They can 1) hope someone else continues the project; 2) try continuing the project themselves; or 3) hire someone else to continue developing the product. Finally, if the author is worried about open source projects not having an organization upon which liability could be enforced, he should consider the many vendors who sell open source software.
Why then did I love Geekonomics? Aside from these two issues, the rest of the book is excellent. The legal chapter alone would be enough to justify reading the book. Although I took introductory law in college, ch 5 put the law into context for my professional industry (digital security). The author's discussions of disorder, churn, software buyers as crash dummies, adhesion contracts, strict liability, aero charts as products, and many other areas are spot-on and eloquently discussed. I disagree with the author's recommendation for a vulnerability tax, but the fact we can have the discussion is really powerful. (How in the world could vulnerabilities be measured in order to be taxed? Why weren't auto makers taxed for "unsafe" cars? If cars were being bombed on the highway, would auto makers be taxed? And so on.)
I'll leave the platitudes to the previous reviews, but suffice it to say that you should read Geekonomics. The future of software is legal, and Geekonomics is an incredible way to understand what is happening in our industry.
While the cement roads we are putting in will last a hundred or more years, the author points out that software is often essentially obsolete by the time the consumer takes possession of it. In fact, consumers value innovation so much, that it is prized above security even if a quick look at the news shows us the cumulative effect of software failure leading to data breach. At this exact moment, according to privacyrights.org, 216,770,536 consumer records have been lost. As Rice points out, in the 1970s the criminal underground realized there was more money to be made, at less risk of being caught, trafficking in drugs than other forms of crime, so it became a big thing. In the past few years, the criminal underground is starting to focus on software, specifically vulnerabilities in software that can lead to data breaches that allow identity theft and credit card fraud.
As the book explains, crime begets crime, if you have a neighborhood with broken windows, this can lead to additional problems, criminals and other worthless fellows are comfortable hanging out and doing whatever they want to do. This too, I have seen in my own life, one of my employees has had to abandon her home for a few weeks. The condominium above her had a broken window that was used to enter that home and people took up residence in the empty foreclosed home. They invited their friends and now the entire complex is less desirable. Geekonomics lists the positive example of the New York Subway system's clean car program, that all cars had to be clean with no graffiti, if a car could not be cleaned it was taken out of service until it was clean. This has lead to a major improvement in the security and user experience of the subway system. However, as the author points out, you can see graffiti, you cannot necessarily see the flaws in software that attract the criminal elements.
Another interesting comparison the book makes is the interstate highway system in the US. It was designed for safety from the beginning and is a critical part of the national infrastructure. If you want to go somewhere you can. For all its costs, having this infrastructure in place saves far more money, imagine trying to get fresh milk to market over muddy, pot hole filled roads. However the Internet, which is the software analog of the highway system was not built for safety and may well not scale to growth as well as the highway system has.
The book continues example after example to show how our legal system does not aid the consumer to receive quality and safety from software, but if fact makes the problem worse. Rice does not simply dwell on problems, after strongly establishing his case, he points the way to the changes that need to take place if we, the first generation to be truly dependent on software are going to prosper. This is an important book, it does not require knowledge of IT or software development to read, every thinking man and woman should read this book and ask, what can I do? Standards, quality and making incentives achieve the results we want and deserve are key. As the author says, "I believe we have not gone too far down the path to alter course, but we aren't trying hard enough yet." That is the call to action, write your legislator, lobby consumer organizations, do what you can, but advocate rational software. Thank you David Rice.
So far, the reviewers of this book are all "security people". Please know that there are caveats to such reviews - namely, we are always looking for the "aha" publications that tell the rest of the world what we have known for a while now. This is one of those, and it may very well be the first I've really enjoyed while trying to put myself in the shoes of the "average computer user" in the world today. My usual way of doing this is by asking myself "Will my mom understand this?" I'm very pleased to report that my mom could in fact "get" the big picture David is painting here - namely, that software is something we are relying on as a critical part of society today, and it is just as fundamentally flawed as the early sewer systems he describes early in the book.
What's great about this book, aside from the points already articulated by the other reviewers, is that it takes a problem we all know exists (most software is crappy) and forces you to look at it from a number of different angles. How many books do you read in a year that actually cause you to ask yourself questions? Probably very few, I'd guess. This is a book that challenges you to think about things differently; for instance, a Windows system crashing is not just a "Blue Screen of Death" on your home PC, it's now a critical system controlling a local power grid that just went down. It's not just a poorly-written piece of Web server software, it's a perfectly viable avenue of electronic data theft. And by the way, this little problem affects every one of us. Bravo, David, you've done a great job here. I tend to agree with Richard Bejtlich that a "vulnerability tax" is somewhat infeasible, but at least we're having some interesting conversations. Change usually stems from these, and change is exactly what's on the menu.
Now the bad news -- we live in a society that tolerates 20,000 annual alcohol-related fatalities (40% of total traffic fatalities) and cares more about Brittany Spears' antics than the national diabetes epidemic. Expecting the general public or politicians to somehow get concerned about abstract software concepts such as command injection, path manipulation, race conditions, coding errors, and myriad other software security errors, is somewhat of a pipe dream.
Geekonomics is about the lack of consumer protection in the software market and how this impacts economic and national security. Author Dave Rice considers software consumers to be akin to the proverbial crash test dummy. This combined with how little recourse consumers have for software related errors, and lack of significant financial and legal liability for the vendors, creates a scenario where computer security is failing.
Most books about software security tend to be about actual coding practices. Geekonomics focuses not on the code, but rather how insecurely written software is an infrastructure problem and an economic issue. Geekonomics has 3 main themes. First -- software is becoming the foundation of modern civilization. Second -- software is not sufficiently engineered to fulfill the role of foundation. And third -- economic, legal and regulatory incentives are needed to change the state of insecure software.
The book notes that bad software costs the US roughly $180 billion in 2007 alone (Pete Lindstrom's take on that dollar figure). Not only that, the $180 billion might be on the low-end, and the state of software security is getting worse, not better, according the Software Engineering Institute. Additional research shows that 90% of security threats exploit known flaws in software, yet the software manufacturers remain immune to almost all of the consequences in their poorly written software. Society tolerates 90% failure rates in software due to their unawareness of the problem. Also, huge amount of software problems entice attackers who attempt to take advantage of those vulnerabilities.
The books 7 chapters are systematically written and provide a compelling case for the need for security software. The book tells of how Joseph Bazalgette, chief engineer of the city of London used formal engineering practices in the mid-1800's to deal with the city's growing sewage problem. Cement was a crucial part of the project, and the book likens the development of secure software to that of cement, that can without decades of use and abuse.
One reason software has significant security vulnerabilities as noted in chapter 2, is that software manufacturers are primarily focused on features, since each additional feature (whether they have real benefit or not) offers a compelling value proposition to the buyer. But on the other side, a lack of software security functionality and controls imposes social costs on the rest of the populace.
Chapter 4 gets into the issues of oversight, standards, licensing and regulations. Other industries have lived under the watchful eyes of regulators (FAA, FDA, SEC, et al) for decades. But software is written removed from oversight by unlicensed programmers. Regulations exist primarily to guard the health, safety and welfare of the populace, in addition to the environment. Yet oversight amongst software programmers is almost nil and this lack of oversight and immunity breeds irresponsibility. The book notes that software does not have to be perfect, but it must rise to the level of quality expected of something that is the foundation of an infrastructure. And the only way to remove the irresponsibility is to remove the immunity, which lack of regulation has created a vacuum for.
Chapter 5 gets into more detail about the need to impose liability on software manufacturers. The books premise is that increased liability will lead to a decrease in software defects, will reward socially responsible software companies, and will redistribute the costs consumers have traditionally paid for protecting software from exploitation, shifting it back to the software manufacturer, where it belongs.
Since regulations and the like are likely years or decades away, chapter 7 notes that short of litigation, contracts are the best legal option software buyers can use to leverage in address software security problems. Unfortunately, most companies do not use this contractual option to the degree they should which can benefit them.
Overall, Geekonomics is an excellent book that broaches a subject left unchartered for too long. The book though does have its flaws; its analogies to physical security (bridges, cars, highways, etc.) and safety events don't always coalesce with perfect logic. Also, the trite title may diminish the seriousness of the topic. As the book illustrates, insecure software kills people, and I am not sure a corny book title conveys the importance of the topic. But the book does bring to light significant topics about the state of software, from legal liability, licensing of computer programmers, consumers rights, and more, that are imperatives.
It is clear the regulations around the software industry are inevitable and it is doubtful that Congress will do it right, whenever they eventually get around to it. Geekonomics shows the effects that such lack of oversight has caused, and how beneficial it would have been had such oversight been there in the first place.
To someone reading this review, they may get the impression that Geekonomics is a polemic against the software industry. To a degree it is, but the reality is that it is a two-way street. Software is built for people who buy certain features. To date, security has not been one of those top features. Geekonomics notes that software manufacturers have little to no incentive to build security into their products. Post Geekonomics, let's hope that will change.
Geekonomics will create different feelings amongst different readers. The consumer may be angry and frustrated. The software vendors will know that their vacation from security is over. It's finally time for them to get to work on fixing the problem that Geekonomics has so eloquently written about.
Look for similar items by category
- Books > Business & Investing > Industries & Professions > E-commerce
- Books > Computers & Technology > Computer Science > Software Engineering
- Books > Computers & Technology > History & Culture > Culture
- Books > Computers & Technology > History & Culture > Privacy
- Books > Computers & Technology > Internet & Social Media
- Books > Computers & Technology > Networking & Cloud Computing > Internet, Groupware, & Telecommunications
- Books > Computers & Technology > Networking & Cloud Computing > Network Security
- Books > Computers & Technology > Networking & Cloud Computing > Networks, Protocols & APIs
- Books > Computers & Technology > Programming > Software Design, Testing & Engineering > Quality Control
- Books > Computers & Technology > Programming > Software Design, Testing & Engineering > Software Development
- Books > Computers & Technology > Security & Encryption
- Books > Computers & Technology > Software
- Books > Computers & Technology > Web Development > Security & Encryption > Encryption
- Books > Textbooks > Computer Science & Information Systems > Networking
- Books > Textbooks > Computer Science & Information Systems > Software Design & Engineering