Category Archives: Internet risk management strategies

Growth of Cyber-Risk Insurance

Also in the recent report from Congress about homeland cybersecurity there is this passage, citing a research report:

The insurance industry has the ability to contribute to the development of a cost methodology through its customer base but is currently limited in the number of specialized cyber risk policies available. CRS found that the "growth of cyber risk insurance is hindered primarily by a lack of reliable actuarial data related to the incidence and costs of information security breaches; enhanced collection of such figures would probably be the most important contribution that policy can make."

Missing data?  Very interesting!

If information gathering has the potential to reduce costs and risks, why does the data shortfall persist? According to the CRS report, "[T]here are two chief obstacles. First, there are strong incentives that discourage the reporting of breaches of information security. Second, organizations are often unable to quantify the risks of cyber attacks they face, or even to set a dollar value on the cost of attacks that have already taken place. Thus, even if all the confidential and proprietary information that victims have about cyber attacks were disclosed and collected in a central database, measurement of the economic impact would still be problematical."

This summary of another report doesn’t say what the strong disincentives are that discourage reporting information security breaches, but one can guess they may have to do with fear of customers worrying about their information being insecure, fear of resulting lawsuits (see Negligence or Risk?) and fear of further targeted attacks. A program like InfraGard may help with such corporate hesitance by permitting information sharing about breaches without public disclosure. Or the other direction might work: disclose all breaches, thus giving all enterprises incentive to do something about them.

The report makes a very important point that a centralized database of all breaches still wouldn’t address the economic issues, because the breached companies don’t know themselves. For that matter, they often don’t even know they’ve been breached; witness the burgeoning blackmarket in botnets. And they know even less about slowdowns and interruptions outside the firewall that cause customers not to be able to transact business.

In other words, required reporting such as the FCC requires of telecommunications companies won’t solve the problem. The popular suggestion of determining the security state of the Internet by having ISPs or even enterprises report on it would be inadequate.

Regrettably, many people continue to use metrics and methodologies from the physical environment when thinking about cyberspace. As CRS determined, "There is a fundamental difference between a cyber attack and a conventional physical attack in that a cyber attack generally disables — rather than destroys — the target of the attack. Because of that difference, direct comparison with previous large-scale disasters may be of limited use."

This last is all true, although there has been at least one case involving an electric utility in which temporary loss of electrical service was counted as physical damage with corresponding legal liability, even though everything worked correctly once power was restored. The lost business did not automatically come back. Damage to reputation does not autmatically come back. Increased expense does not necessarily go away.

There are some other differences about cyberspace.

  1. Damage doesn’t have to be the result of a targeted attack. This is is different from physical attacks on physical plant. This is more like acts of God such as hurricanes, earthquakes, and floods, which can damage multiple enterprises simultaneously without any human targetting. Even some human attacks aren’t targeted at a particular enterprise; for example, botnet collectors don’t really care who owns the affected computers; they just want a lot of them. We’re not talking Ocean’s 11 here, where a gang of thieves spends a lot of effort cracking a specific casino. That sort of thing does happen in cyberspace, but cyberspace isn’t limited to it.
  2. Such aggregation can be even more widespread than for natural disasters, since the average flood is restricted to a riverbed, the average earthquake to a fault, and the average hurricane to an ocean and its environs. The Internet is worldwide, and as we have seen repeatedly, worms, viruses, and general bug exploits are also worldwide. A given enterprise’s customers may be worldwide, and nonredundant routes, congestion, or cable cuts anywhere in the world can interfere with its business.
  3. There are three major electrical grids in the United States, but there is by its nature only one Internet, which also extends worldwide. The Internet is the one infrastructure all enterprises increasingly depend upon.

-jsq

Protecting the Infrastructure that Connects

iIt seems that part of Congress has a clue about what needs to be done in cybersecurity for U.S. homeland security, starting with creating an Assistant Secretary for Homeland Security for Cybersecurity. The recent report from the  Subcommittee on Cybersecurity, Science, and Research & Development of the U.S. House of Representatives Select Committee on Homeland Security sums up the matter pithily in two sentences:

The information infrastructure is unique among the critical infrastructures because it is owned primarily by the private sector, it changes at the rapid pace of the information technology market, and it is the backbone for many other infrastructures. Therefore, protection of this infrastructure must be given the proper attention throughout government.

The Internet isn’t just another infrastructure: it’s the one that connects all the others.

The report spells this point out in more detail, as well:

Information technology and American ingenuity have revolutionized almost every facet of our lives. From education to recreation and from business to banking, the nation is dependent on telephones, cellular phones, personal digital assistants, computers, and the physical and virtual infrastructure that ties them all together. Almost all data and voice communications now touch the Internet — the global electronic network of computers (including the World Wide Web ) that connects people, ideas, and information around the globe.

Technology provides the nation with immeasurable opportunities, giving citizens global access and making daily transactions more affordable, efficient, and interactive. Unfortunately, the same characteristics that make information technology so valuable also make those technologies attractive to criminals, terrorists, and others who would use the same tools to harm society and the economy.

Despite the growing threat, security and efforts to protect information often remain an afterthought frequently delegated to a Chief Information Officer or a Chief Technology Officer. Cybersecurity should be treated as a cost of doing business by the highest levels of an enterprise’s leadership because the ability to conduct business and assure delivery of services to consumers — whether it is banking, electrical, or manufacturing-depends on ensuring the availability of information and related infrastructure.

CYBERSECURITY FOR THE HOMELAND
December 2004
Report of the Activities and Findings
by the Chairman and Ranking Member
Subcommittee on Cybersecurity, Science, and Research & Development
of the
U.S. House of Representatives Select Committee on Homeland Security

This sounds quite like what Lord Levene, Chairman of Lloyds, said last spring: Internet business risk management should be at the top of the priority list for chief officers and board members.

-jsq 

Saving the Internet?

The former CIA Director, George J. Tenet, has said what he thinks needs to be done to improve Internet security:

The way the Internet was built might be part of the problem, he said. Its open architecture allows Web surfing, but that openness makes the system vulnerable, Mr. Tenet said.
   
Access to networks like the World Wide Web might need to be limited to those who can show they take security seriously, he said.

Tenet calls for Internet security By Shaun Waterman UNITED PRESS INTERNATIONAL Published December 2, 2004

Well, that would exclude most governments from using the web.

He also says the Internet is a potential Achilles heel, and warns that

… al Qaeda remains a sophisticated group, even though its first-tier leadership largely has been destroyed.

It is "undoubtedly mapping vulnerabilities and weaknesses in our telecommunications networks," he said.

This makes me wonder several things.

  1. Is this "undoubtedly" the same sort as that Saddam undoubtedly had WMD? Some evidence would be useful here.
  2. Suppose there is actually evidence for OBL mapping Internet vulns. How exactly is destroying it before he can a solution?
  3. Wouldn’t it make more sense to map them ourselves first, and fix them? Sure, it would be expensive to add redundancy in certain cases, but compared to what?

He also said:

"I know that these actions will be controversial in this age when we still think the Internet is a free and open society with no control or accountability," he told an information-technology security conference in Washington, "but ultimately the Wild West must give way to governance and control."

He said this at an event from which he excluded the press, which makes one wonder whether it is the Internet he is worried about or is it a free and open society that is worrisome to him.

Meanwhile, several people have told me that it’s a common mantra inside Microsoft to say that the Internet is the terrorist’s best friend.  I don’t think that’s right, unless you want to extend the same argument to anything else that has free and anonymous communications, such as the Interstate Highway System.

What I think is the terrorist’s and criminal’s best friend is software that ships out of the box with vulnerabilities turned on and that has design flaws that prevent fixing easily exploited bugs.  Mr. Tenet seems to agree on that subject:

Mr. Tenet called for industry to lead the way by "establishing and enforcing" security standards. Products need to be delivered to government and private-sector customers "with a new level of security and risk management already built in."

Maybe that’s what his whole talk was about.  It’s too bad we’ll never know, due to his exclusion of the press.

-jsq

esr at UT with CACTUS

Austin seems to be attracting even more interesting people lately.

Eric S. Raymond, author of the books The Hacker’s Dictionary, The Cathedral and the Bazaar, and The Art of Unix Programming, is speaking at the University of Texas Monday 29 November.

Eric S. Raymond
After the Revolution: Coping with Open Source Success
7PM Monday 29 November 2004
ACES 2.302
University of Texas at Austin

This talk is sponsored by UT School of Information and Capital Area Central Texas Unix Society I believe there is kibitzing going on as well by EFF-Austin and the Austin Linux User’s Group

-jsq

Information Security Considered Difficult

In a paper back in 2001, “Why Information Security is Hard — An Economic Perspective” Ross Anderson of the University of Cambridge gives a number of reasons why technical means will never be adequate for information security, including Internet security. As he says in the abstract:
“According to one common view, information security comes down to technical measures. Given better access control policy models, formal proofs of cryptographic protocols, approved firewalls, better ways of detecting intrusions and malicious code, and better tools for system evaluation and assurance, the problems can be solved.

“In this note, I put forward a contrary view: information insecurity is at least as much due to perverse incentives. Many of the problems can be explained more clearly and convincingly using the language of microeconomics: network externalities, asymmetric information, moral hazard, adverse selection, liability dumping and the tragedy of the commons.”

He uses a number of examples to make his point, among them distributed denial of service (DDoS) attacks that use subverted machines to launch a combined attack at a target. Particularly good machines to subvert for this purpose are end-user machines, because the typical end-user does not have much incentive to pay anything to protect against their machine being used to attack some large corporate entity with which the user has no identification. In many of the examples, the common thread is that
“In general, where the party who is in a position to protect a system is not the party who would suffer the results of security failure, then problems may be expected.”
Anderson amusingly points out that a tenth-century Saxon village had community mechanisms to deal with this sort of problem, while in the twenty-first century we don’t.

The key here is that it is an aggregate problem and we need collective measures to deal with it. In a Saxon village peer pressure may have been enough, and if that didn’t work they may have resorted to the stocks or some similar subtle measure.

Today we may have made some progress with alarming the end users by pointing out that 80% of end-user systems are infected with spyware and botnets of compromised systems are widespread. On the other hand, such numbers indicate tjhat education thus far hasn’t solved the problem. SImilarly, that anyone is still using Internet Explorer after the events of this past summer indicates that users are not taking sufficient steps.

A more obvious method would be to make the software vendors liable. Why should operating systems still be sold with open security holes right out of the box, and why should applications still be sold that have bad security designed in? An obvious answer that I don’t think the paper addresses is that some vendors of such software have enough lobbiests to prevent vendor liability laws from being passed. Anderson’s paper goes into more subtle reasons such as ease of use, number of users, switching costs, etc.

There’s an intermediate method that Anderson attributes to Hal Varian, which is to make the Internet Service Providers (ISPs) take responsibility for malign traffic originating from their users. This may be hapenning somewhat, but has its own problems, especially in implementation, which I may come back to in another post.

But the main point of Anderson’s article is clear and compelling: technical means are not sufficient to provide information security. Non-technical information security strategies are needed.

-jsq

InnoTech and InfraGard

At InnoTech I was followed by the FBI. Chronologically, not physically: they spoke next.

About InfraGard, which is a public-private partnership, i.e., the FBI organizes information sharing about intrusions and other security matters among businesses, academia, and law enforcement agencies. It has been going on since 1996, and has been national since 1998.

The InfraGard talk was mostly a good update on the current state of security, both online and physical, plus information on how to join.

-jsq

The Bazaar

It’s been a while since the last post. I plead flu. It has advantages, though: I lost 10 pounds in 2 weeks.

I’m several conferences behind in writeups. Back at Linucon, I chatted a bit with Eric Raymond, author of The Hackers Dictionary, The Cathedral & the Bazaar, and The Art of Unix Programming.

Of those books, the most relevant to this post is The Cathedral & the Bazaar. Its thesis is pretty simple, but let me paraphrase it and oversimplify it: software built to elaborate specifications by teams of programmers, with flying buttresses and fancy rose windows isn’t necessarily better (more capable, more robust, more user-friendly, more sales, etc.) than software built by loosely knit teams of people building the parts they want to use. Closed source vs. open source. Back when I published the first printed version of Eric’s paper on this subject, this was a radical thesis. Not to its practitioners, of course, since the Berkeley Unix system for example had been produced by such methods back in the 1980s, and Linux was already spreading rapidly in the 1990s. Yet radical to those not familiar with it. Nowadays many companies are using it, and much open source software has become quite popular.

However, the idea extends beyond software, and it appears that many people have worked out aspects of it from different directions. For example, David Weinberger’s Small Pieces Loosely Joined deals with many of the same ideas related to the World Wide Web. Eric’s most recent book is also relevant, since the Unix philosophy has always involved small pieces connected together in various ways instead of large monolithic programs.

John Robb’s Global Guerillas blog has explicitly cited the Bazaar open source idea in relation to ideas of assymetric warfare. Robb had previously cited a long list of books that are more obviously about warfare, the most seminal of which is probably Boyd:The Fighter Pilot Who Changed the Art of War by Robert Coram. This is a biography of John R. Boyd, who started out as a fighter pilot (never defeated), wrote a manual on aerial jet combat that is apparently still in use, “stole” a million dollars worth of computer time in order to develop his theory of why he never lost, which led to designing airplanes including the F-15 and F-16, and eventually via intensive reading of history to a theory of warfare that has since been adopted by the U.S. Marine Corps, as well as by other, less savory, parties. It is known by various names, such as “fourth generation warfare,” “assymetric warfare,” or “highly irregular warfare.”

Someone else approaching many of the same topics is Albert-László Barabási in his book Linked, about scale-free networks; I’ve mentioned his book a number of times already in this blog.

What do all these things have to do with one another? They’re all about organizing loosely joined groups without rigid top-down command and control. They all also have to take into account how such organizations can deal with more traditional c-and-c organizations; which has what advantage; and how.

What does this have to do with Internet risk management strategies? The Internet is a loosely coupled non-hierarchical distributed network. No single organization can control it. Any organization that wants to use it would do well to accept that the Internet introduces sizeable elements that cannot be controlled and therefore risks that must be managed without direct control.

-jsq

Bandwidth Futures

Looking backwards a couple of years, here’s an interesting article about carriers hedging risks by taking out options on future bandwidth prices, among various other forms of risk management (anything except bandwidth trading): “Carriers Seek Rewards of Risk Management” by Josh Long in PHONE+, January 2002. One of the most interesting passages I think is this one about carriers not necessarily knowing themselves:
“Ciara Ryan, a partner in the bandwidth team at global consulting firm Andersen, agrees. Ryan explains the lack of visibility is due in part to mergers and acquisitions creating carriers that are an amalgam of many parts. The information pertaining to these assets has been integrated poorly, making it difficult to employ risk-management tactics, she says.

“Ryan says carriers must be able to extrapolate key bits of information from their databases to manage their network assets properly. This would include, how much they have sold on a particular route, from which point of presence (PoP) it was sold, what the service level agreement (SLA) entailed, whether an option was sold on the contract, whether a contract was a short-term lease or indefeasible rights of use (IRU) agreement and what the definite and projected sales include on particular routes.

“”Very, very few of them would be able to give you this information,” Ryan adds.”

And that’s before considering paths all the way from the carrier’s customer to its customers. If the carriers don’t even know what their own networks consist of, it would appear they can’t be expected to provide a holistic and synoptic view of the Internet, neither one by one or all together.

-jsq