Linucon History

A few days ago I mentioned I was going to give a talk about Internet history at Linucon. That went well, although some of the audience seemed surprised that my estimate of the age of the Internet was about 4600 years older than the nearest contender.

Linucon itself was an interesting attempt to exploit or enable the intersection (maybe 30%) between computing and science fiction fandom. The con had a certain do-it-yourself charm, and the participants seemed pleased. They plan to do it again next year.

-jsq

Decentralizing Energy

This isn’t about the Internet, but it is about a scale-free network: oil production. The big problem with oil isn’t that it’s currently expensive, or that current sources are running short. The problem isn’t even that the U.S. gets most of its oil from the middle east: it doesn’t; the U.S. imports only a fraction of its oil, and only a fraction of that comes from the middle east. (One of the main interests of the U.S. in the middle east and other oil producing areas is to police them so that no other country decides it must develop the capability to do so.) The problem is that too much oil comes from too few suppilers, starting with Saudi Arabia and working down.

So why consider running out of oil a problem? Why not consider it an opportunity? An opportunity to shift to other and more distributed energy sources, thus removing the need to militarize the Middle-East.

Here’s a detailed proposal to do just that, funded partly by Pentagon money, and written by people who have been making practical improvements in energy efficiency for companies large and small for many years: Winning the Oil Endgame, Rocky Mountain Institute; 309 pages; $40.

Amory Lovins proposes doing it not by abandoning suburbia, rather by using profit and market to drive efficiency and shifts in energy production and delivery.

The Economist said about the book:

“Given that America consumes a quarter of the world’s oil but has barely 3% of its proven reserves, it will never be energy-independent until the day it stops using oil altogether.

“How to get there? Amory Lovins has some sharp and sensible ideas. In “Winning the Oil Endgame”, a new book funded partly by America’s Defence Department, this sparky guru sketches out the mix of market-based policies that he thinks will lead to a good life after oil.

“First, he argues, America must double the efficiency of its use of oil, through such advances as lighter vehicles. Then, he argues for a big increase in the use of advanced “biofuels”, made from home-grown crops, that can replace petrol. Finally, he shows how the country can greatly increase efficiency in its use of natural gas, so freeing up a lot of gas to make hydrogen. That matters, for hydrogen fuel can be used to power cars that have clean “fuel cells” instead of dirty petrol engines. It would end the century-long reign of the internal-combustion engine fuelled by petrol, ushering in the hydrogen age.

“And because hydrogen can be made by anybody, anywhere, from windmills or nuclear power or natural gas, there will never be a supplier cartel like OPEC—nor suspicions of “blood for hydrogen”. What then will the conspiracy theorists do?”

In the near term there will no doubt be military actions. In the long run, apparently even the Pentagon thinks we can solve the real problem.

-jsq

Internet History?

In Albert-László Barabási’s book Linked, he is refering to the early deployment of IMPs (Interface Message Processors) on the ARPANET, and he says:
“The fifth was delivered to BBN, a Massachusetts consulting firm, late in 1970…”
That must have been a short delivery, considering that BBN was where IMPs were made.

I don’t hold it against ALB that he didn’t know that; when those things were happening, he was in Hungary, which at the time had certain difficulties communicating with the rest of the world. But how many of you, dear readers, have heard of BBN?

Meanwhile, over on Dave Farber’s Interesting People list, history came up. I mentioned my upcoming talk about Internet history at Linucon to some of the posters, which drew Farber to ask “What happened to CSNet!!!!!!!!!!” Nothing, so far as I know; I didn’t try to mention every historical network, but you can be sure I will mention CSNet. Especially considering that Peter Denning has provided a nice writeup about it, in addition to the one in my book, The Matrix.

If you’re in Austin, my history talk is tonight. Y’all come.

-jsq

Small World State Change

The U.S. government has now gone through four cyberscurity czars in less than two years, with the one-day notice resignation of Amit Yoran, following Howard Schmidt, Rand Beers (who joined the Kerry campaign), and Richard Clarke (who wrote a best-selling book and testified before Congress).

Apparently one argument for pushing cybersecurity down into the bowels of DHS is that the Internet and computers in general are just another homeland infrastructure that needs securing, same as the electrical grid or airlines. Albert-László Barabási (ALB) in his book Linked remarks on how sufficiently close connectivity can cause a state change, in the same manner as water turning to ice. It isn’t electrical utility grids that are interconnecting the world as never before; it is communications systems, led by the Internet, which is rapidly subsuming many other communciations systems. All the other infrastructures increasingly depend on the Internet. Even if it isn’t actually causing a state change, the Internet has global reach and fast speed, producing both its great value via Metcalfe’s Law (many users) and Reed’s Law (many groups) and its potential for cascade failure.

The Internet’s potential for cascade failure is also because of its organization as a scale-free network with many hubs of various sizes. Yet this is also what makes it so rubust unless the hubs are directly targetted. Meanwhile, we hear comparisons to the ability of the FAA to ground all aircraft in U.S. airspace. I don’t really see that an off swtich for the U.S. portion of the Internet would make the Internet more robust, even if it were easy to separate out exactly what that portion was.

I think the U.S. government would benefit by appointing a new cybersecurity head with sufficient powers to get something done, and preferably somebody with deep understanding of the Internet. How about Bruce Schneier or Dan Geer?

-jsq

Hurricane History

Today is the anniversary of the Galveston Hurricane of 1867 that caused $1 million in damage ($12.3M in today’s dollars, or the equivalent of $1.3 billion in share of GDP). Also the Louisiana Delta hurricane of 1893, which had winds of 100 miles per hour and a 12 foot storm surge that killed 1500 people. In 1882 on this day a severe windstorm in northern California and Oregon did severe crop damage and blew down trees.


How do we (or, in this case, Intellicast) know all this?

Let’s look at the fourth major hurricane of this date: the one that struck the Georgia coast in 1898, washing away an entire island, Campbell Island, and killing 50 residents there, as well as all but one of the residents of St. Catherine’s Island. The storm surge at Brunswick is estimated at 19 feet.

“Worthy of note is the brief period of time which has seen the widespread deployment of remote sensing systems, which may accurately place the center of a landfalling storm in data sparse or lightly populated coastal regions.”
“A Reevaluation of the Georgia and Northeast Florida Hurricane of 2 October 1898 Using Historical Resources,”
Last Modified on Thursday, 8 October 1998, 11:00 AM, Al Sandrik, Lead Forecaster, National Weather Service Office, Jacksonville, FL

Sandrik provides a convenient table, Technical Advances in systems for observing tropical cyclones, 1871 through 1980. In 1898 there was some sort of Hurrican Watch Service; they had landline telegraph; and historians have access to ship’s logs. Ships didn’t get wireless telegraph until 1905, so ship’s logs were no use to people on shore at the time.

Sandrik mines newspaper reports, personal letters, and measurements of buildings that are still standing, as well as finding some oaks that were blown down but still growing, providing wind direction at that point. With all this data he concludes that the wind only blew west north of Camden County (the Georgia coastal county just north of Florida) while it reversed and blew east there. So the eye had to have come ashore in Camden County, not 30 miles farther north as previously thought. Also, this was at least a category 3 hurricane, maybe 4, not a cat 2 as previously thought.

He compares records for other hurricanes earlier and later, and concurs with another researcher that for nineteenth century hurricanes,

“…the apparent low frequency on the Gulf coast between Cedar Key and St. Marks is not believed to be real. This area is very sparsely settled and the exact point where many of the storm centers reached the coast is not known, so there has been a natural tendency to place the track too close to the nearest observing point.”(1)

In other words, nineteenth century hurricanes were more common in places that were then sparsely populated but now are beach resorts. This has consequences for disaster planning. Not only that, but 19th century hurricanes were more severe than previously thought, which means that 20th century expectations of hurricane severity in the U.S. southeast may have been too rosy (this is why the recent spate of hurricanes can’t be used as evidence of global warming, although there are plenty of other pieces of evidence for that). Understanding the past has value for risk planning.

We don’t have to do all this forensic research and statistical interpolation for modern hurricanes, because we never miss seeing one anymore, nor its track or intensity. This is because we have better means of observation and coordination.

A radiosonde network (weather balloons) was added in the late 1930s, and organized reconnaissance (hurricane chasing airplanes) in the 1940s. To draw an Internet analogy, if ship logs were somewhat like web server logs, weather balloons are perhaps like web monitoring companies.

The satellites and radar we are all used to seeing on TV and the Internet all date from the 1960s and later. The Internet analogy is what InternetPerils is doing regarding a holistic and synoptic view of the Internet.

Aircraft satellite data links were added in the late 1970s; sort of flying ship log links. Ocean buoys were added about 1973; these are perhaps like honeypots and blacknets.

As Sandrik remarks:

“…the more diverse and accurate the base information used to generate a storm track, the greater the level of confidence that can be ascribed to that particular track.”

This is why multiple sources of different types of data such as are collected by the DNS-OARC participants are important, as is the coordination being attempted by that body. We need both sensing buoys in the ocean of the Internet and ship, radar, aircraft, and satellite reconnaissance, with networks of coordination and display among them.

-jsq

Nameserver Coordination

Today I’m attending by telephone the first-ever Domain Name System Operations, Analysis, and Research Center (DNS-OARC). The attendees include operators of root DNS servers, top level domain servers, domain registries, and well known Internet researchers. Much interesting research is going on, and perhaps some of it can be more coordinated. The group also has members from major vendors. InternetPerils is a charter member.


One reason for this meeting is that DNS-OARC has received an NSF grant of $2.38M; kc of CAIDA and other participants were most complimentary of NSF. I hope this grant is a sign that NSF is coming to see collective action as at least as important as faster networks.

I can’t say much about what else went on, given that members sign a confidentiality agreement. Suffice it to say that people with related projects that might not have been aware of each other now are.

One attendee has previously publicly remarked that the Internet won’t die, because nobody has more incentive to keep it running than the miscreants that feed off of it.

I have a request from the DNS-OARC administration to mention that everyone should use BCP 38 and not peer with people who don’t do source address verification at the edges. This is a relatively new Best Practice (four years old) that is already widely deployed, although not yet widely enough.

One reason it’s still not widely enough deployed is the same reason nobody wanted to believe a tornado could hit Massachusetts. Many people see it as benefitting other people, but not themselves, because they don’t believe it could happen to them.

One thing I can do is link to my own presentation.

-jsq

Force is not Security

In his book Linked, Albert-László Barabási (ALB) remarks:

“Real networks are not static, as all graph theoretical models were until recently. Instead growth plays a key role in shaping their topology. They are not as centralized as a start network is. Rather, there is a hierarchy of hubs that keep these networks together, a heavily connected node closely followed by several less connected ones, trailed by dozens of even smaller nodes. No central node sits in the middle of the spider web, controlling and monitoring every link and node. There is no single node whose removal could break the web. A scale-free network is a web without a spider.”

This is not news to those of us who were involved in USENET. For example, I ran ut-sally, which was the second node in traffic behind seismo. And there were many other nodes of varying degrees of connectivity and traffic. The most connected node also channged from time to time; earlier it was decvax, and then ihnp4.

ALB goes on to refer to Valdis Krebs’ topological examination of the 9/11 hijackers’ network, which indicated that even if the most connected person involved had been apprehended, the rest could probably have proceeded. ALB generalizes the point, noting that terrorist networks are themselves organized similarly.

John Robb has taken this idea further in his Global Guerillas blog, in which he examines in depth how such organizations thrive by decentralized funding and communications.

Force alone will not stop such organizations. This is not to say we can eschew force; in the best of all possible worlds that might be possible, but not in this one. Yet something else is also needed.

The solution is not as simple as McNamara thought when he left the U.S. government to join the World Bank; poverty alone is not the cause of terrorism, and wealth alone is not the solution, nor is lack of education the problem. Most of the 9/11 hijackers were not poor, and most suicide bombers are relatively highly educated by local standards. Nor are terrorism or suicide attacks unique to Islam; the only organization in the world to kill two heads of state (Indira Gandhi and Rajiv Gandhi) with suicide attacks is the Tamil Tigers, whose members tend to be Hindu.

There is a common cause of suicide attacks, according to a recent article in New Scientist:

“The decision to engage in suicide terrorism is political and strategic, Pape says. What is more, the aim is always the same: to coerce a government, through force of popular opinion (apart from a few isolated cases, modern suicide terrorism has only ever been used against democracies), to withdraw from territory the group considers its homeland.”
“The making of a suicide bomber,” by Michael Bond
and editorial from New Scientist vol 182 issue 2447.

This might indicate two ways of dealing with that particular problem: withdraw from the territory the terrorists consider occupied, or change ones government to something other than a democracy. Not only do those options not seem terribly atractive, but suicide terrorism is only one form of terrorism, and withdrawal isn’t the only demand of, for example, Al Qaeda.

ALB proposes eliminating the “underlying social, economic, and political roots that fuel the network’s growth.” And to offer “a chance to belong to more constructive and meaningful webs.”

Here’s another view on that:

“In the past few years, something has gone wrong in the broader relationship between the so-called West and the countries of the Arab and Muslim world. Distrust, recriminations and resentment have mounted. Minor misunderstandings or disagreements have taken on highly symbolic importance and fed the cycle of suspicion.”

“More dialogue per se may not guarantee better relations, but it can help and would at least reduce the barriers of ignorance. Thus we need a dramatic expansion of scholarship programmes and workplace exchange schemes so that more people know about life on the other side. Europe has been transformed through political and market integration, driven by supranational institutions. But the most successful EU programme has been the Erasmus scheme, which gives tens of thousands of students the chance to do part of their university degree in another EU country. Similar schemes also operate for professors and other categories of workers. Together with low-cost airlines, they have probably done more for European unity than the deadweight of the common agricultural policy. We need a similar scheme to link educational establishments in the West to those of the Arab and Muslim world. And, why not, we must also explore the possibilities of introducing low cost air travel on routes to and from the Middle East. There is no reason, other than politically inspired protectionism, why a ticket from London to Beirut or Jeddah should costs twice as much as one to New York. The overwhelming evidence suggests that if people are exposed to more factual information and different experiences, they moderate their views and factor in greater complexity. We may still differ on many things, but at least we should get the facts straight.”
“Why We Do not Get On? And What to Do About It?” by Dr. Steven Everts, Al-Hayat 2004/09/25

And of course the Marshall Plan and the Eurail Pass have probably had effect on U.S.-European relations because they involved many Americans and Europeans interacting.

Sometimes you have to fight force with force, but that alone only leads to more fights. The best way to fix a broken world network may not be to break it further. Better may be to make it more connected.

As McNamara said in 1966:

“The decisive factor for a powerful nation already adequately armed is the character of its relationships with the world.”

How do we get more nations to put that into practice?

-jsq

Time for a de facto electronic mail authentication system?

David Berlind of ZDNet News says in “Catastrophic Loss for unencumbered Standards” that the IETF working group on the most promising mail authentication system has been shut down, due to technical and business differences among its participants, plus it seems Microsoft is trying to patent the solution the working group was working on.

That leaves Meng Weng Wong’s Sender Policy Framework (SPF) as the main non-proprietary solution in this space, not to mention the most widely adopted.

Berlind calls for the Internet mail industry to follow the precedent of the financial industry, in which the principal vendors banded toegher and set a de facto standard for Electronic Funds Transfer (EFT).

One of the most likely groups to do this has been meeting in DC yesterday and today: the Anti-Phishing Working Group. Both Meng Weng and someone from Microsoft are there, as well as representatives from many well-known Internet security companies and many companies affected by phishing and spam.

I don’t see an industry-wide standard coming out of this meeting, but there are more meetings planned in short order….

-jsq

PS: Thanks to Bruce Sterling for blogging about Berlind’s article.

Internet2 This Week

I’m heading down to the Internet2 conference which is in Austin this week. They invited me to be on a panel tomorrow about Can we get ahead of the Crackers?

My answer is: yes, if we leverage technology with collective action. After all, force majeure events are aggregated; they affect multiple organizations. So strategies to pool risk across and beyond the pool of affected enterprises. This is what insurance does, and there are financial and other risk management strategies beyond that.

The panel will be webcast.

-jsq

Reliability more important than Price in ISP selection

According to a recent survey by In-Stat/MDR,


  • “Seventy-three percent of respondents said service quality/reliability was the most important criteria in selecting an Internet service provider.
  • “Sixty-nine percent selected price.
  • “Twenty-one percent of respondents selected company reputation, knowledgeable customer service staff, and availability at multiple locations/national footprint.”

It seems that performance and reliability have moved ahead of price in picking ISPs, and availability, reach, and topology are also significant criteria. Apparently Scott Bradner has been right all these years he’s been saying that ISPs need to have a business model beyond price competition.

Given this situation, it would also seem that an ISP with a risk management plan would have a competitive edge.

-jsq