Category Archives: Reputation Systems

Airline Volunteers

Here’s a Katrina story I haven’t heard tied together anywhere: airline volunteers.
“None of the airlines involved required a contract or any written guarantee of payment before sending their planes and volunteer crews,” Simon wrote of the American Airlines flights. “One official said if Gore promised to pay, that was good enough for them.”
Gore airlifts victims from New Orleans Former vice president chartered two private aircraft Saturday, September 10, 2005; Posted: 7:22 a.m. EDT (11:22 GMT) CNN.com
KNOXVILLE, Tennessee (AP) — Al Gore helped airlift some 270 Katrina evacuees on two private charters from New Orleans, acting at the urging of a doctor who saved the life of the former vice president’s son.
Continue reading

Battle of New Orleans, 2005

Stratfor has posted this analysis: New Orleans: A Geopolitical Prize, By George Friedman. Excepts:

All of the rivers flowed into one — the Mississippi — and the Mississippi flowed to the ports in and around one city: New Orleans. It was in New Orleans that the barges from upstream were unloaded and their cargos stored, sold and reloaded on ocean-going vessels. Until last Sunday, New Orleans was, in many ways, the pivot of the American economy.

For a port to operate, there must be places for river ships to unload, warehouses for intermediate storage, and places for ocean ships to load, and the reverse. That’s why New Orleans is where it is.

Continue reading

Mexican Army Enters Texas

For the first time in 136 years, the Mexican Army entered Texas today. Once again they headed for San Antonio, but this time they did not attack the Alamo.  Instead, they carried two mobile kitchens that can feed seven thousand people a day, and provisions to do so for three weeks. Mexico also sent a relief ship to Mississippi.

Continue reading

Vulnerability Bounties

TippingPoint (owned by 3Com) and iDefense (owned by Verisign) are both offering bounties for disclosure of vulnerabilities. Both firms apparently intend to reveal the disclosures to the affected vendors, rather than to the public. Mozilla has for some time been paying $500 per bug found.

And of course there are numerous other organizations looking for flaws in everyone’s code; many of these organizations won’t tell the vendor first.

Maybe it’s better to encourage as many friendly eyes to look at your code so they’ll tell you before somebody else uses a vulnerability as an exploit or tells the public before they tell you. Hm, this sounds a lot like open software.

-jsq

Vulnerability Restraints or Reputation Suicide?

Doubtless anyone who follows Internet security has heard by now of the case of Michael Lynn, currently under a restraining order by Cisco and Internet Security Systems (ISS). While working for ISS, Lynn discovered a vulnerability in Cisco router code and told Cisco about it in April. Apparently the flaw was fixed shortly afterwards. Lynn was scheduled to give a presentation on the flaw at the Black Hat Conference in Las Vegas this week, with the cooperation of Cisco and ISS. However, Cisco decided not to permit that, and went so far as to have its employees physically remove the ten page presentation from the already-printed conference proceedings.

Nonetheless, within two hours of the scheduled presentation time, Lynn quit his job with ISS and proceeded to give the presentation anyway, wearing a white hat labelled Good. Shortly afterwards, Cisco and ISS slapped a restraining order on Lynn and the conference to stop them from distributing the presentation or discussing it.

The rest of the chattering classes were not under restraining order, however, and within two days of the presentation a PDF of Michael Lynn’s slides was available on the Internet

Update: that link now displays a cease-and-desist letter and a copy of the injunction; a copy of the slides has turned up in Germany.

and discussions were rampant everywhere from security professionals such as Bruce Schneier, who could be expected to defend Lynn, to the Wall Street Journal (WSJ).

Continue reading

Tsunami Smith

In 1998 the former chief meteorologist of Thailand said “a tsunami is going to occur for sure”. Smith Dharmasaroja was called a mad dog for that. On Sunday 26 December 2004, after the earthquake but before the tsunami hit Thailand, he tried again to warn the Thai meteorological department, but could not get them to respond. (A case of denial and damage, just as happened years before in the U.S. regarding hurricanes.)

How did Mr. Smith know? He didn’t accept the received wisdom that earthquakes off Indonesia would only happen on the other side of Sumatra from Thailand. He studied seismology and discovered there was a fault line that would put the Thai tourist resort Phuket in the direct path of a tsunami. His public warning in 1998 was after a tsunami from the same fault hit Papua New Guinea (the Aitapa tsunami of Friday 17 July 1998; there was no tsunami warning system for that part of the Pacific at that time, either).

 

"You’d really have to go digging into very old historical records and the scientific literature and extrapolate from what’s there to find that yes, there could be effects (leading to tsunamis) in Thailand," says Phil Cummins, a seismologist who studies the region at Australia’s national geological agency. "But he was correct."

Such an earthquake did occur and the resulting tsunami hit Phuket.

Two weeks after the 2004 tsunami, the Thai government called Smith Dharmasaroja out of retirement to head its new tsunami warning system.

The economic damages of the 2004 tsunami are estimated at $14 billion by Munich Re, the world’s largest reinsurer. Maybe it would be prudent to do some historical exploration and to set up an early warning system for Internet events that could cause $50 to $100 billion in economic damages.

-jsq

Tsunami Warning System

IIt seems likely that tsunami insurance would be easier to get if there were an early warning system. There is one for the Pacific but none for the Indian Ocean.

Australia, which participates in the Pacific warning system, may have been the first country to volunteer (on 27 December) to help set up a tsunami warning system for the Indian Ocean. Since then India has announced it is building one to be operational within 3 years, PM Koizumi has ordered one for Japan, the U.S. has come out in favor of one, Thailand is lobbying to be the location for one, and there’s been a meeting in Indonesia about the need for one.

Maybe early warning and tracking systems would be useful for other fields of likely major economic damage.

-jsq

Small World State Change

The U.S. government has now gone through four cyberscurity czars in less than two years, with the one-day notice resignation of Amit Yoran, following Howard Schmidt, Rand Beers (who joined the Kerry campaign), and Richard Clarke (who wrote a best-selling book and testified before Congress).

Apparently one argument for pushing cybersecurity down into the bowels of DHS is that the Internet and computers in general are just another homeland infrastructure that needs securing, same as the electrical grid or airlines. Albert-László Barabási (ALB) in his book Linked remarks on how sufficiently close connectivity can cause a state change, in the same manner as water turning to ice. It isn’t electrical utility grids that are interconnecting the world as never before; it is communications systems, led by the Internet, which is rapidly subsuming many other communciations systems. All the other infrastructures increasingly depend on the Internet. Even if it isn’t actually causing a state change, the Internet has global reach and fast speed, producing both its great value via Metcalfe’s Law (many users) and Reed’s Law (many groups) and its potential for cascade failure.

The Internet’s potential for cascade failure is also because of its organization as a scale-free network with many hubs of various sizes. Yet this is also what makes it so rubust unless the hubs are directly targetted. Meanwhile, we hear comparisons to the ability of the FAA to ground all aircraft in U.S. airspace. I don’t really see that an off swtich for the U.S. portion of the Internet would make the Internet more robust, even if it were easy to separate out exactly what that portion was.

I think the U.S. government would benefit by appointing a new cybersecurity head with sufficient powers to get something done, and preferably somebody with deep understanding of the Internet. How about Bruce Schneier or Dan Geer?

-jsq

Hurricane History

Today is the anniversary of the Galveston Hurricane of 1867 that caused $1 million in damage ($12.3M in today’s dollars, or the equivalent of $1.3 billion in share of GDP). Also the Louisiana Delta hurricane of 1893, which had winds of 100 miles per hour and a 12 foot storm surge that killed 1500 people. In 1882 on this day a severe windstorm in northern California and Oregon did severe crop damage and blew down trees.


How do we (or, in this case, Intellicast) know all this?

Let’s look at the fourth major hurricane of this date: the one that struck the Georgia coast in 1898, washing away an entire island, Campbell Island, and killing 50 residents there, as well as all but one of the residents of St. Catherine’s Island. The storm surge at Brunswick is estimated at 19 feet.

“Worthy of note is the brief period of time which has seen the widespread deployment of remote sensing systems, which may accurately place the center of a landfalling storm in data sparse or lightly populated coastal regions.”
“A Reevaluation of the Georgia and Northeast Florida Hurricane of 2 October 1898 Using Historical Resources,”
Last Modified on Thursday, 8 October 1998, 11:00 AM, Al Sandrik, Lead Forecaster, National Weather Service Office, Jacksonville, FL

Sandrik provides a convenient table, Technical Advances in systems for observing tropical cyclones, 1871 through 1980. In 1898 there was some sort of Hurrican Watch Service; they had landline telegraph; and historians have access to ship’s logs. Ships didn’t get wireless telegraph until 1905, so ship’s logs were no use to people on shore at the time.

Sandrik mines newspaper reports, personal letters, and measurements of buildings that are still standing, as well as finding some oaks that were blown down but still growing, providing wind direction at that point. With all this data he concludes that the wind only blew west north of Camden County (the Georgia coastal county just north of Florida) while it reversed and blew east there. So the eye had to have come ashore in Camden County, not 30 miles farther north as previously thought. Also, this was at least a category 3 hurricane, maybe 4, not a cat 2 as previously thought.

He compares records for other hurricanes earlier and later, and concurs with another researcher that for nineteenth century hurricanes,

“…the apparent low frequency on the Gulf coast between Cedar Key and St. Marks is not believed to be real. This area is very sparsely settled and the exact point where many of the storm centers reached the coast is not known, so there has been a natural tendency to place the track too close to the nearest observing point.”(1)

In other words, nineteenth century hurricanes were more common in places that were then sparsely populated but now are beach resorts. This has consequences for disaster planning. Not only that, but 19th century hurricanes were more severe than previously thought, which means that 20th century expectations of hurricane severity in the U.S. southeast may have been too rosy (this is why the recent spate of hurricanes can’t be used as evidence of global warming, although there are plenty of other pieces of evidence for that). Understanding the past has value for risk planning.

We don’t have to do all this forensic research and statistical interpolation for modern hurricanes, because we never miss seeing one anymore, nor its track or intensity. This is because we have better means of observation and coordination.

A radiosonde network (weather balloons) was added in the late 1930s, and organized reconnaissance (hurricane chasing airplanes) in the 1940s. To draw an Internet analogy, if ship logs were somewhat like web server logs, weather balloons are perhaps like web monitoring companies.

The satellites and radar we are all used to seeing on TV and the Internet all date from the 1960s and later. The Internet analogy is what InternetPerils is doing regarding a holistic and synoptic view of the Internet.

Aircraft satellite data links were added in the late 1970s; sort of flying ship log links. Ocean buoys were added about 1973; these are perhaps like honeypots and blacknets.

As Sandrik remarks:

“…the more diverse and accurate the base information used to generate a storm track, the greater the level of confidence that can be ascribed to that particular track.”

This is why multiple sources of different types of data such as are collected by the DNS-OARC participants are important, as is the coordination being attempted by that body. We need both sensing buoys in the ocean of the Internet and ship, radar, aircraft, and satellite reconnaissance, with networks of coordination and display among them.

-jsq