Category Archives: IT Securiiy

Bill Gates Considered as Evil Primitive Bacterium

archaea-tree-woese.jpg Has Freeman Dyson become an evolution denier?

Whatever Carl Woese writes, even in a speculative vein, needs to be taken seriously. In his "New Biology" article, he is postulating a golden age of pre-Darwinian life, when horizontal gene transfer was universal and separate species did not yet exist. Life was then a community of cells of various kinds, sharing their genetic information so that clever chemical tricks and catalytic processes invented by one creature could be inherited by all of them. Evolution was a communal affair, the whole community advancing in metabolic and reproductive efficiency as the genes of the most efficient cells were shared. Evolution could be rapid, as new chemical devices could be evolved simultaneously by cells of different kinds working in parallel and then reassembled in a single cell by horizontal gene transfer.

But then, one evil day, a cell resembling a primitive bacterium happened to find itself one jump ahead of its neighbors in efficiency. That cell, anticipating Bill Gates by three billion years, separated itself from the community and refused to share. Its offspring became the first species of bacteria—and the first species of any kind—reserving their intellectual property for their own private use. With their superior efficiency, the bacteria continued to prosper and to evolve separately, while the rest of the community continued its communal life. Some millions of years later, another cell separated itself from the community and became the ancestor of the archea. Some time after that, a third cell separated itself and became the ancestor of the eukaryotes. And so it went on, until nothing was left of the community and all life was divided into species. The Darwinian interlude had begun.

Our Biotech Future, By Freeman Dyson, New York Review of Books, Volume 54, Number 12 · July 19, 2007

Has he sold out for an admittedly very fetching simile?

Continue reading

Precision Can Hide Accuracy

target.png Metrics are good, but just because they’re precise doesn’t mean they’re useful:
I’ve been thinking a little bit about “threat/vulnerability” pairing. You know the drill, go out, get a scan – match the scan data to existing exploits, and voila! You’ve got risk.

Now regular readers and FAIR practitioners know that I don’t believe this exercise gives you risk at all. In fact, in FAIR terms, I’m not sure this exercise does much for finding Vulnerability.

My Assertion To You: The industry loves T/V pairing because it is precise. It looks good on paper, and if you’re a consultant doing it, it looks like you’ve earned your hourly rate. We love The precision of T/V pairing gives us a false sense of accuracy.

Accuracy, Precision, And Threat/Vulnerability Pairing, Alex, RiskAnalys.is, 23 July 2007

He goes on to point out you also need to consider who’s likely to attack you, as in such Threat Agents, as he calls htem, may be too stupid to use a given exploit, or too smart to use it because they’ve got a better way. He recommends some statistical analysis to help out.

I’d also recommend more basic steps, such as not using IE and shifting away from other monoculture software until you’ve got a mix of software from different sources. Those things will usually get you in trouble with sales and marketing, however, because hey, they’ve never had any problems, well, not many, and it’s not their job to fix them. The precise thing isn’t necessarily the right thing.

-jsq

Identify Theft Prevention

Here’s a useful list of mobile computing security guidelines, plus some links to collections of information loss incidents:

http://attrition.org/dataloss/,
http://breachalerts.trustedid.com/,
http://doj.nh.gov/consumer/breaches.html,
http://www.privacyrights.org/ar/ChronDataBreaches.htm

—: Information Security Policy 101 – Mobile Computing Policy,by The Trusted Toolkit, The Trusted Toolkit Blog, 23 July 2007

-jsq

Military Information Security

bagram_overview.jpg I suppose we shouldn’t be surprised that the U.S. military doesn’t seem to be any better about information security than companies or other parts of government:
Detailed schematics of a military detainee holding facility in southern Iraq. Geographical surveys and aerial photographs of two military airfields outside Baghdad. Plans for a new fuel farm at Bagram Air Base in Afghanistan.

The military calls it “need-to-know” information that would pose a direct threat to U.S. troops if it were to fall into the hands of terrorists. It’s material so sensitive that officials refused to release the documents when asked.

But it’s already out there, posted carelessly to file servers by government agencies and contractors, accessible to anyone with an Internet connection.

Military files left unprotected online, By Mike Baker, Associated Press Writer, Thu Jul 12, 8:03 AM ET

Surely they know better than this? Continue reading

Security Executive

rmdecisions8.jpg Well, this should seem obvious:

For quite a while now, I’ve been claiming that in order for InfoSec to do it’s job properly, it needs to understand the business.

Whose Line Is It Anyway? Arthur, Emergent Chaos, 10 July 2007

Let’s go a bit farther:

Yesterday, Jack Jones again showed that he’s in the same camp when he asked us: "Risk Decision Making: Whose call is it?" There he shares his thoughts how to decide whether or not the Information Security team should be making information risk decisions for a company or if that should come from upper management.

I would claim that this shouldn’t be an either/or question: it’s a both/and.

Continue reading

FISMA Failing

Shades of SOX complaints: the U.S. GAO reports that the Federal Information Security Management Act (FISMA) is failing:

When we go out and conduct our security control reviews at federal agencies, we often find serious and significant vulnerabilities in systems that have been certified and accredited. Part of it, I think, is just that agencies may be focusing on just trying to get the systems certified and accredited but not effectively implementing the processes that the certification and accreditation is supposed to reflect.

Q&A: Federal info security isn’t just about FISMA compliance, auditor says, Most agencies still have security gaps, according to Gregory Wilshusen, by Jaikumar Vijayan Computerworld, June 14, 2007

Sounds like they haven’t implemented numerous simple security measures that were known before FISMA, they don’t have processes to do so, and they don’t adequately report what they’re doing, even with FISMA. What to do?

Continue reading

WS-Anasazi

pueblo_bonito_aerial_chaco_canyon.jpg Gunnar usually says it better than I did:
Coordinated detection and response is the logical conclusion to defense in depth security architecture. I think the reason that we have standards for authentication, authorization, and encryption is because these are the things that people typically focus on at design time. Monitoring and auditing are seen as runtime operational acitivities, but if there were standards based ways to communicate security information and events, then there would be an opportunity for the tooling and processes to improve, which is ultimately what we need.

Building Coordinated Response In – Learning from the Anasazis, Gunnar Peterson, 1 Raindrop, 14 June 2007

Security shouldn’t be a bag of uncoordinated aftermarket tricks. It should be a process that starts with design and continues through operations.

-jsq

Breach Discovery

bv.jpg If people know about security breaches, maybe there’s incentive for the companies whose customers they are or the governments whose constituents they are to do something about them, so this is good news:

New Hampshire, one of a handful of U.S. states that require breaches involving personal information to be reported to the state as well as to affected individuals, has made at least some breach notices it has received available on the net.

New Hampshire gets it, Chris Walsh, Emergent Chaos, 13 June 2007

Or at least if we know what’s really going on, maybe unfounded scare

Continue reading

No Word?

Got my hopes up on this one:
It appears that Science, the journal of the America Association for the Advancement of Science, itself the largest scientific society in the world, has updated its authoring guidelines to include advice for Office 2007 users. The news is not good.
“Because of changes Microsoft has made in its recent Word release that are incompatible with our internal workflow, which was built around previous versions of the software, Science cannot at present accept any files in the new .docx format produced through Microsoft Word 2007, either for initial submission or for revision. Users of this release of Word should convert these files to a format compatible with Word 2003 or Word for Macintosh 2004 (or, for initial submission, to a PDF file) before submitting to Science.”
SCIENCE PUBS REJECT ARTICLES WRITTEN IN WORD 2007, by Rob Weir, Rob Weir Blog, Thursday, May 31, 2007
And here I thought maybe they were rejecting Word entirely. Ah, it could happen. Most papers in physics, mathematics, and computer science journals are already formatted in TeX, if I’m not mistaken. So there is some diversity in publishing software; it’s not all a monoculture.

Meanwhile, the main reason Science rejected Word 2007 is that it is not backwards compatible with previous versions of Word, thus illustrating the Microsoft dilemma: stick with the old and retain customers, or fix problems and lose some. Not so big a dilemma with Word, perhaps. How many submittors to Science are there, as compared with business Word users? But much more of a problem for security fixes that require breaking backwards compatibility.

-jsq

Third-Party Measurement

Apdex It seems it’s not enough to just believe what ISPs tell you about bandwidth use:

NetForecast has been running live measurements of the ten Apdex Alliance Contributing Members.  The results from five different locations across the US show a great range of performance as seen by the users.  The measurement data is then summarized using both typical averaging methods and the Apdex method.  The results are documented in "Averages Hide the Real End-User Experience: Apdex Tells the Full Story," NFR 5086 by Peter Sevcik, April 2007.

The Apdex reports of the very same measurement data uncover many more performance issues.  For example, it finds a region where the users see chronic poor performance that was completely hidden by the averaging methods.  The ten weeks of data show that averages significantly under reported true end-user performance issues. It makes a clear case for reporting your measurements with Apdex!

The ISPs may not know where the users see chronic poor performance, either. Do they know how to ask the users? Do they know what to compare to in other ISPs?

Continue reading