ACM News Service

“Boeing, U. of I. to Work on Computer Trust Issues”
Chicago Sun-Times (01/06/05); Knowles, Francine

Boeing and the University of Illinois at Urbana-Champaign’s Information Trust Institute have teamed up to design trustworthy, reliable, and secure networked systems and software employed in critical infrastructures, with a focus on basic breakthroughs that can become practical and important commercial products within five to 10 years, according to institute director William H. Sanders. The Information Trust Institute will receive an undisclosed amount of funding from Boeing’s Phantom Works unit over the next five years for the purpose of investigating “trusted software,” and Phantom Works VP Gary Fitzmire says U. of I. was chosen on the strength of its trusted software research. The institute’s mission includes setting up science and technology for creating trustworthy networked information systems, the development of methods for evaluating such systems’ trustworthiness, and the administration of those methods to applications in systems including e-commerce, finance, emergency response, data and information processing, and aerospace. The institute has embarked on research projects that include misbehavior detection in wireless networks and a railcar health monitoring system, while the U. of I. last month solicited research project proposals based on the Boeing agreement. Submitted proposals included new software security and survivability techniques, and reliable and robust control of automated aerial vehicles.

From the article
The institute, which launched late last year, is part of the university’s College of Engineering, that recently signed one of its largest master research agreements ever with Boeing’s Phantom Works unit. The business is Boeing’s advanced research and development arm, and it’s providing undisclosed funding to the university over the next five years to support research in “trusted” software. The research will span topics related to security, privacy, reliability, safety and survivability.

The collaboration will focus on “fundamental innovations that can become viable and significant” marketable products in a five-to-10-year time frame, said William H. Sanders director of the institute.

I found this idea a compelling illustration of decentralization because it highlights that there can be *more* trust in a hydra-headed system run by the masses than a single-point-of-Google. Of course, the storage ratio should probably be 1:10 — meaning each byte could be backed up to 10 random machines to ensure that some of them are back online when you need it.

The party-pooper aspect is that asymettric upload/dowload links like DSL means it takes much longer to push *or* pull more data. But then again, he’s right to focus on backup, rather than interactive storage.

Another aspect of the solution is better metadata management — you don’t need to keep three copies of a digital photo, you just need to keep in touch with the three relatives you already mailed a copy to and the website that’s hosting your notes. In other words, most of the time, The Data’s Already Out There…

PBS | I, Cringely . Archived Column

That $3.95 per month fee covers any amount of storage the user wants, limited only by how much storage they are WILLING TO DONATE TO THE SYSTEM. Think of this as an alternate and quite a bit more sophisticated Napster. First, it is for BACKUP, so recovery has to be slow enough so people won’t think of it as another hard drive. Baxter is data insurance and nothing more. It’s a RAID system using donated disk space on a wide area network. Your data is compressed, then cut into chunks, and those chunks are distributed to dozens of places with enough forward error correction thrown in to cover any storage that is lost or happens to be down when recovery is needed. The data is both encrypted (on the customer end, so unencrypted data never enters the system and that vulnerability is eliminated) and split into chunks so no one person has enough to make any sense of it even if they could decrypt it. The Baxter business provides client software, handles divvying-up the RAID information, and keeps track of what chunks go where.

Even though it is Napster-like in that it knows where all the chunks are, Baxter doesn’t know what the chunks are, nor is the end-user in a position to use it as a Napster-like system for music sharing, since data recovery is deliberately slow…

Some new
attacks against the commonly-used SHA-1 and MD5 secure hash algorithms
were announced
at a rump
at the Crypto 2004
on Tuesday, as well as some less-commonly used secure
hash algorithms, including the original SHA (now called SHA-0 to avoid
confusion), RIPEMD, HAVAL-128, and MD4. Although these attacks in
their present form probably
won’t enable any system compromises
, they have algorithm,
protocol, and system designers looking around anxiously for

As background, secure
hash algorithms
are intended to fulfill two requirements: first,
that it be computationally infeasible to find two strings that hash to
the same hash value (“collision-resistance”), and second, that it be
computationally infeasible to find a string that hashes to a given
hash value (“preimage-resistance”). Collision-resistance is clearly a
stronger requirement, since you can construct a collision once you can
find a preimage, but producing collisions does not necessarily imply
that you can find a preimage for any given hash. Most systems don’t
depend strongly on the collision-resistance property. Even without
finding a flaw in the algorithm, there’s a property known as
the birthday paradox
that means that finding a collision by brute
force takes a lot less work than finding a preimage by brute force.

None of the attacks provide a way to find a preimage for either
SHA-1 or MD5, but
collisions have been found for the first time in MD5, in a
reduced-round version of SHA-1, and in SHA-0. I’m not sure whether
collisions had previously been found in the other algorithms, but I
don’t think so.

It was just becoming possible to find an MD5 collision by brute
force, and there was a $10 000 bounty for
a successful collision and a project
to find a collision and claim the bounty
through massively
parallel distributed computing. The attacks presented at Crypto 2004
require substantially less computational work than the brute-force
attack used in this project.

It was also strongly suspected that there were weaknesses in SHA-0,
and some weaker attacks had been found on MD4 and MD5 in the past,
making the breakage of those algorithms somewhat unsurprising.

So the only algorithm left standing is SHA-1, and even it looks
weak, and there isn’t an obvious replacement, although Tiger,
longer-hash versions of SHA-1, and AES-CBC have been
. Bruce
Schneier has called upon the National Institute of Standards and
Technology to initiate a search for a replacement algorithm,
Hal Finney thinks that at least the technique Joux used against
SHA-0 could be used against a wide variety of secure hash functions.

The first version of the MD5 paper had some minor
, which have been corrected in the current version (main page, PDF). Markku-Juhani O. Saarinen
posted some thoughts
and the extracted data files from the paper:
1, 2.

It’s interesting that none of the three papers presented at this
conference were presented by a citizen of the United States; the MD5
(etc.) paper was published by a Chinese team headed by Xiaoyun Wang,
the attack
on SHA-0
was presented by French cryptographer Arnold Joux, and
the attack on SHA-1 was presented by Israeli cryptographers Biham and

Cryptographic research in the US has suffered somewhat from the
Digital Millennium Copyright Act. For example, Ian Goldberg, a
prominent cryptographer who worked at the University of California at
Berkeley before the Digital Millennium Copyright Act was enacted,
explains why he left the United States in his
comments on proposals to tighten Canadian copyright restrictions

On an individual note, I have personally been involved in the mess
that is the US DMCA; some of my own work as a cryptographic
researcher, as well as that of my colleagues, has come under
question as to whether merely publishing an academic paper is a
violation of its anticircumvention provisions. Canada has
developed a strong cryptographic industry, partially as a result
of a more restrictive US legal regime in this area, and this
industry, as well as our high quality of research and education,
would be directly threatened if DMCA-like provisions were
introduced here. I will not live or work in a country that imposes
such restrictions on scientific inquiry. We must not allow
academic speech to be chilled, stifled, and censored by any
person, group of people, or industry.

Other cryptographers in the US have also scaled back their research
to avoid running afoul of the DMCA. Who can say whether one of these
attacks would have been discovered by an American cryptographer in the
absence of that law? Our national security depends in part on our
ability to deploy cryptographic algorithms, such as secure hash
functions, before the intelligence services of other nations find a
way to crack them. The DMCA may therefore be putting our national
security in peril if, for example, Israel’s Mossad has progressed
farther than Biham and Chen on cracking SHA-1.

(I took some other notes
for this post

Toyota Manages Quick Recovery from Fire

Wall Street Journal
8 May 1997
Page A-1
by Valerie Reitman staff reporter

Toyota Motor Shows Its Mettle
After Fire Destroys Parts Plant

KARIYA, Japan — No one knows what sparked the fire that roared through Aisin Seiki Co.’s Factory No. 1 here before dawn on Saturday, Feb. 1, leveling the huge auto-parts plant. But one thing is clear: The crisis-control efforts that followed it dramatically illustrate one reason Toyota Motor Corp. is among the world’s most admired and feared manufacturers.
Read more

Decentralized Intelligence – What Toyota can teach the 9/11 commission about intelligence gathering. By Duncan Watts

I found this a captivating read, though it’s about risk management and problem solving, not decentralization of power per se. The vivid lessons excerpted in the full entry are a reminder that complete risk analysis is impossible, so the only sure strategy is containment.

We don’t do that often enough in software — tracing how errors propagate as ever-longer chains of legacy software are automated together…
Read more

Adam and I just got back from a few days in Las Vegas to see the (astonishingly young!) face of the computer security, ahem, ‘adversary’ community at the 12th annual DEFCON. We found several useful sessions on the fringes of electronic commerce (‘real-time penetration of credit card networks’, ‘the farmer’s market model of pseudonymous dealing’, debates about the broadcast flag and other DRM), as well as several object lessons in how insecure today’s economy already is — from Googling up lists of cc#s to seeing a CommerceNet mail password compromised on the spot (hint: let’s turn on IMAP over SSL already! ;-)

The best technical content is actually released a few days earlier, at the more “official” Black Hat briefings. The best single presentation I attended was the release of a practical toolkit for SSH-over-DNS. That’s right, DNS name lookup queries form perfectly useful covert channels to send and receive data through all manner of firewalls. And it’s that way by design: the whole point is that DNS provides a way to invoke a small RPC at the remote end, namely lookup("symbol"). An even better hack that takes further advantage of DNS’s caching semantics is ‘DNS Radio’ — audio streaming! For more info, see NomDe (either as in ‘Plume’ or ‘Guerre,’ your pick :-).

Another warning that decentralized systems require tracking the conflicting interests of external agencies: not every other computer on the network is necessarily ‘helpful’ when it answers back!

A sign of how paranoid they were about not keeping records: Defcon 12 FAQs

$80 USD covers the entire 3 days of the conference
Cash only. Absolutely no checks, credit cards, money orders, traveler’s checks or foreign currency will be accepted. There is no on-line pre-registration. Everyone must register on-site.

By far the most intriguing single session was hearing about the state of post-9/11 computer security investigation priorities from the horse’s, well, choose the equine body part of your choice: DEFCON 12: Feds Yes, Anarchy No.

DEFCON 12’s de facto guest of honor was Robert Morris, National Security Agency’s Chief Scientist from 1986 to 1994. With his scraggly beard and unfiltered Camels, Morris would have blended in well with the retirees pumping quarters in the slot machines over at Sam’s Town. Morris was quite happy wandering about talking to the gawking youth and dropping hints that he didn’t really like John Ashcroft.

Morris was one of a group of current and former U.S. government employees that appeared on the “Meet the Fed” panel on Saturday afternoon. It was the first time in several years Feds had officially spoken at DEFCON and the panelists used the first half of their presentation as an ad hoc recruiting pitch. Uncle Sam wants talented and clean (i.e. no arrests or documented bad behavior) computer security people. “You can get up to 70 or 75 percent of your students loans forgiven,” repeated one official. The U.S. government has a large number of open computer sec positions to fill and has a tough time retaining employees. Entry-level employees join up, learn the ropes, and then end up departing 3 to 4 years later for more lucrative private sector positions.

Another lesson we learned: far too many businesses have decided to place WiFi networks between point-of-sale machines that take credit card information, and the gateways or front-end-processors that do the credit card auth, so hackers discovering sensitive stuff on wireless networks is going to become an ever-more-common occurrence. It’s not that WiFi is any less secure than a wire; rather, because accessing open wireless networks is a lot easier than cracking a physical line, these kinds of cracks become a lot easier. The fact that most Internet applications still don’t use encryption on the wire makes such systems more vulnerable in the environment where anyone can access what gets sent over the air.