Thursday, December 30, 2010

The PS-3 Private Key Break

At the December 2010 Chaos Communication Congress in Berlin, a group calling itself ''fail0verflow'' announced it had succeeded in bypassing a number of Sony PlayStation 3 restrictive measures. These included the recovery of the private key Sony uses to sign code. Sony was using a technology called public key cryptography to insure that only programs it approved could run on the PS3. Public key cryptography uses one key, which is kept private, to create a mathematical puzzle that is very hard to solve, but which can be unlocked using a different key that can safely be made public. These puzzles can then be used to create electronic signatures that can be verified by any computer that has the public key. Absent unforeseen mathematical breakthroughs, if the public key algorithm is implemented correctly and the private key is kept secret, no one can forge such a signature.

The PS3 private key recovery was made possible by an improper implementation of the public key cryptography system Sony employed, Elliptic Curve Digital Signature Algorithm or ECDSA. ECDSA requires a secret random number to be generated for each signature. Those secret numbers must be different for each signature. If two signatures are found that used the same secret number, even if that number itself is not known, the private key can be recovered by simple algebra. Sony reportedly used the same secret number for all its signatures. As a result, anyone with the now-public private key can sign code it a way that is indistinguishable from a signature issued by Sony.

This flaw was not in any of the PS3 software or the IBM Cell processor that's at the heart of the PS3. It was a mistake in the program used to sign software approved to run on the PS3. That program presumably runs only on some highly guarded server at a facility controlled by Sony. The signing program could have been fixed by adding one line of code, a call to a strong random number generator, like /dev/random, to generate a new random number for each signature.  You may have to go back to the US and UK's Venona exploit in the Cold War to find an example of a large organization (the USSR in that case) screwing up what should have been an unbreakable system.

The lesson from this fiasco is that in cryptography details are crucial. Cryptographic software must be intensively reviewed by experts, not programers who recently read a book or took a course on the subject. The best solution may be open source implementations that have been available for public scrutiny for many years, and even then, an expert review of the final software is still essential. And all user of ECDSA and similar algorithms should review their signing software to insure it does not contain the same bug. This barn may contain more than one horse, so its's worth checking the door.

Here is a YouTube link to the complete presentation: http://www.youtube.com/watch?v=hcbaeKA2moE 

Wednesday, December 22, 2010

Thoughts on Wikileaks and the U.S. Government

Governments have a right and a duty to keep some things secret. Whether it's individual medical records at government hospitals or nuclear weapon design parameters, some data should never become available for all to see. That said, it's the government's responsibility to take measures to protect those secrets. Leaks by individuals trusted with secrets have always been a problem, but computer technology amplifies the danger enormously. Modern computer are built to share information, not protect it, and the price and size of data storage as dropped so low that 250,000 classified diplomatic cables can comfortably fit on a MicroSD storage card the size of a fingernail, selling for under $10. Keeping something that small from moving in or out of the most secure facility is nearly impossible.

Nor is it enough to restrict access to only people who have been vetted for security clearances. Assuming the vetting process is 99.9% effective, and 1000 cleared people have access to the data, the probability of a leak is 63% (1-.999^1000).

When I drove my son back to school after this Thanksgiving, I took him to the campus bookstore to buy him a new laptop. Driving home I got a call from my credit card company asking to verify a recent transaction. Their computers apparently found it odd that I made a large purchase 100 miles from home. Now a credit card company has a strong interest in quickly stopping misuse of a stollen card, but didn't the U.S. Government have an even stronger interest in protecting classified information? It's not as if the risk Wikileaks posed was unknown. The U.S. Army wrote a report about the security hazards posed by Wikileaks, which report found its way to the Wikileaks site.

After 9/11 attacks in 2001, there was a strong push to break down barriers sequestering information that could prevent future attacks if shared. Large amounts of classified data, including hundreds of thousands of diplomatic cables and wartime incident reports, were made accessible over special networks that could only be used by people cleared at the Secret level. Was software developed in parallel to track unusual usage by individuals? Was it deployed in the field? If not, why not? Who made the decision to allow sharing without adequate precautions? Is such software deployed universally now?  Those are the questions the U.S. Government should be pursuing, instead of the embarrassingly pointless effort to keep people from reading cables already published.

So far the leaked cables have caused more embarrassment than danger. Even the cable listing potential terrorist targets worldwide should have modest impact. It's not as if the terrorists had run out of potential targets they knew about.

My big worry is highly classified data that was not leaked, particularly software used to design nuclear bombs. The vast majority of U.S. nuclear weapons were designed between 1945 and 1983 using a series of ever more powerful supercomputers. However none of those supercomputers come close to the power of the average desktop computer in use today. Indeed most were puny compared to the Macintosh G4 released in 1999 that was the subject of the famous Apple Tank Ad, which you can watch at http://www.youtube.com/watch?v=7Eb1yih5kNY. Those G4's now gather dust in thousands of basements. Software programs for bomb design likely would take much less space on a memory card of thumb drive than all those diplomatic cables. What is being done to insure they never leak?

Getting started...

Computer and Internet security is a mess and it's getting worse. Big corporations and the government seem unable to solve their own security problems, much less lead the way to a safer information technology world. I hope to spread some useful ideas and common sense here and one more blog on the topic can't do much damage. For starters, check out my Diceware.com page, with suggestions on picking and remembering strong passphrases and passwords.