Charlie Miller On Hacked Batteries, Cloud Security, And The iPad

Security In The Cloud

Alan: What about "mathematically proving" that software is "correct" and performs as expected to meet the design requirements? That's done with avionics software. Can we do that with regular software? Can you mathematically prove that something is secure, or at least impervious to specific attack patterns like fuzzing or SQL injection?

Charlie: It's probably possible sometimes, but it is not done. We're still really in the stone ages of software security. At this point, the only practical thing to do is fuzz, audit, and analyze the hell out of it. Microsoft fuzzes everything, but obviously there are still plenty of bugs in its stuff. I've found critical bugs in software that has been analyzed by static analysis tools. Research indicates that different fuzzers find different bugs. Finding all (or even most) critical software vulnerabilities is really hard, time intensive, and expensive. NASA might have the time and money to make sure the software on their Mars Rover is perfect, but software vendors want to ship software and make money and are willing to live with a "few" vulnerabilities.

Alan: Besides better software, what about hardware issues? Joanna Rutkowska published the SMM attack a couple of years ago, and you recently talked about the firmware attack with Apple batteries. How do we approach this problem?

Charlie: This is really hard. Another example you left out is Ralf Phillip Weinmann and his mobile baseband attacks. There are lots of different chips in all of our electronics that you don't think about. This is one of the reasons I was interested in the battery research. The worst thing about hardware is that it is hard/expensive to analyze. We can all download Internet Explorer and audit the code/fuzz it. But it takes equipment and special skills to look at hardware. I probably spent $1000 for equipment on the battery research and that was just for fun. These barriers make the systems less secure because it discourages researchers like me from analyzing it.

Alan: Where does cloud computing fit into this? You’re putting a lot of trust into the company developing the cloud software and the company actually hosting the cloud. If its software is bad, or worse, if its privacy policies are incomplete or its employees are unethical, you are at a significant level of risk for data compromise. In addition, a big database like that would be a prime target for hackers. On the other hand, companies like Amazon, Apple, Microsoft, and Google should be better-equipped than the average end-user when it comes to security.

Charlie: Yes, cloud security is tough because it can't be independently validated very easily. We can all tear apart MS PowerPoint to see what it does with our data, but when you ship your data off to the cloud, researchers like me cannot look at the software to try to find bugs. In fact, poking around on their Web site is illegal. Using software that is not on your system, and thus cannot be torn apart and reverse engineered, means you are putting a large amount of trust on whoever is writing that software. The guys like me won't be able to help you. As for whether these big companies are better than the average person, I'm not sure. Sony might be a good counterexample to your argument.