Page 2:Painting A Malware Background
Page 3:Taking The BluePill
Page 4:Breaking Past Hardware-Based Protection
Page 5:Can Your BIOS Be Flashed By Malware?
Page 6:A Strategy For Protection
Page 7:How Correct Is Security By Correctness?
Page 8:How Viable Are Heterogeneous Computing Environments?
Page 9:Making Recommendations
How Viable Are Heterogeneous Computing Environments?
Alan: Maybe the right approach is application security by isolation and services security by design. The servers running the cloud should have many of its applications secured by isolation, but you still have to rely on some security by design?
Joanna: Sure. As said, security of server-side software is a different field than security of desktop systems.
Alan: Along those lines, thinking as a biologist, it seems as if we, as a community, should avoid standardization along any single piece of hardware of software. When a hospital buys new computers, it ends up choosing hundreds of the same model. If the motherboard or CPU is found to have a flaw, the entire hospital is at risk for attack. Should big organizations, going into the future, consider a heterogeneous computing environment? Some Intel PCs and some AMD PCs? Some Windows, some Mac, and some Linux?
Joanna: Well, that is actually a "Security by Obscurity" approach. If we care about DoS attacks, then surely it is helpful. If we, however, are afraid of information being stolen, which implies a somewhat more targeted attack, then I guess it only provides a false sense of security--I assume the hospital would still use some popular OS, not a home-brew, recompiled Linux, right?
Alan: Depends on how sophisticated the hospital is. A lot of infrastructure is run in hospitals on *nix machines, while most user machines are Windows or Mac. Many hospitals rely on Citrix-based terminals and the like.
Joanna: But there would still be some mainstream Linux distros, not recompiled, customized OSes. The CEO would still use a specific OS (either Windows or Mac or maybe even some Linux, but a popular distro). For the attacker that is going after data records, it would be irrelevant what the other computers are using.
Alan: Well, it’s the layered approach. You can go for the information directly that is stored somewhere in some cloud. Or you can go for the end-user systems that access the information from the cloud. So, if a bug in Windows was discovered that allowed full compromise of the system, an organization with the capabilities of heterogenous computing could quickly take all Windows machines off the network and still operate using the Linux/OS X machines.
Joanna: As I said earlier, this is good in mitigating DoS attacks, but not information leak attacks.
Interestingly a variant of this "Security by Obscurity" approach has been widely adopted in the recent years on most mainstream OSes. For example, the memory layout randomization technique (ASLR) first introduced on Linux by the PaX patch, later brought to Vista, and now also coming to Mac OS X. This ASLR is nothing else then Security by Obscurity, when we think about it.
Another anti-exploitation technique is stack protection through so-called "canaries," which are magic values placed on the stack to detect stack overflow. That’s, again, nothing else but Security by Obscurity. It was been introduced by the Stack Guard on Linux a decade ago, and now, for quite a few years, it has been present in Microsoft's Visual Studio compiler.
So, I'd rather recommend using those dedicated anti-exploitation approaches that are also based on this concept of providing somewhat heterogeneous environment rather than investing lots of money and effort into buying heterogeneous systems for a corporation, which likely will provide no additional security.
Alan: Unless you were paranoid and used all of those dedicated anti-exploitation approaches on multiple machines.
Joanna: And what benefit would it offer, besides DoS protection?