I’ve mentioned a number of software laws in various posts, like Cargill’s Ninety Nine Rule, or Occam’s Razor. And there are tons of laws that you probably already know, like Metcalfe’s Law or Moore’s Law.
I’ve found a very complete list of the laws regarding software development (I highly recommend reading that link. I’ll wait, go ahead). But from that list, we seem to have developed a complete blind spot for five in particular. Let’s look at these five and how our collective ignorance of them continues to impact software development today:
Law #1: Amdahl’s Law
Gene Amdahl first published this notion in a 1967 paper. This law is about the mistaken notion that “All We Need Are More Parallel Processors and Our Software Will Run Faster”.
The Damning Evidence: Pop quiz: have you bought a new machine in the past 4 years that was multi-core? Were you a little disappointed when you checked the processor usage and found that not every one of those shiny, new cores was busy all the time, no matter which of your apps you ran?
We buy new hardware with the mistaken impression that our old programs will continue to run even faster than before because we expect our software to take advantage of all those friggin cores! But software never runs as fast we expect it to on the multi-core hardware, because the parallel component of the program is often missing, underdeveloped, or poorly understood by the developer. Thus, our software continues to disappoint us on even on shiny, new multi-core hardware.
Exceptions: Some applications have been expressly written to be massively parallel and they continue to kick ass and take names on new multi-core hardware (e.g. rendering, scientific and encoding applications). By and large, most applications simply don’t benefit from those extra cores because they weren’t written to do so.
Law #2: The Law of False Alerts
First introduced by George Spafford in this article, the law states that the more the user is presented with false or erroneous alerts, the more they will ignore real alerts in the system.
The Damning Evidence: Windows Vista is the classic current example. Every bloody operation in it required your permission from the user authentication module. After while, you just madly clicked “Yeah, sure whatever…” for every warning that popped up. This, of course, robs the operating system of any ability to protect you from a real threat because you’ve been annoyed by the feature in the first place.
Of course, people still design applications like this:
- “Are you sure you want to delete?”
- “No, really, are you REALLY sure you want to delete?”.
- “OK, look, I’ve asked already but just so I can’t be blamed for anything, are you SUPER-DUPER-ABSOLUTELY, 110% sure you want to delete?”
Stop the insanity. If they click delete and they weren’t supposed to, how about offering an undo operation? Too hard you say? Then you’re not trying hard enough. Don’t punish the users for bad design.
Law #3: Jakob’s Law of Internet Experience
From Jakob Nielsen, web usability guru, who states that users only spend a small fraction of time on your site, compared to all other sites. Therefore, your site experience should be similar to all other sites to minimize learning curve and maximize usability.
The Damning Evidence: Well, things like Firefox Personas aside, which distract your users from the actual content of the sites, we still can’t seem to come up with a consistent way to develop user interfaces on sites. Thanks to Web 2.0, everyone is now trying to copy the success of sites like Facebook, Twitter, and other social networks to create wild, experimental web pages that are just plain awful to use.
Don’t get me wrong here: I’m not saying different is bad, I’m saying that different is hard to get right. Users (especially “Normals”) don’t like to be made to think how to use things. But that doesn’t seem to stop us creating web pages with crazy stuff on them.
Exceptions: Sometimes, user interfaces are giant evolutionary steps that simply lie outside the normal boundaries we’ve come to expect and that’s acceptable. The iPhone was a perfect example: no one really had mastered the touch interface until Cupertino & Co came out with it and they didn’t exactly follow any of the old school rules. But it was still a major success and now sets the standard for all smartphones. However, most everyone else thinks they’re creating the exception when they’re just breaking the rules poorly.
Law #4: The Pesticide Paradox
Attributed to Bruce Beiser, the law states that every method you use to prevent or find bugs leaves a residue of subtler bugs against which those methods are ineffectual.
The Damning Evidence: Things like Test Driven Development and Unit Testing give us the false impression that we’ve quashed the major bugs in the system when all we’ve really done is quash the obvious bugs, leaving the more subtle, painful, and difficult ones behind. Many of these types of bugs are related to concurrency or particular complex data conditions that are difficult to express as unit tests.
Before anyone rants about this comment section claiming I think TDD is bad, or unit testing is evil, please hear me correctly: Unit testing and TDD leave a false sense of security that we’ve managed to create stable software. They are a starting point to more complete testing, but they are not the end. The meaningful problems are often in integration with other systems and modules, that are often left out of testing plans because of time constraints, schedule pressures, laziness and sometimes plain arrogance.
Exceptions: Small, simpler systems rarely suffer from these issues because testing is much easier. This is mostly a complex software problem, at a level of enterprise development, large applications (e.g. Microsoft Word), or operating systems.
Law #5: Fisher’s Fundamental Theorem of Natural Selection
While this law stems from genetic research by R.A. Fisher, the application in software is somewhat obvious: The more highly adapted an organism becomes, the less adaptable it is to any new change.
The Damning Evidence: We strive to create complex, interesting, and highly useful frameworks: Hibernate, Struts, Flex, ExtJS, and jQuery to name a few. But every version we release generates new requests by the users for missing features or enhancements. Each change adds more complexity. And the more complex the software, the lower the chance those changes can be easily accommodated in subsequent versions.
For example, Struts went through a major rewrite for version 2.0, which speaks volumes about the original version’s adaptability to change. Spring did a major update for AOP that was a breaking change from 1.0. ExtJS did the same for their 1.0 and 2.0 releases.
Exceptions: Probably none–this seems to be the inherent nature of frameworks. But if you know of something, please prove me wrong in the comment section. I’d love to hear about some piece of software that didn’t follow this rule.