“Every man believes he knows how to do three things: drink liquor, woo a woman, and write.” – anonymous
You’ve just installed a shiny new multimillion-dollar computer system used to dispatch ambulances in a large, metropolitan city. You and your colleagues have spent years developing the software, which allows first-responders to efficiently locate the nearest ambulance in any part of the city. When an emergency call comes in, your computer immediately pinpoints the nearest ambulance, alerts the dispatcher, and sends help on the way. It’s a triumph.
Except that it doesn’t work. In fact, it’s so bad that experienced dispatchers would rather use a pencil and paper instead of your horribly buggy system. Rather than speeding up response times, ambulances are getting delayed or misdirected. Some aren’t even dispatched at all, long after the emergency call has come in. These unexplainable delays have probably cost lives. And it’s your fault.
Worse yet, this isn’t a hypothetical scenario. It really happened, in London in 1974 – and again in 2006, and again in 2011, and again in 2012. The London Ambulance System’s repeated computerization efforts were spectacularly counterproductive and plagued with problems and bugs and crashes, all apparently caused by poor programming.
Whom do you sue over a catastrophe like this?
Are you, the individual programmer, at fault? Is your team of fellow programmers liable? Or does your employer take the heat for your group failure? Is there insurance for this kind of thing? Does somebody lose their job, or drive the company into bankruptcy? Who, exactly, pays for this fiasco?
As with most legal issues, there’s no clear answer. But a lot of smart people are working hard to try to draw the black-and-white lines that define a programmer’s liability. If your code costs lives, should you be allowed to keep coding?
We accept that some software flaws are probably inevitable, just like there are occasional flaws in steel or iron. Once in a while you get a bad Ethernet cable, or a bad shrimp at the restaurant. Mistakes happen. It’s not a perfect world.
But software is always created by someone; it’s not a random natural variation in raw materials. And there are ways to guarantee – or, at least, to improve – the chances of creating reliable software. All too often, though, we ignore the rules, skip the classes, or skirt the procedures that help us generate better code, even though we know we shouldn’t. So who carries the ultimate responsibility when our system crashes – literally – as in the case of a car’s faulty drive-by-wire system?
Set aside for a moment the impenetrable thicket of actual laws in this and other countries; how do you think this should be handled? If you could wave a magic wand and declare the law of the land, how much responsibility would programmers bear for their own creations? Does it rest with the employer or with the individual? And how much buggy code is okay – “it’s just one of those things” – versus actual, legal liability? How bad does it have to get before someone can sue you?
There’s a good article on this topic in The Journal of Defense Software Engineering. In it, author Karen Mercedes Goertzel points out that we don’t even know what laws cover software. Is it a product or a service? The distinction makes a big difference to the applicable laws. Software is obviously intangible (CD-ROMs and USB sticks don’t count), so that makes it seem like a service. But it’s also a thing… sort of. Goertzel points out that software isn’t really sold; it’s licensed, so that makes it a service in the eyes of the law. But at the same time, software (especially commercial software) is delivered from a single vendor to a multitude of unknown customers in an open market, which makes it a product.
It doesn’t get any easier from there. If your code is faulty, you might be guilty of violating your warranty, even if there wasn’t much of a warranty to begin with. We’re all familiar with click-through EULAs (end user license agreements), which appear to absolve the software vendor of any and all liability, up to and including nuclear holocaust, zombie apocalypse, and loss of data. But apart from that, most software (again, especially mass-produced commercial software) comes with an implied warranty of suitability. Word-processing software is expected to, well, process words. A game is expected to be at least somewhat entertaining (No Man’s Sky excepted). There are certain baseline features that your software is expected to perform, even if you never say so anywhere.
That puts you under the jurisdiction of contract law. If your code doesn’t do what it’s supposed to do, you’re in breach of contract. That strategy has been employed successfully by plaintiffs to sue software vendors who delivered buggy products, or products that weren’t exactly buggy but that didn’t deliver the expected feature set.
This all sounds very squishy and unclear, and it is. But some situations are obvious no-nos. You can be sued for fraud if you knowingly mispresent or withhold information from a potential customer. For example, it’s not okay to hide the fact that you know about a vulnerability in your code, even if you intend to fix it later. This isn’t the same as saying you might have a vague suspicion that there might be bugs lurking somewhere; it applies when you already know about a specific vulnerability but choose not to disclose that fact. You’re also on the hook if you lie about inadequate testing. If your code hasn’t yet been through all the tests that you’d normally perform, you need to say so. If that testing later uncovers something that you could have found earlier, you could be liable for not disclosing that fact sooner.
Contracts, warranties, and misrepresentations typically limit your financial exposure to the price of the product. In other words, a simple refund. That’s good news for you. But lawyers often categorize software malfeasance under product liability, also called tort. Under tort law, you can be sued for a whole lot of money, far beyond just the price of the software or the system that runs it.
On the plus side, product liability applies only when there’s actual harm or damage, not just inconvenience or misrepresentation. On the negative side, there’s been actual harm or damage. Your software has caused real injury, so the stakes are high.
One insidious aspect of product liability is that you’re responsible for other peoples’ malicious actions. If a third-party hacker finds a vulnerability in your code and exploits it to cause damage, you’re answerable for it. In short, security is your responsibility, and lax security puts you on the hook for any fallout from hacking. Good luck with that.
Don’t panic just yet. Throughout all of this, lawyers rely on the concept of “reasonable care.” That is, you’re not expected to be perfect, just good. The prevailing legal view is that it’s impossible (or nearly so) to create perfectly secure and reliable code, so nobody’s handing out death sentences for every little bug. Some kinds of mistakes are okay. But you are expected to exercise “reasonable care” in writing your code, testing it, and ensuring that it does what it’s supposed to, while not also doing too much that it’s not supposed to. You need to be professional, in other words.
So… what exactly constitutes “reasonable care?” Nobody knows. Not every medical doctor is expected to save every single patient she sees – just the ones that a reasonably proficient and properly trained doctor could aid. Medical failure doesn’t automatically trigger a malpractice, and, with programmers, not every bug is grounds for a legal case. But beyond some hard-to-define point, a panel of experts might decide that your code wasn’t properly and professionally developed, and hold you liable.
To help clarify that situation, there are movements afoot to certify programmers, the same way that doctors, lawyers, and accountants are certified. On the one hand, that seems like an onerous and unnecessary intrusion into our business. Who wants to pass a Programmers’ Bar Exam every year? Or prove they’re staying abreast of all the latest developments in coding, debugging, and certification? What a nuisance.
On the other hand, certification confers certain benefits – like legal protection. If you’re officially certified, you’re automatically covered by “reasonable care.” You’re a professional, not a hacker, and your participation in a project means it’s a stand-up job done to professional levels of quality. Nobody can impugn your programming skills when you’ve got your certificate right there on the wall.
Software lawsuits aren’t going away, whether they’re under the doctrine of contract law, implied warranty, product liability, tort, malpractice, or fraud. Nor should they. Programming isn’t a special class of profession immune from liability, and coders aren’t above the law any more than plumbers, doctors, or heavy-equipment operators. Clarity is what we need. Clarity about the limits of liability, the ways to avoid it, and the remedies available when reasonable care is not exercised. Once we all know the rules, we can play the game on a level field.