There is a legal and ethical concept known as “a duty of care.” It’s simple, really: it’s the idea that a person can be liable for harm that occurs to someone else if foreseeable harm happens to someone as a result of someone else’s actions.
If you knowingly take a shortcut around fire regulations when you build a building, you have breached your duty of care to the building’s eventual occupants. If something happens as a result of your actions or inactions, you can be found liable for the resulting damage. Simple, right?
Well, over the weekend, the U.S. National Security Administration (NSA) was back in the news after the discovery of the Heartbleed computer bug, a software coding mistake that has had manufacturers scrambling to close a loophole that gives hackers a way around security encryption software.
Friday, the White House said it had not had prior knowledge of Heartbleed, despite revelations by NSA whistleblower Eric Snowden that showed the NSA was seeking just such a flaw in web-based secure encryption.
But the revelations didn’t end there. The question was, if the NSA did find a major backdoor leak, would it move to protect the security of hundreds of thousands of web users, or would it simply exploit the flaw for its own ends?
Well, a little of both.
“This process is biased toward responsibly disclosing such vulnerabilities,” an administration spokeswoman said.
What does that mean?
Well, it’s being taken to mean that, in most cases, NSA-discovered flaws that allow backdoor entry into computers and software will be revealed to manufacturers so that they can be repaired. In a broad range of other cases — if the NSA can show a strategically defensible use for the flaws — it may not reveal what it knows, even to software manufacturers.
That treads on some particularly significant and legal grounds. You can understand why a spy agency wouldn’t want to disclose its strategic weapons. Four particular software coding problems let the NSA damage scores of centrifuges being used in Iran’s nuclear enrichment program, for example.
But Heartbleed raises an interesting question: the flaw has the potential to do billions of dollars of damage to ordinary citizens who had every expectation of protection by their own government. In this particular case, the U.S. government has said it did not know about or exploit the flaw — but that same government didn’t say that it wouldn’t have used the flaw, if it had known about it.
If government agencies are willing to simply sit on the sidelines and watch damage being done to citizens, have they not abridged a clear-cut duty of care? Sure, the defence is now an old and careworn one, something we hear more and more from governments. It’s the war on terror, and the end justifies the means.
Well, no, it does not.
When a government can’t be trusted to always put its citizens first, the government has become part of the problem.