top of page

The Delta Between Law and Technology

Updated: Nov 26

The Spectre of Meltdowns Past

Yes, I know that Spooky Season is over, but the ghosts of vulnerabilities past have come back to haunt me. One of the great fun things about being a cybersecurity attorney is the ability to learn all sorts of cool tech stuff. Seven years ago, I remember getting a lesson in the hardware vulnerabilities that could be exploited in side-channel attacks impacting nearly every computer in the world. Because replacing every CPU manufactured since the mid-1990s was onerous (to say the least), the onus fell on software providers like Microsoft to mitigate the risk. This might also explain why Intel was facing 32 class action lawsuits (not sure how many are still in play) while I am unaware of any against Microsoft.


The level of technical knowledge necessary to understand Spectre and Meltdown from a legal perspective was pretty low – most claims were based on contract rather than strict liability or generalized torts. The first major lawsuit filed against CrowdStrike for the failures resulting from an error in their Falcon endpoint software has landed with a $500 million claim from Delta. For those lucky enough to have not yet caused such widespread outages I’m going to break down the jargon so that this can be great learning experience on how the law might treat liability for software.


You Keep Using That Word…

Cyber folks, jump ahead. Legal folks, you might find this helpful.


Before we get into the Delta complaint, though we’re going to need a vocab lesson. Why? Because Delta uses a number of these terms extensively in its complaint, and to paraphrase Inigo Montoyo, I don’t think they mean what Delta thinks they mean. So, let me give you an idea as to what these terms – and some terms necessary to explain what really happened – mean to the cybersecurity community.


  1. Attack – The intentional exploitation of a vulnerability by a threat actor.

  2. Backdoor/door – Command sequences or similar (think of a “secret knock” on a door) that allow individuals with knowledge of the backdoor to bypass normal access restrictions. Backdoors are generally undocumented for all regular users and can have both legitimate and illegitimate – and usually malicious – purposes.

  3. Bug – An unintentional error or defect in computer program that causes it to produce incorrect or unexpected results or to behave in unanticipated or unintended ways. Code defects can lead to malfunction.

  4. Exploit – Anything (such as a human action or a computer program) that takes advantage of a vulnerability in a system, typically for malicious purposes.

  5. Hack – an overused undefined term used mainly by media when they lack the capability or willingness to describe what actually happened. Generally accepted in the cybersecurity community as an indicator of a blowhard who wants to use jargon in an attempt to sound important while hiding the fact that they lack necessary skill to use precise language.

  6. Malware – Software or code designed to perform or assist in the performance of malicious activities on a computer system. Malware is often used to exploit vulnerabilities.

  7. Risk – the probability a threat will exploit a vulnerability to cause harm to an asset. In the professional world of risk management, this is a calculatable number based on probability function and expected harm measured in dollars.

  8. Threat – Any action or inaction that could cause damage, destruction, alteration, loss, or disclosure of an asset or that could block necessary access to or prevent maintenance of an asset. Threats are typically expressed as an impact to one or more of the three elements of cybersecurity: confidentiality, integrity, or availability.

  9. Threat Actor – An individual or group that causes a threat. Threat actors can be outsiders or insiders and may act intentionally or negligently.

  10. Vulnerability – A weakness in an asset (which is anything in an environment that needs to be protected – including people, process, and technology) that can be leveraged by a threat actor to cause an unpermitted action.


The Delta Complaint

Disclaimer for the non-lawyer reader: this is only Delta’s side of the story. We are still waiting for CrowdStrike’s response. So, remember that these are allegations made by Delta. The defense has not yet been mounted.


A total of nine claims for damages are made by Delta (“Counts”), some of which I can see as being easily dismissed through a 12(b)(6) motion by CrowdStrike. I’m just going to focus on the ones that causes me the most trepidation: Product Defect & Gross Negligence. For my non-lawyer followers, I’ve offered a crib-notes view of the legal requirements around each claim. For my lawyer readers – skip that and yes, I know it’s more complicated.


Cyber Crime-like Stuff

These claims are related to civil and criminal statutes of Georgia. The first one is a state law similar to the Computer Fraud and Abuse Act (which readers may be familiar with, but Delta makes no claim under that law). The Georgia law makes it crime to use a computer without authority and with the intention of (summarized) deleting programs or data, interfering with programs or data, or causing a computer to malfunction. They also make a claim around trespass to “personalty” (personal property) that is related to harm to property.


The striking part of Delta’s claim is that they allege that the Content update for the Falcon software was an intentional exploit of a vulnerability for the purpose of gaining “unauthorized access” to a computer/network. This is based on the fact that the channel file updates were automatically pushed to the kernel. We will have to wait to see if CrowdStrike’s response will show similar terms to their General Terms and Conditions (see Section 5), which actually authorize that access. Assuming CrowdStrike had a tech-savvy legal team (and I know some of them, so that’s a safe bet), this is probably not a huge legal risk. So, let’s examine the next claim.


Strict Liability: Product Defect 

Strict liability is liability without a need to prove fault. It is typically found in product liability statutes, which allow for entire supply chains to be held liable when a product they produce causes harm. In the U.S.[1], software has not historically (or, more importantly, through legal precedent) been considered a “product” or a “good” for which the Uniform Commercial Code or strict product liability statutes apply. When challenged, legal analysis would focus on whether the software was “tangible” and whether the “goods aspect” or the “service aspect” predominated the transaction.


One of the main characteristics of original classifications of software as a more intangible service was the need for updates. The law has long recognized that software can include bugs that impact performance, including bugs as significant as “fatal flaws”. The law also recognized that software manufacturers – even with excellent quality controls designed to catch bugs – built the cost of bug fix into their product pricing and distribution models. What they did not build into that model was strict liability.


Those legal analyses were back in the olden days of downloadable software distributed by floppy disks and CD-ROMs. Since then, software has become much more of a service than a good: delivered, used, and paid for as a service, with continuous on-demand updates that help patch bugs and offer improved security. Delta’s complaint, which claims damages under strict product liability, cites no legal authority that indicates this long-standing precedent should be overturned, relying instead only on the level of impact of the event.


Changing this characterization through judicial action, rather than thought-out legislation, risks creating a de facto requirement that software be free from certain types of vulnerabilities or bugs, as liability will be based on impact rather than fault. This is a fatal position, because it is a provable, mathematical certainty that it is not possible to guarantee that software is free from these vulnerabilities or bugs. The impact of such a ruling might be anything from higher service prices to a stifling of innovation as founders become unwilling or unable to take on the risk of huge legal debt due to code errors.


Luckily, the bar for strict liability in the U.S. remains fairly high – it’s far more associated with physical harm than mere monetary harm, particularly when the harmed party could have taken mitigating actions but didn’t. That leaves one more particularly stick wicket: Gross Negligence.


Gross Negligence

For the non-lawyers: Gross Negligence is like negligence, but worse. What does this mean? Well, negligence requires (1) a duty to either do (or refrain from doing) something and (2) that the [alleged negligent] party failed to do that thing, and (3) that failure caused some sort of harm. In ordinary negligence, the failure to meet the duty was basically a normal human mistake. Gross negligence is such a severe deviation from the required duty that it showed a reckless disregard for a severe level of foreseeable harm.


In Georgia, there is a gross negligence statute: GA Code § 51-1-4 (2020) in which “gross negligence” is defined as the absence of “slight diligence” which is further defined as “that care which every man of common sense, however inattentive he may be, takes of his own property”. The complaint, interestingly, does not cite this statute, although it does copy the language in its complaint. Perhaps the law of the contract is not Georgia. Or perhaps Delta does not want CrowdStrike’s software development lifecycle and release processes to be compared to its own as part of that standard. Who knows? But it does leave a burning question that Delta did not address in its complaint: WHAT IS THE DUTY OF CARE FOR SOFTWARE QUALITY, ASSURANCE, AND DELIVERY?


Technology and the Duty of Care

Duty of Care is a legal concept that, in the U.S., is generally established through precedent. Case law addressing the duty of care for software and software-like offerings is few and far between. If Delta were to win this case, it could establish a baseline duty of care (at least in Georgia). So, let’s look at what Delta alleges and what duty it believes exists.


Delta alleges that CrowdStrike intentionally “create[ed] and exploit[ed] an unauthorized door within Microsoft’s OS” intentionally (that’s the Computer Crime-related claim) and through its gross negligence. Yet it failed to state what the standard of care was that was violated. Let’s walk through both the claim, defenses, and how it might develop into a standard of care.


No doubt there was a type of door in the Microsoft OS. It was a known and documented door that had a specific purpose of allowing kernel-level channel file updates. As security software, this is a good thing because it means that security software can quickly be updated as new threats are identified. That door was not likely to have been created by CrowdStrike, though. Delta itself, in the factual allegations, specifically walks the court through Microsoft’s kernel-level certification process, a process that would be very weird indeed if the door was unauthorized – why have a test to make sure you are allowed through a door if the door itself is unauthorized? Perhaps this (and the contract) will kill any claims based on lack of authorization.


Then the question then is whether participation in the Microsoft kernel-level certification process was indicative of a “failure to exercise even the slightest diligence.” Again, Delta does not cite or propose a standard of care, merely alleges that the following failed to meet it: (1) no additional verification; (2) testing did not catch the error; (3) Delta did not actively participate in the rollout; and (4) it had no rollback capabilities. Now, a cynic would say that if there was a certification program, then there must be more than “even the slightest diligence” – and certainly would question why Delta specifically called out the “additional” verification steps, which kinda sorta acknowledges that there was something present. Delta, however, alleges that this is not only insufficient at meeting the duty of care, but does not even qualify as the “slightest diligence”. This seems to be a question of fact or law, which means (should this go to trial), it will be decided by a jury (fact) or judge (law). Keep scrolling.


Delta further alleges that the rollout of the Falcon update was done without minimal testing, routine quality and assurance, or rollback capabilities. Again, though, no standard of care is cited or proposed. “Minimal” and “routine” are not defined by Delta, nor is any industry standard or practice referenced in terms of what should be expected. I take less umbrage at “rollback capabilities”, which is possibly self-descriptive. The lack of cited, stated, or even proposed standard of care puts this allegation, again, into the hands of judge and jury. Put together, it signals that Delta wants the courts to decide the standard of care. That is a danger to the software industry and cybersecurity.


When Courts Decide Duty of Care

Delta cites no ISO or NIST control set alleged or even suggested to be controlling a controlling standard. No industry associations quoted as guidance or “best practice”. Just “what was done did not protect us”. In short, Delta alleges that a standard of care was breached without ever stating what the standard of care is. That, dear reader, is what bugs me the most about this complaint. Delta seems to allege that the duty of care is the in the results rather than the process.


Placing a duty of care in the results of software operation or failure will create great legal risks for software companies, particularly if the difference between “ordinary” and “gross” negligence is based on the level of harm (as Delta seems to want).  That is because there are public policy, statutes, and other legal mechanisms that prohibit excluding warranties and limitations of liability against gross negligence.  What’s that mean for the non-lawyers out there? Lawyers may not be able to save your butts with contracts. The law will extend beyond the contract.


Apologies if this surprises you, but there is more to tech law than contracts.


Why We Can’t Have Easy Things

In the U.S., there are no federal or state law specifically governing software quality, security, or liability generally [2]. Instead, we rely on general laws and private laws (e.g., contracts), such as those between Delta and CrowdStrike. If judicial action defines a standard of care for the development and deployment of software, or subjects software to strict product liability, it could wholly upend the business models and cost structures of the digital economy.  And that’s before we consider what happens as the limits of mathematics becomes more visible in our increasingly digital lives.


As software, technology, and artificial intelligence become more and more advanced, the likelihood of hitting the limitations of mathematics and computer science will increase. It’s just an issue of volume.  We know this because we have been hitting those limits through the Blue Screen of Death for a couple decades now. If a court were to decide that strict liability applies, or that the standard of care was based on nature of a failure, it would put a ticking time bomb in every piece of software out there. Yes, it might give harmed parties and governments the ability to look smug while bragging how they can stick it to tech companies, but at what cost? The cost of tech investment or U.S. dominance in the global economy? What about the cost of actual security of digital infrastructure?

 

Creating a legal standard in case law that adopts Delta’s position on CrowdStrike’s action could undermine cybersecurity and a fundamental level. By characterizing the Falcon content updates as “unauthorized alterations” and “hacks” into their systems, Delta upends decades of industry understanding over these terms and turns accepted, good industry practices into security events on par with cybersecurity attacks. In doing so they would ignore the grim realities of software never being free of bugs that no quality process can correct for – no matter how comprehensive – and would disincentivize automated updates and just-in-time protection, which perversely would result in a weakening of security and increased likelihood of harm through cybersecurity attacks, not to mention a vast increase in the price of services.


What could possibly go wrong?

 


[1] Updates to EU Product Liability Directive are changing this position in the EU. Delta’s claim may fail here, but a similar claim may well be possible after the EU Product Liability Directive is updated.

[2] Yes, I am aware of sector specific laws, like the DFARS. Don’t @ me.

51 views0 comments

Recent Posts

See All

Comments


bottom of page