Mind Matters Natural and Artificial Intelligence News and Analysis
industry-metallurgical-plant-dawn-smoke-smog-emissions-bad-ecology-aerial-photography-stockpack-adobe-stock
industry metallurgical plant dawn smoke smog emissions bad ecology aerial photography
Image Credit: Андрей Трубицын - Adobe Stock

Physicality Of Data And The Road To Inherently Safer Computing

The software industry today is precisely where the chemical industry was in 1978; hazard control is a mere afterthought
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

“The Physicality Of Data And The Road To Inherently Safer Computing” was originally published by Forbes, August 24, 2021. David Kruger is Co-Founder and VP of Strategy for Absio Corporation and a co-inventor of Absio’s Software-defined Distributed Key Cryptography (SDKC).

David Kruger

Our current concept of cybersecurity is to defend against attacks and remedy failure by erecting more and better defenses. That’s a fundamental mistake in thinking that guarantees failure. Why? Because it’s mathematically impossible for a defensive strategy to fully succeed, as explained in the previous installment of this article series. Another even more fundamental mistake in thinking is that cyberattackers are the cause of our woes. They aren’t. They’re the effect.

A hazard is a potential source of harm. Cyberattackers target certain physical data objects for the purpose of converting them into harmful information. Targeted digital objects become hazardous whenever a lack or a loss of control renders them usable by cyberattackers. Theft of intellectual property, trade, military and diplomatic secrets, business and critical infrastructure shutdowns, identity theft, loss of privacy and financial losses of nearly $1 trillion in 2020 alone are examples of harm.

That’s why security is the wrong way to think about curing our cyber woes. Security is reactive and defensive; time, money and attention are being wasted by overly focusing on treating symptoms (cyberattacks) instead of curing the disease: out-of-control data. Safety is proactive and preventive; it’s the art and science of controlling hazards.

Inherently safer design (ISD) is an engineering methodology for addressing safety issues in the design of processes that use and/or produce hazardous materials. ISD seeks to:

  • Eliminate the hazards it can.
  • Reduce the hazards it can’t eliminate.
  • Control what remains. 

In 1978, after a series of fatal accidents, Trevor Kletz, the father of ISD, suggested that chemical process design engineers focus on eliminating or reducing hazards instead of relying on extensive and expensive safety systems and complex operating procedures. The refocus was necessary because until then, the primary design focus was maximizing production and minimizing costs. Designs were thrown over the wall to safety engineers whose job was to bolt on layers of compensating safety controls to keep chemical plants from burning down, blowing up or spewing toxins into the environment. If this sounds familiar, it’s because design thinking-wise, the software industry today is precisely where the chemical industry was in 1978. 

Ironically, in March 2010, the same month the NSA released cybersecurity through defense-in-depth official U.S. doctrine, the Department of Homeland Security, in conjunction with the Center for Chemical Process Safety and the American Institute of Chemical Engineers, issued a formal definition for Inherently Safer Technology:

“Inherently Safer Technology (IST), also known as Inherently Safer Design (ISD), permanently eliminates or reduces hazards to avoid or reduce the consequences of incidents. IST is a philosophy, applied to the design and operation life cycle, including manufacture, transport, storage, use, and disposal. IST is an iterative process that considers such options, including eliminating a hazard, reducing a hazard, substituting a less hazardous material, using less hazardous process conditions, and designing a process to reduce the potential for, or consequences of, human error, equipment failure, or intentional harm.” 

Let’s conduct a thought experiment to see if hazardous out-of-control data correlates well with a highly hazardous chemical (HHC). We’ll posit that:

  • Huge quantities of HHC are needed because it’s the only material out of which trillions of dollars of vitally necessary and highly desirable products can be made.
  • HHC manufacturing, transport and storage systems are everywhere, but they’re highly leaky, and strangers with unknown intentions can readily access process, storage and shipping areas. 
  • Modern society utterly depends on HHC, so thieves and nation-state adversaries are perpetually dreaming up new methods for stealing it, holding it for ransom or probing for ways to impede or destroy manufacturing, storage and transport systems.

For our chemical, we’ll use dimethyl cadmium. It’s truly nasty stuff — inexpensive and simple to make and it manifests multiple kinds of harm: If you inhale a few millionths of a gram, it’s absorbed instantly into the bloodstream and attacks the heart, lungs, kidneys, liver, brain and nervous system. If you survive (unlikely), it’s a potent carcinogen. If you spill it, it’s likely to catch fire and give off poisonous cadmium oxide smoke. If it doesn’t catch fire, it reacts with oxygen to form a crust of dimethyl cadmium peroxide, a friction-sensitive explosive. If the dimethyl cadmium peroxide explodes, the remaining dimethyl cadmium will be aerosolized, ready for inhalation.

In our scenario, dimethyl cadmium manufacturing, storage and transport systems leak like a sieve and let potentially malevolent strangers wander about. What are the chances that people, organizations and governments have been or are about to be harmed by it or one of its derivatives? Pretty high, right?

“But,” you respond, “it would be insane to make, store and ship dimethyl cadmium with such inadequately designed systems! Who would do that and why would we let them?” That’s a really good question because our dimethyl cadmium thought experiment correlates precisely with how most software currently manufactures, stores and ships hazardous target data. There are exceptions, but most software leaks like a sieve and doesn’t consistently keep strangers (cyberattackers) out. That’s not hyperbole; read the news.

The failure to design software capable of controlling data throughout its lifecycle is an exact analog to failing to engineer the safe creation, storage, transport, processing and disposal of highly hazardous chemicals — except the potential consequences of software failure are far more severe. 

The incorporation of ISD into chemical processing, storage and shipping resulted in tremendous increases in safety, but also in substantially lowered costs. Building safety in, instead of bolting it on, shed multiple layers of costly and complex compensatory systems.

My hope in writing this series of articles is to start a movement to repeat that success in software design and engineering.

What’s the road to inherently safer computing? It begins with leaders coming to grips intellectually and emotionally with the reasons why what we’ve been doing hasn’t worked and why it never will. It’s time for a major rethink and a reset. Do that, and then let’s be on our way.

Future articles in this series on the physicality of data will focus on practical implementations of inherently safer computing.

Follow me on LinkedIn. Check out my website


Here are all of David Kruger’s columns on cybersecurity to date — browse and enjoy:

The physicality of data and the road to personal data ownership. When data was first digitized in the 1950s, no controls were built in to protect it from unauthorized use or misuse. Confusion between information in our minds and physical data, as stored on computers, hampers efforts to control how our data is used.

The physicality of data and the road to cybersecurity With cyberattacks trending upward, remember that cyberattack potential is always greater than cyberdefense potential. Data objects can defend themselves with encryption, which makes them unusable if captured by cyberattackers. Sadly, little data today is encrypted

The physicality of data and the road to inherently safer computing The software industry today is precisely where the chemical industry was in 1978; hazard control is a mere afterthought.
Most software leaks like a sieve and doesn’t consistently keep strangers (cyberattackers) out. That’s not hyperbole; read the news. — David Kruger

and

Physicality of data: The road to inherently safer authentication. Even though the world is arguably far more at risk from uncontrolled data than from uncontrolled HHCs, there are no hordes of people demanding solutions — yet. Many people don’t realize that — unlike the chemical industry — software/hardware makers aren’t typically held accountable for harm caused by preventable hacks.


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

Physicality Of Data And The Road To Inherently Safer Computing