Cybersecurity: Put a Lid on the Risks. We Already Own the Lid
Security specialist David Kruger says, data must be contained when it is in storage and transit and controlled when it is in useVeteran software developer David A. Kruger offered some thoughts on computer security recently at Expensivity and we appreciate the opportunity to republish them here as a series. Last week’s discussion focused on the ingredients that cybersecurity needs to work. Today, the focus is on putting a lid on risks.
Put a Lid on It
Fortunately, we have at our disposal untold millions of man hours of safety engineering focused on safely extracting benefits from the use of hazardous things. For example, our homes and the highways we travel on are chock full of beneficial things that can easily kill us, such as high voltage electricity, flammable/explosive natural gas, and tanker trucks filled with flammable or toxic chemicals driving right next to us. These very rarely do us harm because the hazards are contained in storage and in transit, and their usage is controlled. Containment keeps hazardous things in until they are released for use. Controls enable hazardous things to be used safely.
Containers and controls enable the safe use of hazards thing. If you are familiar with propane grills, think of the tank, tank valve, pressure regulator, and burner knobs. They are each engineered to safely extract a specific benefit — delicious grilled food — from highly hazardous propane. The tank is the container which safely contains propane in storage and in transport. The tank valve and pressure regulator are system controls. Even if the tank valve is opened, gas won’t flow, because a safety mechanism in the valve constrains the flow of gas unless a pressure regulator is properly attached. The pressure regulator constrains the flow of gas to a specified maximum volume and pressure. The burner knobs are user controls. They enable the user to instruct the grill to operate within a user-specified temperature range. So, a universal design principal for systems intended to extract a benefit from the use of a hazardous material can be formulated as follows: The hazardous material shall be safely contained until it’s put into use, the user shall be provided controls for extracting the specified benefit from use of the hazardous material, and system controls shall enable the user’s instructions to be carried out safely. How does this apply to the problem of open data?
Key Point: Data is physical and hazardous, therefore, the only way to use it safely is to contain it when it’s in storage and in transit and control it when it’s in use.
Data can be contained with strong encryption. If a cyberattacker gains control of strongly encrypted data but has no access to its keys, they can’t get it out of containment and do harmful things with it. When continuous unrelenting cyberattack is a known operating condition, there is no good reason to not encrypt all data by default the moment it is created, and from then on, only decrypt it temporarily for use. Only a tiny fraction of all data created is intended to be public. If you are its rightful owner, you can decrypt it and make it public whenever and wherever you choose. Can software encrypt data by default? Of course, it can. It’s known art.
Shot Caller
The first principle of controlling data is that control must be continuous. Data is distributed by making copies, and the copies can be processed by every compatible instance of software in existence. Therefore, the original and every copy must be accompanied by its user’s instructions. If those instructions don’t accompany the data, the recipient of the data, licit or illicit, can do whatever they want with it, and we are back to square one—open data.
The second principal of control is that each instance of data must have a unique, verifiable identity to support updateability and auditability. User instructions may need to be updated, such as changing access to data. The unique, verifiable identity supports traceability, usage logging, and proof of ownership, which means that the creation, distribution, and use of data can be fully auditable.
To accomplish this, software must make and manage a third data component. Open data has two components, the payload and metadata. The third component is instructions. When software takes the data out of containment, it consumes the data owner’s instructions and carries them out. When two component data is shared, data owners are at the mercy of whomever is in control of the copy. When three component data is shared, each copy acts as a dynamic proxy for the data owner; it carries with it the data owner’s will and choices and can be updated and audited as needed. For brevity, we’ll call three-component data that is encrypted by default “controllable data.”
Controls provided by software enable data owners to instruct the system how their data can and cannot be used. To use data safely, the minimum controls are:
- Authentication Controls. Authentication determines who and what may temporarily decrypt data for use. A user must authenticate themselves to use their own devices safely, but when connecting their device to another device with which data will be shared, it is unsafe to authenticate the user only. Here’s why. To do work, computers require three physical actors working in unison:
- a user issuing instructions
- to an instance of software installed on
- an instance of hardware.
Cyberattackers only need to compromise one of these three actors to take control of data. Without consistently authenticating the user, instance of software, and instance of hardware requesting to connect, it is not possible to be certain who or what is on the other end of the line. Because each actor has unique physical characteristics, each combination of user, instance of software, and instance of hardware can be cryptographically authenticated. This process can be automated and made invisible to the user. It’s known art. We’ll refer to authenticating the user, instance of hardware, and instance of hardware as “full-scope authentication.”
- Temporal Controls. Most data isn’t intended to last forever, so data owners need to be able to control when and for how long their data can be used, and revoke access to shared data when recipients no longer need it.
- Geographical Controls. There are many use cases where data can only be used safely within specified physical or virtual locales. For example: physical location controls enable use only within a specified country. Virtual location controls enable use only within a specified organization’s network.
- Intended Use Controls. Usage controls constrain data to specified uses. For example, software can use data for purpose A, B, and C but not for purpose X, Y, or Z. Intended use controls can be customized for specific use cases, such as turning off a recipient’s ability to forward data to others or to export it from the controlling application. Intended use controls can be set directly by the user or they can be imported. When data is shared with a trusted third party, pre-agreed upon intended use controls can be imported from the third party and applied to the user’s data, and the software will objectively manage the use of the data for both parties.
It Wasn’t Me
Cyberattackers make a handy scapegoat. They provide endless revenue opportunities for symptomatic point solution providers and shifting responsibility away from software makers, but the fundamental mistake was ours; we allowed open data to metastasize throughout the connected world. For the reasons explained above, it is not possible to cure our open data cancer by treating its symptoms with a couple of aspirin, a few dabs of antibiotic cream, and some bandages.
Key Point: A hard truth about our current cybersecurity crisis is that we did this to ourselves.
Key Point: We got into this mess one piece of software and data at a time, so we’ll have to get out of it one piece of software and data at a time.
Next: The sweet science of agile software development
Here are all thirteen segments in the series:
The true cause of cybersecurity failure and how to fix it Hint: The cause and fix are not what you think. David A. Kruger, a member of the Forbes Technology Council, says it’s getting worse: We’re in a hole so stop digging! Get back to root cause analysis.
What’s wrong with cybersecurity technology? Know your enemy: The target isn’t networks, computers, or users; they are pathways to the target —gaining control of data. The challenge: If a cyberdefender scores 1,000,000 and a cyberattacker scores 1, the cyberattacker wins, David Kruger points out.
Ingredients that cybersecurity needs to actually work Software makers continue to produce open data as if we were still living in the 50s, and the Internet had never been invented. Forbes Council’s David Kruger says, the goal should be safety (preventing harm) rather than, as so often now, security (reacting to hacks with new defenses).
Cybersecurity: Put a lid on the risks. We already own the lid. Security specialist David Kruger says, data must be contained when it is in storage and transit and controlled when it is in use. Cyberattackers are not the problem; sloppy methods are. We must solve the problem we created one piece of data or software at a time.
The sweet science of agile software development Effective security, as opposed to partial security, increases costs in the short run but decreases them in the long run. Software veteran: Getting makers to change their priorities to safer products safe rather than the next cool new feature will by no means be easy.
Computer safety expert: Start helping ruin cybercriminals’ lives. Okay, their businesses. Unfortunately, part of the problem is the design of programs, written with the best of intentions… First, we must confront the fact that software makers are not often held responsible for the built-in flaws of their systems.
The cybercriminal isn’t necessarily who you think… Chances are, the “human data collector” is just someone who works for a company that makes money collecting data about you. Did you know that his bosses have paid gazillions in fines for what he and his fellows do? Let’s learn more about what they are up to.
Sometimes, money really is the explanation. Today’s internet is a concentration of power, in terms of information, never before seen in history. The HDCs (human data collectors) treat us as guinea pigs in a thoroughly unethical experiment designed to learn how to manipulate the user most effectively.
How search engine results can be distorted Search providers such as Google are able to increase their ad revenues by distorting the search results delivered to users. Human data collectors (HDCs) have been able to evade responsibility for the preventable harms they cause by blame shifting and transferring risk to users.
How online human data collectors get free from responsibility Cybersecurity expert David A. Kruger talks about the Brave Old World in which you have much less power than Big Tech does. For Big Tech, government fines and other censures are merely a cost of doing business, which makes reform difficult at best.
Cybersecurity: Why a poke in the eye does not work. The current system punishes small businesses for data breaches they could not have prevented. Computer security expert David Kruger says the current system makes as much sense as fining the hit and run victim for not jumping out of the way.
Is your data about yourself too complex for you to manage? That’s the argument human data collectors (HDCs) make for why they should be allowed to collect and own your data. Policymakers should declare that human data is the property of the individual, not of the data collector, computer security expert David Kruger argues.
and
How software makers will push back against reforms Software makers will grumble but insurers may force their hand. That, however, is NOT the Big Battle… the Big Battle: Wall Street will oppose reforms that restore control to you because the market cap of Big Tech depends on human data collection.