Mind Matters Natural and Artificial Intelligence News and Analysis
business-technology-internet-and-networking-concept-young-businesswoman-working-on-his-laptop-in-the-office-select-the-icon-security-on-the-virtual-display-stockpack-adobe-stock
Business, technology, internet and networking concept. Young businesswoman working on his laptop in the office, select the icon security on the virtual display.
Business, technology, internet and networking concept. Young businesswoman working on his laptop in the office, select the icon security on the virtual display.

Computer Safety Expert: Start Helping Ruin Cybercriminals’ Lives

Okay, their businesses. Unfortunately, part of the problem is the design of programs, written with the best of intentions…
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Veteran software developer David A. Kruger offered some thoughts on computer security recently at Expensivity and we appreciate the opportunity to republish them here as a series. Yesterday’s discussion focused on agile software development. Today’s discussion looks at making life somewhat less comfortable for the guy who wants to steal your credit card number.

Ruining the Economics of Cyberattack

Would fully implementing controllable data and full scope authentication prevent every cybersecurity failure? Of course not. There are scenarios, particularly those aided by human gullibility, ineptitude, and negligence, where cybersecurity can and will continue to fail. However, cyberattacks are carried out by human beings for the purpose of acquiring money and/or exercising power, and there is a cost/benefit analysis behind every attack. Controllable data and full scope authentication, even though imperfect, increases the cost of illicitly gaining control of data by several orders of magnitude, thereby significantly diminishing the motivation to attack—and that’s the point.

David Kruger

Programming Ethics

The staff and management of many software makers are completely unaware of the inherent hazardousness of open data and partial authentication and their causal link to preventable cybersecurity harms. Many are genuinely committed to programming ethics, but their concept of cybersecurity is based on the symptomatic point solution fallacy. The fallacy is continually reenforced by their professors, peers, textbooks, trade publications, and endless articles about cybersecurity, most of which lead with images of a scary faceless hooded figure hunched over a keyboard—the dreaded cyberattacker. It would be unreasonable to hold them responsible for believing what they’ve been taught, especially given that symptomatic point solutions actually do thwart most cyberattacks; they’re just inherently insufficient due to the asymmetrical nature of attack and defense. That being said, once staff and management understand that cybersecurity failure is caused by software design, not cyberattackers, many professing adherence to programming ethics will have some hard decisions to make.   

Part Two – Cybersecurity Policy

Lesson Learned 5. The designer neglected to compensate for a known operating condition, therefore, they are responsible for fixing existing and new designs.

When it comes to fixing a root cause, there are two questions. The first is “Who is able to apply the fix?”, and the second is “who is responsible for applying the fix?” The “who is able” question is about engineering because it’s about redesigning an engineered process. That was the subject of Part One—Cybersecurity Technology.

“Who is responsible” is about policy because the responsibility for preventing harm and assessing liability for failing to prevent harm is decided by policymakers, that is, by legislators and regulators. The role of policymakers is crucial if the strategy of software makers causing preventable harm is to evade their responsibility. That’s the subject of Part Two—Cybersecurity Policy.

The first question was answered earlier: Only software makers can apply the fix because data is the hazard, and the form of data is as software makes it to be. Logically, you would expect the answer to the “Who is responsible for applying the fix?“ to be “Obviously, software makers are responsible because 1) their product is causing preventable harm, and 2) they are the only ones able to fix it.” That entirely reasonable expectation would be buttressed by the fact that essentially every other kind of manufacturer of potentially harmful things, such as planes, trains, automobiles, chemical plants, pharmaceuticals, mining and pipeline equipment, children’s toys, and electrical appliances are all held responsible and liable for their design shortcomings when they cause preventable harm.

Unfortunately, perhaps tragically, policymakers aren’t holding software makers responsible for the preventable harms they are causing because policymakers too are caught up in the symptomatic point solution fallacy. In Part Two, we are going to focus on examining software maker motives, evasion tactics, and preventable harms resulting from impeding the flow of data, and finish with policy recommendation and a look towards the future. Hold on tight—this long and bumpy road is about to get a lot rougher.

Next: The cybercriminal isn’t necessarily who you think… Let’s find out more about who he is.


Here are all thirteen segments in the series:

The true cause of cybersecurity failure and how to fix it Hint: The cause and fix are not what you think. David A. Kruger, a member of the Forbes Technology Council, says it’s getting worse: We’re in a hole so stop digging! Get back to root cause analysis.

What’s wrong with cybersecurity technology? Know your enemy: The target isn’t networks, computers, or users; they are pathways to the target —gaining control of data. The challenge: If a cyberdefender scores 1,000,000 and a cyberattacker scores 1, the cyberattacker wins, David Kruger points out.

Ingredients that cybersecurity needs to actually work Software makers continue to produce open data as if we were still living in the 50s, and the Internet had never been invented. Forbes Council’s David Kruger says, the goal should be safety (preventing harm) rather than, as so often now, security (reacting to hacks with new defenses).

Cybersecurity: Put a lid on the risks. We already own the lid. Security specialist David Kruger says, data must be contained when it is in storage and transit and controlled when it is in use. Cyberattackers are not the problem; sloppy methods are. We must solve the problem we created one piece of data or software at a time.

The sweet science of agile software development Effective security, as opposed to partial security, increases costs in the short run but decreases them in the long run. Software veteran: Getting makers to change their priorities to safer products safe rather than the next cool new feature will by no means be easy.

Computer safety expert: Start helping ruin cybercriminals’ lives. Okay, their businesses. Unfortunately, part of the problem is the design of programs, written with the best of intentions… First, we must confront the fact that software makers are not often held responsible for the built-in flaws of their systems.

The cybercriminal isn’t necessarily who you think… Chances are, the “human data collector” is just someone who works for a company that makes money collecting data about you. Did you know that his bosses have paid gazillions in fines for what he and his fellows do? Let’s learn more about what they are up to.

Sometimes, money really is the explanation. Today’s internet is a concentration of power, in terms of information, never before seen in history. The HDCs (human data collectors) treat us as guinea pigs in a thoroughly unethical experiment designed to learn how to manipulate the user most effectively.

How search engine results can be distorted Search providers such as Google are able to increase their ad revenues by distorting the search results delivered to users. Human data collectors (HDCs) have been able to evade responsibility for the preventable harms they cause by blame shifting and transferring risk to users.

How online human data collectors get free from responsibility Cybersecurity expert David A. Kruger talks about the Brave Old World in which you have much less power than Big Tech does. For Big Tech, government fines and other censures are merely a cost of doing business, which makes reform difficult at best.

Cybersecurity: Why a poke in the eye does not work. The current system punishes small businesses for data breaches they could not have prevented. Computer security expert David Kruger says the current system makes as much sense as fining the hit and run victim for not jumping out of the way.

Is your data about yourself too complex for you to manage? That’s the argument human data collectors (HDCs) make for why they should be allowed to collect and own your data. Policymakers should declare that human data is the property of the individual, not of the data collector, computer security expert David Kruger argues.

and

How software makers will push back against reforms Software makers will grumble but insurers may force their hand. That, however, is NOT the Big Battle… the Big Battle: Wall Street will oppose reforms that restore control to you because the market cap of Big Tech depends on human data collection.


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

Computer Safety Expert: Start Helping Ruin Cybercriminals’ Lives