Mind Matters Natural and Artificial Intelligence News and Analysis
close-up-businesswoman-collecting-data-information-converting-into-statistics-planning-strategy-gathering-resources-creating-visual-graphical-graphs-using-computer-laptop-and-smart-mobile-device-stockpack-adobe-stock
Close up businesswoman collecting data information converting into statistics, planning strategy gathering resources creating visual graphical graphs using computer laptop and smart mobile device
Close up businesswoman collecting data information converting into statistics, planning strategy gathering resources creating visual graphical graphs using computer laptop and smart mobile device

How Online Human Data Collectors Get Free From Responsibility

Cybersecurity expert David A. Kruger talks about the Brave Old World in which you have much less power than Big Tech does
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Veteran software developer David A. Kruger offered some thoughts on computer security recently at Expensivity and we appreciate the opportunity to republish them here as a series. Last week, we looked at how search engine results can be distorted. This week, we look at how HDCs (human data collectors) free themselves from any responsibility for outcomes.

Brave Old World

HDCs’ licensing strategy is designed to free them from any vestige of fiduciary duty. Fiduciary law traces its roots back to the Code of Hammurabi in 1790 BC, through the Roman Empire, early British law, and up to the present day.

David Kruger

The purpose of fiduciary law is to compensate for two sad facts of human nature. In unequally powered business relationships, 1) businesses with more power will abuse customers with less power, and 2) the greater the disparity of power between the business and the customer, the more likely customer abuse will occur if left unchecked. The purpose of fiduciary law is to inhibit customer abuse by assigning the business statutory duties to act in the best interests of their customers. There are many unequal power relationships between many kinds of businesses and customers, so there is an enormous amount of common and black letter fiduciary law for policymakers to draw on. Common fiduciary duties include:

  1. Duty of Care. Businesses have a duty to not harm their customers
  2. Duty of Loyalty. Businesses have a duty to not place their interests above the interests of their customers.
  3. Duty of Good Faith. Businesses have a duty to act in good faith, meaning they must deal fairly with customers. Examples of acting in bad faith towards customers includes lying to them, using deceptive practices, and shirking their obligations.
  4. Duty of Confidentiality. Businesses have a duty to protect their customers’ sensitive or confidential information.
  5. Duty of Disclosure. Businesses have a duty to act with candor, that is, to answer customers’ and regulators’ questions honestly.

A Slap on the Wrist

The high number of government-brought lawsuits against HDCs all around the world, the thousands of pages of laws and regulations designed to reign in HDCs’ bad behavior, the employment of thousands of regulators, and fines, penalties, judgments, and settlements in the billions of dollars make it abundantly clear that policymakers are aware of the harms HDCs are causing.  

However, it is also abundantly clear that policymakers have fallen prey to the symptomatic point solution fallacy in two ways. First, to date, legislation, regulation, and litigation designed to reduce cybersecurity failure has been deterrence based, that is, if you don’t adhere to behavior A, you’ll get punishment B. Just like technological symptomatic point solutions, deterrence policy is an attempt to stop bad behavior (symptoms) instead of eliminating deficiencies in policy that enable HDC bad behavior (fixing the root cause).

Deterrence-based policy, like its technological symptomatic point solution cousin, is afflicted with a math problem. Deterrence implemented as criminal prosecution or political or military reprisal for successful cyberattacks cannot achieve a high enough ratio of successful prosecutions or reprisals to successful attacks to generate any real fear on the part of the cyberattackers. What miniscule success deterrence policy has achieved is perceived by criminal and military/terroristic cyberattackers as acceptable risk.

The same applies to deterrence measures contained within privacy laws and regulations. The ratio of punishments to revenues generated while violating laws and regulations is so low that big tech HDCs absorb them as merely the cost of doing business. Millions to billions of dollars in annual monetary penalties might sting them a bit, but when the aggregate cost of non-compliance is a small percentage of annual revenue offset by charging captive advertisers slightly higher ad rates, they don’t do much of anything. The tens of thousands of small HDCs clogging up app stores tend to be domiciled overseas and too small to be worth prosecuting.

That’s why deterrence has hardly been more than a speed bump to cyberattackers, including big tech HDCs’ drive to acquire all human data and continuing to use it in harmful ways.

Key Point: if the metric for deterrence policy success is the degree to which it has decreased successful cyberattacks, including breaches, human data collection, lying, inveiglement, deception, and user manipulation, it’s had little success.

Key Point: The root cause of ineffective policy isn’t insufficient deterrence; it’s allowing software makers to arbitrarily exempt themselves from fiduciary duty and transfer their risk to their users.

Next: Why a poke in the eye does not work.


Here are all thirteen segments in the series:

The true cause of cybersecurity failure and how to fix it Hint: The cause and fix are not what you think. David A. Kruger, a member of the Forbes Technology Council, says it’s getting worse: We’re in a hole so stop digging! Get back to root cause analysis.

What’s wrong with cybersecurity technology? Know your enemy: The target isn’t networks, computers, or users; they are pathways to the target —gaining control of data. The challenge: If a cyberdefender scores 1,000,000 and a cyberattacker scores 1, the cyberattacker wins, David Kruger points out.

Ingredients that cybersecurity needs to actually work Software makers continue to produce open data as if we were still living in the 50s, and the Internet had never been invented. Forbes Council’s David Kruger says, the goal should be safety (preventing harm) rather than, as so often now, security (reacting to hacks with new defenses).

Cybersecurity: Put a lid on the risks. We already own the lid. Security specialist David Kruger says, data must be contained when it is in storage and transit and controlled when it is in use. Cyberattackers are not the problem; sloppy methods are. We must solve the problem we created one piece of data or software at a time.

The sweet science of agile software development Effective security, as opposed to partial security, increases costs in the short run but decreases them in the long run. Software veteran: Getting makers to change their priorities to safer products safe rather than the next cool new feature will by no means be easy.

Computer safety expert: Start helping ruin cybercriminals’ lives. Okay, their businesses. Unfortunately, part of the problem is the design of programs, written with the best of intentions… First, we must confront the fact that software makers are not often held responsible for the built-in flaws of their systems.

The cybercriminal isn’t necessarily who you think… Chances are, the “human data collector” is just someone who works for a company that makes money collecting data about you. Did you know that his bosses have paid gazillions in fines for what he and his fellows do? Let’s learn more about what they are up to.

Sometimes, money really is the explanation. Today’s internet is a concentration of power, in terms of information, never before seen in history. The HDCs (human data collectors) treat us as guinea pigs in a thoroughly unethical experiment designed to learn how to manipulate the user most effectively.

How search engine results can be distorted Search providers such as Google are able to increase their ad revenues by distorting the search results delivered to users. Human data collectors (HDCs) have been able to evade responsibility for the preventable harms they cause by blame shifting and transferring risk to users.

How online human data collectors get free from responsibility Cybersecurity expert David A. Kruger talks about the Brave Old World in which you have much less power than Big Tech does. For Big Tech, government fines and other censures are merely a cost of doing business, which makes reform difficult at best.

Cybersecurity: Why a poke in the eye does not work. The current system punishes small businesses for data breaches they could not have prevented. Computer security expert David Kruger says the current system makes as much sense as fining the hit and run victim for not jumping out of the way.

Is your data about yourself too complex for you to manage? That’s the argument human data collectors (HDCs) make for why they should be allowed to collect and own your data. Policymakers should declare that human data is the property of the individual, not of the data collector, computer security expert David Kruger argues.

and

How software makers will push back against reforms Software makers will grumble but insurers may force their hand. That, however, is NOT the Big Battle… the Big Battle: Wall Street will oppose reforms that restore control to you because the market cap of Big Tech depends on human data collection.


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

How Online Human Data Collectors Get Free From Responsibility