Mind Matters Natural and Artificial Intelligence News and Analysis
data-transformation-factory-processing-binary-code-stockpack-adobe-stock
data transformation, factory processing binary code
data transformation, factory processing binary code

Sometimes, Money Really Is the Explanation

Today's internet is a concentration of power, in terms of information, never before seen in history
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Veteran software developer David A. Kruger offered some thoughts on computer security recently at Expensivity. We appreciate the opportunity to republish them here as a series. Last week we looked at the fact that the cybercriminal isn’t necessarily the weirdo in the hoodie. He could just a boring corporate bureaucrat collecting data on you that his boss plans to use later.

Now we look at where the money in the business is:

David Kruger

It’s All About the Benjamins

Why are HDCs [human data collectors] so willing to abuse their own users? For the money and the power that comes from having lots of it. In 2002, Google discovered that the raw human data it was collecting from its users to increase the quality of the user experience could be repurposed to deliver targeted ads, that is, ads delivered to an individual’s screen in real time based on what the individual was currently searching for, and those ads could be repeated, called ad retargeting. That capability turned out to be astoundingly lucrative. As of February 2021, Google’s market capitalization was approximately 1.4 trillion US dollars, and about 85% of their revenue comes from advertising. About 95% of Facebook’s revenue comes from selling ads.

That’s No Moon

Knowledge really is power, and HDCs act as gatekeepers to the sum of all digitized surface web content plus the sum of all the digitized human data they have collected to date. That’s a concentration of power never before seen in human history. Let’s take a closer look at current preventable harms enabled by that concentration.

Spilt Milk

HDCs are creatures of open data; they could not have come into existence, or continue to exist in their current form, without it. Their internal use of open data and dependence on symptomatic point solutions have resulted in multiple preventable harmful breaches of user personal information, and it is unreasonable to project that such breaches have come to an end.  Future preventable breach harms are expected.

Free Spirit

In the list of cybersecurity failure types described previously, impeding the flow of data, is not well understood. Usually, it’s defined only as disrupting the flow of data such as happens in a denial-of-service attack. Another more insidious, and arguably more harmful, impedance is distorting the flow of information.  

The ideal of the early Internet was to be the world’s public library, one that would provide near instantaneous and unrestrained access to the sum of all information available on the surface web (with one notable universal exception—child pornography).

Nobody expected that the information on the new-fangled world wide web would be completely accurate, truthful, and non-contradictory. Why? Because truth, lies, mistakes, misinformation, disinformation, bias, liable, slander, gossip, and the means to broadcast it to enormous audiences existed (gasp) before the Internet. A vital characteristic of a free society, pre-Internet and now, is that people 1) have the right to unimpeded access to public information, 2) are responsible for their own due diligence, and 3) are free to arrive at their own conclusions. Distorting the flow of public information diminishes each, and harms individuals and society as a whole.

Nudge, Nudge, Wink, Wink

Ads are a mix of useful to useless and entertaining to irritating, but nonetheless, producers have a legitimate need to market to their prospects. Advertising and persuasive marketing copy is neither illegal nor immoral. Targeting and retargeting ads based on real-time human behavior provided advertisers with a genuinely new capability, explained below by Shoshana Zuboff in “The Age of Surveillance Capitalism” (reviewed by Expensivity here):

“Advertising had always been a guessing game: art, relationships, conventional wisdom, standard practice, but never “science.” The idea of being able to deliver a particular message to a particular person at just the moment when it might have a high probability of actually influencing his or her behavior was, and had always been, the holy grail of advertising.”

However, Google and other HDCs didn’t stop there—and therein lies the fundamental policy problem.

Google, followed shortly by Facebook and others, discovered that, for a given individual, the greater the volume and diversity of raw human data they can collect and the longer they can collect it, the more effectively the data can be used to slowly and surreptitiously use algorithmic nudging to change the user’s beliefs and behaviors. In other words, HDCs treat human beings as perpetual guinea pigs in an endless and thoroughly unethical experiment by using software designed to learn how to manipulate their user most effectively. This is unethical because the intent of the HDC’s is to use its software to diminish personal autonomy, and they hide their intentions from their user for the most obvious of reasons: If the user becomes aware of how they are being manipulated and for what purposes, they’d likely be angered and demand that the manipulations stop.

In addition to nudging, since users see more ads the longer the stay logged on, HDCs began using their newfound user manipulation capability to addict users to their software. Details about the mechanisms of addiction are not within the scope of this article, but most rely on presenting information and controlling its flow in a manner designed to generate and reenforce a dopamine hit or to amplify negative emotions such as fear, anger, envy, guilt, revenge, and lust. HDCs’ algorithmic nudging and intentional addiction are increasingly understood to be harmful to individuals and society at large, as attested by numerous studies and whistleblower testimony. HDCs are keenly aware of the harm, but it hasn’t stopped them.  

Key Point: Advertising isn’t the problem; user manipulation via surreptitious algorithmic nudging and intentionally addicting users is.

Key Point: The ability to manipulate users for one purpose creates the ability to manipulate users for any purpose.

Next: How search engine results can be distorted


Here are all thirteen segments in the series:

The true cause of cybersecurity failure and how to fix it Hint: The cause and fix are not what you think. David A. Kruger, a member of the Forbes Technology Council, says it’s getting worse: We’re in a hole so stop digging! Get back to root cause analysis.

What’s wrong with cybersecurity technology? Know your enemy: The target isn’t networks, computers, or users; they are pathways to the target —gaining control of data. The challenge: If a cyberdefender scores 1,000,000 and a cyberattacker scores 1, the cyberattacker wins, David Kruger points out.

Ingredients that cybersecurity needs to actually work Software makers continue to produce open data as if we were still living in the 50s, and the Internet had never been invented. Forbes Council’s David Kruger says, the goal should be safety (preventing harm) rather than, as so often now, security (reacting to hacks with new defenses).

Cybersecurity: Put a lid on the risks. We already own the lid. Security specialist David Kruger says, data must be contained when it is in storage and transit and controlled when it is in use. Cyberattackers are not the problem; sloppy methods are. We must solve the problem we created one piece of data or software at a time.

The sweet science of agile software development Effective security, as opposed to partial security, increases costs in the short run but decreases them in the long run. Software veteran: Getting makers to change their priorities to safer products safe rather than the next cool new feature will by no means be easy.

Computer safety expert: Start helping ruin cybercriminals’ lives. Okay, their businesses. Unfortunately, part of the problem is the design of programs, written with the best of intentions… First, we must confront the fact that software makers are not often held responsible for the built-in flaws of their systems.

The cybercriminal isn’t necessarily who you think… Chances are, the “human data collector” is just someone who works for a company that makes money collecting data about you. Did you know that his bosses have paid gazillions in fines for what he and his fellows do? Let’s learn more about what they are up to.

Sometimes, money really is the explanation. Today’s internet is a concentration of power, in terms of information, never before seen in history. The HDCs (human data collectors) treat us as guinea pigs in a thoroughly unethical experiment designed to learn how to manipulate the user most effectively.

How search engine results can be distorted Search providers such as Google are able to increase their ad revenues by distorting the search results delivered to users. Human data collectors (HDCs) have been able to evade responsibility for the preventable harms they cause by blame shifting and transferring risk to users.

How online human data collectors get free from responsibility Cybersecurity expert David A. Kruger talks about the Brave Old World in which you have much less power than Big Tech does. For Big Tech, government fines and other censures are merely a cost of doing business, which makes reform difficult at best.

Cybersecurity: Why a poke in the eye does not work. The current system punishes small businesses for data breaches they could not have prevented. Computer security expert David Kruger says the current system makes as much sense as fining the hit and run victim for not jumping out of the way.

Is your data about yourself too complex for you to manage? That’s the argument human data collectors (HDCs) make for why they should be allowed to collect and own your data. Policymakers should declare that human data is the property of the individual, not of the data collector, computer security expert David Kruger argues.

and

How software makers will push back against reforms Software makers will grumble but insurers may force their hand. That, however, is NOT the Big Battle… the Big Battle: Wall Street will oppose reforms that restore control to you because the market cap of Big Tech depends on human data collection.


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

Sometimes, Money Really Is the Explanation