Mind Matters Natural and Artificial Intelligence News and Analysis
close-up-man-hand-type-on-keyboard-laptop-to-use-search-engine-optimization-seo-tools-for-finding-customer-or-promote-and-advertise-about-content-online-for-marketing-technology-and-business-concept-stockpack-adobe-stock
close up man hand type on keyboard laptop to use search engine optimization (SEO) tools for finding customer or promote and advertise about content online for marketing technology and business concept

How Search Engine Results Can Be Distorted

Search providers such as Google are able to increase their ad revenues by distorting the search results delivered to users
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Veteran software developer David A. Kruger offered some thoughts on computer security recently at Expensivity and we appreciate the opportunity to republish them here as a series. Last week, we looked at the way data is collected on us for and marketed. This week we look at how search engine results can not be what they seem:


Off Target

The promise and purpose of search technology is that with it a user can find what they are looking for, not what the search engine provider deems worthy of being found. That creates an inherent conflict of interest when search providers such as Google are able to increase their ad revenues by distorting the search results delivered to users. Distortion, in this context, is defined as arbitrarily differentiating search results between users, changing their order, and/or withholding results for the purpose of changing user’s beliefs or behavior. The distortion of search results, whether under the guise of “helping users to make good decisions” or selling advertising is still distortion. The quid pro quo of distorted search is: “You give us all of your human data, and we’ll use it to decide what we think is best for you to know.” Such distortion is enabled by enormously complex search algorithms that are claimed as trade secrets. The use of complex algorithms is not the problem, holding them secret is.  

Key Point: When search results are distorted and search algorithms are held secret, the user cannot know how search results are being used to manipulate them.

A Day at the Races

Another manifestation of coupling advertising rates to user manipulation is Search Engine Optimization (SEO). In horse racing, a “tout” is a person who charges bettors for inside information about upcoming races. Touts are good for racetrack owners because people who pay for their knowledge are likely to bet more often and in larger amounts, especially if the tout really does facilitate the occasional win.

That’s a pretty good description of the Search Engine Optimization (SEO) business—they are touts for Google’s and other search provider’s racetrack. In 2020, SEO cost business about $39 Billion USD and millions of man-hours to produce SEO content. The problem with SEO is not that it is ineffective, rather that requirement to do SEO just to increase the odds of being found smacks of restraint of trade. The SEO tout/racetrack game is exclusionary. Many businesses, especially the approximately five million US small businesses with less than twenty employees, may not have the skill or money to engage in SEO; it’s not cheap. But without paying the touts, they cannot be assured of being found.

Stage IV Cancer

Thanks largely to Google’s and Facebook’s success, the collection of raw human data for purposes of monetization has metastasized throughout a significant portion of the software-making world. Some HDCs collect raw human data for their own use, but most collect it for resale. There are millions of HDC apps in the various app stores that are surveillance platform first and app second. These smaller HDC software makers sell human data to data brokers, who altogether do about $200 billion a year in human data trafficking. In the last few years, HDC software makers have been joined by some of the world’s largest hard goods manufacturers whose products happen to contain software that connects to the internet. Examples include automakers, television and other home entertainment device makers, home appliance makers, computer, mobile phone, and tablet makers, mobile device providers, toymakers, and Internet service providers, all anxious to cash in on raw human data.

Despite all this, in a fine example of Orwellian doublespeak, HDCs publicly proclaim themselves to be the champions and protectors of privacy while simultaneously hoovering up as much raw human data as they possibly can. They have redefined privacy from “I, as an individual, decide what, when, and with whom I’ll share information” to “We, as a company, will collect every scrap of your raw human data we can, declare it to be company property, do with it what we will, share it with whom we want, guard it from our competitors—and call the whole thing privacy.” When HDCs say, “We take extraordinary measures to protect your privacy!”, what they mean is “We take extraordinary measures to protect our property!”

Unnecessary Roughness

Many believe that mass raw human data collection is inevitable because advertising-supported HDCs must have it to provide their services for free. The HDC value equation has been “For users to benefit from our service for free, we must collect identifiable human information to fund our operation by selling targeted ads.”

That’s no longer true.

Privacy-enhancing technologies (PETs) that didn’t exist a few years ago are able to extract user attribute data needed to target ads from raw human data without extracting identity information. Software can make such attribute-only data controllable, so we’ll refer to it as controllable attribute-only data. Modern PETs used advances in math to assure that attribute-only data cannot be analyzed to identify specific individuals, and additionally, such analysis can be reliably prevented because the data is controllable. Modern PETs should not be confused with older data anonymization technologies that suffered from recurrent data re-identification problems.

The advent of controllable attribute-only data has a profound implication that policymakers should factor into their thinking. As before, since this is a big picture article, technical detail isn’t provided for the following assertion, but, like other technologies described above, it’s achievable with existing technology: 

Key Point: HDCs can be monetized by targeted advertising without collecting raw human information.

Additionally, there are search engines that:

  • Record zero information about the searcher
  • Do not distort search results
  • Enable users to make their own customizable persistent search filters. In other words, the user controls the search algorithm, not the search engine provider.

The technology to offer privacy-preserving, undistorted, user-controllable search supported by privacy-preserving targeted advertising exists. There is nothing to prevent existing advertising-supported search engines such as Google from “reforming” and ditto for advertising-supported social media. The point is that advertising supported HDCs can reform, but whether they will reform remains to be seen.

These Are Not the Droids You’re Looking For

Before suggesting specific policy fixes, it’s important to understand exactly what policy needs to fix. HDCs have been able to evade responsibility for the preventable harms they cause by 1) blame shifting and 2) arbitrarily transferring risk to their users.

HDCs blame cyberattackers for problems they themselves cause and only they can cure. They transfer what should be their own risk to their users by presenting them a Hobson’s choice embodied in license agreements. These agreements are filled with legalese so dense that the attorneys who don’t specialize in writing them have a hard time figuring out what they mean; the general public doesn’t have a chance. So, as a public service, I’ve translated and summarized them here:

  • “You must click Accept, otherwise you can’t use our software. If you click Accept, you acknowledge that you can never ever hold us responsible for anything, and that the raw human data we take from you is our property, not yours, so we can do whatever we want to with it.”

When a user (or their attorney, or state attorney general, or federal official) complains, HDCs point to the user’s acceptance of the license and declare they aren’t responsible, no matter how egregious the harm.

Next: The Brave Old World


Here are all thirteen segments in the series:

The true cause of cybersecurity failure and how to fix it Hint: The cause and fix are not what you think. David A. Kruger, a member of the Forbes Technology Council, says it’s getting worse: We’re in a hole so stop digging! Get back to root cause analysis.

What’s wrong with cybersecurity technology? Know your enemy: The target isn’t networks, computers, or users; they are pathways to the target —gaining control of data. The challenge: If a cyberdefender scores 1,000,000 and a cyberattacker scores 1, the cyberattacker wins, David Kruger points out.

Ingredients that cybersecurity needs to actually work Software makers continue to produce open data as if we were still living in the 50s, and the Internet had never been invented. Forbes Council’s David Kruger says, the goal should be safety (preventing harm) rather than, as so often now, security (reacting to hacks with new defenses).

Cybersecurity: Put a lid on the risks. We already own the lid. Security specialist David Kruger says, data must be contained when it is in storage and transit and controlled when it is in use. Cyberattackers are not the problem; sloppy methods are. We must solve the problem we created one piece of data or software at a time.

The sweet science of agile software development Effective security, as opposed to partial security, increases costs in the short run but decreases them in the long run. Software veteran: Getting makers to change their priorities to safer products safe rather than the next cool new feature will by no means be easy.

Computer safety expert: Start helping ruin cybercriminals’ lives. Okay, their businesses. Unfortunately, part of the problem is the design of programs, written with the best of intentions… First, we must confront the fact that software makers are not often held responsible for the built-in flaws of their systems.

The cybercriminal isn’t necessarily who you think… Chances are, the “human data collector” is just someone who works for a company that makes money collecting data about you. Did you know that his bosses have paid gazillions in fines for what he and his fellows do? Let’s learn more about what they are up to.

Sometimes, money really is the explanation. Today’s internet is a concentration of power, in terms of information, never before seen in history. The HDCs (human data collectors) treat us as guinea pigs in a thoroughly unethical experiment designed to learn how to manipulate the user most effectively.

How search engine results can be distorted Search providers such as Google are able to increase their ad revenues by distorting the search results delivered to users. Human data collectors (HDCs) have been able to evade responsibility for the preventable harms they cause by blame shifting and transferring risk to users.

How online human data collectors get free from responsibility Cybersecurity expert David A. Kruger talks about the Brave Old World in which you have much less power than Big Tech does. For Big Tech, government fines and other censures are merely a cost of doing business, which makes reform difficult at best.

Cybersecurity: Why a poke in the eye does not work. The current system punishes small businesses for data breaches they could not have prevented. Computer security expert David Kruger says the current system makes as much sense as fining the hit and run victim for not jumping out of the way.

Is your data about yourself too complex for you to manage? That’s the argument human data collectors (HDCs) make for why they should be allowed to collect and own your data. Policymakers should declare that human data is the property of the individual, not of the data collector, computer security expert David Kruger argues.

and

How software makers will push back against reforms Software makers will grumble but insurers may force their hand. That, however, is NOT the Big Battle… the Big Battle: Wall Street will oppose reforms that restore control to you because the market cap of Big Tech depends on human data collection.


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

How Search Engine Results Can Be Distorted