Google Engineer Reveals Search Engine BiasHe found Google pretty neutral in 2014; the bias started with the US 2016 election
Recently, senior Google engineer Gregory Coppola contacted watchdog group Project Veritas to talk about the Google bias that affects the information users get about political choices. (Here’s the interview.)
Coppola, who grew up in Canada, told Project Veritas that he had experienced Google as a pretty neutral place politically when he first started working on Google Assistant there in 2014. Things changed during the US 2016 election when “every tech company, everybody in New York, everybody in the field of computer science basically believed that” anyone who supported Donald Trump was a racist. Inconveniently for himself, Coppola kind of liked Trump.
As another American election looms, he relayed his concerns to Project Veritas. From the video:
Narrator: According to Coppola, Big Tech’s agenda is dangerous, in large part, because it is hidden.
Coppola: I think we had a long period, of ten years, let’s say, where we had search and social media that didn’t have a political bias and we kind of got used to the idea that top search was that, Google was probably the answer. And Robert Epstein, who testified before Congress last week, looked into it and showed that the vast majority of people think that if something is more highly rated on Google Search than another story, then it would be more important and more correct. And, and we haven’t had time to absorb the fact that tech might have an agenda. I mean, it’s something we’ve only started to talk about now. That’s why I’m talking to you. [transcribed by Denyse O’Leary, Ottawa, July 27, 2019]
Coppola is referring to the recent testimony of widely published behavioral research psychologist Robert Epstein (who described himself as “center/center-left” before a Senate committee):
Data I’ve collected since 2016 show that Google displays content to the American public that is biased in favor of one political party (Epstein & Williams, 2019) – a party I happen to like, but that’s irrelevant. No private company should have either the right or the power to manipulate large populations without their knowledge. …
I reach out to diverse, different audiences because the threats posed by Google, and, to a lesser extent, Facebook, are so serious that I think everyone in the world needs to know about them. I put my own political leanings aside when I report my data and concerns because the problems these companies present eclipse personal politics. To put this another way, I love humanity, my country, and democracy more than I love any particular party or candidate. And democracy as originally conceived cannot survive Big Tech as currently empowered.
If you were to examine the data I have been collecting over the past 6-and-a-half years, every one of you would put partisanship aside and collaborate to reign in the extraordinary power that Google and Facebook now wield with unabashed arrogance.Robert Epstein, “Why Google Poses a Serious Threat to Democracy, and How to End That Threat (transcript)” at American Institute for Behavioral Research and Technology
Epstein offers five “disturbing findings,” among which is
In 2016, biased search results generated by Google’s search algorithm likely impacted undecided voters in a way that gave at least 2.6 million votes to Hillary Clinton (whom I supported). I know this because I preserved more than 13,000 election-related searches conducted by a diverse group of Americans on Google, Bing, and Yahoo in the weeks leading up to the election, and Google search results – which dominate search in the U.S. and worldwide – were significantly biased in favor of Secretary Clinton in all 10 positions on the first page of search results in both blue states and red states.Robert Epstein, “Why Google Poses a Serious Threat to Democracy, and How to End That Threat (transcript)” at American Institute for Behavioral Research and Technology
Epstein’s testimony links up with earlier observations such as that “searches for Hillary Clinton did not autocomplete to words that were popular searches if they reflected negatively on the Democratic candidate .”
Users, of course, don’t know that the algorithm is configured that way. We think that’s the best information available in some neutral sense. But as Coppola told Project Veritas, the algorithms — the series of commands to computers — “don’t write themselves.” People may write their own opinions into an algorithm, knowingly or otherwise.
In a recent article, Coppola offers more details of the mechanics of opinion manipulation, using Google News as an example and choosing “donald trump” as a subject:
The most-used site, CNN, is selected in 20% of all articles! In other words, even with the millions of sites on the Internet, 1 out of every 5 stories about “donald trump” from Google News is from CNN.Gregory Coppola, “Analyzing Google News: Introduction” at Medium
The significance of that fact is that, among larger American media networks, CNN is thought to interact with the current White House in the most hostile way. For example, the network sued the White House recently over the denial of a press pass to one of its reporters.
From the partial transcript at Project Veritas:
… COPPOLA: But I guess I just, you know, I look at search and I look at Google News and I see what it’s doing and I see Google executives go to Congress and say that it’s not manipulated. It’s not political. And I’m just so sure that’s not true. That it’s, you know, it becomes a lot less fun to work on the product. So it affects you that much. Yeah, definitely. I mean, the thing about Google is if you leave, um, you know, any other salary at any other company will be lower. Hmm. So I do think it’s a sacrifice.” …
COPPOLA: “I just want to say to all the non-programmers that I really don’t buy the idea that big tech is politically neutral, and I think we need to start incorporating that into whatever strategy we use to have a democracy going forward.”
Coppola’s attitude to his bosses is ambivalent. He respects CEO Sundar Pichai and does not believe that he or other top Google executives intentionally assert falsehoods at hearings. But he cannot accept the information given as objective fact. Although Coppola also insists that Google treats customer data confidentially and tries to be fair, he notes that it only takes a few people to influence a result: “And in fact, I think it would only take a couple out of an organization of 100,000, you know, to make sure that the product is a certain way…”
He is certainly not claiming that it is all some kind of conspiracy; rather, the favor shown to a “handful of sites” that are “vitriolically opposed” to Trump is the outcome of concentrated group opinion at Google. That said, he insists, it amounts to “interference in the American election.”
Coppola has, of course, been placed on administrative leave. He has started a fund-raising campaign, seeking living expenses, “to spend four months publishing content about issues in politics and technology” from his home in New York City.
Some Democrats are getting antsy too. A longshot candidate for the Democratic Party’s nominee for President in 2020, Tulsi Gabbard, is suing Google because “Google’s discriminatory actions against my campaign are reflective of how dangerous their complete dominance over internet search is.”
One thing we can count on is greater public interest in how algorithms are constructed, a subject that has emptied auditoriums in the past.
Note: Coppola received his engineering degree from the University of Waterloo and his MA in linguistics from Simon Fraser University, both in Canada, and his doctorate in engineering from the University of Edinburgh, according to his Linked In profile.
Further reading: Algorithms can be unknowingly biased as well, See, for example,
Did AI teach itself to “not like” women? No, the program did not teach itself anything. But the situation taught the company something important about what we can safely automate.
Can an algorithm be racist? No, the machine has no opinion. It processes vast tracts of data. And, as a result, the troubling hidden roots of some data are exposed