In the first of four podcasts with Gary Smith, author of The AI Delusion, he and Walter Bradley Center director Robert J. Marks discussed the 2016 US election, which analysts expected to be won by Hillary Clinton. For one thing, Clinton had inherited sophisticated artificial intelligence tools from her predecessor and onetime rival for the Democratic party nomination Barack Obama.
What went wrong for her is instructive if we want to understand what data can and can’t do.
Marks: Barack Obama succeeded handsomely in his presidential campaign with a geek-driven campaign. But Hillary Clinton’s presidential campaign did not succeed with the same approach. Why not?
The Obama campaign put every potential voter into its database, along with hundreds of tidbits of personal information: age, gender, marital status, race, religion, address, occupation, income, car registrations, home value, donation history, magazine subscriptions, leisure activities, Facebook friends, and anything else they could find that seemed relevant…
After Obama secured the nomination, the fund-raising continued. For the full 2008 election campaign, Obama raised $780 million, more than twice the amount raised by his Republican opponent, John McCain. McCain didn’t have a realistic chance of winning, and he didn’t—with only 173 electoral votes to Obama’s 365.Gary Smith, “How Trump beat Ada’s big data” at OUP Blog
That should have worked for Clinton too, surely. Her team of sixty mathematicians and statisticians created a program called Ada (named after a famous mathematician, Ada Lovelace (1815–1852) to replicate the success, as widely expected. But, as Smith explains to Marks:
Smith: Some of the stuff that made a difference, they couldn’t put in a computer. Like the enthusiasm factor. When Bernie Sanders and Donald Trump gave speeches, tens of thousands of people showed up and yelled and screamed and were excited. And when Hillary Clinton gave a speech, a couple of hundred people would show up and sit politely. And you couldn’t put that in a computer. The computer algorithm had no idea she was in trouble. He observed that they also couldn’t enter into the computer the underlying concern about job erosion and loss which can’t really be quantified. Some people say, if you can’t measure it, it doesn’t count but sometimes the things that count can’t be measured.
Ada is just a computer program and, like all computer programs, has no common sense or wisdom. Any human who had been paying the slightest bit of attention noticed Clinton’s vulnerability against Bernie Sanders, a virtually unknown 74-year-old Socialist senator from Vermont, who was not even a Democrat until he decided to challenge Clinton. A human would have tried to figure out why Sanders was doing so well; Ada didn’t.
When Clinton suffered a shock defeat to Sanders in the Michigan primary, it was obvious to seasoned political experts and campaign workers who were on the ground talking to real voters that Sanders’ populist message had tremendous appeal and that the blue-collar vote could not be taken for granted; Ada didn’t notice.Gary Smith, “How Trump beat Ada’s big data” at OUP Blog
In the aftermath, a Democratic pollster considered it “malpractice” to rely more on the machine than on Electoral College numbers. And Bill Clinton was so frustrated trying to discuss the risks that he reportedly threw his phone out of a window.
In his writings, Smith quotes the aphorism that one can torture Big Data into confessing. But, by definition, one can’t then turn around and expect the information to be reliable.
Also featuring Gary Smith: AI Delusions: A statistics expert sets us straight We learn why Watson’s programmers did not want certain Jeopardy questions asked
Next week: Why IBM Watson is Going Toes Up
Here’s a list of podcasts from Mind Matters News
Explore more of the paradoxes of Big Data: Big Data can lie: Simpson’s ParadoxThe Paradox illustrates the importance of human interpretation of the results of data mining
Study shows eating raisins causes plantar warts Sure. Because, if you torture a Big Data enough, it will confess to anything
IBM’s Watson is NOT our new computer overlord