As the third and final speaker at the Dallas launch of the Walter Bradley Center for Natural and Artificial Intelligence, philosopher of technology George Gilder, author of Life after Google, offered some insights into the ultimate vision of the current AI technocrats:
Seriously, the Google people do not believe that there will be life after Google. Their vision of AI usurping human minds really represents what I call an eschaton, a final thing, almost like an eschatological vision.
They believe that AI will achieve what my friend Ray Kurzweil calls the Singularity when it will attain capabilities far beyond human minds and thus be able to reproduce itself. And project itself into the universe and seed the universe with a cascade of ever more intelligent machines, thus kind of propagating human life throughout the universe.
And since many of the people at Google — most of them probably — believe in multiple parallel universes (which is a preposterous view), these AI machines will multiply on. Larry Page and Sergei Brin and Ray Kurzweil and all these Singularitarians can fly off to a nearby planet with Elon Musk, leaving the rest of us back on the beach in the United States collecting guaranteed annual income.
If that is what they believe, Google’s business activity is surely worth a glance. The other day, senior research scientist Jack Poulson revealed that he had quit Google over its involvement with totalitarian AI censorship in China. He was told,
We can forgive your politics and focus on your technical contributions as long as you don’t do something unforgivable, like speaking to the press.”
This was the parting advice given to me during my exit interview from Google after spending a month internally arguing, resignation letter in hand, for the company to clarify its ethical red lines around Project Dragonfly, the effort to modify Search to meet the censorship and surveillance demands of the Chinese Communist Party.Jack Poulson, “I Used to Work for Google. I Am a Conscientious Objector.” at The New York Times (April 23, 2019)
If the surveillance technology developed for China catches on in the West, however numberless the Googlers’ infinite parallel universes may be, Americans will be constantly and closely observed while sitting behind on the beach.
But Gilder doesn’t think that the story will really end that way:
He compares the Googlers’ utopia to Marx’s utopia, in which the distinction between the state and society is abolished. Both visions of the future aim to eliminate the need for human productivity. But, he insists, “AI will make human beings more productive. It will make them more employable. It will generate the very capital that can endow new work. This what technology has always done and it is no different today.”
If so, the Singularity is far—not only far but ever receding. Thus Google’s utopia is as unlikely as Marx’s. Thus, he concludes, “I think almost everything in the Google philosophy is wrong.” And, in his view, the enterprise is not too big to fail.
See also: George Gilder: Life after Google will be okay
George Gilder talks tech at World News Daily. In a three-part interview, he explains why he thinks Google is doomed
George Gilder explains what’s wrong with “Google Marxism”
Also from the Dallas launch: Computer science professor Robert J. Marks: Are there things about human beings that you cannot write code for? As Bradley Center director, he relates that question to the Center’s goals and to human aspirations
Also from the Dallas launch: Jay Richards: Creative freedom, not robots, is the future of work The Officially Smart people are telling us two scenarios, good and bad, about the impact of artificial intelligence (AI), says Jay Richards, a research professor at the Busch School of Business at the Catholic University of America. He disagrees with both. In an information economy, he says, there will be a place where the human person is at the very center
Walter Bradley: Tell people about AI, not sci-fi. His struggle to bring reality to“sci-fi” origin of life research is the Center’s inspiration.