Mind Matters Natural and Artificial Intelligence News and Analysis
Photo by Chris Yang

Technology Centralizes by Its Very Nature

Here are some other truths about technology, some uncomfortable ones
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

In the last decade, we’ve witnessed the global expansion of AI, largely in an algorithm — Deep Learning — coupled with Big Data. Deep Learning and Big Data excel at tasks like visual object recognition. Once difficult AI problems like recognizing faces in photos became easier so companies like Google and Facebook began offering AI-powered photo recognition services. Suddenly “the machine” could spot pictures of your friends and you and suggest new groups, new tags, or — hey! — a new conversation.

In the same decade, China used essentially the same AI technologies, together with 170 million CCTV cameras, to blanket Beijing and other major cities with pervasive surveillance. The same technology that recognizes your friend or your cat can also catch you jaywalking or looking at goods in a shop window. Those who worry about the technology’s double-sided nature are told that it’s up to well-meaning individuals, companies, and governments to be vigilant, protecting our hallowed sphere of liberties and freedom.

All that is true, of course. But there’s a deeper problem with the growth of technology that explodes the distinction between “AI” and digital technology generally and suggests that the angel vs. devil view of technology is, unfortunately, a shallow take.

This is the article I didn’t want to write. But it’s time to say it.

Here are some truths, as far as I can tell. Some uncomfortable ones:

1) Technology in general cannot be rolled back. Once it’s developed and finds a practical or scientific use, it’s pressed into service somewhere, somehow. Thus technology grows in our lives as a function of time. Rewind to 2010. Less technology. To 2000. Even less. Go back a hundred years and you would experience, for all practical purposes, a different realm and a different social world.

You would experience massive gaps in the technological fabric that we’re all “wearing” today. Telecommunication would be mostly voice (“You there! Oh. He can’t hear me.” would capture a lot of interpersonal communication in 1920). Road signs exist but it seems like drunken teenagers stole half of them. Warning labels too have been stripped down, as if common sense is suddenly assumed to exist everywhere. Anything that a normal adult wouldn’t try to swallow would not be identified as a choking hazard. Nutrition labels, too, are mysteriously missing. And without all the buzz of artificial lights, the night sky shines ever so brightly.

Sure, people would talk differently and you’d notice other social differences. But mostly you’d notice the changes in technology. Here’s an upside to your adventure: By virtue of rewinding time alone, you are less trapped in a thick fabric of technology everywhere. Conversely, to understand the future, roll the dial forward: and see more and more technology. All the cinematic takes on the future, like Minority Report (2002) or Blade Runner (1982) capture this basic intuition. In Minority Report, it’s hard to get off the grid anywhere. Ads follow you through stores, offering you a Guinness when you seem stressed (so: the future is using “AI” to recognize your face). The future, barring an apocalypse, is just chock full of technology someone managed to foist on to the rest of us. So (1) is an axiom. A truth.

2) Technology, by its very nature, centralizes. Early Web visionaries espoused a radical freedom, a kind of libertarian ideal where the point of going online was to escape the increasing fuss and surveillance of the real world. But the nature of the technology virtually (no pun intended) guaranteed that, over time, the Web would begin centralizing. It would begin tracking down the various pockets of privacy-lovers, linking them together, coaxing them onto Twitter, moving servers into server farms, moving entrepreneurial ideas themselves into Big Tech. In the space of a couple of decades, Web start-ups morphed, for all practical purposes, into massive monopolies.

If today’s digital technology were a government, it would be the Soviet Union. In the early 2000s someone figured out that social networks were cool; now we have Facebook, with its one billion members. Ditto photo sharing sites (it’s mostly all Instagram these days). And of course search (but we don’t even search anymore, we “google”). All this concentration gets variously excused or blamed by pundits and critics as failures of business or government, or short-sighted customers.

At root, though, these factors are not the real cause. The technology itself is the real cause. It grows. It moves to the center. It centralizes itself. No matter what tech we develop, if people find it useful, it will grow into a centrally-controlled, massive structure. To see what I mean, consider a counterexample: non-digital technologies, like crafts or tools. If you buy a shovel or a Buck knife, you don’t thereby link yourself into a vast database of shovel-users. The shovel doesn’t keep track of your shoveling, read your biometrics, and store a file on you-as-shoveler somewhere. It’s a thing, an artifact. So you see, the digital technology is itself the problem. No Matrix could be built with tools and artifacts.

3) Because (1) means that tech grows by its nature and (2) means that it grows into a central node or structure, our world will pretty much naturally tend toward centralizing power and thereby threaten privacy and even liberty. This may seem dystopian, but it’s already happening. Anonymity and privacy are becoming commodities. And the difference between a liberal democracy and China is, while hugely important, is still not the most basic problem. The great hope and conceit is that smart protectors of small government can do something about technology—pass a law, fight back—to keep the world a place worth living. The problem is that they can’t control the natural tendency of technology. (1) and (2) will happen, no matter what.

I call this the Depressing Thesis. I didn’t want to believe it. But, unfortunately, the empirical evidence for (1) and (2) is overwhelming. And once you see it, you see it everywhere. Big Tech? Of course. China and supersurveillance? Had to happen. (Trump and Twitter? Guess so.)

The first signs of my own discontent came early on, while reading Francis Fukuyama’s brilliant but ill-timed The End of History and the Last Man (1992). Fukuyama suggested that liberal democracy represented the final political innovation in human history, the best idea in all our various attempts to get along in society. Fukuyama, almost alone among political theorists in taking technology in and of itself as a serious force, argued for the truth of a version of (1) without admitting it directly. Tech can’t be “rolled back,” he explained, pointing out that once an innovation is available and proves helpful, it grows like a fungus everywhere. So: People tinker. An idea comes. It gets implemented. People like it. It’s HERE. The only way to get rid of a tech is to replace it with something that works better or that people like more. We can get rid of black and white TVs by developing Technicolor ones. We can lose our TVs by watching the content on our laptops or smart phones—which, helpfully, tend to follow us around all day now, unlike the older TVs.

4) Technology wins wars. (See Robert Marks’s latest book, The Case for Killer Robots.) Because the denizens of Fukuyama’s best-of-the-rest free democracies don’t really want to end up part of China or Russia, their existence necessitates a constant buildup of ever more advanced military technology. To them it’s for freedom. But for a People’s Republic, it is for control. Tech builds up in both places, by a simple and unavoidable logic. Ugh. Thus it is that technology by its very nature and connection to us expands to fill all spaces. No wonder the modern world is so confusing (and just wait for the future).

What, then, to do? You can’t kill AI. For one thing, it’s not even “AI”; it’s math run on supercomputers. You can fulminate about despotism and China, but you have to live somewhere—and the issue is more time than nation (see (1) above). On the larger issue, on the essential character of technology is like that of a bacterial infection. There are deadly bacteria but there are also healthy gut bacteria and they grow together.

I confess, I’m flummoxed. To me, to say that technology grows is tantamount to saying that we all die or that love and war seems to capture the important gritty stuff in life. Or that Shakespeare was a good poet. Truisms. The problem is that technology masquerades as malleable and decide-able and controllable, as wars may sometimes masquerade as predictable and avoidable to naive Pollyanna. We should call out China’s aggressive moves to track its citizens and intimidate them in various ugly ways, no doubt. But so too, should we all “call out” the modern world. New ideas are needed. The problem is much deeper than AI itself. Anyone?


Other recent musings from our Analysis desk:

Superintelligent AI is still a myth. Neither the old classical approaches nor the new data scientific angle can make any headway on good ol’ common sense.

and

Can AI help Hollywood predict the next big hit? AI analysis sifts the past finely. But how well does the past predict the future?


Mind Matters Analysis

Technology Centralizes by Its Very Nature