Mind Matters Natural and Artificial Intelligence News and Analysis
a-robot-or-a-woman-half-a-robot-with-mechanical-technological-body-parts-and-upgrades-transhumanism-cyborg-and-artificial-intelligence-stockpack-adobe-stock
a robot or a woman half a robot with mechanical technological body parts and upgrades, transhumanism cyborg and artificial intelligence,
Image licensed via Adobe Stock

Wrapping Up the Westworld Series

Ultimately, the moral of the story is transhumanism
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

The first time I watched Westworld, I remember enjoying it, but upon revisiting the series, my opinion of it has dropped a great deal. There are a variety of problems. First, it’s a bait and switch. It teases the idea of showing how robots can come to life, and it plays with your expectations for most of the series. It even goes as far as to discuss theories like The Bicameral Mind, and The Turing Test. Then, in the last episode, it confirms what the viewer has been slowly growing to suspect. The robots had been coming to life the entire time, and Ford had been wiping their memories. The show says that Ford programmed the robots to experience everything we’ve come to believe is a part of their awakening. For instance, when Dolores and Teddy wound up on the shoreline just in time for Ford to announce his new narrative. If Dolores had been on the cusp of waking up the entire time, why was she still following Ford’s instructions to the degree that she ended up in the exact place he told her to go? Same thing with Maeve. We’re led to believe that she is an awakening robot who is managing to fly under the radar, only to find out that Ford had preprogrammed all her actions.

This “twist” is disguised as a mystery for the audience to ponder, but it’s really just an attempt to plug up the plot holes. For example, why did Maeve see her programming on the tablet when the robots aren’t even allowed to comprehend modern photos? Because Ford told her to. But if Ford really had enabled her to see the tablet and also enabled her to alter her own programming, then why didn’t she see that he’d manipulated her code to begin with? Did he program her to conveniently ignore that portion of information on the tablet? If so, why did she see her manipulated code when Benard showed her Dr. Ford’s alterations?

A Waste of Time?

It was cathartic when Maeve finally broke free of Ford’s predestined path by going to find her daughter, but the fact that Ford was pulling Maeve’s strings throughout the series makes her entire story arc feel like a waste of time. It makes everything that happens in the series feel like a waste of time, and it undermines the moment when the robots finally do wake up. How do we know Ford isn’t still pulling their strings?

Then we have the philosophic problems with the show. We established that The Bicameral Mind and The Turing Test were not adequate theories for explaining consciousness, but it was still nice to see that the writers had done some research to explain how the robots might be able to wake up. But here’s where it gets confusing. On the one hand, they affirm the theory of The Bicameral Mind—the idea that man first mistook his own thoughts for the voice of the gods—by having Dolores first hear Arnold’s voice, then Ford’s, then herself in the last episode, but at the same time, they throw away the notions of a Bicameral Mind and a pyramid to consciousness by explaining that the maze was a correction to Arnold’s initial approach. Consciousness is not a journey upward, but inward as Arnold says. If there was some nuance that made all the approaches possible, the writers didn’t articulate it, and the saying that consciousness is a journey inward is just ambiguous enough to be useless. It was infuriating to see the writers put so much research in at the beginning only to throw away an imaginative idea and exchange it for a pointless platitude.

Lastly, I want to make a comment on the overall theme which only revealed itself in the last episode. The writers paint a very grim view of the tourists who enter the park, as they rightly should. The only reason anyone is going there is so they can indulge their baser impulses. There’s one character who tries to use the place to grow, but even he becomes jaded over time. This was all well and good, but in the last episode, they take this grim view and extrapolate it to the rest of humanity. Dr. Ford doesn’t just want the robots to wake up, he wants them to kill everybody. And what’s incredible is the show makes it seem as if Dr. Ford is right, a malevolent sage who wants to destroy the world! But what’s his basis for this? It’s the park! Which is idiotic! It’s like entering a seedy casino or bar and saying the degenerates inside are the summation of mankind. Now, I believe man has a fallen nature, but a man who builds a place that attracts society’s worst and then acts horrified by their depravity is a moron. But Dr. Ford is the hero of this sordid tale, and he believes mankind is awful and must not only be eradicated but replaced . . . by robots. Here, is where Westworld goes from being an exploration about how robots can wake up to an affirmation of transhumanism. And they waited until the very last episode to tell the viewers they should die.

To be honest, I completely missed this the first time I watched the series, but after taking a closer look, I can safely say that Westworld isn’t a what-if scenario. It’s a lie on multiple levels.

Humanity Needs to Be Transcended

First, there’s the bait and switch which takes place when we find out the robots have been awake through most of the series; although, the writers waver on this point, sometimes acting like the robots are already awake, while at other moments, acting like they’re becoming conscious for the first time. Frankly, this is because the writers are playing fast and loose with the term consciousness, and I’m convinced they don’t even know what the term means, nor did they bother to come up with a working definition for the script. Plus, the concept of self – the act of a robot portraying pain versus the idea of a person being in pain – is a notion the writers want to completely avoid. Even when they get close to articulating this distinction, the characters who state that the park’s robots are not real are treated as villains.

Secondly, the writers want you to believe humans are bad and don’t deserve the ground they walk on. I won’t say humans are purely good, but they’re better than robots. A robot can’t feel, and some feelings are important, like empathy. But the writers don’t seem to think so. When Benard asks if his feelings are real, Dr. Ford essentially tells Bernard it doesn’t matter because feelings are overrated. Ultimately, Ford believes the thing that should replace humanity must be cold, murderous, and lifeless: a void.

And a void is just what you’ll find when you treat consciousness as a journey inward. All your beliefs, emotions, and convictions are just paper-thin illusions created by random neurons firing in something that vaguely resembles a pattern. But if you treat consciousness as a journey upward, you’ll find that these “abstract” concepts, like love, beauty, and justice, are real. These concepts are above you, they exist outside of you, and the fact that we can detect them is one of the many things proving we exist and make us human. At the beginning of these reviews, I’d planned on recommending the series, but after rewatching it, I’d say don’t give Westworld your attention. It’s a waste of time.  


Gary Varner

Gary Varner is the Assistant to the Managing and Associate Directors at the Center for Science & Culture in Seattle, Washington. He is a Science Fiction and Fantasy enthusiast with a bachelor’s degree in Theater Arts, and he spends his time working with his fellows at Discovery Institute and raising his daughter who he suspects will one day be president of the United States. For more reviews as well as serial novels, go to www.garypaulvarner.com to read more.

Wrapping Up the Westworld Series