Mind Matters Natural and Artificial Intelligence News and Analysis
sad-at-sunset-stockpack-adobe-stock
Sad at Sunset
Image licensed via Adobe Stock

Westworld: Episode 8 Review

Is misery the key to consciousness?
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Episode Eight is one of the stronger episodes in the series. It starts with Bernard waking up after killing Theresa. He is distraught over killing his former girlfriend, and he is also horrified to realize he is a robot. This, of course, means the memories of his son are not real. Ford explains that Bernard is a robotic version of Arnold. The two men were on a quest to discover consciousness, but when Arnold died, Ford needed someone to help him continue his work. So, he built Bernard. This is a change in Ford’s story. Initially, Ford expressed no real interest in Arnold’s work. Now, he says the two were working together to create a version of consciousness. Of course, none of this really matters to poor Bernard. He is experiencing a great deal of guilt and pain. Ford offers to make him forget—which is hilarious because he was going to wipe the robot’s memory anyway—if Bernard covers up Theresa’s murder. Bernard agrees because, really, he doesn’t have a choice anyway.

Are My Feelings Real?

Bernard does as Ford commands, and just before Ford is about to erase his memory, he asks Bernard what it is like to be in his particular situation. He’s a robot, who knows he’s a robot, and understands exactly how he was made. Bernard says he understands what he is, but he wonders if his emotions are real. Ford’s answer is essentially a cop-out. He tells Bernard that he isn’t really missing anything when it comes to feeling human emotion. In his mind, there is no difference ultimately. But if we were to explore the question for ourselves, we return to the same problem that was mentioned in previous reviews. Ford casually mentions that he and Arnold were able to program emotions into the robots, and that the programming has advanced over time, but how does anyone really know if the robot is genuinely experiencing the emotions or if it is just mimicking the behavior? We don’t. In fact, there is no reason to infer that Bernard is experiencing a sense of self. And this sense of self would have to come before the emotions have any validity. There is a difference between simply acting sad and expressing the idea, “I am sad.” How does one translate sadness into binary? This is the all-important question when it comes to addressing the plausibility of inert matter becoming conscious, and it is ignored. Not that I really blame the writers for this too much because there’s no answer to the question. At any rate, Bernard accepts Ford’s answer, and Ford wipes his memory.

In the end, Theresa is outed as the mole who tried to steal the data, and Ford subtly lets Hale know that he is aware of her role in hacking one of the robots to attack humans. As far as he’s concerned, Hale isn’t a threat, at least, not yet.

Returning to Maeve, she gradually becomes more and more disgusted with her existence inside the park. However, she encounters a glitch in her plan. A bomb has been planted in every robot’s spine, so if she or any of the others try to leave the company premises, they will explode. In order to escape the premises, she must deactivate the bomb, which will require her to get a new body, but they reserve that problem for another episode.  First, she decides to make some new additions to her programming.

When Felix is finished with the changes, she can use voice commands to control the other robots, and she uses this talent to help the bandits who repeatedly attack the town to successfully steal a safe. We’re not sure what her plan is, but we know it involves their help somehow. But not everything runs smoothly for Maeve. At a random moment in the episode, she has a flashback, one where we learn that the Man in Black was the one who killed her daughter. During this flashback she accidentally kills one of the other robots, thinking it was the Man in Black. This prompts the park’s staff to retrieve it, and the episode ends for Maeve.

Meanwhile, the Man in Black and Teddy find a group of people who were attacked by Wyatt. They manage to save a girl, but during this fight, Teddy also begins to remember the Man in Black and his misdeeds. Teddy attacks the Man in Black and ties him up. Here, we learn the strange man’s backstory. The Man in Black is a business big wig of some kind, and his wife committed suicide before the series began. His daughter blamed him for his wife’s death, accusing him of being a monster. The Man in Black decided to see if she was right, so he came to the park to see if he could do something truly terrible. He ends up murdering a homesteader and her daughter. This turns out to be Maeve and her child. But before Maeve dies, something about the way she fights the Man in Black convinces him that the robot is alive. He concludes that robots can come to life if they are put through continuous pain, and that misery is the key to consciousness.

There Must Be a Self for Emotion to Bear Meaning

It’s a dark idea to be sure, but I can’t help but think we run into the same problem as before. There has to be a self before any experience or emotion can mean anything. But can repetition create a self? Perhaps, repeated painful experiences, assuming they are remembered, can create a sense of self-preservation, and self-preservation can become the bedrock of one’s personhood. The idea is intriguing. If something were to die over and over again, could the desire to avoid that pain trigger the development of something beyond blind obedience to commands? Personally, I don’t think so. There would have to be some sort of additional component, something that would turn the pain into an abstraction. Something has to make the pain an existential threat. Where is the desire to avoid pain coming from? How would you translate a desire to avoid pain into binary? You could teach a robot to avoid a knife. The shape of a knife and its dimensions can be coded, but how do you code an abstract concept? You can’t. You can only program behaviors. Again, we’re returning to that same root problem. How do we know the robot is feeling pain? How do we know it isn’t parroting behavior when certain stimuli are provided? Both the Man in Black and Ford’s theories are suffering from the same problem. They both are assuming a self, which is the crux of the matter to begin with.

Our final plot in the episode follows Dolores and William. She finds where her original hometown was but the whole place is buried for some reason. She starts to think she’s crazy, but William comforts her. They begin their trek back to Sweetwater but run into Logan, William’s brother-in-law. He’s joined the Confederados. William and Dolores are captured. We’ll pick up what happens then in the next review.            


Gary Varner

Gary Varner is the Assistant to the Managing and Associate Directors at the Center for Science & Culture in Seattle, Washington. He is a Science Fiction and Fantasy enthusiast with a bachelor’s degree in Theater Arts, and he spends his time working with his fellows at Discovery Institute and raising his daughter who he suspects will one day be president of the United States. For more reviews as well as serial novels, go to www.garypaulvarner.com to read more.

Westworld: Episode 8 Review