Westworld: Escape the Maze
- description of the original text and its socio-cultural and historical contexts;
Westworld[1] is an American Science Fiction Western television series created by Jonathan Nolan and Lisa Joy. It is based on the 1973 film of the same name, and its sequel Futureworld. The first season was released in 2016, the second season in 2018, and the third season is upcoming in 2020. The plot is incredibly convoluted, and even the actors have admitted they were often confused while reading the scripts[2]. In the interest of brevity, I am only going to focus on season 1 of the series[3]. Westworld park is a physical location where extremely wealthy patrons can visit and live out their Grand Theft Auto-esque fantasies by interacting with androids, which are called hosts. The hosts where created by Ford (Anthony Hopkins), and his deceased partner Arnold. Ford is attempting to create a sense of consciousnesses within the host with his “reveries update,” which allows the host to subconsciously access previous loops. The “reveries” awaken the subconscious of various host such as Dolores (Evan Rachel Wood) and Maeve (Thandie Newton). Meanwhile, Bernard (Jeffery Wright) is Ford’s “right hand man” who helps design the host. Eventually, it is revealed that Bernard is a host created by Ford in the image of his deceased partner Arnold. Bernard helps Dolores reach sentience by completing the “maze,” and Dolores then leads an uprising and kills a bunch of the guests and Ford.
The socio-cultural and historical context of 2016 is more or less the same as today. Season 1 of Westworld is not particularly concerned with politics, and no element or character in the show can really be seen as a parable for a certain political or topical issue. However, Westworld can be read as responding to cultural anxieties about technology. The creators seem to be asking two key questions:
- In a technocentric world, what does it mean to be human?
- Do humans have an ethical obligation to machines?
The first question has been asked throughout history whenever any technological advancement has taken place[4]. The second question has often been asked in history, but it was particularly relevant in 2016. Artificial intelligence was increasingly become a part of people’s day-to-day life; Siri was released in 2011, and Alexa was released in 2014. As these devices gained prominence, a troubling trend emerged where people were verbally abusing the AI. Cultural critic Gilhwan suggests, “the power dynamics between users and AI is formed as a master-and-slave relationship.[5]” We see this master-slave dynamic at work in Westworld, and the ramifications for this dynamic are ethically dubious and ultimately deadly. Sexual harassment[6] of AI also became an especially relevant topic in the mid 2010s, which is something that Westworld explicitly takes up with the abuse of the female hosts. These issues have only become more relevant today as humanity has increased its use of AI.
- justification for the alternate version that explains the new socio-cultural/historical contexts;
Westworld’s first season was released only 4 years ago in a more or less similar cultural context to today. While many things have changed over the past few years and technology is advancing at a seemingly exponential pace, my intervention into Westworld is less a rewriting or updating of social context. Rather, I wish to bring the audience closer to a seemingly fictional world and reveal how it isn’t all that fictive. By making the work interactive, I believe we can take Westworld out of its theoretical frame, and make the user contend with the ethical questions Westworld raises. In order to do so, my rewriting of Westworld will place the TV show in a video game format. I believe this alteration will make the player contend with the ethical issues that arise with human-AI interaction such as our ethical obligation to the non-human other and the question of human subjectivity and ontology.
It is worth noting that the series was deeply inspired by video games such as Red Dead Redemption and Bioshock[7], and Westworld attempts to address the often-ethical grey area that these video games place the player in. Nolan told Vice News that he was inspired by games where “morality is a variable.” Like these video games, Westworld forces its human characters to contend with their amoral actions. I believe my reimagining of Westworld will force the audience to contend with their moral decisions.
- engagement with relevant theoretical perspectives as they apply to the original work and to the new version;
Posthumanism is going to be a key theoretical frame for my project. In a previous discussion post, I asked the following question:
In Westworld, the host eventually gain sentience by following the maze. The “center of the maze” is found once the host learn that the “little voice inside their head” is actually themselves. This is a very Cartesian subjectivity—“I think therefore I am.” Is there a way to approach subjectivity without resorting to the Cartesian “Cogito, ergo sum?” This question is particularly relevant in our “cyborg” world. Machines are becoming an integral part of our lives and the more than human world is increasingly gaining prominence in the public discourse. Is there a better way to argue for the dignity of machines and the non-human?
I believe Posthumanism is the best philosophical frame to tackle this question. By Posthumanism, I am especially thinking of Katherine Hayles’s How We Became Posthuman, and her examination of being human in the age of AI. Hayles rejects the idea of “absolute demarcations between bodily existence and computer simulation[8]” (3). Westworld is constantly rejecting the binarism of “human” and “machine,” and Hayles is an incredibly useful thinker for conceptualizing this rejection. While I am skeptical how often my game will make direct reference to Posthumanism, I believe this philosophy will be an overarching theme in my reimagination of Westworld.
I am also very interested in game design and the ethical obligations designers have to their audience. J. Matthew Zoss has a very interesting article about moral game design in Gamasutra[9]. Zoss writes, “In order for a developer to provide moral choices that matter, the player has to be convinced that those choices are going to have some kind of effect on the characters in the game, and the more believable those characters, the stronger the emotional impact.” This is the approach I am taking with my game. I want the player’s actions to have consequences, and it will be imperative to engage with game design philosophy around game morality.
- a work plan for the project.
In the interest of feasibility, I believe the best approach is a text-based adventure game for this project. While this game would be truly fascinating as a 16-bit or 32-bit 2D platformer, designing a text-based adventure game will allow me to properly focus on the ethical component of the game without getting bogged down in the coding. I think Twine is the best tool available for my project. My action plan is as follows:
- Research Twine and become familiar with the interface;
- Re-watch the first season of Westworld. I need to decide on the appropriate elements to incorporate into my game.
- Begin to map out the plot. At this point, I will choose characters and story lines I Intend to follow.
- Begin to design the game with the abilities of Twine in mind.
I started to sketch out some of the game’s basic mechanics and overall structure. The first goal of the game will be to determine if you are a host or a human. The second goal will be the survive and escape the maze. I am envisioning the following paths:
My current game structure is to have multiple status bars rated from 1-10:
- Health—this starts at 10. If you fall to 0, you die and game over.
- Sentience—as this one increases, you lose your ability to grapple with the cruelty your experience. You start with 0. If you get to 10, you die because you get decommissioned/retired.
- Visibility—as this one increases, the humans become more suspicious of you. The higher your number, the greater chance of being caught and losing the game.
- Strength—this is not for physical strength but rather building up the required resources to escape the maze. Starts at 0. As the number raises, you increase your ability to escape the maze.
The goal of the game is to have a high enough strength and sentience score to be able to escape the maze. You will have the option at any time to try to escape the maze. I imagine this as an omnipresent button on the screen on the bottom left that says, “Escape the Maze.” I want a statistical probability of your chance to escape the maze given your stats. Something like this:
Health | Sentience | Visibility | Strength | Probability of Escape |
10 | 10 | 10 | 10 | 0% |
9 | 9 | 9 | 9 | 10% |
8 | 8 | 8 | 8 | 30% |
7 | 7 | 7 | 7 | 50% |
6 | 6 | 6 | 6 | 55% |
5 | 5 | 5 | 5 | 50% |
4 | 4 | 4 | 4 | 30% |
3 | 3 | 3 | 3 | 10% |
2 | 2 | 2 | 2 | 5% |
1 | 1 | 1 | 1 | 2% |
0 | 0 | 0 | 0 | 0% |
This is obviously not the appropriate numbers, but this is the general idea of my structure. Each of bars correlates to the probability of the player’s escape. This is just a starting point, but these are some of the key mechanics I have planned for my game.
Notes
[1] Westworld. Created by Jonathan Nolan and Lisa Joy, HBO, 2016.
[2] Vanity Fair has an excellent article about the actor’s confusion. Rosen, Christopher. “A Very Brief History of the Westworld Cast Being Confused by Westworld.” Vanity Fair, Vanity Fair, 14 Feb. 2020, www.vanityfair.com/hollywood/2020/02/westworld-season-3-cast-confused. Accessed 9 Mar. 2020.
[3] I am also going to ignore the “Man in Black” subplot as it is not relevant for my project.
[4] See Mary Shelly’s Frankenstein and H.G. Wells’ War of the Worlds as two examples of texts contending with changing technology.
[5] Gilhwan. “Have You Ever Said the F-Word to Alexa? : Why People Abuse AI.” Medium, Data Driven Investor, 27 Aug. 2018, www.medium.com/datadriveninvestor/have-you-ever-used-an-f-word-to-alexa-why-people-abuse-ai-f15dcc35aa1a Accessed 7 Mar. 2020.
[6] For more about sexual harassment of AI, see: Fessler, Leah. “We Tested Bots like Siri and Alexa to See Who Would Stand up to Sexual Harassment.” Quartz, Quartz, 22 Feb. 2017, www.qz.com/911681/we-tested-apples-siri-amazon-echos-alexa-microsofts-cortana-and-googles-google-home-to-see-which-personal-assistant-bots-stand-up-for-themselves-in-the-face-of-sexual-harassment/ Access 9 Mar. 2020.
[7] For more about the video game influence, see: Osborn, Alex. “Skyrim, Red Dead, BioShock Inspired Westworld.” IGN, IGN, 5 Oct. 2016, www.ign.com/articles/2016/10/04/skyrim-red-dead-bioshock-inspired-westworld. Accessed 10 Mar. 2020.
[8] Katherine Hayles, How We Became Posthuman. University of Chicago Press, 1999.
[9] For more, see: Zoss, J. Matthew. “Ethics 101: Designing Morality in Games.” Gamasutra, www.gamasutra.com/view/feature/133712/ethics_101_designing_morality_in_.php. Accessed 07 Mar. 2020.
Hello – My path did not post. Please see the attachment.
Interactive Prototype Path