Media
Asimov, Isaac. 2004 (1950). “Runaround” and “Reason” in I, Robot. New York: Bantam Dell.
One of
- Fritz Lang. (1927). Metropolis. New York, NY: Kino International. (Available through NYU Stream and NYU Kanopy)
- Scott, Ridley. (1982) Blade Runner – The Final Cut. Burbank, CA: Warner Home Video. (Available through NYU Stream)
Geek of the Week
- Huda
- About Cowboy Bebop: Set in 2071, in a post-apocalyptic world where Earth has become largely uninhabitable, the story follows a ragtag group of bounty hunters, known as cowboys, aboard the spaceship “Bebop.” As they traverse planets and moons in search of wanted fugitives, each cowboy contends with shadows from the past they can’t outrun.
- Yoshiyuki Takei and Shin’ichirô Watanabe, “Speak Like a Child,” Cowboy Bebop, October 28, 2001. (Available through NYU Stream)
- Yoshiyuki Takei and Shin’ichirô Watanabe, “Brain Scratch,” Cowboy Bebop, November 11, 2001. (Available through NYU Stream)
- Steven
- Alex Garland et al., Ex Machina, Drama, Sci-Fi, Thriller (A24, Universal Pictures, Film4, 2015). (Available through NYU Stream) (Selection TBA)
- Scene 1 – 35:43 – 45: 25
- Scene 2 – 1:08:43 – 1:18:09
- Scene 3 – 1:28:52 – 1:42:33
- Derek Thompson, “What Jobs Will the Robots Take?,” The Atlantic, January 23, 2014.
- Maddie Stone, “Can AI-Powered Robots Solve the Smartphone E-Waste Crisis?,” The Verge, February 3, 2022.
- Alex Garland et al., Ex Machina, Drama, Sci-Fi, Thriller (A24, Universal Pictures, Film4, 2015). (Available through NYU Stream) (Selection TBA)
Theory and Commentary
Battaglia, Debbora. 2001. Multiplicities: An Anthropologist’s Thoughts on Replicants and Clones in Popular Film.Critical Inquiry 27(3): 493-514.
Gray, C. H., ed. 1995. Selections from Part 4: In the Imagination. In The Cyborg Handbook. New York: Routledge.
Assignments
Essay 2 Due
1. If clones are devout of memory, have they lost the human “essence that separates man from machine?
2. How can we read Battagalia’s argument on clon-isms, in relation to the image economy?
3. What does Oehlert mean by the statement that:” there is a real chance that the dominant cyborg body politics of the future will be military information societies”. ? How does the “corporation (think meta.. ), pose a greater threat to humanity that the individual?
1. What is the most important criterion for distinguishing an android from a human? Emotions, consciousness, or any other one?
2. Is it possible that androids can gradually become emotional like those in Blade Runner (1982)?
3. Is passing the Turing test sufficient to prove an artificial life is conscious?
1. According to futuretimeline.net, traditional employment will become obsolete by the 2200s. What do you think human beings will do with the bulk of their time once they no longer have to work for 40+ hours per week?
2. Cowboy Bebop serves as a model of an imagined anime version of 2071 — a version that was crafted in the 90’s. Based off the changes in technology that have happened between the show’s creation in 1998 and today, what do you think the year 2071 will look like?
3. Typically, when new technology arises, it is primarily utilized in the sex industry. With regards to the integration of AI and robotics in this industry, what do you think the ethics of consent will begin to look like once human beings begin forming relationships with nearly conscious robots?
See this link for question #1:
https://futuretimeline.net/23rdcentury/2200-2249.htm
1. Does our essential self and personhood remain the same if our memories are removed?
2. Is a self-driving car a cyborg?
3. Why would a creator want to imbue a cyborg with desire?
1) While The Atlantic article suggests that humans are better at caring then Robots are, is this a case of fact or is it simply that when creating robots, the ability to care is left out. If this is the case, is it due to humans fear that providing robots with such an ability might truly make humans obsolete?
2) What is the significance in the Cowboy Bepop episode four characters memory being triggered by a video pf her past self?
3) Is there a possibility of a world that could exist where humans and Machine exist together in a non exploitative manner?
1. Why do we want robots (cyborgs/gynoids/etc.) to resemble humans, in appearance as well as emotion? Is it because our (not to generalize) inherent (or not) nature of wanting to control others, and since we cannot do that with real humans, we’re resorting to AI robotics? What do you feel about treating robots as slaves?
2. With the world’s population increasing more than ever, why do we keep wanting to replace human labor with machines? Why aren’t we satisfied with the standards of human labor?
3. Realistically, unless we magically find a new infinite energy source that can replace fuel and such, the world 50 years from now would either be not so different than today, or a little worse than today. Do we perhaps need a little more realism in the upcoming works of science-fiction, or should we keep portraying the world metaphorically?
1. A hierarchical and graded social structure is constructed in Metropolis. Rather than being a religious metaphor, I think it’s better to say that it is the structure of modern society, of modernity, for example, in the Hugo Award-winning work Folding Beijing, which actually has a similar setting. I’m very curious about this question: I find a very close relationship between modernity, sci-fi, and utopia, but I don’t know how to understand this relationship, except that they always appear at the same time, to be more specific, they always manifest in some form of architecture or social organization, so what exactly is the relationship between utopia and modernity, sci-fi and modernity?
2. Despite being a novel in form, I, Robot is more like Asimov’s ten thought experiments on the Three Laws of Robotics. Is there any philosophical analysis of the Three Laws of Robotics? Are they logically reliable?
3. In Debbora’s paper she uses anthropological methods to analyze clones and replicants in popular movies, and she notes that “popular films are major cultural documents of the social life of the public moment” but I doubt if this is reliable in terms of media analysis for I feel popular films or any other forms of cultural products cannot reflect directly what people were thinking about but merely show what people were watching at that time.
1. If multiplicity is justified as a kind of ‘antiwaste’ because of the accumulated knowledge plus knowledge potential- why not just copy/ duplicate the memory? What physically about a human, or Doug in this case, needs to be cloned?
2. Can knowledge only be claimed as fact when verified by material experience? If so- how can the stored, inherent memory to a Clone really be considered real? (the theory of knowledge + knowledge realm)
3. How are the socio-political markers of the 1970s mirrored in Blade Runner?
1. Much of our texts this week involved the importance of memory as a defining trait of humanity. What does this reveal about how memory loss is uniquely dehumanizing?
2. We also saw humans gain empathy for robots while in Ex Machina Ava did not reciprocate the empathy for Domhall Gleeson but in Blade Runner Roy saved Rick. What disparate messages are being communicated here? Is Blade Runner arguing that the robot/human split can be bridged through individual relationships or that symbolically speaking, robots are the saviors of humanity?
3. In the two Cowboy Bebop episodes we saw two examples of our personal, human relationships with technology. In “Brain Scratch” humans essentially download their memory/personality/soul into a computer system. In “Speak like a Child” we see a video where a child uses a video cassette to create an externalized support system. What interested me most is that in “Speak like a Child,” the video is only meant for the eyes of the girl. Excluding social media and other performative or social platforms, how do we externalize our personality/soul/memories with the aid of technology? (Think private Spotify playlists, private Pinterest boards, digital photos, etc.)
1. Should we feel empathy for that jumping robot?
2. Should we anthropomorphise AI?
3. Do we measure AI based on their ability to mimic us, to make decisions rather than other level of consciousness? What does it mean to be human?
1. Is Deckard a replicant?
1′. Do we care?
2. Is the fear of death the root of human experience? What becomes of a thinking machine that’s afraid to die? Is it consciousness that makes the fear of death, or is it in fearing death that the mind gains conciousness?
3. Will we ever really want to get rid of our bodies?
1. If memories build the soul and feeling for a “human”, how to distinguish between the real experiences and the fake implanted memories for the definition of consciousness?
2. For antisocial people who lack the ability of empathy, how to distinguish between them and the androids?
3. Are we going to meet our creations half way through our development?
Robots have a place in our mind that maybe, on consideration, a dangerous distraction. The software at Bloomberg that writes news is a ‘robot’ – just not in a physical form. The software that reads xrays is – for human Radiologist with school bills to pay, very very real. Even if Boston Dynamics robots never pick up a gun, and march into battle – the software that replaces humans with better, faster, cheaper solutions is set to upend our economy. And the somewhat utopian vision that we’ll no longer need to work, so we can do ‘other things’ may have a fatal flaw built into humans. We may need to have a mission – and without work, that may not be easy to find.