There is a debate over whether large charities and NGOs are actually helpful in times of crises. Ken Liu’s short story “Byzantine Empathy” enters this debate by presenting the reader with two former roommates who embody two sides of the argument. Both women represent long debated sides, but a new tool emerges to radicalize the way the world sees humanitarian crises: immersive virtual reality. The intensity of the VR immersion is not our current reality, however, Liu’s short story parallels how interactivity and dissemination of information influence actions whether for good or bad. 

In “Byzantine Empathy” Jianwen creates a new cryptocurrency Empathium, which allows users to directly donate to refugees. Sophia is the head of an international humanitarian charity, Refugees Without Borders (RWB.)  RWB joins Empathium to win back donations from younger funders. RWB brings in more money and more users, and with this exposure, Jianwen shares an immersive VR experience to users. It puts viewers in the perspective of a Han Chinese child as she watches her mother get gunned down and her baby sister trampled. This VR experience pushes users of Empathium to fund projects for Han Chinese refugees struggling against Burmese soldiers on the border between China and Myanmar. This is not an ideal move for RWB because it appears the west is overstepping boundaries and weakens diplomacy negotiations. For this story, VR makes it very easy for the user to empathize with victims and mobilizes change. A similar situation, today, arises from other media and even more specifically out of Myanmar. 

  While Facebook is not by definition a VR experience, its interactivity and ability to sway user emotions is analogous to Liu’s VR technology. In the years leading up to this story’s publishing, two events led governments and users to question the power of Facebook. The first was a study done on users without their knowledge. In 2012, the site allowed data scientists to expose users to more negative words, and others to more positive (Meyer.) By the end of the week, the users being monitored mirrored the emotions they were exposed to (Meyer.) This type of manipulation via this interactive media grew into an even bigger issue for Myanmar. Between about 2010 and 2015, leaders weaponized Facebook’s inability to flag and remove hate speech in Burmese by posting misinformation and dehumanizing content about Rohingya Muslims (McLaughlin.) Like the 2012 study found, users were able to empathize with particular language and imagery presented to them and act on that information. In this case, it incited violence against Rohingya Muslims and outside criticism was silenced by further misinformation. Liu’s “Byzantine Empathy” seems to show the positives of interactive media because exposure to information allowed Empathium users to force humanitarian efforts, but the story does not end there. 

 Sophia goes to the conflict area, and upon arrival is seemingly attacked by Han Chinese rebels. While this is a stretch of the truth, her equipment was transmitting a new VR experience, and as she tells Jianwen, that twisted perception is all she needs, “…it doesn’t need much editing—there will be outrage at home. A defenseless American woman, the head of a charity dedicated to helping refugees, is brutalized by ethnic Han Chinese rebels armed with guns bought with money from Empathium…” (Liu 95.) It is an unfortunate end for the refugees on the border, but it is also a metaphor for how much perceptions of reality can be twisted by interactive media. Viewers, like Facebook users, are unable to verify the context of the media, and because it comes from a figure of power and leadership, the text, image, video, or alleged crime is taken as truth. 

Dissecting how Empathium and VR function in “Byzantine Empathy” is startling because it shows how easy it can be to manipulate our exposure to crises. More importantly, it reminds us that that future is not disconnected; it is one Facebook login away. 

 

 

Works Cited

Liu, Ken. “Byzantine Empathy.” Twelve Tomorrows, edited by Wade Roush, MIT Press, 2018, pp. 69–99. 

McLaughlin, Timothy. “How Facebook’s Rise Fueled Chaos and Confusion in Myanmar.” Wired, Conde Nast, 6 July 2018, www.wired.com/story/how-facebooks-rise-fueled-chaos-and-confusion-in-myanmar/. 

Meyer, Robinson. “Everything We Know About Facebook’s Secret Mood Manipulation Experiment.” The Atlantic, Atlantic Media Company, 9 Sept. 2014, www.theatlantic.com/technology/archive/2014/06/everything-we-know-about-facebooks-secret-mood-manipulation-experiment/373648/.