The Pandemic


'The Pandemic' tells a story of an alien accidentally helps human reflect on themselves. The project is for course Creative Technology 2 in spring term, 2020, instructed by Phil van Allen. It is a collaborative project with Yuanyuan Gong and Jooga Dong.

My role: Design story narrative and robot interaction. Edit video(00:00-03:30).


Delft AI Toolkit, Adobe Creative Suite, DSLR.


Fiction, AI


2020 Spring




In Planet Patous, aliens tell the time with the help of Rosaka as their time-keeper. Rosakas love following aliens around like their shadow. Every Patous-hours(equals 908 days on the Earth), Rosakas will choose a group of aliens with a common characteristic to follow to insure themselves continually seen by the aliens, and light up a color in the sequence of pink, blue,green and purple. An Patous-day starts with pink and ends with purple and aliens change their behaviors accordingly.

2.Story narrative:

People gradually discovered that many robots followed people with pink hair, which caused panic among humans. People named the robot Rosaka and began to guess its purpose. Due to the camera-like device on Rosaka, people began to speculate that Rosaka is a tool used by the government to monitor some  groups of people.

Rosaka likes to follow, the faster people go, the tighter it will follow, and sometimes they just suddenly stop following. People hate it, people are afraid of it. Gradually, due to the aversion to Rosakas, people built discrimination against the pink hair. With discrimination aroused, people with pink hair cannot even enter a grocery store. After about three months, people have been able to live together with Rosakas. However, it seems like the pink-haired people became the common enemy of humans. Someone changed their hair and deleted all the selfies they posted on Instagram.

Meanwhile, some people take pride in their pink hair and would rather endure discrimination than dye their hair to the other colors. The price of hair dyes has gone up frantically, and the robbery of barbershops greatly increased... One day, all the Rosakas suddenly started to rotate themselves. At the same time, Rosakas on Planet Patous were doing the same thing. Rosakas were timekeepers on Patous. Unlike the earth, Planet Patous doesn't have night time, so it is impossible to tell the time judging from the change of light. Although they already developed technology to tell the time from stars, they still retain the oldest timekeeping method on their planet: Rosaka, which is a domesticated creature and likes to run after Patousian. They change color every Patous-hour (approximately three months on earth). During a Patous-hour, they will follow those Patousian who share the same color; more specifically, they will follow the pink, blue, green, and purple Patousian in succession. This habit brought to earth, Rosaka automatically started following people with different hair colors, thinking they were Patousian.

About two days later, all Rosaka stopped their rotation, which was a celebration to the new hour, and started following the blue-haired man. A man who had just dyed his pink hair blue went out of the barbershop, couldn't believe his eyes.

3. Robot Interaction

Rosakas have difficulties in communicating with humans. When humans shout/wave at a Rosaka, trying to scare it away, Rosaka will follow up closer because they see big sounds and handwave as signals of asking the time. Only when people touch its head gently, Rosaka will regard touching as turning the timekeeping function off. The interaction is building story confliction.

Point of View

The story is a narrative of how discrimination forms. Discrimination begins with an unnecessary emphasis on a more or less arbitrary human trait that then gets misread and overblown, and eventually becomes a taboo. The story tracks this process and critiques a problematic system that allows people to establish discrimination against the "otherings." The discrimination is infectious and in our story, it becomes a pandemic. It is ironic but realistic.

Left: a Photoshop-ed image that was used in the video. Right: photo taken during segregation.


1. Pink hair and COVID-19?

COVID-19 does gives some inspirations but it is not the starting point of the project. Our intention is more about discussing a system that allows people generate bias to other people. Bias and discrimination can happen to anything - hair color, skin color, weight, gender, and also wearing-a-mask-or-not. Something about COVID-19 pandemic that resonates with the project is, by the time the project started, masks were not encouraged by CDC for the reason that washing hands is enough and people should leave the masks to healthcare workers. People who covered their faces were the otherings to the society. By the time the project ends, masks and face coverings are encouraged and even become a must in some cities. This is exactly the same ending as our story’s - people change attitudes but the discrimination doesn’t end. It transfers to other groups.

2. Will empathy reduce prejudice?

In our video there is a story hidden in the screenshots: A pink-haired man didn’t want to change his hair color at first, but decided to dye his hair to blue in the end; coincidentally, the robot changed their tracking target to the blue-haired. By this detail we are challenging human’s empathy. Some people suggest that by putting ourselves in another person's shoes, we can reduce our unconscious biases and change our perspective to those who look different from us. When the hair color discrimination transfers to other groups and the blue haired people are forced to "put themselves in each other's shoes", will they change attitude to the robot? It's open-ended and leaves a space for the viewer to imagine what if they are part of the story.

3. Is AI more justicial than humans?

AI can help reduce systemic bias in government, business, educational institutions and other organizations to give everyone an equal chance. Using AI in decision making is more justicial than humans.
Humans can’t perceive all the characteristics of a certain group(e.g. people see someone with mental illness kills people, and will assume other patients are potential murderers) and can’t eliminate instinctive xenophobia (e.g. when seeing people from other races, they will feel the difference). Both limitations of humans contribute to biased perceptions of others. AI can help humans avoid being distracted by statistics or innate biases and make decisions based solely by individuals....Will AI form new discrimination?

Outcome & Reflection