AIN.Capital editor talked to a Ukrainian woman who works as a TikTok moderator about work, war, and prohibited content. She wished to remain anonymous.

How long have you been working as a TikTok moderator? How did you get the job? Do you work for an outsourcing company or directly for ByteDance?

Currently, I am in Poland, and since September this year, I have been working in an international outsourcing company as a content moderator. ByteDance is our client, so I have no direct contact with them, the dialogue between the company and the client is taking place at the highest level, and we – content moderators – just do our job. 

What exactly are your responsibilities? Tell us how do you work? What regions does your team cover?

I work in the Ukrainian team, and our responsibilities include the moderation of Ukraine and Russia because, unfortunately, we still have joint moderation. I work in the first line of moderation, our task is to monitor videos, profiles, audio, and comments. 

As for the moderation of videos, profiles, and comments, we moderate reported materials or materials that AI suspects of violating the rules of the social platform. Audio is moderated by absolutely everyone. 

What is your work and rest schedule? Is it regulated, or do you decide it yourself?

Moderation takes place 24/7, so according to the regulations, we have three shifts, 9 hours each, which includes 8 hours of work, a one-hour break, half an hour for various activities organized by the company (it can be a light workout, interactive games, literature discussions, various discussions, short lectures, etc.) 

The working time is fully organized to perform all the work evenly, that is, at a certain time, there is a certain number of moderators on a certain line (for example, video or profiles, audio or comments) depending on the volume of work. 

How stressful is your job? What are the pros and cons?

The content can be absolutely different, and every so often it can be frankly unpleasant or difficult, but it is worth noting that each moderator, if necessary, can immediately seek help from a psychologist or just take a break during work. The company really cares about our well-being. 

Among the pros, I would name good corporate communication. Due to COVID-19, we can work online, but at the same time there is always contact with our managers during the shift, you can always ask for help with difficult cases, etc. I have not yet encountered any cons in the workflow during this short period. 

Tell us, what content do you have to deal with?

The content is very diverse: from innocent dancing or singing, which is quite popular on the platform, to pornographic images, abuse, and outright violence. It should be understood that we mostly receive content that someone has complained about, so these are sometimes low-quality, meaningless cases. 

The most common cases are children under the age of 13, who are automatically deleted according to the age policy of the social platform. There is quite a lot of content with teenagers drinking alcohol or smoking, or dancing in a somewhat explicit manner – the policy on minors is quite strict. Perhaps an integral part is an erotic content for adults, which in most cases is also prohibited. And one of the biggest parts is the Russian-Ukrainian war, which has its own rules that are supplemented to reflect the situation on the ground. 

What were the weirdest, most interesting, most horrible, or funny cases?

Hundreds of different cases go through me every day, so it’s hard to single out one thing, but some of the oddest for me are the new trends like “you are my enemy”, where frankly strange content is published ranging from sexual fetishes to animal abuse. 

The most interesting thing for me and what impressed me is some content from ORDLO [Russian-occupied territories of Ukraine] and occupied Crimea. From time to time, there are videos of teenagers who actually grew up in the occupation and do not remember what it is like to live in Ukraine, but despite the constant Russian propaganda, they still try to convey the truth about the war in their videos, to convince other victims of the occupation since 2014 where the truth is and where the lies are.  

The most horrible for me so far are Russian videos of our captured and killed military. They publish such things with even greater cruelty, so it always hurts to see it. 

The funniest and cutest are probably videos where parents film their children singing Ukrainian songs, and of course, a ton of content with cute dogs and cats.

How do you handle fakes and outright propaganda in the context of the war in Ukraine unleashed by Russia? Share some examples.

The topic of war is monitored very closely. Almost every statement, fake, and propaganda is vigorously challenged by our management. Generally speaking, the rules of the social platform adhere to the official position of Ukraine and internationally recognized statements. For example, the Russian fake that allegedly Bucha was staged by Ukraine is a violation, the official position of the social platform is on the side of Ukraine. Russian accusations that there are Nazis in Ukraine and that Ukraine is governed from the outside also fall into the category of “dangerous disinformation”.

In fact, behind every fake and statement is the hard work of our team members. 

It is also worth mentioning the discussion on Russian insignia and the promotion of the “Z” sign, which is already perceived by many as a modern Russian swastika. During all these 8 months, many discussions were held and finally recently it was recognized that the image of the symbol with the aim of glorifying the Russian military or threatening Ukraine or Ukrainians in social media falls under the category of “threat of violence”. 

Also, in my opinion, the most painful moment for our entire team is the interpretation of our Azov as international terrorists. Any mention or image of Azov is blocked, we are working to remove them from this category, but so far to no avail, unfortunately. 

What struck you the most when you got this job? What would you advise TikTok users as a moderator?

Perhaps what struck me the most was the absolutely diverse amount of content, downright disgusting and unpleasant. People post absolutely everything, from ordinary everyday things to completely different perversions and cruelty. 

I would advise users – especially parents – to supervise their children and what they post. Children and teenagers want to look like adults, so they often try to emulate adult trends. 

And for most users, I would advise the following: if you don’t want your videos to be deleted, learn what slurs are and don’t use them. This is probably one of the most popular reasons for a ban. Unfortunately, such popular expressions as “Rusnia”, “Moskal”, “Katsap” are also slurs and are prohibited. Also, be tolerant – you cannot openly wish death to Russians, no matter how much we would like it, because bullying is prohibited on TikTok. 

If your video has been deleted, do not re-post it because it will be deleted anyway, but your account may also be banned for repeated violations. 

I would also like to wish everyone: create more content, diverse content, whatever you like, and enjoy the process. And perhaps a small request from a person who looks at all the content from Ukrainians and Russians from the side: think carefully whether it is worth dancing to another popular Russian song because the Russians publish a tortured and murdered Ukrainian to the same song.