On May 22, a young woman in Cairo named Aya Khamees uploaded a video to TikTok that would alter the course of her life. The 18-year-old was already a small-time star on the social media platform, where she posted singing and dancing clips under the name Menna Abdel Aziz. In the video, which is no longer available on the app, Khamees appears to have a black eye, and the right side of her face is scratched. Looking into the camera, she recounts how she was beaten and brutally raped at a party, as her attackers filmed the acts. “If the government is watching, I want them to go out and get me my rights,” she says.
The clip quickly went viral. Police later arrested everyone who attended the gathering, including Khamees, The New York Times reported. She was charged with drug use, prostitution, and violating “family values.” On social media, a flood of supporters began using hashtags like #حقمنهعبدالعزيز (It’s Menna Abdel Aziz’s right), sparking a reckoning in Egypt over how the country treats victims of sexual violence. After the case received widespread attention and Khamees completed a rehabilitation program, the charges against her were dropped. But her experience is far from an anomaly.
Since April, Egyptian authorities have arrested at least ten female TikTok influencers for violating draconian internet laws, according to Amnesty International. The troubling incidents are a reflection of Egypt’s conservative, male-dominated government, but they are also an extreme example of the political and content moderation challenges that TikTok is facing around the world. TikTok is owned by the Chinese tech giant ByteDance, and according to Egyptian media, the Chinese ambassador informed Egyptian officials that it respected the country’s decisions in the cases and would remove illegal content in the future.
Over the past two years, TikTok has quickly grown into a global sensation. Roughly 700 million people now use the platform each month. The app has been installed close to 940 million times in 2020, up 30% from 2019, according to estimates from the market research firm Sensor Tower. Some turn to TikTok for entertainment or distraction, but others are increasingly using it to push boundaries, shaping the future of digital culture in the process.
Their fate depends on whether TikTok will effectively manage the massive community it has built or, instead, make the same mistakes its predecessors did when it comes to free speech, misinformation, and safety. I’ve spent the better part of the past 18 months covering TikTok as a reporter at Wired, and now as an editor at Rest of World. After a while, I came to realize that the company faces three related problems, all of which will be familiar to anyone who has followed Google or Facebook over the past decade. The difference is that TikTok may be less equipped to handle them.
The first problem is related to content moderation and trust. When TikTok began exploding in popularity in 2018, its website listed a short set of rules banning extreme forms of behavior, such as violence and pornography. Facebook, meanwhile, has long had extensive community guidelines covering hundreds of scenarios (though they’re far from totally effective). TikTok has strengthened its policies several times in the past two years, prohibiting hate speech and limiting the spread of conspiracy theories like QAnon. But these improvements came only after it had weathered a series of damaging scandals over censorship.
In its early days, TikTok said that the company “took a blunt approach to minimizing conflict on the platform,” penalizing users for discussing controversial topics, like tensions between ethnic or religious groups. The company appeared determined to avoid any chance of controversy. (It often claimed its mission was to “bring joy” rather than, say, host political discussions.) That strategy may have been an extension of how ByteDance is forced to operate in China, where it runs a domestic version of TikTok, as well as other widely used social media platforms.
Internet companies in China are required to abide by broad and ever-changing guidelines regarding the types of speech that must be censored. A few years ago, the Chinese government cracked down on ByteDance, accusing it of hosting salacious material, including videos of teen mothers glamorizing their pregnancies. CEO Zhang Yiming quickly apologized and promised that his company would do a better job policing users in the future.
Media outlets like The Guardian later reported that TikTok had instructed moderators to blacklist videos advocating for gay rights and featuring topics sensitive to Beijing, including Tibetan independence. In March, The Intercept revealed that TikTok had suppressed posts from “ugly” and poor users, allegedly in an attempt to stem bullying. TikTok said that the guidelines were outdated, but it still looked as though the company had copied and pasted policies from China to the rest of the world.
The fallout was disastrous: Users and journalists accused TikTok of censoring other content that might upset the Chinese government, such as protests in Hong Kong and a video that raised awareness about China’s internment camps for Muslim minorities (TikTok denied the claims). In response to growing suspicion from users and government officials, the company spent much of 2020 trying to move away from China and remodel itself as a global social media company.
Its efforts have included releasing regular reports about state requests for video removals, which indicated it had never received any from Beijing. It also opened a transparency and accountability center where people can witness TikTok’s moderation practices up close. But it’s not clear whether these changes are enough to convince people that TikTok isn’t influenced by its home country. The app is the first social media platform from China to become a truly global success, but its rise comes as residents of many Western nations increasingly view the country’s government unfavorably.
There’s another reason TikTok may be struggling to earn people’s trust: The way the platform is designed. TikTok is the only major social media app driven almost exclusively by artificial intelligence. Instagram and Twitter have also embraced algorithmic curation but are still largely oriented around whom you know or choose to follow. When you open TikTok, you’re greeted by the platform’s central feature, the For You page — an endless stream of videos chosen by an algorithm. What you see is determined by your previous behavior, along with other factors, such as the country you’re in and the device you’re using.
The app encourages passive consumption, allowing the For You page to determine your preferences, well, for you. There’s no need to follow specific accounts or seek out particular kinds of content (though you certainly can). “That makes the structure and priorities of the algorithm even more important, as it increasingly determines what we watch, read, and hear, and what people are incentivized to create in digital spaces to get attention,” the writer Kyle Chayka wrote in a recent essay about TikTok.
The For You page has proved to be a potent vehicle for creating hit songs and viral memes as well as for launching the careers of new celebrities. But its inner workings are still largely opaque. It’s hard for users to know what’s actually on the app, or what they might be missing. More than almost any other platform, TikTok may be inadvertently pushing its users into separate spheres, and there are fewer avenues for them to discover when and if that’s happening. (TikTok has said it’s studying the issue.)
The risk is that TikTok will make political polarization worse, directing users to more and more extreme content. “I think algorithmic feeds are playing a big role in the perpetuation and amplification of disinformation,” said Benjamin Grosser, an artist and a professor of new media at the University of Illinois at Urbana-Champaign. His recent project is Not For You, a browser extension that deliberately tries to confuse TikTok’s For You page in order to surface content you might not otherwise see. The signals the plug-in sends are random, but after playing with it for a while, Grosser said he noticed that TikTok consistently promoted videos from the same handful of popular creators. TikTok’s algorithm is a powerful tool for producing culture, and it may be biased in ways that are independent of its users’ preferences.
The most obvious challenge TikTok must confront is that governments around the globe simply don’t like it. After a clash broke out along the border between India and China, Indian authorities blocked TikTok along with hundreds of other apps made by Chinese developers. In the United States, the Trump administration similarly moved to ban the platform amid rising tensions with China; the company is still entangled in a complicated legal battle in U.S. courts. Pakistan also briefly censored TikTok, though it quickly reversed the decision.
American companies like Facebook and Twitter frequently find themselves banned by repressive regimes. But rarely are they used as a scapegoat for their country of origin. TikTok will likely continue struggling to dissociate itself from ByteDance and China, especially as other nations begin scrutinizing the company. And Beijing has signaled that it’s willing to step in if ByteDance makes any moves it disapproves of, like selling TikTok to an American company, as the Trump administration has demanded.
Aside from the world’s biggest superpowers, TikTok will need to navigate thorny relationships with leaders from a slew of other nations where it’s quickly gaining millions of new users. Its stated values will likely be tested. Two years ago in Indonesia, for example, the government reportedly unblocked TikTok only after the platform agreed to remove “all negative content.” The company will need to decide whether it’s willing to make similar sacrifices in the future.
In many countries, the people joining TikTok look a lot like Khamees, the woman from Egypt: fearless, young, and eager to use a powerful new medium for free expression. The question is whether TikTok can fairly and adequately support them.