Nasreddine Benmadi had watched so many TikTok videos of migrants crossing the Mediterranean, between North Africa and Europe’s coast, that he had a good idea what the journey would look like.
At midnight, a small boat would leave from Algiers for the coast of Spain. It would take about six hours for Djamaa El Kebir, the Great Mosque of Algiers, to fade into the distance — a marker that they had crossed the European maritime border. He knew to take some clothes, 100 euros in cash, and flares to shoot into the sky in case they got in trouble at sea.
Nineteen-year-old Benmadi had clicked through countless TikTok videos and Instagram accounts, scrutinizing videos of boats, mostly filled with young men, drifting upon calm, glassy water. On accounts that aggregated these videos, footage of sea journeys would be cut alongside images of the same migrants walking along well-groomed European streets and posing in front of monuments like the Eiffel Tower. The journey was tagged as harka or haraga, a colloquial name in North Africa for the Mediterranean crossing.
Though he was aware of the risks, all Benmadi could think about was getting out of Algeria. Europe was his vision. “I was waking up and going to bed with that in mind,” he told Rest of World.
He departed from the Algerian coast at midnight with 14 others. When he had a strong enough phone signal, Benmadi paused to post on TikTok. The filter-enhanced water and sky were vivid blue; wearing an orange life vest, he smiled into the camera and filmed a video. After two days, the boat would come ashore in Spain, where Benmadi would push on toward Paris.
The content that Benmadi watched in preparation for his trip — and then produced himself — is all part of migrant TikTok, an ecosystem of content by and for migrants often repurposed to advertise and promote perilous, sometimes deadly journeys across closed European borders.
Speaking to Rest of World, experts pointed to migrant TikTok as a new entry point for young people into the world of irregular migration. The absence of reliable information means that social media has long played a role in helping people share advice, with Facebook groups and other private channels acting as informal hubs for knowledge: how to travel, whom to contact. But with the rise of apps like TikTok where posts are public, compounded by recommender algorithms that repeatedly suggest similar content, virality has given this information greater reach among people who aren’t actively searching for it.
“It can be seen by virtually anybody,” said Petra Molnar, a researcher at the University of Toronto specializing in migration, technology, and human rights.
The content helps mainstream the very idea of migration, pointed out Amine Ghoulidi, a Ph.D. candidate at King’s College London who has researched social media and irregular migration in North Africa. He said this exposes people to information that they otherwise wouldn’t have encountered. “You don’t have to work hard to stumble upon this content.”
In February, viral haraga TikTok videos, particularly from two Tunisian women, Chaima Ben Mahmoude and Sabee al Saidi, drew media attention, which led to criticism from conservative outlets for glamorizing illegal migration. Ben Mahmoude’s video, which pictured her on a boat with other migrants at sea, racked up around 2 million views. These, like other videos and accounts following the same theme, tend to originate on TikTok before they are reposted to Instagram, where they acquire another round of viewers.
Aside from visual clues, like orange life vests, they can look like standard TikTok clips: young people posting videos of travel by land and sea or of arrival to their European destinations. There’s little context about them, apart from the posters’ TikTok usernames and the occasional direct reference to immigration in their captions. The videos share common keywords, like ghourba, meaning abroad or overseas exile in Algerian dialect; sometimes an implicit reference is made through emojis of boats, European flags, hearts, and crying faces. The videos don’t appear inflammatory or offensive, and can often be found alongside innocuous and humorous content related to the struggles ordinary migrants face in getting European visas. Captions are often in North African dialects of Arabic, as is the audio.
The videos imply — but don’t overtly state — that the passage was illegal, and therefore arranged by smugglers. Videos referencing smuggling services would be subject to moderation and taken down, Meta, who owns Instagram, told Rest of World.
“We don’t allow coyotes [professional people smugglers], criminal organizations, and other human smugglers to operate on Facebook and Instagram and remove content facilitating smuggling whenever we find it — whether that’s through proactive detection or reports from our community,” a Meta spokesperson said in a written statement.
However, the company also said it does not want to limit conversations around migration, especially for those fleeing violence and seeking asylum. TikTok did not respond to requests for comment.
Moderating these videos is complicated, in part because social media platforms can struggle to interpret the context of regional languages that might refer to an illegal act like people smuggling. There’s also often no clear link between migrant accounts and the smugglers who, ultimately, may have helped arrange their journey.
Alexandre Alaphilippe, executive director of the anti-disinformation nonprofit EU DisinfoLab, said that people in various contexts often recognize this, and can deliberately use language to keep their content from being flagged. “You have these tactics that are being used to both be recognized as a community, as a language, and to be able to escape moderation,” Alaphilippe said. “It’s always a cat-and-mouse game.”
The hashtag “haraga” shows 13,000+ uses on Instagram, and some aggregator accounts on TikTok that incorporate that keyword in their name have upward of 150,000 followers. Social media companies like TikTok and Meta increasingly employ AI systems to moderate content at scale. But since these AI systems are context-blind, digital rights activists say they can end up missing, for example, a key word in dialect. That keyword may continue to feed similar content onto a user’s timeline.
“Content recommender systems are trained on certain data sets,” said Eliška Pírková, Europe policy analyst and Global Freedom of Expression lead at Access Now. “If these data sets are being misrepresented or they are simply not adequate, [they] can easily amplify potentially harmful content.”
New arrivals to Europe are usually aware of how influential their viral content can be. Hocine Beg, 22, came to Paris as a student from Algeria and started making TikTok videos as a hobby. His content doesn’t address migration; most of his videos show him posing in front of monuments and lip-syncing to popular music. But his videos get recommended alongside content that does, since he uses some of the same hashtags.
When people contact him on social media asking him how to get to Europe, he feels an obligation to guide them, including on which routes are safe to take. “My loved ones and family crossed through this path and told me about it; they gave me ideas. So, I’m trying to help other people now,” Beg told Rest of World.
In Europe, the social media platforms and governments trying to rein in information sharing on illegal migration can’t agree how to do it, or to what extent. Platforms cite the right for people to discuss seeking asylum, while EU law enforcement wants the content taken down entirely — something that, under European law, bounces back under the responsibility of the platform.
New EU legislation attempts to mandate the monitoring of online smuggling networks and even algorithm transparency, while agencies like Frontex and Europol have tried to use data scraping to inform predictive analysis models for what routes illegal migrants might use. So far, it’s resulted in a tug of war that leaves the content largely up and available.
“From the perspective of law enforcement, the interest is in flagging content — then it’s the responsibility of platforms to take it down,” said Angeliki Dimitriadi, a research fellow at the Global Public Policy Institute who specializes in migration. “But the platforms are also placing the responsibility on the states.”
When there are few available routes to migrate legally, there’s a vacuum of quality information about alternatives. Dr. Ysabel Gerrard, a lecturer on digital media and society at the University of Sheffield, sits on Meta’s suicide and self-injury advisory board. She holds that moderation decisions should account for why people are drawn to share the information in the first place. “There are ways that this is a reaction to difficult political and social climates,” she said. “There are often really good reasons why people are doing this, and it shouldn’t necessarily be something that is feared.”
She added: “When I say content moderation is hard, I mean it — because I’m not just standing on the sidelines saying what should be done, I’m trying to do it myself.”
In Paris, Benmadi and Beg often hang around other young North Africans who post about their daily lives on TikTok. A scroll of Benmadi’s TikTok account reflects the pattern of migrant TikTok: posing in streetwear, Eiffel Tower aglow in the background, interspersed by the occasional throwback post to his sunlit Mediterranean crossing. He recently created a YouTube account to host videos of his crossing experience.
Benmadi limits his involvement to creating content, though. He gets plenty of inquiries from others who are hoping to make the journey, he said, but chooses not to give them advice.
“It’s hard at night, in the middle of the sea, without food. One small move could turn the boat upside down, and goodbye,” Benmadi said. There was another, less social-media-friendly side to his journey that didn’t make it to TikTok: how the driver who led his group forgot to turn on the GPS, which caused them to travel without clear direction for more than 40 hours, with only the compass on an iPhone to point them in the right general direction. One of Benmadi’s fellow passengers almost lit a flare next to the can of gasoline. They survived overnight off water and dates.
“When your decision is made, it has to be between you and yourself.”