When Microsoft acquired GitHub in 2018, there were concerns that the company would struggle to moderate controversial code on the world’s largest repository of open-source software. Chief among them was whether GitHub would continue to host deepfake code that’s used, among things, to transpose faces of celebrities onto pornographic videos without their consent. GitHub power users were also worried if Microsoft would remove code that undermined its business interests. The worst predictions never came true, but four years on, the lack of moderation on GitHub has brought Microsoft into the eye of a completely different storm.
On January 1, hundreds of Muslim women in India found their names and Twitter profile pictures displayed on a fake auction site named “Bulli Bai” — a slur against Muslim women — without their consent. The code for the app was hosted on GitHub.
While GitHub quickly took down the app, following massive social media backlash, this is the second time in seven months that the platform has been used to target Muslim women in India. In mid-2021, a similar web application called “Sulli Deals” was hosted on Github to trade Muslim women without their consent. The app was online for weeks before it was taken down.
Many of the women targeted on Bulli Bai are outspoken critics of Hindu nationalism and have criticized Prime Minister Narendra Modi’s handling of religious minorities in the country. The list included journalist Ismat Ara and veteran activist Khalida Parveen, among others.
Quratulain Rehbar, a freelance journalist based in Kashmir, who was on the list of targets, believes GitHub should be held accountable for enabling the bigoted app. “No one should offer a platform to spread hate. Then this becomes a norm,” Rehbar told Rest of World.
GitHub is used by over 73 million developers globally as a kind of social networking platform where they can share their codes. These codes can be searched, reused, or sifted through for errors by developers across the globe. Dedicated communities of volunteers develop and maintain the open-source coding that underpins a significant chunk of our multitrillion dollar internet economy. In 2021, India emerged as the fastest-growing open source contributor community for GitHub, with 5.8 million developers.
GitHub does not moderate user uploaded code unless it receives complaints. The user-generated nature of the platform exposes it to censorship and blockage by sovereign governments, and the platform has landed in hot water multiple times in the past. In 2013, GitHub was blocked in China and, just two years later, the code sharing site suffered a massive denial-of-service attack, which was subsequently linked to the Chinese government and said to be in response to political information posted on the platform. GitHub has since been unblocked and is widely used in China, often dubbed as the “last land of free expression.” The platform faced similar government-mandated blocks in Russia, Turkey, and India.
But unlike the all-powerful state mandating censorship, the demand for moderation of code, as seen in the recent Bulli Bai controversy, comes from victims. The question raised by developers and activists is: How could a company like GitHub not have a logic for preventing removed code from being published again?
Padmini Ray Murray, a digital rights advocate and the founder of Design Beku, a tech and design collective, said GitHub India’s response in both cases — Sulli Deals and Bulli Bai — has been inadequate. “We need to know @github @GitHubIndia why it is so difficult for you to not monitor & take down content that clearly violates your policies? Is it that your moderation teams are not diverse enough? or do you just not care enough to be more vigilant?” Murray tweeted after reports of the clone app emerged. Despite the takedown of Sulli Deals in 2021, Murray said, GitHub never published any blog or investigation into the issue and hasn’t been transparent about how it moderates code that enables communal hate.
GitHub did not respond to Rest of World’s specific questions on moderation, but a company spokesperson said in an email statement that it has longstanding policies against content and conduct involving harassment, discrimination, and inciting violence. “We also continue to follow applicable laws and policies as we engage in good faith with authorities,” it said.
While the criticism of GitHub’s inadequacies sounds reasonable, tech policy experts believe it might be hard to workshop a solution. For instance, with only a one-line modification, allowing it to source photographs of politicians, the same code used in Bulli Bai wouldn’t violate Github’s community standards, according to Pranesh Prakash, co-founder of the Center for Internet and Society. “It would then be making a point about public corruption instead,” he said. “The similarity of the code can’t by itself tell you very much about whether the new repository violates GitHub’s community standards or not.”
This is not GitHub’s first rodeo in moderating code used to create objectionable content. GitHub continues to host deepfake code, according to reports, possibly citing educational purposes. In late 2020, GitHub removed the repository youtube-dl, a free software tool for downloading YouTube videos, because of copyright complaints. But several weeks later, it reinstated the code after pushback from activists and journalists who used it as a reporting tool to gather crucial video evidence.
Syed Sami, a Chennai-based software engineer and an online sleuth investigating the origins of the Bulli Bai application, hypothesized that the creation of a prescreening mechanism could help reduce abuse of the platform. GitHub users would have to list the name of the owner and purpose of creation of the page, which could then be greenlighted by GitHub.
But Sami’s solution of prior censorship at scale comes with its own attendant problems. It runs counter to the ethos of open source projects, where people’s code could be flagged as a “false positive,” even if it’s not doing anything problematic. Even Sami is cognizant of the complexities of moderation: “I think GitHub cannot help much on the issue.”