Tracey Breeden is the founder of Disrupt the Landing, a trust and safety consultancy that works to foster safe, respectful, and inclusive digital spaces. Formerly the head of safety and social advocacy at Match Group and the head of global women’s safety at Uber, Breeden has focused her career on rooting out bad actors on international platforms.

She spoke to Rest of World about how trust and safety work changes at the global scale, and what mistakes platforms make in responding to online harm.

This interview has been edited for clarity and brevity.

How would you describe the work that you’ve done at these companies?

At the highest level, my vision is to create safe, equitable, and respectful spaces for all people — but most importantly, for the most marginalized, so for women and people of color and historically marginalized communities. If I create safety for those who are most marginalized, then it creates safe spaces for everyone.

What does it mean to think about these issues globally?

These are global issues already. Online harassment and abuse is a global epidemic. Gender-based violence is a global epidemic. And it shows up in every country and every region. So the first thing is to help people understand that globally, it doesn’t matter whether you’re in the U.S., whether you’re in South Africa, whether you’re in Brazil, whether you’re in India, whether you’re in Japan — that these things show up and that there is harm and abuse in every region. And they tend to have similar trends and similar things. You’ll have cultural nuances that come into play, but ultimately, you’ll see similar trends across the globe.

And what that tells me is that we can actually approach it in a similar way, and cause disruption and prevent it across the globe by using similar approaches but also being sensitive to the cultural nuances that we may see in different regions across the world.

What sorts of blind spots do you see from platforms right now?

To me, the biggest blind spot is the lack of investment. People know that harm is happening on these platforms. The blind spot is that they just don’t know how to address it. There’s a little bit of hesitancy around transparency because they think, if we’re transparent about this, people are going to think we’re unsafe. And the reality is that people already know from their own experience that there is an unsafe component. What happens in our culture shows up on our platforms. It does. It shows up on every platform.

When we’re transparent about what’s happening on these platforms, it opens the door to get help from the outside. There are experts out there who can come in and help you address these things appropriately. So there’s a lack of understanding of what our role is in actually addressing what happens on our platforms.