blog featured image

The world’s largest community for creators and their patrons provides visual artists, musicians, and other creators with business tools to run their own subscription content service and content moderation. Members can build relationships, provide their subscribers with exclusive perks, and get funding directly from fans and patrons. The international Platform has over 100,000 monthly active creators and over 2 million monthly active patrons.

Since its inception, the Platform Community has had to navigate some choppy waters. It needs to vigilantly draw the line between funding art vs. adult content and encouraging free discourse vs. removing speech some may consider offensive. As much as the site’s openness allows original content to thrive, it could also be a megaphone for people with hateful views on race, gender, and sexuality.

The Challenge: Timely and efficient content moderation

There are many challenges that come with enforcing Community Guidelines for thousands of members. Sometimes, there are materials posted that many consider to be borderline pornographic, racist, or sexist. They wanted to flag and remove these offenders from the site. With over 100,000 creator accounts, they decided to leverage a mix of technology and human intervention for the audit process.

Their Trust & Safety Team partnered with Boldr to thoroughly audit over 7,000 creator profiles within a week, unprecedented swiftness.

The Approach: Content review, one by one

The Community needed a reliable partner, who could audit large numbers of flagged profiles with discernment and sensitivity. David Sudolsky, Boldr founder and CEO says, “Our goal is to ensure our partners succeed, and one of the ways we do this is by jumping in and helping out where they need immediate support. We can’t take full credit, though. The collaboration across both teams is what made this a huge success.”

Finding the right “chemistry” between Client and partner is crucial when building a Content Moderation partnership. When values and culture are aligned, communication and trust flow effortlessly.

To expedite the partnership, the Community decided to fly in two members of their Trust & Safety Team to Manila to train the Boldr Team. In response, we assigned ten of Boldr’s sharpest analysts to review profiles, respond to Creator questions, and resolve cases.

The Measurable Impact: Keeping the Community Safe

With the help and guidance of the Trust & Safety Team, we were able to complete the audit of 6,851 flagged accounts. Boldr completed this audit 20% faster than the goal, allowing them to start handling more complex Content Moderation processes.

“We’re thankful for Boldr’s fast and good quality work!”
Tyler Palmer, VP of Operations

With a dedicated outsourced team of content moderators to serve and interact with the broader Creator Community, they were able to create a safer environment for creators and patrons. The partnership helped them stay committed to their mission: empowering creators and allowing their patrons to connect with them on a whole new level.