According to Mashable, OnlyFans CEO Keily Blair announced the platform will start conducting criminal background checks for US creators using AI-powered screening provider Checkr. The company states 99% of its checks are automated with human quality assurance. Checkr has been used by Uber and DoorDash but faced dozens of lawsuits between 2015 and 2020 alleging violations of the Fair Credit Reporting Act. Additional lawsuits in 2022 and 2023 claimed Checkr failed to properly screen drivers using stolen identities and missed criminal histories. This follows Pornhub’s similar announcement last month about background checks for content partners.
The Checkr controversy
Here’s the thing about Checkr – they’ve got a pretty messy legal history. We’re talking about dozens of lawsuits between 2015 and 2020 alone, with plaintiffs claiming the company misreported crimes and flagged offenses that were too old to legally report. And it didn’t stop there. In 2022, there was a lawsuit alleging Checkr and Uber failed to catch drivers using stolen identities. Then in 2023, another case claimed their “quick checks” missed a DoorDash driver’s criminal history entirely. That’s a lot of red flags for a company that’s supposed to be catching red flags.
What this means for creators
For OnlyFans creators, this adds another layer to an already intensive verification process. They already have to provide government ID and banking information just to post content. Now they’re adding criminal background checks into the mix. The platform’s transparency center explains their verification requirements, but this new layer raises questions about accuracy and privacy. What happens if Checkr makes a mistake? These background checks could potentially block legitimate creators from earning on the platform based on incorrect or outdated information.
Bigger industry shift
This isn’t just an OnlyFans thing – it’s part of a broader trend. Pornhub announced similar checks last month, though they’re focusing on studios rather than individual creators. Both companies say it’s about safety, but let’s be real – it’s also happening amid increased legislative pressure. Age-verification laws are popping up everywhere despite studies showing they don’t actually work. Platforms are trying to get ahead of regulation by showing they’re taking “safety” seriously. The question is whether background checks actually make platforms safer or just create more barriers for legitimate workers.
The screening reality check
Checkr’s marketing materials paint a picture of seamless, accurate screening, but the legal record tells a different story. The archived Protocol article details how these automated systems can go wrong. When you’re dealing with people’s livelihoods, accuracy matters. A single error could mean someone loses their income source. And given Checkr’s track record with Uber and DoorDash, creators have legitimate reasons to be concerned about how this will play out.
