A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.
Did it? Or did it make them look elsewhere?
The amount of school uniform, braces, pigtails and step-sister porn on Pornhub makes me think they want the nonces to watch.
given the amount of extremely edgy content already on Pornhub, this is kinda sus
Yeah…i am honestly curious what these search terms were, how many of those were ACTUALLY looking for CP. And of those…how many are now flagged somewhow?
4.4 million sounds a bit excessive. Facebook marketplace intercepted my search for “unwanted gift” once and insisted I seek help. These things have a lot of false positives.
Imagine a porn site telling you to seek help because you’re a filthy pervert. Thats gotta push some to get some help I’d think.
Imagine how dumb, in addition to deranged, these people would have to be to look for child porn on a basically legitimate website. Misleading headline too, it didn’t stop anything, it just told them “Not here”