To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

PornHub moderators had to review ‘700 videos per day’ but were expected to watch more

PornHub moderators had to review ‘700 videos per day’ but were expected to watch more

An ex-moderator interviewed by Netflix says employees had a massive workload and he couldn't determine the ages of people in the clips.

Netflix’s wild new documentary, which delves into the controversies surrounding PornHub, includes a crazy testimony from a moderator who claims he had to watch a crazy number of explicit videos as part of his day-to-day job.

Money Shot: The PornHub Story explains how PornHub has been criticised by anti-child trafficking and exploitation campaigners who say that it hosts clips that feature minors, as well as people who did not give their consent to be filmed.

Following the start of the viral #traffickinghub by Laila Mickelwait, who wants the site shut down for good, a ground-breaking New York Times investigation examined these claims and revealed the horrific story of Serena Fleites, a 14-year-old girl who found a video of herself on the site and struggled to get it removed.

The article blew up and preceded a crackdown on the content uploaded to PornHub, with only verified accounts able to share content.

Dani Pinter, senior legal counsel at the National Centre of Sexual Exploitation (NCOSE), an organisation that has worked with Laila in a bid to take down PornHub, said the moderators weren't able to do their jobs.

“That’s impossible," she said regarding the workload. "So, of course, they were fast-forwarding, skipping through. No sound, which is key because sometimes the women and – or children – in the videos are crying, yelling, saying ‘No,’ saying, ‘Stop.’ And they’re not catching any of that.”

The ex-moderator struggled to determine age in a lot of the videos uploaded to PornHub.
Netflix

The former moderator who appeared in the Netflix doc - and whose identity was withheld - explained how he worked for the porn website for less two years.

He said that some of the moderation happened in Canada - where PornHub’s parent company MindGeek is based - but most of it happened in Cyprus.

There were 'a little over' 30 moderators during the man’s stint at the company and they had to watch a lot of porn.

“Every moderator had to review 700 videos per day, but it was expected for us to do more," he said.

The shifts are said to have been a fairly regular eight hours, which makes for a lot of sexually explicit content for one person to consume.

“We were scrubbing through videos as fast as we could,” he recalled. “Even if we thought that we were being diligent with our work, we would still miss a few videos every now and then.”

The moderator went on to confess he found it hard to tell the ages of some people in the videos he had to watch: “I can’t really tell from a video the age of somebody. It’s a really hard thing to determine if a 17 year old is more than 18. They could be 14, they could be 19.

“Basically, we would just guess, then my manager would decide if the video would be taken down for good or if it will go live again. The rules constantly changed.”

PornHub's execs praised the site's moderation strategy.
Siraj Ahmad / Alamy Stock Photo

In February 2021, Canadian Parliament questioned MindGeek about PornHub's moderation practices after the website suspended all content from unverified accounts – around 80 percent of the site’s content at the time - due to concerns about exploitation.

MindGeek’s Chief Operating Officer David Marmorstein Tassillo praised PornHub’s moderating system and said it was the first site to have ‘human moderation’ of its content.

“We had human moderation on our sites when it was a word that didn't exist... These were things that we started," he said.

After calling its moderation practices a ‘constant evolution,’ he continued: “We weren't public about it but these are things we did since the beginning,"

A spokesperson for MindGeek told LADbible that the company has ‘zero tolerance for illegal material' and its policy is to remove content in violation of its terms of service or reported by users 'no questions asked'.

Users can only upload content to MindGeek’s platforms - which also includes RedTube, YouPorn and Brazzers - if they have a government-issued ID that passes third-party verification, the spokesperson explained.

Dani Pinter in Netflix's PornHub documentary.
Netflix

Any user can fill out the Content Removal Request Form to ‘disable a piece of content’ on PornHub for further review. This is a ‘small part of MindGeek’s industry leading safeguards,’ the statement continued, which also highlighted that the National Center for Missing & Exploited Children (NCMEC) reported that PornHub ‘has fewer incidents of Child Sexual Abuse Material (CSAM), and removes cases of CSAM in the shortest amount of time after being notified, among all major platforms, including Facebook, Instagram, Twitter, YouTube, and more.”

Following on from the claims in Netflix’s doc, the spokesperson told LADbible that ‘any insinuation that we do not have enough moderators to thoroughly review all uploaded content is categorically false’.

They added: “The fight against illegal material on the internet must be led by effective policies, data and facts, and MindGeek is committed to remaining at the forefront of this fight.”

Money Shot: The PornHub Story drops on Netflix today.

If you have been affected by any of the issues in this article and wish to speak to someone in confidence, contact The Survivor’s Trust for free on 08088 010 818, or through their website thesurvivorstrust.org

Featured Image Credit: Netflix

Topics: PornHub