To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

MrDeepFakes has chilling warning about future of site that gets 13million visits a month

MrDeepFakes has chilling warning about future of site that gets 13million visits a month

Deepfakes are basically videos in which someone else’s face is swapped onto the body of another person

Look, we all love Deep Tom Cruise. Watching an AI-generated version of the actor falling over will never get old, but that doesn’t mean deepfake technology should be taken lightly.

For anyone who doesn’t know, deepfakes are basically videos in which someone else’s face - more often than not, a celebrity’s - is swapped onto the body of another person.

Now would be a good time to introduce Mr DeepFakes, who runs a deepfake website that people can visit to request the face of a celebrity be imposed into a sex tape.

Mr DeepFakes issued a chilling warning about the future of deepfake technology.
BBC

Unsurprisingly, Mr DeepFakes’ site racks up around 13million visitors each month and has 250,000 members, which means business is good for Mr DeepFakes.

But that hasn’t stopped him issuing a chilling warning about the future of deepfake technology, noting that eventually, anyone one of us - celebrity or not - could be deepfaked into sex tapes.

Mr DeepFakes was speaking in the BBC’s new documentary Deepfake Porn: Could You Be Next? which explores the rise of deepfake pornography.

Mr DeepFakes said in the doc: “Currently on Mr DeepFakes we have over 20,000 deep fake porn videos hosted.”

Speaking anonymously from the USA, he added: “The technology will only get better and it’s increasingly difficult to distinguish between a real video and a fake video. There will be a point in the future where any of us could be deepfaked.”

However, defending his living, the website owner added: “I think that as long as you’re not trying to pass it off as the real thing, that shouldn’t really matter because it’s basically fake. I don’t really feel that consent is required, it’s a fantasy, it’s not real.”

Over summer, criminal reforms reported on by The Guardian targeted deepfake pornography.

In July, the Law Commission of England and Wales recommended that sharing deepfake pornography carry a sentence of up to three years behind bars.

Prof Penney Lewis, the law commissioner for criminal law, said at the time: “Sharing intimate images of a person without their consent can be incredibly distressing and harmful for victims, with the experience often scarring them for life.”

She added: “Current laws on taking or sharing sexual or nude images of someone without their consent are inconsistent, based on a narrow set of motivations and do not go far enough to cover disturbing and abusive new behaviours born in the smartphone era.”

Featured Image Credit: BBC devilmaya / Alamy

Topics: Technology, Sex and Relationships, Celebrity