Daughter's desperate plea after mother begins close relationship with fake Owen Wilson

Home> Community> Weird

Daughter's desperate plea after mother begins close relationship with fake Owen Wilson

Scammers are back at it with the celebrity deepfakes

Ah Artificial Intelligence, you've done it again.

Technology and AI is progressing at a rapid pace, to a point where people are struggling to tell the difference between real and fake online.

AI videos have been doing the rounds on X in recent times, with millions of users shocked by how realistic they seemed.

A street video that eerily mirrors the viral 'Hawk Tuah' video has caught the eye of many, though they were shocked to find out that it was all created with Google's 'state-of-the-art video generation model', Veo 3.

But it looks like the confusion is only just beginning, with other videos including one with a woman whispering into the camera also shocking users.

Scammers are also catching on though, with one woman falling victim to a fake Brad Pitt who scammed her out of £700,000, earlier this year.

Now, one woman has taken to Reddit to explain that her mum is about to fall for something similar.

Owen Wilson, is that you? (Reddit)
Owen Wilson, is that you? (Reddit)

Writing in the r/Scams group, she listed a 'plethora of red flags' and asked for help to convince her mum that she's about to be scammed.

The user's mother believes that she has been talking to actor Owen Wilson, who is known for his roles in Cars and Wedding Crashers among other titles.

Spoiler alert though, it's not him, and the video that was sent as 'proof' is very obviously AI.

"My sister & I have been telling her it’s a scam but she’s just not hearing us," she said.

The user explained: "She met him on Yahtzee with Friends. He claims he’d mistaken her for a person he knows in real life. He only talks to her on WhatsApp. Voice calls.

"She says FaceTime too but I have doubts."


She added that photos of Wilson that were sent to her were easy to find on the internet and on fan accounts.

Her mum claims that he hasn't asked for money or bank details, but that he got her a job at Warner Bros, where she 'can make $5000 a month but liking social media posts'.

"The job has sent her a couple $10 payments through Cashapp for her first trainings. She says they’ll send her $1,000 through CashApp when she finishes training," she continued.

The user said that 'Owen Wilson' told her mum that he'd be buying a house in their small coastal town and wants both her parents to live there and be caretakers while he's away.

She added: "He had an actual realtor from this gated community call her to discuss their options."

The still is more realistic than the video, in all fairness (Reddit)
The still is more realistic than the video, in all fairness (Reddit)

This part was believable to her though, as the realtor 'mentioned my sister in law’s mother’s uncommon name', even though they aren't connected on social media.

To top it all off, the above video was sent as 'proof' that he was real, as the user asked others in the group for advice to convince her mum that it was all fake.

One user pointed out: "Wow that’s scary, you can tell the tone is off and robotic. Face also looks slightly different than Owen but wow that’s crazy for someone who wouldn’t know any better."

Another said: "It's only going to get worse, unfortunately. Deepfake accuracy will only continue to improve," while a third advised: "What a frustrating situation.

"Be careful not to alienate her. It can make these situations more difficult."

Featured Image Credit: Reddit

Topics: AI, Artificial Intelligence, Technology, Weird, Reddit