
When it comes to ChatGPT, there seems to be two camps; those who vow to never use it and those who use it to outsource the most mundane, time-consuming and even human elements of life.
Some people, for whatever reason, use this form of artificial intelligence as a therapist or friend to whom they confide in and ask for personal advice.
But be careful of befriending ChatGPT and telling the tool too many of your secrets – you never know where it'll end up and how it could be exploited.
Don't forget that there's also a human cost of creating speedy software such as ChatGPT, not least the workers in the global south, including Kenyan workers who were forced to endure vivid descriptions of sexual abuse, violence, racist and hateful text.
Advert

An Oxford University computer science professor has shared his warning against making the AI platform your best friend. Mike Woolridge told The Daily Mail: “It has no empathy. It has no sympathy.
"That's absolutely not what the technology is doing and crucially, it's never experienced anything. The technology is basically designed to try to tell you what you want to hear – that's literally all it's doing."
And considering that human connection is about having compassion and empathy, especially if you're in need of interpersonal advice, perhaps a set of code behind a screen isn't the best option?

Advert
Not only that, Professor Woolridge also warned against the data breach potential of sharing sensitive information. In 2023, Italy became the first Western country to ban ChatGPT due to the information being regurgitated for training purposes. An Italian data-protection authority said the app had breached data involving user conversation and payment information.
The watchdog said OpenAi had had no legal justification for 'the mass collection and storage of personal data for the purpose of 'training' the algorithms underlying the operation of the platform.'
And although ChatGPT says users between the ages of 13 to 18 need to obtain parental consent before using it, the Italian watchdog claimed it still 'exposes minors to absolutely unsuitable answers compared to their degree of development and awareness.'
Professor Woolridge echoed the concerns, saying: “You should assume that anything you type into ChatGPT is just going to be fed directly into future versions of ChatGPT.
"It's extremely unwise to start having personal conversations or complaining about your relationship with your boss, or expressing your political opinions on ChatGPT."
Topics: Artificial Intelligence, Mental Health, Technology, News, World News, Science