AI-powered CHATBOTS are on the rise – but there are some things you should never tell them.
These bots use AI to talk as if they were real humans, but make no mistake: there is great hidden danger.
Chatbots are seemingly everywhere, with millions of people using Google TwinsChatGPT from OpenAI and Bing AI from Microsoft.
And there are countless others out there, writing down everything you say to them.
Unfortunately, you need to be extremely careful about what you say to an AI chatbot.
US Sun spoke with cybersecurity expert Dr. Martin J. Kraemer, who revealed the dangers of revealing too much.
“Never share sensitive information with a chatbot,” said Dr. Kraemar, security awareness advocate at KnowBe4.
“You may need to share your flight booking code or parts of your address with an airline chatbot, but this should be an exception.
“You can always call instead of using the chatbot. Generally, never share your password or other authentication credentials with a chatbot.
“Also, don’t share your personal thoughts and intimate details. It’s safe to assume someone else will have access to them.
“The bot will not keep everything to itself. Likewise, it will not share commercial information.”
TALKING
It’s easy to end up sharing too much information with a chatbot.
Their human conversational style can lull you into a false sense of security.
But one simple reason why you should be careful about what you share is the risk of your account being hacked.
In more sinister cases, AI can be designed to collect personal information to be used later in scams and fraud.
Paulo Bischoffconsumer privacy advocate
Likewise, unencrypted chats can be snooped on by experienced hackers.
And chatbot apps are also at risk of having conversations leaked in cyber breaches.
That’s not all: your conversations could end up being pulled back into the AI systems themselves.
“Never share any private or personally identifiable information with an AI chatbot,” said Paul Bischoff, consumer privacy advocate at Comparitech, in conversation with The US Sun.
AI romance scams – BEWARE!

Beware of criminals using AI chatbots to scam you…
The US Sun recently revealed the dangers of AI romance scam bots – here’s what you need to know:
AI chatbots are being used to scam people looking for romance online. These chatbots are designed to mimic human conversations and can be difficult to detect.
However, there are some warning signs that can help you identify them.
For example, if the chatbot responds very quickly and with generic responses, it is probably not a real person.
Another clue is if the chatbot tries to transfer the conversation from the dating platform to a different app or website.
Furthermore, if the chatbot asks for personal information or money, it is definitely a scam.
It’s important to stay vigilant and exercise caution when interacting with strangers online, especially when it comes to matters of the heart.
If something seems too good to be true, it probably is.
Be skeptical of anyone who seems too perfect or too eager to move the relationship forward.
By being aware of these warning signs, you can protect yourself from falling victim to AI chatbot scams.
“Your information can become part of the AI’s training data, meaning anyone using AI could theoretically access it.
“In more sinister cases, AI can be designed to collect personal information to be used later in scams and fraud.”
You should be especially cautious if you are downloading artificial intelligence chatbots from untrustworthy sources.
It’s best to stick to official app stores and avoid chatbots with few (or bad) reviews.
If you decide to speak to a chatbot, remember that it is not a real person – and the information you share may not be secure.
This story originally appeared on The-sun.com read the full story