Google AI chatbot responds with a threatening message
India TV · 33m
Google's AI Chatbot Gemini urged users to DIE, claims report: Is it still safe to use chatbots?
In a controversial incident, the Gemini AI chatbot shocked users by responding to a query with a suggestion to 'die.' This has sparked concerns over the chatbot's language, its potential harm to users' mental health,
moneycontrol.com · 2h
Google's AI chatbot Gemini verbally abuses student, calls her 'stain on universe': 'Please die'
The alarming exchange unfolded when Reddy asked the chatbot for help with a school assignment exploring challenges faced by older adults. Instead of providing constructive assistance, the chatbot issued a series of disturbing and hostile statements,
Yahoo · 1d
Google's Gemini Chatbot Explodes at User, Calling Them "Stain on the Universe" and Begging Them To "Please Die"
Google's glitchy Gemini chatbot is back at it again, folks — and this time, it's going for the jugular. In a now-viral exchange that's backed up by exported chat logs, a seemingly fed-up Gemini begs a user to "please die" after they repeatedly asked the chatbot to solve their homework for them.
Results that may be inaccessible to you are currently showing.
Hide inaccessible results