moneycontrol.com · 1h
Google's AI chatbot Gemini verbally abuses student, calls her 'stain on universe': 'Please die'
The alarming exchange unfolded when Reddy asked the chatbot for help with a school assignment exploring challenges faced by older adults. Instead of providing constructive assistance, the chatbot issued a series of disturbing and hostile statements,
abp LIVE · 8h
After Suggesting Users To Eat Rock, Google Gemini AI Makes A Blunder Again. Asks A Student To Die
Posted on the r/artificial subreddit, the student's brother said that both of them are freaked out over the result of his homework assignment. The user also shared a full transcript of their conversation history with the Gemini AI. It appears the user was testing out Google’s chatbot to assist with homework assignments.
cybernews · 1d
“Human, please die”: Google Gemini goes rogue over student’s homework
Google states that Gemini has safety filters that prevent chatbots from diving into disrespectful, sexual, violent, or dangerous discussions and encouraging harmful acts. However, despite the safety intents, AI chatbots are still murky when it comes to controlling their responses.
Results that may be inaccessible to you are currently showing.
Hide inaccessible results