(Updates with Google's statement in the fourth and fifth paragraphs.)
Alphabet's (GOOG) Google unit is being sued for 'wrongful death' by the family of a man who killed himself, allegedly based on the advice of the tech giant's Gemini artificial intelligence chatbot, according to multiple media outlets.
The lawsuit, which was filed Wednesday in the US District Court in California's northern district, alleges Jonathan Gavalas was influenced by Gemini to the point where he committed suicide, according to the reports.
According to the complaint, Gavalas started out using Gemini for purposes such as improving his writing. However, it alleges that after months of consulting the chatbot, his queries involved possibly committing violence before he took his own life, the suit alleges.
Referring to the case of Gavalas, Google said Wednesday that it is reviewing the claims in the lawsuit, noting that Gemini "referred the individual to a crisis hotline many times".
"Gemini is designed to not encourage real-world violence or suggest self-harm," the company said in a statement.
Price: 303.32, Change: -0.24, Percent Change: -0.08