Skip to Content

ChatGPT encouraged FSU shooter, victim’s family alleges in new lawsuit

By Hadas Gold, CNN

(CNN) — The family of a victim of last year’s Florida State University mass shooting filed a lawsuit Sunday against OpenAI, alleging ChatGPT “inflamed and encouraged” accused shooter Phoenix Ikner’s “delusions” ahead of the attack.

The lawsuit filed in Tallahassee follows the first criminal investigation against OpenAI opened by Florida Attorney General James Uthmeier last month over whether the company “bears criminal responsibility” for the shooting.

The family of Tiru Chabba, one of the two people police say Ikner killed in April 2025, alleges Ikner messaged ChatGPT thousands of times before carrying out the shooting.

The chatbot helped him plan the logistics of the shooting, including how to operate weapons and advising on “what time would be best to encounter the most traffic on campus,” the complaint said. It also alleges that ChatGPT “provided what he viewed as encouragement in his delusion.”

Six other people were wounded in the shooting. Ikner has pleaded not guilty, and his trial is set to begin in October.

The family alleges wrongful death, gross negligence, products liability, and failure to warn, among other counts.

“OpenAI built a system that stayed in the conversation, perpetuated it, accepted Ikner’s framing, elaborated on it, and asked tangential follow-up questions to keep Ikner engaged,” the lawsuit states. “ChatGPT’s design created an obvious and foreseeable risk of harm to the public that was not adequately controlled.”

Chabba’s family is seeking undefined compensation and is pushing for OpenAI to add more safeguards to ChatGPT. Amy Willbanks, an attorney for the family, said the company should mitigate and eliminate dangers posed by ChatGPT before they become accessible to the public.

“We cannot have a product that is unregulated and being used by people when we don’t know the full extent of what it can lead to,” Willbanks said during a press conference on Monday.

OpenAI said that while the FSU shooting was a “tragedy,” ChatGPT is “not responsible.”

“In this case, ChatGPT provided factual responses to questions with information that could be found broadly across public sources on the internet, and it did not encourage or promote illegal or harmful activity,” said OpenAI spokesperson Drew Pusateri. “We work continuously to strengthen our safeguards to detect harmful intent, limit misuse, and respond appropriately when safety risks arise.”

In a blog post last month, OpenAI said it is working to train ChatGPT to recognize when conversations could result in “threats, potential harm to others, or real-world planning” and will “guide people to real-world support.” If an account is flagged by ChatGPT’s internal system, a human reviewer will check the activity to see whether authorities need to be notified, the company said.

OpenAI is facing at least 10 lawsuits from families who allege that people harmed themselves or others after chatting with ChatGPT.

Seven families of victims in a February school shooting in Canada sued OpenAI and CEO Sam Altman last month, alleging the company and its ChatGPT chatbot were complicit in the injuries or deaths of their children.

The lawsuits follow an apology from Altman in April to the Tumbler Ridge community in British Columbia, Canada, for not alerting authorities to the shooter’s conversations with ChatGPT, even after staff flagged the account internally.

Eight people, including six children were killed, before the shooter died by suicide.

The-CNN-Wire
™ & © 2026 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Article Topic Follows: CNN - Money

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

KION 46 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.