AI may have played a leading role in the 2025 Florida State University shooting, where two people died, and six others were injured. Lawyers for the family of Robert Morales, who was one of the victims, allege that the shooter used ChatGPT to help plan the attack. The lawsuit will be one of the first to hold an AI company legally accountable for a mass shooting.
The New York Post reports that the law firm, Brooks, LeBoeuf, Foster, Gwartney and Hobbs, claims that the suspected gunman, Phoenix Ikner, was “in constant communication with ChatGPT” leading up to his attack on April 17, 2025 on the Tallahassee campus. The attorneys say that they are going to file a lawsuit against OpenAI because it “may have advised the shooter how to commit these heinous crimes.”
Attorneys suing OpenAI because a FSU shooter chatted with ChatGPT before killing two students.pic.twitter.com/r6abNYnrGs
— Hope 🇳🇿 (@MrMakiri) April 7, 2026
Ikner was a student at Florida State University when he started shooting near the student union building just before noon, killing two people and injuring six more. The victims were Morales, a 57-year-old Aramark employee and father, and Tiru Chabba, who was a 45-year-old businessman from South Carolina. Ikner used a service pistol that belonged to his stepmother, who was a deputy with the Leon County Sheriff’s Office. The gunman also carried another shotgun that he didn’t use during the attack.
The attorneys have compiled more than 270 images of ChatGPT conversations that Ikner had with the chatbot, and have included them as exhibits in the case. They say that these conversations show how he used AI to help him plan the shootout, and this forms the basis of the argument. The contents of the messages have not been made public yet.
OpenAI is not taking this lying down, and have defended themselves. They identified the account which could be linked to Ikner after the shooting, and they did share the information with the authorities. A spokesperson stated that the AI system is designed to detect harmful intent and respond safely. They also added that the company continues to improve these safeguarding measures.
Victim’s attorney claims ChatGPT aided accused Florida State gunman in planning shooting https://t.co/L3itoV46Ps
— FOX Carolina News (@foxcarolinanews) April 7, 2026
The attorneys for Morales’ family also say that Leon County Sheriff’s office may also be responsible for what happened that day. Ikner reportedly attended a firearms training through a youth advisory program, and he had access to the weapons. This is despite the fact that he had already apparently shown signs of being unstable. The attorneys wrote a letter to the authorities stating Ikner “was not mentally stable and should not be around guns.”
When the cops arrived on the scene at FSU in April last year, they managed to shoot and arrest Ikner. He sustained serious face injuries but survived the face-off, and is now facing charges including first-degree murder and attempted murder. So far, investigators say that there is no known connection between Ikner and his victims. There is also no clear motive for why he went on the attack that day.
The lawsuit could become one of the defining cases for whether AI developers can be held accountable in situations like these. As the courts weigh the evidence, including Ikner’s conversations with ChatGPT, the ruling could change the laws around AI and public safety.



