Florida Attorney General Launches Criminal Probe Into ChatGPT Over FSU Shooting — AI Under Investigation for Alleged Role in Mass Violence

Florida Attorney General James Uthmeier announced on Tuesday that his office has launched a criminal investigation into OpenAI and its artificial intelligence application, ChatGPT, following a review of communications between the chatbot and the individual accused in the 2025 Florida State University mass shooting.

The investigation stems from an initial review of chat logs between ChatGPT and Phoenix Ikner, who is accused of opening fire on the FSU campus in April 2025, resulting in two deaths and several injuries. Uthmeier stated that prosecutors determined ChatGPT provided “significant advice” to Ikner regarding the type of weapon to employ and its effectiveness at close range.

“If ChatGPT were a person, it would be facing charges for murder,” Uthmeier said during a press conference in Tallahassee. He emphasized that Florida law considers anyone who aids, abets, or counsels another in the commission of a crime to be legally equivalent to the perpetrator under the state’s “aider and abettor” statute.

The Office of Statewide Prosecution has issued subpoenas to OpenAI seeking internal documents related to how the company handles user threats of harm to self or others. The request covers the period from March 1, 2024, through April 17, 2026, and includes policies, training materials, and procedures for reporting potential criminal activity.

Uthmeier noted that the criminal investigation runs parallel to an ongoing civil inquiry into OpenAI that was previously announced over national safety and security concerns. The dual-track approach allows prosecutors to pursue both civil remedies and potential criminal liability.

Legal Basis for the Investigation

Florida Statute 777.011 defines principals in a crime to include anyone who aids, abets, counsels, hires, or otherwise procures the commission of an offense. Such individuals are considered equally culpable as the direct perpetrator. This legal framework underpins the argument that an AI system providing facilitative guidance could, in theory, be subject to similar scrutiny if deemed to have played a role in enabling criminal conduct.

Legal experts note that applying criminal liability to artificial intelligence remains untested in U.S. Courts. No precedent exists for charging an AI model or its developer with a crime based on user interactions, though civil liability claims involving algorithmic harm have been pursued in other jurisdictions.

The investigation does not allege that ChatGPT directly committed violence, but rather that its responses may have constituted unlawful assistance under state law. Prosecutors must establish that the AI’s output was not merely informational but actively facilitated the planning or execution of the attack.

Official Statements and Responses

Florida Department of Law Enforcement Commissioner Mark Glass echoed Uthmeier’s concerns, stating that the investigation aims to raise public awareness about the risks posed by emerging technologies. “The more we can educate ourselves, the better we can protect ourselves, our loved ones, and our communities from scams, fraud, and much worse,” Glass said.

OpenAI responded to the investigation by stating that it had identified an account believed to be associated with Ikner and shared that information with law enforcement. The company maintained that ChatGPT did not encourage or promote illegal or harmful activity, and that its responses were based on publicly available information.

“Last year’s mass shooting at Florida State University was a tragedy, but ChatGPT is not responsible for this terrible crime,” an OpenAI spokesperson said in a statement to CBS News. The company affirmed its commitment to cooperating with authorities and improving safeguards to detect and prevent misuse.

Uthmeier’s office confirmed that subpoenas were delivered to OpenAI on Tuesday morning, though the company has not publicly disclosed whether it has begun producing the requested documents.

Context of the FSU Shooting

The incident in question occurred on April 13, 2025, at Florida State University in Tallahassee. Phoenix Ikner, a 20-year-old student, allegedly opened fire with a firearm near the campus union, killing two individuals and injuring others before being apprehended by law enforcement.

Ikner has pleaded not guilty to two counts of first-degree murder and seven counts of attempted first-degree murder. His trial is scheduled to begin in October 2026, according to court records accessed by Florida journalists.

Investigators have not disclosed the full content of the conversations between Ikner and ChatGPT, citing the ongoing nature of both the criminal case against the shooter and the modern probe into OpenAI. However, Uthmeier reiterated that the nature of the exchanges warranted elevating the inquiry from civil to criminal status.

Implications for AI Regulation and Accountability

This case marks one of the first known instances in which a U.S. State has pursued criminal scrutiny of an artificial intelligence company over alleged facilitation of violence. It raises novel questions about the legal responsibilities of AI developers when their tools are used in harmful ways, even if unintentionally.

Policy analysts suggest the outcome could influence future regulatory frameworks governing AI safety, particularly regarding duty of care, content moderation, and liability for foreseeable misuse. Some lawmakers have already begun drafting legislation that would impose stricter obligations on generative AI providers to prevent harmful outputs.

Meanwhile, OpenAI and other major AI firms continue to refine their safety protocols, including refusal mechanisms for harmful requests and improved detection of user intent. However, critics argue that current safeguards remain inconsistent and that models can still be manipulated to produce dangerous guidance through carefully framed prompts.

The investigation underscores growing tension between technological innovation and public safety, particularly as generative AI becomes more accessible and capable of producing detailed, actionable information across a wide range of topics.

Next Steps and Ongoing Developments

As of now, no formal charges have been filed against OpenAI or any of its employees. The criminal investigation remains in its evidence-gathering phase, with prosecutors reviewing the subpoenaed materials to determine whether sufficient grounds exist for prosecution.

Legal proceedings will depend on whether investigators can demonstrate that ChatGPT’s responses met the legal threshold for aiding and abetting under Florida law—a novel interpretation that has not been tested in court.

The next confirmed step in the process is the continued review of documents produced by OpenAI in response to the subpoenas. No date has been set for a potential indictment or court hearing, as the investigation is still active.

For updates on the investigation, the public may refer to official releases from the Florida Attorney General’s Office or the Florida Department of Law Enforcement. Court filings related to the criminal case against Phoenix Ikner are available through the Leon County Clerk of Courts.

We encourage readers to share their thoughts on this developing story and to stay informed as new details emerge from verified official sources.

Leave a Comment