Every day, millions of people type their thoughts, worries, and questions into AI chatbots like ChatGPT. Many treat these conversations the way they would a diary or a talk with a trusted friend — honest, unfiltered, and deeply personal. But there is something most people do not realize: unlike a conversation with a lawyer, a doctor, or a therapist, nothing you say to an AI chatbot is legally private. Those conversations are stored on company servers, and law enforcement can access them with a warrant or subpoena.
This new reality is already changing the way crimes are investigated and prosecuted in the United States and around the world.
What Makes AI Chats Different from Google Searches?
For years, police have used internet search histories as evidence. If a suspect Googled "how to clean up bloodstains" the night of a murder, prosecutors could present that search to a jury. But AI chatbot conversations go much further. Instead of typing a few words into a search bar, people have full back-and-forth conversations with chatbots, often sharing detailed thoughts, emotions, and plans.
Legal experts say this makes chatbot logs far more revealing than a simple web search. Kyle Valente, an attorney who has written extensively on the topic, has said that AI chatbot logs will inevitably take center stage in courtroom evidence disputes across the country. Prosecutors can use these conversations to establish what a suspect was thinking, whether they knew what they were doing was wrong, and whether they planned their actions in advance.
As one legal expert put it, chatbot conversations are like "your diary times ten."
Real Cases Where AI Chats Became Evidence
Several criminal cases have already shown how chatbot conversations can play a role in the justice system. Here are some notable examples.
The Vandalism Confession (Missouri, 2025)
In one of the most widely reported cases, a 19-year-old sophomore at Missouri State University named Ryan Schaefer was charged with felony property damage after allegedly vandalizing 17 cars in a campus parking lot at 3 a.m. What made this case unusual was not the crime itself, but the evidence. Just ten minutes after leaving the scene, Schaefer reportedly opened ChatGPT on his phone and began telling the chatbot what he had done. He described smashing car windshields, asked whether he would go to jail, and wanted to know if there was any way anyone could figure out it was him.
When police later searched his phone — with his consent — they found the entire conversation. Prosecutors described it as a "troubling dialogue" that documented his state of mind immediately after the alleged crime. The chat logs, combined with surveillance footage and cell phone location data, became the foundation of the criminal case against him.
The Arson Investigation (Florida, 2025)
In Florida, a 29-year-old man named Rinderknecht was charged in connection with a series of fires. Investigators highlighted his ChatGPT history as a key piece of evidence. According to authorities, the suspect had used the chatbot to generate images of cities and forests engulfed in flames, told the chatbot he felt "liberated" after burning a Bible, and — shortly after allegedly starting a brush fire on New Year's Day — asked ChatGPT whether someone could be held responsible if a fire started because of their cigarettes. The case illustrated how a suspect's chatbot conversations could reveal not just actions but also mindset and intent.
The Child Exploitation Investigation (Federal, 2025)
In what was reported as the first known federal search warrant asking OpenAI for user data, the Department of Homeland Security sought ChatGPT records as part of a child exploitation investigation. The suspect, who allegedly operated multiple illegal websites on the encrypted Tor network, had discussed his use of ChatGPT with an undercover federal agent. This led investigators to seek a warrant for his chatbot conversation history. While the ChatGPT conversations themselves were not directly related to the crimes, investigators believed they could help build the broader case. The suspect was charged with conspiracy to advertise child sexual abuse material and remains in custody.
AI Used to Escape a Crime (Missouri, 2025)
Not all criminal cases involving AI chatbots are about catching suspects. In one remarkable case, a woman in Kansas City used ChatGPT to escape from an alleged sex trafficker. According to court records, the woman had been recruited into prostitution and was being physically abused and threatened. She used ChatGPT to create a fake email claiming she needed to return a rental car, which gave her a reason to leave the hotel where she was being held. She then drove to the airport, where she met police and family members. Her alleged trafficker was arrested the following day. The case showed that AI chatbots can also serve as tools for victims seeking safety.
Why AI Conversations Are Not Protected
When you speak with a lawyer, those conversations are protected by attorney-client privilege. When you speak with a doctor or therapist, there are similar confidentiality protections. But when you type something into ChatGPT, Gemini, Grok, or any other AI chatbot, there is no legal privilege protecting what you say.
OpenAI's CEO, Sam Altman, has publicly acknowledged this gap. In a 2025 podcast interview, he warned that people are treating ChatGPT like a therapist or lawyer, but that those conversations could be required to be turned over if a lawsuit or criminal investigation demands it. He called the situation "very screwed up" and said the tech industry has not yet figured out how to give AI conversations the same privacy protections that exist for human professionals.
As of now, OpenAI's terms of service state that user conversations may be reviewed for safety purposes and can be disclosed in response to legal requests such as subpoenas or court orders. Even conversations that a user deletes may still be stored on the company's servers for a period of time.
How Police Get Access to Your Chats
Law enforcement agencies obtain AI chatbot records in several ways. The most common method is through a search warrant signed by a judge. Under the Stored Communications Act, police can also issue subpoenas to force companies to disclose information that identifies their users. In some cases, investigators simply find the conversations on a suspect's phone during a lawful search of the device, as happened in the Missouri vandalism case.
A federal court ruling in the case of The New York Times v. OpenAI ordered OpenAI to preserve all user chat logs — including deleted and temporary conversations — as part of evidence preservation. This ruling signaled that courts are increasingly treating AI conversations as records that can be discovered and used in legal proceedings.
What This Means Going Forward
Legal experts predict that AI chatbot evidence will become routine in criminal investigations within the next few years. As hundreds of millions of people use chatbots every week, the volume of potential evidence is enormous. Some attorneys have compared this shift to the early days of social media evidence, when prosecutors first realized that people's Facebook posts and Instagram messages could be used against them in court.
However, the use of AI evidence also raises important questions that courts have not yet fully answered. Can a chatbot "confession" be taken at face value, or might someone have been joking, venting, or speaking hypothetically? Should AI conversations receive some form of legal protection, similar to the privacy protections that apply to conversations with doctors and lawyers? And who is responsible for making sure that AI companies handle user data fairly?
These are questions that lawmakers, judges, and technology companies will need to answer in the years ahead. In the meantime, legal professionals offer one consistent piece of advice: treat every conversation with an AI chatbot as if it could one day be read aloud in a courtroom. If you need truly confidential advice, talk to a licensed attorney — not a machine.
* * * * * * * * *
VanHo Law represents individuals, families and small businesses throughout Ohio and Pennsylvania in state and federal courts. If you need assistance from VanHo Law, please give us a call at 330-653-8511 or contact us here.