NYT v. OpenAI: Court Order Requires Everything You Ask ChatGPT Be Saved; Historic Copyright Case Moves Forward

55
OpenAI
Unless you’re using a special privacy mode (like ChatGPT Team or Enterprise with data controls), your prompts and the AI’s responses are stored and may be reviewed by OpenAI for system improvement. A recent court order now requires these interactions to be preserved for legal purposes. File photo: Novikov Aleksey, licensed.

NEW YORK, NY – In a development that should concern anyone who has ever typed a question into ChatGPT, a federal judge has ordered OpenAI and Microsoft to preserve all records related to their AI models – including user prompts and responses, both past and future. This means that anything you’ve ever asked ChatGPT could now be subject to indefinite storage, at least for the duration of ongoing litigation.

The order stems from the New York Times’ high-profile lawsuit against OpenAI and Microsoft (NYT v. OpenAI), filed in December 2023. The Times alleges that millions of its copyrighted articles were used without permission to train AI models like ChatGPT and Microsoft Copilot, and that the AI tools can reproduce its content in ways that threaten its business model and violate copyright law.

To support its case, The Times is seeking access to specific examples of outputs where AI models may have regurgitated its content. As a result, the court has issued a broad preservation mandate, requiring the defendants to retain data related to model training, internal communications, and user interactions with the AI.

In plain terms: Every question users have asked ChatGPT – and the AI’s responses – must now be preserved. The ruling applies not only to historical interactions but to future ones as well.

What’s at Stake

While data retention for internal research or debugging is not new in the tech world, this ruling formalizes the idea that user interactions with AI are potential legal evidence. That sets a precedent not just for OpenAI and Microsoft, but potentially for the entire AI industry.

Privacy advocates have raised concerns about what this means for everyday users:

  • Is your data really anonymous?
  • Can legal authorities subpoena your interactions?
  • Should you treat AI prompts like public posts or emails?

OpenAI’s own privacy policy notes that conversations may be reviewed by humans to improve the system, though users of the pro or enterprise versions can opt out of data training. However, the new legal directive potentially overrides existing company policies – at least temporarily.

A Turning Point for AI and Privacy

This case could become a landmark not only for copyright law but for the future of data privacy in AI systems. If court proceedings begin to rely on user-submitted prompts as evidence, it may force AI developers to rethink how long they retain data, what they disclose to users, and how much control people have over their digital interactions.

For now, one thing is clear: What you type into ChatGPT may no longer be as private – or as temporary – as you thought.


Top Questions About ChatGPT, Data Storage & the NYT Lawsuit

1. Is everything I ask ChatGPT being saved?
Yes. Unless you’re using a special privacy mode (like ChatGPT Team or Enterprise with data controls), your prompts and the AI’s responses are stored and may be reviewed by OpenAI for system improvement. A recent court order now requires these interactions to be preserved for legal purposes.

2. Can ChatGPT conversations be used as evidence in court?
Potentially, yes. In the NYT v. OpenAI lawsuit, user prompts are being preserved as possible evidence to show that ChatGPT can reproduce copyrighted content. This means historical conversations could be reviewed or subpoenaed under certain conditions.

3. Does OpenAI delete my questions after a while?
Not by default. Chat history is retained unless you manually delete it or disable history in your settings. However, even then, OpenAI may retain some data for safety, security, or legal compliance—and the current court order halts any deletion for now.

4. Can anyone else see what I asked ChatGPT?
OpenAI staff may review conversations to improve the system unless you opt out (available in some account settings). Your chats aren’t publicly visible, but they’re not completely private either.

5. Is there any way to opt out of having my data stored?
If you’re a free or Plus user, you can turn off chat history under settings, which limits how your data is used for training – but it may still be stored for security or compliance. Business and enterprise users can opt out of data retention entirely through contracts.

6. Is ChatGPT anonymous?
Not entirely. While the system doesn’t automatically know your name, chats are tied to your account and IP address. If you use ChatGPT while logged in, your prompts can potentially be linked to you.

7. Should I be careful about what I type into ChatGPT?
Yes. Avoid sharing sensitive personal information, legal matters, passwords, or anything you wouldn’t want recorded. AI chats may feel private, but they’re still being processed and stored by a company.

8. What does the NYT lawsuit have to do with user questions?
The New York Times claims that ChatGPT can reproduce its content word-for-word. To investigate, the court has ordered OpenAI to preserve user interactions that may demonstrate this behavior – making user prompts legally significant.

9. Could this lawsuit change how AI companies store data?
Yes. If courts decide that storing or reproducing copyrighted or personal content violates law, it could lead to stricter data handling rules across the industry.

10. Where can I read more about OpenAI’s data practices?
Visit OpenAI’s privacy policy or check your ChatGPT account settings for more control options.

Comment via Facebook

Corrections: If you are aware of an inaccuracy or would like to report a correction, we would like to know about it. Please consider sending an email to [email protected] and cite any sources if available. Thank you. (Policy)