Your employees' seemingly private ChatGPT logs being aired in public during discovery for a random court case you aren't even involved in is absolutely a business risk.
I get where it's historically coming from, but the combination of American courts having almost infinite discovery rights (to be paid by the losing party, no less, greatly increasing legal risk even to people and companies not out to litigate) and the result of said discoveries ending up on the public record seems like a growing problem.
There's a qualitative difference resulting from quantitatively much easier access (querying some database vs. having to physically look through court records) and processing capabilities (an army of lawyers reading millions of pages vs. anyone, via an LLM) that doesn't seem to be accounted for.
I assume the folks who are concerned about their privacy could petition the court to keep their data confidential.
I occasionally use ChatGTP and I strongly object to the court forcing the collection of my data, in a lawsuit I am not named in, due merely to the possibility of copyright infringement. If I’m interested in petitioning the court to keep my data private, as you say is possible, how would I go about that?
Of course I haven’t sent anything actually sensitive to ChatGTP, but the use of copyright law in order to enforce a stricter surveillance regime is giving very strong “Right to Read” vibes.
> each book had a copyright monitor that reported when and where it was read, and by whom, to Central Licensing. (They used this information to catch reading pirates, but also to sell personal interest profiles to retailers.)
> It didn’t matter whether you did anything harmful—the offense was making it hard for the administrators to check on you. They assumed this meant you were doing something else forbidden, and they did not need to know what it was.
People need to read up on the LIBOR scandal. There was a lot of "wait why are my chat logs suddenly being read out as evidence of a criminal conspiracy".