720-891-1663

Return to Client Alerts

Large Language Models vs. The Law: A Challenge

Large language models like ChatGPT are here to stay. Too much money to be made by both the buyers and sellers for it to go away.

But that doesn’t mean that problems with them will magically disappear.

Likely it will take a decade or more for both companies and the courts to figure out how to deal with it.

One case:

OpenAI, the maker of ChatGPT, is facing a lawsuit in Europe. While Europe is a little ahead of the United States, I see no reason why this lawsuit model will not work here in states with second generation privacy laws like Texas and California.

Here is the premise.

A person requested OpenAI to correct an erroneous birth date in its output as allowed by and required by European law (GDPR). EU residents have a RIGHT to require companies to correct erroneous data.

OpenAI said they don’t know how to do that.

So, they offered an alternative of removing all of the person’s personal data from their system’s output.

However, EU law does not allow a company to selectively choose which rights it wants to grant a person.

OpenAI could be fined up to 4 percent of its global revenue over this.

Similarly, U.S. state privacy laws do not allow businesses to pick and choose which rights out of those enumerated in the law they want to give their residents.

NOYB (Max Schrems) is filing and litigating the lawsuit on behalf of the anonymous user. He has a pretty impressive track record in privacy lawsuits, so if I was OpenAI, I would be concerned.

OpenAI’s privacy policy says that users can submit a correction request through privacy.openai.com, but then it says that they may not actually be able to correct the data in question.

US state privacy laws and EU’s GDPR also allow a person to make a Subject Access Request or SAR, asking for a copy of any information the company has about the person. In this case, OpenAI ignored the request.

Some states allow residents to sue companies; others require lawsuits originate from a district attorney or attorney general. The first category is more problematic for companies.

As companies continue to integrate AI into their products and systems, conundrums like this will continue to come up and companies will get sued. Plan for it. Do what you can to mitigate it. There is no magic here, but if you have questions, please contact us. Credit: Tech Crunch