Who Owns Your AI Code and Who is Liable for it?
As more code is being written by AI such as Microsoft’s Github Copilot and OpenAI’s ChatGPT, the questions are started to be asked:
- Who owns that code?
- Can it be copyrighted?
- Who is liable for damages if the code causes serious damage?
Those are only some of the questions that the courts and Congress will need to sort out.
Do we have any answers yet? Not really, but here are some possibilities from knowledgeable attorneys.
What does seem clear, at this point at least, is that you cannot copyright the code that the AI writes. Depending on how much of the code was written by an AI or AIs, the overall result may or may not be able to be copyrighted.
That doesn’t mean that it can’t be treated as a trade secret, if that requires a completely different set of security controls if you want to do that.
It does mean that if anyone can get hold of that code, you can’t go after them for theft of intellectual property, but there may be other laws at play here such as the Computer Fraud and Abuse Act (CFAA).
From a contract law standpoint, you MAY own the code that the AI generates, depending on the terms and conditions of the agreement with the vendor who runs the AI. ChatGPT, for example, does not claim ownership, but that doesn’t mean other LLMs think the same way.
And worldwide, companies and courts are wrestling with this bear.
And, when it comes to AI generated images, you may be out of luck there. Since they cannot be copyrighted (according to the US Copyright Office), if they are published (they cannot be trade secret in that case), they are, effectively, public domain.
Now lets move on to liability. That is another can of worms. Until the cases grind through the courts to come a definitive answer, it is best to cover your backside.
Even before AI developers rarely wrote all of their own code. Libraries, GitHub, SDKs and other tools were and are very common.
One assumes that AI generated code will be in a similar bucket to the above. Mostly that means that you need to read license agreements and you need to make sure that license agreements for your systems protect you.
Another thing to consider. Sean O’Brien, a lecturer at Yale Law, said consider this:
The chances that AI prompts might output proprietary code are very high, if we’re talking about tools such as ChatGPT and Copilot, which have been trained on a massive trove of code of both the open source and proprietary variety.
O’Brien says that, just like patent trolls, there is going to be an entire industry of trolls going after developers for using proprietary code.
How does that make you feel?
If you are nervous, please contact us. We are not attorneys and can’t give you legal advice, but we can give you our decades of practical knowledge and
Credit: ZDNet (part 1) and ZDNet (part 2 )