Artificial intelligence is no longer some futuristic idea we only talk about in hypothetical terms. It’s here, and it’s changing how businesses operate in real time. But as exciting as AI is, it also comes with a fair share of risks — especially for those of us tasked with drafting the contracts that underpin these new technologies.
If you’re an in-house lawyer or advising tech companies, you know that AI contracts aren’t your standard software agreements. They require a deeper dive into how the technology works and more precise language to address issues like bias, IP ownership, liability, and compliance.
Let’s walk through some key clauses you should include in AI contracts, along with the pitfalls to avoid. Think of this as a mix of lessons learned and advice I’d give if we were having coffee and swapping stories about the quirks of tech law.
1. Understand the Technology
Before you can draft a solid contract, you need to know what you’re dealing with. AI systems can feel like a black box, so your contract should force vendors to open that box and show you what’s inside.
Good Clause:
“The vendor will provide documentation detailing the AI system’s architecture, algorithms, data sources, and training methodologies. Updates will be provided on an annual basis or as material changes are made.”
Bad Clause:
“The vendor will make the AI system available as-is, without any obligation to disclose its functionality or provide technical documentation.”
Why it matters: Without insight into how the system works, you’re flying blind. Imagine agreeing to buy a car without knowing if it has brakes. That’s what this feels like when you skip this clause.
2. Define Ownership of Data and Outputs
One of the biggest areas for disputes is ownership — who owns the data you feed into the AI, and who owns what comes out of it?
Good Clause:
“The client retains exclusive ownership of all data provided to the vendor, as well as all outputs derived from such data. The vendor will not use client data or outputs for any purpose other than delivering the agreed-upon services without prior written authorization.”
Bad Clause:
“Outputs from the AI system and data used to train the system are subject to shared ownership unless otherwise specified.”
Why it matters: If you’re providing proprietary data, you don’t want the vendor turning around and using it to improve their system — or worse, sharing it with your competitors.
3. Set Performance Standards
Let’s be honest, “reasonably well” is a phrase that belongs in vague promises, not contracts. Be specific about what success looks like.
Good Clause:
“The AI system must maintain an accuracy rate of at least 95 percent in detecting fraudulent transactions, with false positives limited to 2 percent or less. Failure to meet these metrics for two consecutive months will entitle the client to a refund or termination without penalty.”
Bad Clause:
“The vendor will ensure the AI system performs reasonably well under typical operating conditions.”
Why it matters: If the AI fails to deliver, vague terms leave you with little recourse. You need a way to hold the vendor accountable.
4. Address Liability and Indemnification
When AI makes mistakes — and it will — someone has to clean up the mess. Make sure it’s not your company footing the bill.
Good Clause:
“The vendor will indemnify and hold harmless the client against all claims, fines, and damages arising from system errors, discriminatory outcomes, or violations of applicable laws caused by the AI system.”
Bad Clause:
“The vendor is not responsible for any outcomes resulting from the use of the AI system, including errors, biases, or legal violations.”
Why it matters: If the AI produces biased results or breaks the law, you don’t want to be left holding the bag. This clause ensures the vendor shares the responsibility.
5. Prioritize Audit and Transparency Rights
It’s not enough to trust the system. You need the ability to verify it’s doing what it’s supposed to.
Good Clause:
“The vendor will allow the client to perform audits of the AI system on a semi-annual basis. These audits may include reviewing training data, algorithms, and compliance with agreed-upon standards.”
Bad Clause:
“The vendor’s systems and operations are proprietary and are not subject to client audits.”
Why it matters: Audit rights give you the power to verify the system’s performance and compliance. Without them, you’re just hoping the vendor’s word is good enough.
6. Require Proactive Management of Bias
Bias in AI isn’t just a buzzword — it’s a real risk that can hurt your company’s reputation and bottom line. Make sure your contract forces the vendor to take bias seriously.
Good Clause:
“The vendor will conduct bias assessments of the AI system on a quarterly basis, provide results to the client, and implement corrective measures within 30 days of identifying bias.”
Bad Clause:
“The vendor makes no warranty regarding the system’s compliance with ethical or fairness standards.”
Why it matters: Bias isn’t always obvious until something goes wrong. This clause ensures it’s being monitored and corrected before it becomes a problem.
7. Include Termination and Data Portability Provisions
When the relationship ends, you need a way to walk away with everything that belongs to you.
Good Clause:
“Upon termination of this agreement, the vendor will export all client data and outputs in a commonly used, machine-readable format within 14 days.”
Bad Clause:
“Upon termination, the vendor reserves the right to retain and archive all client data and outputs for unspecified purposes.”
Why it matters: If you don’t address this, you might find yourself unable to switch vendors or access your own data.
8. Specify Security Obligations
AI systems process sensitive data, which means security has to be a top priority.
Good Clause:
“The vendor will implement encryption for all data at rest and in transit. In the event of a data breach, the vendor will notify the client within 24 hours and indemnify the client for all associated costs, including regulatory fines and damages.”
Bad Clause:
“The vendor will implement reasonable security measures and is not liable for breaches unless caused by gross negligence.”
Why it matters: A data breach can cost your company millions. Strong security provisions protect your organization and keep vendors accountable.
Final Thoughts
AI is one of the most exciting developments in tech today, but it also comes with risks that demand careful planning. Crafting a contract for AI isn’t just about ticking boxes — it’s about ensuring your organization is protected and ready to thrive in this rapidly evolving space.
Have you run into challenges negotiating AI contracts? Let’s compare notes. I’d love to hear what’s worked for you and how you’ve handled tricky situations. Reach out or drop a comment below.
Let’s make AI contracts as smart as the technology they govern.


Leave a Reply