11 min read

The AI Clauses Your Vendors Are Sneaking Into Contracts

Vendors are adding AI-related clauses to contracts, and most libraries aren't noticing. Here's what to watch for and how to push back.

TL;DR
  • Vendors insert AI training clauses allowing them to use your patron data to improve their AI products with no compensation to you.
  • Many vendor contracts shift all AI liability to your library while vendors keep control, review indemnification and "as-is" clauses carefully.
  • Demand audit rights, explicit opt-in (not opt-out) for AI training, and the right to terminate if core AI features disappear.
  • Five critical AI clauses to watch: data training rights, as-is disclaimers, unilateral feature changes, no audit rights, and indemnification carve-outs.

I've been reviewing library vendor contracts for the past year, specifically looking at how they handle AI.

Here's what I've learned: Vendors are adding AI-related clauses to contracts, and most libraries aren't noticing.

Some of these clauses are reasonable. Some are outrageous. And some are so vague that you won't know what you agreed to until it's too late.

Let me show you what to watch for.

The "We Can Train AI on Your Data" Clause

This is the big one. And it's showing up in more contracts than you'd think.

Here's an example (paraphrased from a real contract):

"Customer grants Vendor a worldwide, non-exclusive, royalty-free license to use, reproduce, and create derivative works from Customer Data for purposes of improving Vendor's services, including but not limited to machine learning and artificial intelligence development."

Translation: The vendor can use your patron data, usage patterns, and search queries to train their AI. Forever. For free. And you can't stop them.

Now, they'll say "we anonymize the data" or "we aggregate usage patterns." But do you actually know how they're anonymizing it? Can you audit their process? Do you trust them?

Renewal or addendum in your inbox?

Let's say you're using a discovery system. It's tracking:

All of that is "usage data." And if your contract has a clause like this, the vendor can feed that data into their AI to "improve the product."

But here's what they're not telling you:

  1. That AI will be sold to other customers. Your patrons' search patterns are now training an AI that benefits your vendor's entire customer base, or worse, gets sold as a standalone product.
  2. The anonymization isn't as good as they claim. Research shows that "anonymized" datasets can be re-identified, especially when combined with other data sources.
  3. You're violating patron privacy. If your library has a privacy policy that says "We don't share your data with third parties," but your vendor contract says they can use patron data for AI training... you've got a problem.

What to Do

Look for this clause in your contracts. If you find it, push back with specific counter-language:

Model Counter-Clause (copy this):

"Vendor shall not use Customer Data, including but not limited to usage patterns, search queries, patron activity logs, or any data generated by Customer's use of the Service, to train, develop, improve, or create machine learning models, artificial intelligence systems, or derivative products without Customer's prior written consent. Any such use requires a separate written agreement specifying the scope, duration, compensation (if any), and audit rights."

Also demand:

If the vendor won't budge, consider whether the AI features are worth giving up control of your data.

The "AI Is As-Is" Clause

Another common one:

"Vendor provides AI-powered features on an 'as-is' basis. Vendor makes no warranties regarding the accuracy, reliability, or performance of AI-generated content. Customer is solely responsible for verifying any AI output before use."

Translation: If the AI screws up, that's your problem, not theirs.

Why This Matters

Let's say your library uses an AI chatbot to answer patron questions. A patron asks about a sensitive health topic, and the AI gives them dangerously wrong information. The patron follows that advice and gets hurt.

Who's liable? According to this clause: You are.

The vendor is saying "We're not responsible for what the AI says." Which means if you're using AI in patron-facing services, you're taking on liability without the vendor sharing any of the risk.

What to Do

Negotiate a shared liability clause. The vendor should have some responsibility for AI failures, especially if:

Model Counter-Clause (copy this):

"Notwithstanding any 'as-is' provisions, Vendor shall be liable for damages arising from AI features that: (a) produce outputs that violate applicable law, (b) fail to perform materially as documented, or (c) result from defects in Vendor's AI design, training, or implementation. Vendor shall maintain errors and omissions insurance covering AI-related claims of at least $[X] million."

Also demand:

The "We Can Change AI Features Anytime" Clause

Here's a sneaky one:

"Vendor reserves the right to modify, update, or discontinue AI features at any time without notice. Customer's continued use of the service constitutes acceptance of any changes."

Translation: The AI you're paying for today might disappear tomorrow, and you have no recourse.

Why This Matters

Remember when I talked about the Internet Archive lawsuit and how AI training might not be fair use? If a vendor loses a lawsuit or gets regulatory pressure, they might suddenly yank AI features.

Or they'll put AI features behind a more expensive tier.

Or they'll replace the current AI with a worse version.

And according to this clause, you just have to live with it.

What to Do

Push for:

If AI features are a key reason you're buying the product, get that in writing. Make sure the contract protects you if those features disappear.

The "No AI Audit Rights" Clause

This one's often buried in the "Confidentiality" or "Proprietary Information" section:

"Customer agrees not to reverse-engineer, decompile, or attempt to discover the underlying algorithms, models, or training data used in Vendor's AI features."

Translation: You're not allowed to figure out how the AI works or what data it was trained on.

Why This Matters

If you can't audit the AI, you can't verify:

You're flying blind.

And if a regulator asks "How do you know this AI complies with the law?" your answer is "We don't. The vendor won't let us see."

That's not a great position to be in.

What to Do

Demand audit rights with specific contractual language:

Model Counter-Clause (copy this):

"Upon reasonable notice, Customer shall have the right to: (a) receive annual reports on AI system performance, bias testing results, and training data sources; (b) request third-party audit reports (SOC 2, ISO 27001) that include AI systems; (c) receive documentation of how AI features process Customer Data; and (d) audit Vendor's compliance with this Agreement's AI provisions. Vendor's claims of proprietary information shall not prevent Customer from receiving summary-level information sufficient to assess regulatory compliance."

Specifically demand:

If the vendor says "That's proprietary," counter with "Then how can I trust it?" Good vendors will find a middle ground. Maybe they share audit results without revealing trade secrets.

The "Indemnification Carve-Out" for AI

This one's particularly nasty:

"Vendor's indemnification obligations under Section [X] do not apply to any claims arising from or related to the use of AI-powered features."

Translation: If someone sues you because of the vendor's AI, the vendor won't help you.

Why This Matters

Standard vendor contracts usually include "indemnification": the vendor agrees to defend you if someone sues you over the vendor's product.

But AI is new and legally uncertain. So vendors are carving AI out of indemnification clauses.

Which means if a patron sues you because the AI violated their privacy, or gave them bad information, or discriminated against them... the vendor says "Not our problem."

What to Do

This is a deal-breaker for many libraries. If the vendor won't indemnify you for AI-related claims, you need to seriously consider whether the AI features are worth the legal risk.

Model Counter-Clause (copy this):

"Vendor shall defend, indemnify, and hold harmless Customer from any claims, damages, or liabilities arising from: (a) AI outputs that infringe third-party intellectual property rights; (b) AI outputs that violate applicable privacy laws; (c) defects in Vendor's AI design, training, or implementation; or (d) Vendor's failure to disclose known AI limitations. This indemnification shall survive termination of this Agreement."

At minimum, demand:

If the vendor won't budge, get your own legal advice before signing.

The "AI Training Partnership" Upsell

Some vendors are now offering "AI training partnerships" as an add-on. They'll pitch it like this:

"For an additional fee, Vendor will customize our AI using your library's unique data, creating a tailored experience for your patrons."

Sounds great, right? Custom AI trained on your data!

The Catch

You're paying them to train an AI on your data. Data you already gave them. And the trained AI model? They own it. Not you.

So you're funding their AI development, and they're turning around and selling that AI to other customers.

What to Do

If you're going to pay extra for AI customization, make sure the contract specifies:

Otherwise, you're just subsidizing the vendor's product development.

What This Looks Like in Practice

Some library vendors have added explicit AI usage restrictions to their license agreements. These typically prohibit libraries from using licensed content to train third-party AI models or integrating content into library-developed AI tools without permission.

Meanwhile, vendors are incorporating AI into their own products—creating an asymmetrical situation where vendors can use AI with your licensed content, but you cannot.

Note: Vendor license terms vary and change frequently. Check your specific vendor agreements for AI-related usage restrictions. Contact your vendor directly to understand current terms before making any decisions.

The message is clear: Vendors are watching how libraries use AI - and they're setting firm boundaries while reserving AI rights for themselves.

What You Need to Do Before Your Next Contract Renewal

Step 1: Inventory your contracts.
Which vendors are providing AI features (or might in the future)? Make a list.

Step 2: Review AI clauses.
Look for the issues I've outlined above. Use the search function in your contract PDFs: search for "AI," "machine learning," "training data," "artificial intelligence," "algorithms."

Step 3: Create a negotiation checklist.
For each vendor, decide:

Step 4: Start negotiating early.
Don't wait until the week before your contract expires. Start 6+ months out. Vendors are more willing to negotiate when they're not under time pressure. And neither are you.

Step 5: Consult legal counsel.
If your library has access to legal advice, use it. AI contracts raise novel legal issues, and a lawyer can spot problems you might miss.

The Question You Should Ask Yourself

Here's the big one: If your vendor's AI fails or causes harm, who's responsible?

If the answer is "We are, and we have no recourse against the vendor," you're taking on risk without getting anything in return.

Good vendor relationships are partnerships. Both sides share risk and reward. If your vendor is pushing all the AI risk onto you while keeping all the control, that's not a partnership. It's exploitation.

Don't sign those contracts.


Further Reading

Need help reviewing AI clauses in your vendor contracts? Get in touch.

Published February 1, 2026. Category: Vendor Management.

Renewal or addendum in your inbox?

Get new posts by email, or book a free 30-minute call if you’re facing a contract, AI policy, or vendor decision.

Get the newsletter Free 30-min call