The AI Clauses Your Vendors Are Sneaking Into Contracts
Vendors are adding AI clauses to contracts. Some are reasonable. Some are outrageous. Most libraries aren't noticing.
I’ve been reviewing library vendor contracts for the past year, specifically looking at how they handle AI.
Here’s what I’ve learned: Vendors are adding AI-related clauses to contracts, and most libraries aren’t noticing.
(If you need general contract reading advice first, check out the field guide to reading vendor contracts.)
Some of these clauses are reasonable. Some are outrageous. And some are so vague that you won’t know what you agreed to until it’s too late.
The “We Can Train AI on Your Data” Clause
This is the big one. And it’s showing up in more contracts than you’d think.
Here’s an example (paraphrased from a real contract):
“Customer grants Vendor a worldwide, non-exclusive, royalty-free license to use, reproduce, and create derivative works from Customer Data for purposes of improving Vendor’s services, including but not limited to machine learning and artificial intelligence development.”
Translation: The vendor can use your patron data, usage patterns, and search queries to train their AI. Forever. For free. And you can’t stop them.
Now, they’ll say “we anonymize the data” or “we aggregate usage patterns.” But do you actually know how they’re anonymizing it? Can you audit their process?
(Research shows that “anonymized” datasets can often be re-identified with surprisingly little additional information.)
Why This Matters
Your discovery system is tracking what patrons search for, what results they click on, what resources they access, how long they spend on each page, what device they’re using, what time of day they search.
All of that is “usage data.” And if your contract has a clause like this, the vendor can feed that data into their AI to “improve the product.”
But here’s what they’re not telling you:
- That AI might be sold to other customers. Your patrons’ search patterns are now training an AI that benefits your vendor’s entire customer base—or worse, gets sold as a standalone product.
- The anonymization might not be as good as they claim. Research has shown that “anonymized” datasets can often be re-identified.
- You might be violating patron privacy. If your library has a privacy policy that says “We don’t share your data with third parties,” but your vendor contract says they can use patron data for AI training… you’ve got a problem.
What to Do
Look for this clause in your contracts. If you find it, push back. Demand:
- Explicit opt-in (not opt-out) for AI training
- The ability to audit how your data is being used
- Limits on what types of data can be used
- The right to revoke consent at any time
If the vendor won’t budge, consider whether the AI features are worth giving up control of your data.
The “AI Is As-Is” Clause
Another common one:
“Vendor provides AI-powered features on an ‘as-is’ basis. Vendor makes no warranties regarding the accuracy, reliability, or performance of AI-generated content. Customer is solely responsible for verifying any AI output before use.”
Translation: If the AI screws up, that’s your problem, not theirs.
Why This Matters
Your library uses an AI chatbot to answer patron questions. A patron asks about a sensitive health topic, and the AI gives them dangerously wrong information. The patron follows that advice and gets hurt.
Who’s liable? According to this clause: You are.
The vendor is saying “We’re not responsible for what the AI says.” Which means if you’re using AI in patron-facing services, you’re taking on liability without the vendor sharing any of the risk.
What to Do
Negotiate a shared liability clause. The vendor should have some responsibility for AI failures, especially if:
- The AI is core to the product (not just a bonus feature)
- The AI is making recommendations or providing information to patrons
- The vendor is charging extra for AI features
At minimum, demand:
- Clear documentation of what the AI can and can’t do
- Warnings about known limitations or failure modes
- The ability to disable AI features if they’re causing problems
The “We Can Change AI Features Anytime” Clause
Here’s a sneaky one:
“Vendor reserves the right to modify, update, or discontinue AI features at any time without notice. Customer’s continued use of the service constitutes acceptance of any changes.”
Translation: The AI you’re paying for today might disappear tomorrow, and you have no recourse.
Why This Matters
Remember the Internet Archive lawsuit? If a vendor loses a lawsuit or gets regulatory pressure, they might suddenly yank AI features.
Or they might decide to put AI features behind a more expensive tier.
Or they might replace the current AI with a worse version.
And according to this clause, you just have to live with it.
What to Do
Push for:
- Advance notice of AI feature changes (30-90 days minimum)
- The right to terminate the contract without penalty if major AI features are removed
- Price protection if AI features are moved to a higher pricing tier
If AI features are a key reason you’re buying the product, get that in writing—and make sure the contract protects you if those features disappear.
The “No AI Audit Rights” Clause
This one’s often buried in the “Confidentiality” or “Proprietary Information” section:
“Customer agrees not to reverse-engineer, decompile, or attempt to discover the underlying algorithms, models, or training data used in Vendor’s AI features.”
Translation: You’re not allowed to figure out how the AI works or what data it was trained on.
Why This Matters
If you can’t audit the AI, you can’t verify:
- Whether it’s biased
- Whether it’s using your data inappropriately
- Whether it’s complying with regulations
- Whether it’s actually doing what the vendor claims
You’re flying blind.
And if a regulator asks “How do you know this AI complies with the law?” your answer is “We don’t. The vendor won’t let us see.”
That’s not a great position to be in.
What to Do
Demand audit rights. Specifically:
- The right to review AI training data sources
- Access to bias testing results
- Third-party audit reports (like SOC 2, which should include AI systems)
- Transparency about how the AI works (at least at a high level)
If the vendor says “That’s proprietary,” counter with “Then how can I trust it?” Good vendors will find a middle ground—maybe sharing audit results without revealing trade secrets.
The “Indemnification Carve-Out” for AI
This one’s particularly nasty:
“Vendor’s indemnification obligations under Section [X] do not apply to any claims arising from or related to the use of AI-powered features.”
Translation: If someone sues you because of the vendor’s AI, the vendor won’t help you.
Why This Matters
Standard vendor contracts usually include “indemnification”—the vendor agrees to defend you if someone sues you over the vendor’s product.
But AI is new and legally uncertain. So vendors are carving AI out of indemnification clauses.
Which means if a patron sues you because the AI violated their privacy, or gave them bad information, or discriminated against them… the vendor says “Not our problem.”
What to Do
This is a deal-breaker for many libraries. If the vendor won’t indemnify you for AI-related claims, you need to seriously consider whether the AI features are worth the legal risk.
At minimum, demand:
- Indemnification for AI failures caused by the vendor (e.g., bugs, security flaws, design defects)
- Insurance coverage that includes AI-related claims
- A clear process for handling AI-related complaints or lawsuits
If the vendor won’t budge, get your own legal advice before signing.
The “AI Training Partnership” Upsell
Some vendors are now offering “AI training partnerships” as an add-on:
“For an additional fee, Vendor will customize our AI using your library’s unique data, creating a tailored experience for your patrons.”
Sounds great, right? Custom AI trained on your data.
The Catch
You’re paying them to train an AI on your data. Data you already gave them. And the trained AI model? They own it—not you.
So you’re funding their AI development, and they’re turning around and selling that AI to other customers.
What to Do
If you’re going to pay extra for AI customization, make sure the contract specifies:
- You own the custom AI model (or at least have exclusive rights to it)
- The vendor can’t use your data to train AI for other customers without compensation
- You can take the custom AI model with you if you leave the vendor
Otherwise, you’re just subsidizing the vendor’s product development.
Real Example: EBSCO’s AI Restrictions
In 2024, EBSCO added restrictions to some contracts that limited how libraries could use AI with EBSCO content. Specifically, they prohibited:
- Using EBSCO content to train third-party AI models
- Scraping EBSCO databases for AI training purposes
- Integrating EBSCO content into library-developed AI tools without permission
This caused friction with libraries that wanted to build their own AI research assistants or use AI to improve discovery.
EBSCO’s position: “Our content is licensed for human use, not AI training.”
Libraries’ position: “We paid for this content. We should be able to use it however we want.”
Throughout 2025, contract renegotiations happened across the industry. Some libraries got carve-outs for specific AI uses. Others backed down. ProQuest, JSTOR, and other major vendors followed EBSCO’s lead with similar restrictions.
The message is clear: Vendors are watching how libraries use AI—and they’re setting firm boundaries.
What You Need to Do Before Your Next Contract Renewal
Step 1: Inventory your contracts. Which vendors are providing AI features (or might in the future)? Make a list.
Step 2: Review AI clauses. Look for the issues I’ve outlined above. Use the search function in your contract PDFs: search for “AI,” “machine learning,” “training data,” “artificial intelligence,” “algorithms.”
Step 3: Create a negotiation checklist. For each vendor, decide:
- What AI clauses are deal-breakers?
- What clauses are negotiable?
- What information do we need from them?
Step 4: Start negotiating early. Don’t wait until the week before your contract expires. Start 6+ months out. Vendors are more willing to negotiate when they’re not under time pressure—and neither are you.
Step 5: Consult legal counsel. If your library has access to legal advice, use it. AI contracts raise novel legal issues, and a lawyer can spot problems you might miss.
The Question You Should Ask Yourself
If your vendor’s AI fails or causes harm, who’s responsible?
If the answer is “We are, and we have no recourse against the vendor,” you’re taking on risk without getting anything in return.
Good vendor relationships are partnerships. Both sides share risk and reward. If your vendor is pushing all the AI risk onto you while keeping all the control, that’s not a partnership—it’s exploitation.
Don’t sign those contracts.
Go read yours. Right now. You might be surprised what you already agreed to.
Authenticity note: With the exception of images, this post was not created with the aid of any LLM product for prose or description. It is original writing by a human librarian with opinions.
Discussion
Have questions or feedback? Join the conversation using your GitHub account.