Does your library’s privacy policy say you “don’t share patron data with third parties”?

Your AI tools might be doing exactly that. (And your vendor contracts might be allowing it.)

Do you use an AI-powered discovery system? A chatbot? Automated recommendations? Cloud-based databases?

If you answered yes to both, your privacy policy is lying to your patrons. And that’s a legal problem under FERPA, COPPA, and state privacy laws.

The Problem: What Your Privacy Policy Says vs. What Your AI Does

What most library privacy policies say:

“We do not sell, rent, or share your personal information with third parties except as required by law or with your explicit consent.”

What’s actually happening when you use AI tools:

  • Discovery systems send search queries to vendor servers for AI processing
  • Chatbots transmit patron questions to third-party AI services (OpenAI, Google, Microsoft)
  • Recommendation engines analyze borrowing patterns on vendor-hosted infrastructure
  • Database vendors use anonymized usage patterns to train AI models
  • Digital collections process user interactions through cloud AI services

That’s data sharing with third parties. And your privacy policy just said you don’t do that.

Real Example: The EBSCO Discovery Scenario

Say you use EBSCO Discovery Service with AI-powered search recommendations. Here’s what happens when a patron searches for “anxiety treatment”:

  1. Patron enters search query
  2. Query is sent to EBSCO’s servers (third party)
  3. EBSCO’s AI processes the search using usage patterns from millions of library users
  4. AI ranks results based on what similar users found useful
  5. Results are displayed to patron

At minimum, you’ve shared:

  • The search query (“anxiety treatment”)
  • Usage patterns (what they clicked, how long they stayed)
  • Session metadata (time, location if using IP-based location services)

EBSCO likely uses that data—anonymized—to improve their AI for all customers.

Is this “sharing personal information with third parties”? Legally, it might be.

Did your privacy policy disclose this? Almost certainly not.

Could this create liability under state privacy laws? Absolutely.

California Consumer Privacy Act (CCPA) / California Privacy Rights Act (CPRA)

Applies to: Any library serving California residents (yes, even if you’re not in California)

Requires:

  • Clear disclosure of what personal information you collect
  • Clear disclosure of what third parties receive that information
  • Notice of AI use when automated decision-making affects individuals
  • Right to opt out of data sharing for AI training

Penalties: Up to $7,500 per intentional violation. If 1,000 patrons are affected, that’s $7.5 million.

Virginia, Colorado, Connecticut, Utah (and more)

As of 2026, 13 states have comprehensive privacy laws. They all require substantially similar disclosures about:

  • What data you collect
  • Who you share it with
  • How automated systems use it
  • How individuals can opt out

If your privacy policy was last updated in 2018 and says “we don’t share data with third parties,” you’re non-compliant in multiple jurisdictions.

The Special Problem: Student Data (FERPA)

If you’re an academic library, you’ve got FERPA (Family Educational Rights and Privacy Act).

FERPA restricts disclosure of “education records” without consent. Courts have ruled that library records can be education records if they’re tied to students and used for educational purposes.

The FERPA-AI problem:

If your discovery system uses AI to recommend resources to students based on their search history, borrowing patterns, or course enrollment data, and that AI processing happens on a vendor’s server, you’ve potentially disclosed education records to a third party without proper FERPA safeguards.

FERPA requires:

  • Written agreements with vendors (not just contracts—specific FERPA-compliant language)
  • Restrictions on further disclosure
  • Assurance that AI training doesn’t violate FERPA

Most library-vendor contracts from 2020 or earlier don’t have this language.

The Children’s Problem: COPPA

Public libraries serving children under 13 need to comply with COPPA (Children’s Online Privacy Protection Act).

If your library’s AI systems collect data from children under 13—including search queries, website browsing history, app usage, or any “persistent identifiers” (cookies, IP addresses, device IDs)—you need verifiable parental consent before collecting that data.

The AI chatbot example:

You implement an AI homework helper chatbot on your website. A 10-year-old asks, “How do I solve 2x + 5 = 15?”

Under COPPA:

  • That’s personal information (it came from an identifiable child)
  • You need parental consent before the AI processes it
  • You need to disclose what data the AI collects and how it’s used
  • You need to allow parents to review and delete that data

Most libraries implementing AI chatbots haven’t done any of this.

How to Audit Your Privacy Policy for AI

Step 1: List every system that uses AI

Make a spreadsheet:

  • System name (e.g., EBSCO Discovery, LibraryH3lp chatbot, Alma analytics)
  • Vendor
  • What data it collects (search queries, borrowing history, user interactions)
  • Where that data goes (vendor servers, cloud AI services, third-party processors)
  • What the vendor does with it (real-time processing only? Stores it? Uses for AI training?)

Step 2: Compare to your privacy policy

For each system, ask:

  • Does your policy disclose this data collection?
  • Does your policy disclose sharing with this vendor?
  • Does your policy explain AI processing?
  • Does your policy offer opt-out for AI profiling?

If the answer is “no” to any of these, you have a gap.

Step 3: Check vendor contracts

For each AI vendor, review the contract for:

  • Data use restrictions: Can they use your library’s data to train AI for other customers?
  • FERPA compliance (if academic library)
  • COPPA compliance (if serving children)
  • Breach notification requirements
  • Data deletion procedures

If your contract is silent on these, or if it allows vendor AI training on your data, you need to renegotiate or update your privacy policy to disclose this.

Model Language: How to Fix Your Privacy Policy

Section 1: AI Systems and Data Processing

**Artificial Intelligence and Automated Decision-Making**

Some of our library systems use artificial intelligence (AI) and machine learning to improve services. This includes:

- **Discovery and Search Systems**: Our catalog and database search tools use AI to rank results, suggest related materials, and improve search accuracy. When you search, your query and usage patterns (what you click, how long you view items) are processed by our vendors using AI algorithms. This processing may occur on vendor servers located outside our library.

- **Recommendation Systems**: AI analyzes anonymized borrowing and browsing patterns to suggest materials you might find interesting. Individual recommendations are based on aggregated usage data from all library users.

- **Chatbots and Virtual Assistants**: Our automated chat services use AI to answer common questions. Your questions are processed by third-party AI providers [list specific providers: OpenAI, Google, Microsoft, etc.]. Conversations may be stored temporarily for AI training and service improvement.

**Your Rights Regarding AI:**
- You can opt out of personalized recommendations (see "Your Privacy Choices" below)
- You can request information about how AI systems process your data
- You have the right to human review of any automated decision that affects your library privileges

For technical questions about our AI systems, contact [privacy officer email].

Section 2: Third-Party Data Sharing

**Third-Party Service Providers**

We share certain information with third-party vendors who provide library services:

- **Library System Vendors**: [List vendors: EBSCO, ProQuest, OCLC, etc.] receive your search queries, borrowing records, and usage data to provide catalog, database, and discovery services. These vendors process data on our behalf and are contractually restricted from using it for other purposes, except for anonymized aggregate analytics to improve their services.

- **AI Service Providers**: Some library tools use third-party AI services [list: OpenAI for ChatGPT, Google Cloud AI, Microsoft Azure AI]. When you interact with these tools, your queries and usage data are transmitted to these providers for processing. We have agreements requiring these providers to:
  - Process data only for library services
  - Not use individual patron data for AI training without anonymization
  - Delete or return data upon request
  - Maintain appropriate security safeguards

**Your data is NOT:**
- Sold to third parties
- Used for targeted advertising
- Shared with social media companies
- Included in AI training datasets in identifiable form without your consent

For a complete list of our third-party service providers and their privacy practices, see [link to vendor list].

Section 3: Children’s Privacy (if applicable)

**Children's Privacy (COPPA Compliance)**

We take children's privacy seriously and comply with the Children's Online Privacy Protection Act (COPPA).

**For children under 13:**
- We do not knowingly collect personal information from children under 13 through our website or digital services without verifiable parental consent
- If AI-powered tools (chatbots, recommendation systems) are available to children, we use them only in COPPA-compliant ways:
  - Parental consent is obtained before data collection
  - Parents can review and delete children's data
  - Children's data is not used for AI training or profiling

**Parents' Rights:**
- Review what information we've collected from your child
- Request deletion of your child's information
- Refuse to allow further collection
- Contact us with privacy questions: [privacy officer contact]

Section 4: Your Privacy Choices

**Your Privacy Choices and Rights**

Depending on your state of residence, you may have additional privacy rights under laws like the California Consumer Privacy Act (CCPA), Virginia Consumer Data Protection Act (VCDPA), and similar state laws.

**Your Rights May Include:**
- **Right to Know**: Request information about what personal data we collect, how we use it, and who we share it with
- **Right to Delete**: Request deletion of your personal information (subject to legal retention requirements)
- **Right to Opt Out**: Opt out of:
  - AI-powered personalized recommendations
  - Data sharing with third parties for AI training
  - Profiling and automated decision-making
- **Right to Correct**: Request correction of inaccurate personal information
- **Right to Non-Discrimination**: Exercise privacy rights without denial of library services

**How to Exercise Your Rights:**
- Email: [privacy officer email]
- Phone: [privacy officer phone]
- In-person: Visit any library location and ask for the Privacy Officer
- Online form: [link to privacy request form]

We will respond to your request within [30/45 days depending on state law] and will not charge a fee unless your request is excessive or repetitive.

What to Do This Month

Week 1: Audit current state

  • List all systems using AI
  • Review existing privacy policy
  • Identify gaps

Week 2: Vendor communications

  • Email all vendors asking:
    • “Does your system use AI?”
    • “What data does it collect and process?”
    • “Do you use our data for AI training?”
    • “Are you FERPA/COPPA compliant?”
  • Document all responses

Week 3: Legal review

  • If you have legal counsel, have them review state privacy law obligations
  • If you don’t, use free resources: State attorney general privacy guidance, ALA privacy toolkit, state library agency resources

Week 4: Update privacy policy

  • Add AI disclosure sections
  • Add third-party sharing details
  • Add opt-out mechanisms
  • Get board/administration approval
  • Post updated policy prominently
  • Notify patrons of changes

Frequently Asked Questions

Does my library privacy policy need to mention AI?

Yes, if you use any AI-powered systems (discovery tools, chatbots, recommendation engines, automated cataloging). If your privacy policy says “we don’t share data with third parties” but your AI systems send data to cloud providers or use patron data for training, that’s a legal problem under FERPA, COPPA, and state privacy laws.

What should a library privacy policy say about AI?

Your privacy policy should disclose: (1) Which systems use AI, (2) What data is collected and processed, (3) Whether patron data is used for AI training, (4) Which third parties receive data, (5) How patrons can opt out, and (6) Special protections for children under FERPA and COPPA.

Is library AI use regulated by FERPA?

Yes. FERPA applies to libraries in schools and universities. If your AI systems process student education records (checkout history, database usage, research queries) and share that data with vendors without consent, you’re violating FERPA. You need data processing agreements and updated privacy policies.

How do I make my library privacy policy COPPA compliant for AI?

For children under 13, your privacy policy must: (1) Obtain verifiable parental consent before collecting data through AI systems, (2) Allow parents to review and delete children’s data, (3) Never use children’s data for AI training or profiling, and (4) Provide clear opt-out mechanisms. Many library chatbots and recommendation engines are not COPPA compliant.

Can patrons opt out of library AI systems?

They should be able to. Many state privacy laws (California CCPA, Virginia VCDPA, Colorado CPA) give patrons the right to opt out of AI-powered profiling, automated recommendations, and data sharing for AI training. Your privacy policy should explain how to opt out and ensure library services remain accessible to those who opt out.

The Bottom Line

Your privacy policy is a legal document. It’s not marketing fluff. It’s a binding representation of your data practices.

If your policy says you don’t share data with third parties, but your AI systems do, you have three options:

  1. Stop using AI until you can update your policy
  2. Update your policy immediately to accurately describe what you’re doing
  3. Change your AI practices to match your policy

Option 2 is usually the most realistic.

Most patrons aren’t going to flee the library because you disclose AI use. They will lose trust if they discover you’ve been misleading them. Transparency builds trust. Deception destroys it.

Fix your privacy policy. Do it this month.

Your patrons are trusting you. Don’t betray that trust with outdated policies.


Authenticity note: With the exception of images, this post was not created with the aid of any LLM product for prose or description. It is original writing by a human librarian with opinions.