New Lawsuit Highlights Concerns About AI Notetakers: 7 Steps Businesses Should Take

Sep 5, 2025

A new lawsuit against Otter.ai spotlights the legal risks of AI notetakers. Heres what businesses need to know and how to protect themselves.

A recently filed class action lawsuit against Otter.ai underscores the legal and compliance risks organizations face when using AI-driven notetaking tools. The case, brought on August 15 in the U.S. District Court for the Northern District of California (5:25-cv-06911) before Judge Eumi K. Lee, alleges that Otter’s transcription service secretly records conversations without proper consent and then uses that data to train its models.

While the convenience of automated notetaking is appealing, this case illustrates the significant privacy, security, and compliance concerns that accompany these technologies.

Case Summary: Brewer v. Otter.ai

The complaint, raises several key allegations:

  • Unauthorized Interception: Otter allegedly records conversations involving not only its account holders but also third parties who did not consent, potentially violating federal and California wiretap laws.

  • Training AI Models Without Permission: The suit claims that Otter retains conversational data indefinitely and uses it to refine its systems without explicit authorization.

  • Shifting Compliance Burdens: Plaintiffs argue that Otter tells its customers to secure permissions themselves, effectively outsourcing legal compliance obligations.

  • Multiple Legal Violations: Alleged violations include the Electronic Communications Privacy Act (ECPA), Computer Fraud and Abuse Act (CFAA), California Invasion of Privacy Act (CIPA), privacy torts, and the state’s Unfair Competition Law.

It is important to note that these are early-stage allegations. No findings of fact or law have been made, and Otter will have the opportunity to contest the claims in court.

The Risks With AI Notetakers

Even if organizations prohibit the use of these apps, employees often adopt them informally, making governance difficult. Key risks include:

  • Consent and Privacy: Many jurisdictions require all-party consent before recording. Failure to secure this could result in liability.

  • Data Ownership and Use: Vendors may retain transcripts, metadata, or recordings indefinitely, sometimes for AI training purposes.

  • Security: Cloud-based storage can expose data to unauthorized access or cross-border transfer risks.

  • Workplace Compliance: Meetings may include discussions of health conditions, union activity, or complaints of harassment recording these can heighten legal exposure.

  • Privilege and Confidentiality: If notetakers capture attorney-client communications, it may complicate privilege claims.

  • Reputation: Even if legal, silent recordings may be viewed as a breach of trust by clients or employees.

7 Steps Businesses Should Take

If your organization uses, or is considering using, AI notetakers, here are seven practical steps to mitigate risks:

1. Update Consent Protocols
Secure consent from all meeting participants each time a notetaker is used, whether internal or external. Include clear notice in meeting invitations and policies.

2. Carefully Vet Vendors
Ask about data storage, retention, and use. Seek contractual assurances that sensitive information will not be repurposed or used for training.

3. Establish a Company Policy
Define when and how AI notetakers may be deployed. Clarify employee responsibilities for notification, consent, and data handling.

4. Limit Use in Sensitive Contexts
Avoid using notetakers in meetings involving privileged legal strategy, HR investigations, or confidential client matters.

5. Review Security Safeguards
Confirm that vendors use encryption, access controls, and compliant storage practices. Understand where data is stored geographically.

6. Train Employees and Managers
Educate staff on appropriate and inappropriate use. Provide scripts for informing clients or third parties.

7. Develop a Governance Framework
Integrate AI notetaker policies into your broader AI governance strategy. Align with FTC, EEOC, and NIST guidance to stay ahead of evolving regulations.

Conclusion

The Brewer v. Otter.ai lawsuit is an early but important test of how courts may view AI transcription services under federal and state privacy laws. Regardless of the outcome, businesses should not wait to act. By updating consent protocols, strengthening policies, and embedding AI governance, organizations can reduce liability while still benefiting from productivity-enhancing tools.

Learn More