AI Note Taking Tools and GDPR: Do You Need a New Lawful Basis?

Scott Dooley
6 min read · Feb 3, 2026

AI-powered meeting tools like Otter.ai, Fireflies, and Microsoft Copilot are now standard in many workplaces. They transcribe conversations, summarise discussions, and generate action items automatically.

But if you’re responsible for data protection, you’ve probably asked: do we need to identify a new lawful basis to use these tools?

The short answer is no – not if your purposes haven’t changed. But there are still compliance steps you need to follow. This guide explains what the ICO says and what you need to do.

Do You Need a New Lawful Basis for AI Note Taking?

Under UK GDPR, you must identify a lawful basis for each purpose you process personal data. The key word here is purpose.

When Your Existing Lawful Basis Still Applies

If your organisation has always recorded meetings, taken notes, or created documents from discussions, you already have a lawful basis for those activities. Using an AI tool to do the same thing doesn’t change why you’re processing the data – it just changes how.

According to ICO guidance, if you’re not using the AI tool for any new purposes or processing activities, you can continue to rely on the lawful bases you’ve already identified.

This applies across the board:

  • Meeting recordings
  • Note-taking and summarisation
  • Creating draft documents from discussions

The same principle applies to special category data. If you’re already processing sensitive information under an appropriate condition, you don’t need to identify a new one just because you’re now using AI to help.

When You DO Need to Identify a New Basis

You will need a new lawful basis if you’re using AI to do something genuinely different. This includes:

  • Using AI outputs to make decisions about people (performance assessments, hiring decisions, customer profiling)
  • Processing data for purposes you weren’t doing before
  • Sharing personal data with the AI provider for their own purposes, such as training their models

Each distinct processing activity needs its own lawful basis. If you’re doing something new, you need to work through the analysis again.

Compliance Steps When Using AI Note Taking Tools

Even if your lawful basis stays the same, you still have obligations under UK GDPR. Here’s what you need to do.

Update Your Privacy Notices

People have a right to know how and why their personal data is used. When you introduce AI tools, your privacy notices should explain this.

Tell people that AI is being used for transcription, summarisation, or document creation. Explain what happens to the data and who has access to the outputs. Do this before you start using the tool – not after.

If meeting participants include external parties (clients, suppliers, candidates), make sure they’re informed too. A brief statement at the start of a meeting is good practice: “This meeting is being recorded and transcribed using AI.”

Check AI Outputs for Accuracy

UK GDPR requires personal data to be accurate and, where necessary, kept up to date. AI transcription tools make mistakes. They mishear names, confuse speakers, and get technical terms wrong.

Before you share or store AI-generated notes, someone should review them. This doesn’t need to be exhaustive – a quick check for obvious errors is often enough. But you need a process in place.

If you spot inaccuracies, correct them promptly. Don’t let incorrect information sit in your records or get shared more widely.

Consider Data Protection Impact Assessments

The ICO says most AI applications trigger DPIA requirements because they involve high-risk processing. This is particularly true where there’s “systematic and extensive evaluation of personal aspects based on automated processing.”

Even if a full DPIA isn’t strictly required, documenting your assessment is good practice. Record what risks you’ve identified, what measures you’re taking to address them, and what residual risk remains.

If you’re processing large volumes of meeting data, or if the AI tool is making inferences about individuals, a formal DPIA is almost certainly needed.

Watch Out for Automated Decision-Making Rules

If you’re using AI note-taking tools purely for transcription and summarisation, Article 22 restrictions on automated decision-making probably don’t apply. But the picture changes if AI outputs start influencing decisions about people.

When Article 22 Applies

Article 22 restricts decisions that are “solely automated” and have legal or similarly significant effects on individuals. Think HR decisions, customer credit assessments, or performance evaluations.

If your AI meeting tool is flagging employee performance issues, identifying “negative sentiment,” or feeding into appraisal systems, you’re moving into territory where these rules apply.

When Article 22 is triggered, you need additional safeguards: human oversight, the right to contest decisions, and clear explanations of the logic involved.

Note that ICO guidance on automated decision-making is currently under review following the Data (Use and Access) Act 2025. The rules in this area may change.

Sharing Data with AI Providers

Check what happens to your data once it reaches the AI provider. This is where many organisations trip up.

If your AI tool provider uses your meeting data to improve their models or for any other purpose beyond providing the service to you, that’s data sharing – not just data processing. You need a lawful basis for that sharing, separate from your basis for using the tool.

Enterprise versions of AI tools often include “no training” commitments, meaning your data won’t be used to improve the model. Check your contract carefully. If you’re on a consumer or basic business plan, the terms may be different.

If data is being shared for training purposes, consider whether legitimate interests applies (you’ll need to complete a legitimate interests assessment), whether contractual necessity covers it, or whether consent from meeting participants is required.

Key Takeaways

Using AI for meeting notes doesn’t automatically require a new lawful basis – if your purposes haven’t changed, your existing basis still applies.

But you still need to:

  • Update privacy notices to explain AI is being used
  • Review AI outputs for accuracy before sharing or storing
  • Consider whether a DPIA is needed
  • Check your contract for data sharing terms

The ICO has made clear that AI doesn’t get special treatment under data protection law. The same principles apply – you just need to think through how they apply to your specific situation.

Further Reading

Author

  • Scott Dooley is a seasoned entrepreneur and data protection expert with over 15 years of experience in the tech industry. As the founder of Measured Collective and Kahunam, Scott has dedicated his career to helping businesses navigate the complex landscape of data privacy and GDPR compliance.

    With a background in marketing and web development, Scott brings a unique perspective to data protection issues, understanding both the technical and business implications of privacy regulations. His expertise spans from cookie compliance to implementing privacy-by-design principles in software development.

    Scott is passionate about demystifying GDPR and making data protection accessible to businesses of all sizes. Through his blog, he shares practical insights, best practices, and the latest developments in data privacy law, helping readers stay informed and compliant in an ever-changing regulatory environment.

    View all posts