GDPR Legitimate Interests Assessment: The Complete Guide

Scott Dooley
12 min read · Dec 27, 2022 Last updated: January 4, 2026

Legitimate interests is the most flexible lawful basis under GDPR, but that flexibility comes with responsibility. To rely on legitimate interests, you need to demonstrate that your processing fulfils a genuine purpose, is necessary, and does not override the rights and freedoms of the people whose data you are processing.

This guide explains how to conduct a Legitimate Interests Assessment (LIA), when legitimate interests is and is not appropriate, and includes worked examples to help you apply the framework to your own processing activities.

What Is a Legitimate Interests Assessment?

A Legitimate Interests Assessment is a documented analysis that demonstrates your processing meets the requirements for relying on legitimate interests as your lawful basis under Article 6(1)(f) of UK GDPR.

The assessment follows a three-part test:

  1. Purpose test – Is there a legitimate interest behind the processing?
  2. Necessity test – Is the processing necessary for that purpose?
  3. Balancing test – Is the legitimate interest overridden by the individual’s interests, rights or freedoms?

While conducting an LIA is not a strict legal requirement, it is considered best practice and is difficult to demonstrate compliance with the accountability principle without one. If your reliance on legitimate interests is ever challenged by a data subject or the ICO, your documented LIA will be your primary evidence.

You must complete the assessment before you begin processing.

UK Update: Recognised Legitimate Interests

The UK Data (Use and Access) Act 2025, which received Royal Assent in June 2025, introduced a significant change to the legitimate interests landscape. A new seventh lawful basis called “recognised legitimate interests” now exists under Article 6(1)(ea) of UK GDPR, alongside the traditional legitimate interests basis under Article 6(1)(f).

For certain specified purposes, organisations can now rely on legitimate interests without conducting the full balancing test. These recognised purposes include:

  • Preventing crime, including fraud
  • Safeguarding national security
  • Safeguarding vulnerable individuals
  • Responding to emergencies affecting life or health
  • Democratic engagement (political parties contacting voters)

If your processing falls within one of these recognised categories, you still need to pass the purpose and necessity tests, but the balancing test is not required. This removes the need to weigh the impact on individuals against the benefits of the processing for these specific activities.

For all other legitimate interests processing, the full three-part test still applies. The ICO has published updated guidance on both recognised legitimate interests and the traditional legitimate interests basis. Organisations have until June 2026 to ensure full compliance with the DUA Act changes.

When Legitimate Interests Is Appropriate

Legitimate interests works well when:

  • You have a genuine business reason for processing
  • Processing has minimal privacy impact
  • The individual would reasonably expect the processing
  • You have an existing relationship with the individual
  • There is little risk of harm to the individual

Common examples where legitimate interests is often appropriate include:

  • Fraud prevention and detection
  • Network and information security
  • Internal administrative purposes
  • Direct marketing to existing customers (with opt-out)
  • Processing necessary for legal claims
  • Intra-group data transfers for business purposes

When Legitimate Interests Is NOT Appropriate

Legitimate interests should not be your default choice. It is not appropriate when:

Processing is high risk – If your processing could cause significant harm, distress, or disadvantage, legitimate interests is unlikely to apply

Processing involves children’s data – The balancing test gives extra weight to children’s interests and rights

Processing involves special category data – You cannot use legitimate interests alone for sensitive data such as health, political opinions, or biometric data

Individuals would not expect it – If people would be surprised or concerned by your processing, legitimate interests probably does not apply

You have a power imbalance – Employers processing employee data, or public authorities exercising official functions, face higher bars for legitimate interests

Another lawful basis is more appropriate – If consent or contract would be more transparent and straightforward, use those instead

Public authorities cannot rely on legitimate interests for processing carried out in the performance of their official tasks.

The Three-Part Test: Detailed Guidance

Part 1: Purpose Test

In this first part, you identify your purpose and confirm it constitutes a legitimate interest.

Checklist:

  • Have you clearly defined your specific purpose for processing?
  • Is the purpose lawful and not in conflict with other laws?
  • Is there a real, tangible benefit to you or a third party?
  • Have you identified any wider public benefits?
  • Have you considered any ethical issues with your purpose?
  • Is your stated purpose specific rather than vague?

A vague purpose like “for marketing” is not sufficient. You need to specify what marketing activities, to whom, through which channels, and for what outcome.

Questions to answer:

  • Why do you want to process the data?
  • What benefit do you expect to get from the processing?
  • Do any third parties benefit from the processing?
  • Are there any wider public benefits?
  • What would the impact be if you could not proceed?
  • Are you complying with relevant industry guidelines or codes of practice?

Part 2: Necessity Test

The necessity test asks whether your proposed processing is actually required to achieve your purpose.

Checklist:

  • Will the processing actually help achieve your purpose?
  • Is the processing proportionate to your purpose?
  • Could you achieve the same purpose without processing personal data?
  • Could you achieve the same purpose by processing less data?
  • Is there a less intrusive way to achieve the same outcome?
  • Have you minimised the data processed to what is genuinely required?

“It’s the most convenient way” is not a valid justification. If you can achieve the same outcome with less data or without processing personal data at all, you should pursue that approach instead.

Questions to answer:

  • Is there a reasonable alternative that does not involve personal data?
  • Are you collecting only the data fields you actually need?
  • Can you use anonymised or pseudonymised data instead?
  • Is the scope of processing limited to what is required?

Part 3: Balancing Test

The balancing test weighs your legitimate interests against the rights, interests, and freedoms of the individuals whose data you process.

This is often the most complex part of the assessment and requires you to consider multiple factors.

Nature of the data:

  • Is any of the data special category (health, politics, religion, etc.)?
  • Does it include financial information, location data, or communications content?
  • Does it relate to children or vulnerable individuals?
  • Is the data publicly available or private?
  • How sensitive would individuals consider this data to be?

Reasonable expectations:

  • Would individuals expect this type of processing?
  • Do you have an existing relationship with them?
  • What did you tell them when you collected their data?
  • How long ago was the data collected?
  • Is your intended use obvious and widely understood?
  • Do you have evidence about expectations (surveys, market research)?

Potential impact:

  • Could the processing cause any damage or distress?
  • Could it affect someone’s financial position?
  • Could it affect someone’s reputation?
  • Could it result in discrimination?
  • Could it limit someone’s ability to exercise other rights?
  • What is the severity of potential impact?
  • How likely is any harm to occur?

Safeguards:

  • What measures can you implement to reduce risks?
  • Can you offer an opt-out mechanism?
  • Can you restrict who has access to the data?
  • Can you pseudonymise the data?
  • Have you implemented appropriate security measures?

Safeguards and Risk Mitigation

Even when the balancing test is favourable, you should implement safeguards to minimise any remaining risks to individuals.

Common safeguards include:

  • Opt-out mechanisms – Allow individuals to object to the processing easily
  • Transparency measures – Clear privacy notices explaining the processing
  • Data minimisation – Processing only the minimum data required
  • Pseudonymisation – Replacing identifiers with pseudonyms where possible
  • Access controls – Limiting who can access the data
  • Retention limits – Deleting data when no longer needed
  • Security measures – Encryption, access logging, regular reviews

Document the safeguards you will implement as part of your LIA.

When to Conduct a DPIA Instead

A Legitimate Interests Assessment is a relatively light-touch analysis. However, if your assessment reveals significant risks to individuals, you may need to conduct a full Data Protection Impact Assessment (DPIA).

Consider a DPIA if your LIA reveals:

  • Processing of special category data at scale
  • Systematic monitoring of individuals
  • Processing that could result in legal or significant effects
  • Use of new technologies with unknown privacy implications
  • Processing of children’s data
  • Processing that involves automated decision-making with significant effects

If in doubt, err on the side of conducting a DPIA. It provides stronger evidence of your accountability.

Worked Examples

Example 1: Product Recommendations (Retail)

Scenario: An online retailer wants to analyse customer purchase history to display personalised product recommendations when customers log into their account.

Purpose test: The purpose is to provide relevant product suggestions to customers and increase sales. This is a legitimate commercial interest with a clear benefit to the business and potential benefit to customers who receive relevant recommendations. The purpose is specific and lawful. Passed.

Necessity test: To generate relevant recommendations, some analysis of purchase history is necessary. The retailer will use only purchase data from the last 24 months and will not enhance it with external data sources. Anonymous recommendations would not be personalised and therefore would not achieve the purpose. Passed.

Balancing test: The data is not sensitive. Customers would reasonably expect an online retailer to make recommendations based on their shopping history, as this is standard practice in e-commerce. The impact is minimal – customers simply see different product suggestions. The retailer will implement an opt-out option in account settings. Passed.

Outcome: Legitimate interests is appropriate. The retailer documents the assessment and implements an opt-out mechanism.

Example 2: Fraud Prevention (Financial Services)

Scenario: A payment processor wants to analyse transaction patterns to detect and prevent fraudulent transactions.

Purpose test: Preventing fraud is a legitimate interest explicitly mentioned in GDPR Recital 47. It protects both the business and customers from financial harm. Under the UK DUA Act 2025, this is also a “recognised legitimate interest.” Passed.

Necessity test: Fraud detection requires analysis of transaction data including amounts, times, locations, and patterns. This processing is necessary – without it, fraudulent transactions would go undetected. Passed.

Balancing test: Customers would expect their payment provider to protect them from fraud. The processing protects customer interests. While transaction monitoring is extensive, the alternative (no fraud protection) would be worse for customers. Security measures protect the data from misuse. Passed.

Outcome: Legitimate interests is appropriate. As fraud prevention is a recognised legitimate interest under UK law, the balancing test is presumed favourable, though safeguards remain essential.

Example 3: Employee Monitoring (Workplace)

Scenario: An employer wants to install software that monitors employee computer activity including screenshots, keystroke logging, and website visits.

Purpose test: The employer states the purpose is to measure productivity and protect company assets. These are legitimate business interests. Passed with caution.

Necessity test: The proposed monitoring is extensive. Less intrusive alternatives exist, such as measuring output rather than activity, or monitoring only access to sensitive systems. Keystroke logging and continuous screenshots go beyond what is necessary for productivity measurement. Failed.

Balancing test: Even if necessity were established, the balancing test presents problems. Employees have a reasonable expectation of some privacy at work. Continuous monitoring is highly intrusive and could cause significant distress. The power imbalance between employer and employee means consent is not freely given. Failed.

Outcome: Legitimate interests is not appropriate for comprehensive employee monitoring. The employer should consider less intrusive alternatives and, if monitoring is essential, use consent with genuine choice, or limit monitoring to specific high-risk activities with clear justification.

Example 4: AI Model Training

Scenario: A software company wants to use customer support chat transcripts to train an AI model that will improve automated responses.

Purpose test: Improving customer service is a legitimate business interest. The purpose is specific and lawful. Passed.

Necessity test: Training AI models requires example data. However, the company should consider whether anonymised or synthetic data could achieve similar results. If real transcripts are necessary, the company should use only the minimum data required. Requires careful analysis.

Balancing test: Customers may not expect their support conversations to train AI systems. The transcripts may contain personal details shared during support interactions. The company needs strong safeguards: anonymisation where possible, clear privacy notices, opt-out for future interactions, strict access controls on training data. Requires strong safeguards.

Outcome: Legitimate interests may be appropriate if strong safeguards are implemented and processing is clearly disclosed. Consider updating privacy notices before proceeding and offering opt-out options.

Documenting Your Assessment

Your LIA should be a written document that records:

  1. The processing activity – What data, for what purpose
  2. Your analysis of each test – Purpose, necessity, balancing
  3. Evidence considered – Customer expectations research, risk analysis
  4. Safeguards implemented – Opt-out, security measures, retention limits
  5. Your conclusion – Whether legitimate interests applies
  6. Date and review schedule – When to reassess

Keep the assessment proportionate to the risk. A simple, low-risk processing activity needs only a brief assessment. Complex or higher-risk processing requires more detailed analysis.

Review your LIA if circumstances change, such as new data being collected, changes in how individuals might perceive the processing, or regulatory guidance updates.

Key Takeaways

  • Legitimate interests requires documented justification through a three-part test
  • Complete your LIA before you start processing
  • The UK now has “recognised legitimate interests” for specified purposes under the DUA Act 2025
  • Legitimate interests is not appropriate for high-risk processing, children’s data, or where people would not expect it
  • Implement safeguards to minimise risks even when the balancing test is favourable
  • If your LIA reveals significant risks, consider conducting a full DPIA instead

Further Resources

A Note on This Guide

This article provides general information about conducting Legitimate Interests Assessments under UK GDPR. It does not constitute legal advice. The UK Data (Use and Access) Act 2025 is now in force, with full compliance required by June 2026. For processing activities that are complex, high-risk, or sector-specific, consider seeking professional legal advice.

Author

  • Scott Dooley is a seasoned entrepreneur and data protection expert with over 15 years of experience in the tech industry. As the founder of Measured Collective and Kahunam, Scott has dedicated his career to helping businesses navigate the complex landscape of data privacy and GDPR compliance.

    With a background in marketing and web development, Scott brings a unique perspective to data protection issues, understanding both the technical and business implications of privacy regulations. His expertise spans from cookie compliance to implementing privacy-by-design principles in software development.

    Scott is passionate about demystifying GDPR and making data protection accessible to businesses of all sizes. Through his blog, he shares practical insights, best practices, and the latest developments in data privacy law, helping readers stay informed and compliant in an ever-changing regulatory environment.

    View all posts