CNIL Publishes Recommendations on AI and GDPR
On February 7, 2025, the French Data Protection Authority (CNIL) released two significant recommendations designed to guide organizations in the responsible development and deployment of artificial intelligence (AI) systems, in accordance with the EU’s General Data Protection Regulation (GDPR). These recommendations, building upon the CNIL’s 2023 four-pillar AI action plan, provide crucial insights for organizations navigating the evolving landscape of AI and data privacy.

These recommendations are titled “AI: Informing Data Subjects” (the “Recommendation on Informing Individuals”) and “AI: Complying and Facilitating Individuals’ Rights” (the “Recommendation on Individual Rights”).
General Principles
The CNIL clarified in its press release:
- The purpose limitation principle can be flexibly applied to general-purpose AI systems. Organizations unable to define all future applications during training may describe the system type and potential key functionalities.
- The data minimization principle does not prevent the use of large training datasets. Data should be selected and cleaned to optimize algorithm training while preventing unnecessary personal data usage.
- Training data can be retained for a longer time if security measures are in place.
- The reuse of databases, including those available online, is possible, given the data was collected lawfully and that its reuse is compatible with the original collection purpose.
Recommendation on Informing Individuals
The CNIL emphasizes the importance of transparency in AI systems that process personal data. Organizations must convey clear, accessible, and understandable information to data subjects regarding the processing of their data.
Key points from this recommendation include:
- Timing: Information should be provided at the time of data collection. If data is obtained indirectly, individuals must be informed as soon as possible, and no later than at the first contact or data sharing event. In any case, this must occur within one month.
- How to provide information: Information should be concise, transparent, and easily understood, using clear language. It should be easily accessible and separate from other content. CNIL suggests a layered approach, offering essential information upfront with links to more detailed explanations.
- Derogations to individual information: The CNIL analyzes cases where exemption from individual notification is permitted; for example, if individuals already have the information under Article 14 of the GDPR. Organizations must ensure these exemptions are applied judiciously.
- Information Content: As a general rule, provision of details as required by Articles 13 and 14 of the GDPR will be necessary. When individual notification is exempt, organizations must use general privacy notices, like a website, containing similar relevant information. If the organization cannot identify individuals then they must explicitly state this in the notice. The organization must provide specific details on data sources. However, a general disclosure is acceptable if the data comes from numerous publicly available sources.
- AI Models Subject to GDPR: Not all AI models are subject to GDPR since some are anonymous and do not process personal data. Certain AI models may retain parts of the training data and therefore fall under GDPR.
Recommendation on Individual Rights
The CNIL’s guidelines aim to ensure that individual rights are respected and facilitated when their personal data is used in developing AI systems or models.
General Principles:
- Individuals must be able to exercise their GDPR rights with respect to training datasets and AI models (unless the AI models are anonymized). They must be able to exercise their data protection rights.
- The CNIL notes that while the rights of access, rectification, or erasure for training datasets present challenges similar to large databases, exercising these rights directly with respect to the AI model raises unique and complex issues.
Exercising Rights in AI model or System Development:
- The CNIL highlights that responses depend on whether requests regard training datasets or the AI model itself. When rights requests relate to training datasets, identifying individuals can be challenging.
- Organizations must also inform individuals of how their request is interpreted and handled.
- If an organization no longer requires data to identify individuals and can prove this, it may reflect this in its response to requests. AI providers do not need to identify individuals in their training datasets.
- Organizations are not required to retain identifiers solely to facilitate rights requests if data minimization principles justify their deletion. If the individual provides additional details, the organization can use this to verify and facilitate rights requests. Individuals have to get copies of their personal data from training datasets, including annotations and metadata in an understandable format. When complying with the right of access, organizations must provide details on data recipients and sources.
- Further, when complying with the right of access, organizations must provide details on data recipients and sources. If the original source is known, this information must be disclosed.
- With respect to the rectification, erasure, and objection rights, the CNIL clarifies that, among others: Individuals can request correction of inaccurate annotations in training datasets. When processing is based on legitimate interest or public interest, individuals may object, if the circumstances justify it.
- Article 19 of the GDPR provides that a controller must notify each data recipient with whom it has shared personal data of a rectification, restriction, or deletion request. When a dataset is shared, updates must be communicated via APIs or contractual obligations.
Exercising rights on AI Models subject to GDPR:
- Organizations must assess if a model contains personal data. Then, the organization must identify the data included and give data subjects the opportunity to provide additional information to help verify their identity and exercise their rights.
- For generative AI models, the CNIL recommends that providers establish an internal procedure to systematically query the model using a predefined set of prompts.
- The rights to rectification and erasure are not absolute and should be assessed in light of the sensitivity of the data and the impact on the organization, including the technical feasibility and cost of retraining the model.
Exceptions to the exercise of rights:
- When relying on an exception to limit individuals’ rights via GDPR, the organization must inform individuals in advance that their rights may be restricted and explain the reasons for such restrictions.