The non-profit organization noyb has sent a cease and desist letter to Meta over the company’s plans to use EU personal data from Instagram and Facebook users to train its new AI systems. Meta announced it would begin using this data from May 27 onwards without obtaining opt-in consent from users, instead relying on a claimed ‘legitimate interest’ to process the data.
Background on Meta’s AI Training Plans
Meta’s decision to use EU user data for AI training without explicit consent has sparked significant controversy. The company is bypassing the requirement for opt-in consent by claiming a ‘legitimate interest’, a legal basis that allows data processing under certain conditions. However, critics argue this approach violates the General Data Protection Regulation (GDPR), which typically requires explicit user consent for such data processing.
Legal Implications and Potential Consequences
The GDPR provides several legal bases for processing personal data, including opt-in consent and legitimate interest. While companies can claim legitimate interest, this must be balanced against individual rights. In Meta’s case, the company is giving users only the right to object (opt-out) under Article 21 GDPR, rather than seeking prior consent.

Max Schrems, noyb’s founder, criticized Meta’s approach, stating that the European Court of Justice has already ruled against Meta’s use of ‘legitimate interest’ for targeted advertising. He argued that using this basis for AI training is equally unjustified. Schrems emphasized that a simple solution exists: asking users for opt-in consent.
Potential Legal Actions Against Meta
As a Qualified Entity under the new EU Collective Redress Directive, noyb can bring an injunction to stop Meta’s data processing practices. If successful, this could force Meta to delete any AI models trained on EU user data. The organization is also considering a class action for non-material damages on behalf of affected users. Other consumer groups, such as German Consumer Organisations, are also exploring legal action.
Broader Impact and Future Implications
The outcome of this dispute could have significant implications for Meta and other tech companies operating in the EU. If Meta is found to be in violation of GDPR, it could face substantial fines and be required to alter its data processing practices. The case highlights the ongoing tension between tech companies’ data needs for AI development and individual privacy rights under EU law.
The situation is being closely watched by data protection authorities and consumer rights groups across the EU, as it may set a precedent for how companies can use personal data for AI training in the future.