By Peter Alexander Earls Davis*
On 13 March 2026, Meta quietly announced that it would discontinue end-to-end encryption (E2EE) for Instagram direct messages, effective 8 May 2026. The company’s stated rationale was low user adoption. The feature, introduced in December 2023 as an opt-in toggle available in select regions, had attracted few users. Meta’s suggestion? Move to WhatsApp.
The announcement has drawn criticism from privacy advocates, but little attention has been paid to its significance under EU data protection law. The decision to remove E2EE is notable in itself, but just as important from a legal perspective is why Instagram never offered it by default in the first place. As this post argues, Meta is arguably obliged to offer E2EE, and should have done so by default, under the GDPR’s data protection by design and by default (DPbDD) obligation enshrined in Article 25. What makes this case particularly striking is that Meta’s own platforms illustrate exactly what compliance would and should look like. If Article 25 is to shake longstanding criticism of its utility, this is the case to do it.
The reasons Meta never made E2EE default on Instagram are, it seems, primarily commercial and operational rather than legal. Instagram’s infrastructure is optimized for content discovery, recommendation, and advertising. Its trust and safety systems rely on server-side access to message content for detecting harmful material. E2EE by default would have blinded these systems, creating friction with both platform business models and the growing regulatory pressure around child safety, including the UK Online Safety Act, the proposed EU CSA Regulation, and state-level litigation in the US.
The removal of opt-in E2EE is best understood against the backdrop of this complex (or “wicked”) policy arena. Maintaining parallel encrypted and unencrypted messaging infrastructures on a single platform generates engineering complexity and complicates content moderation. With negligible uptake, the cost-benefit analysis might have tipped toward removal. Meta’s concurrent decision to preserve E2EE on WhatsApp, where encryption is a core product feature, suggests that Meta’s once-emphatic commitment to E2EE across its platforms is retreating.
There is also a data access dimension. In 2016, a research group at Harvard predicted that market forces would limit the adoption of encryption that obscured user data from companies themselves. That prediction initially proved wrong as platforms embraced E2EE as a competitive credential, but it may be proving prescient now. Without E2EE, Meta regains technical capability to scan, analyze, and process DM content for moderation, AI training, and advertising. Given Meta’s December 2025 announcement that interactions with its AI tools within private conversations may be used for targeted advertising, the commercial value of access to unencrypted user communications may simply have overtaken the value of offering E2EE.
The EU’s legal framework, however, points in a different direction. Article 25 of the GDPR imposes two distinct obligations on controllers. Article 25(1) requires the implementation of appropriate technical and organizational measures designed to give effect to data protection principles. Article 25(2), on data protection by default, requires that default settings ensure only personal data necessary for each specific purpose of processing are processed, and that personal data are not made accessible to an indefinite number of persons without the individual’s intervention. As commentators have observed, the “by default” requirements of Article 25(2) are, in certain respects, more concrete and stringent than the broader “by design” obligation.
For messaging services processing the content of private communications, both limbs point to E2EE. It represents the paradigmatic technical measure for ensuring confidentiality by design, as made clear in guidance from European authorities, and it is difficult to argue that E2EE is not the “state of the art” when Meta has itself implemented it by default on WhatsApp (which uses the Signal Protocol) and on Messenger (where Meta rolled out default E2EE in late 2023). Offering E2EE only as an opt-in feature, meanwhile, sits uncomfortably with Article 25(2)’s insistence on privacy-protective defaults. The European Data Protection Board’s Guidelines on DPbDD reinforce this reading, clarifying that the mere existence of a privacy-protective setting does not satisfy the Article 25 obligations unless that setting is activated by default.
Instagram’s architecture inverted this logic by requiring users who wanted E2EE to discover and manually enable it, individually for each conversation. The vast majority, unsurprisingly, did not. Article 25(2) was designed to prevent this very outcome, and it can be cogently argued that EU law does require a messaging platform like Instagram to implement E2EE by default. Article 32 GDPR, which requires security measures appropriate to the risk, further supports this conclusion, as does Article 5(1)(f)’s integrity and confidentiality principle and the CJEU’s recognition that mass surveillance of communications engages the essence of the right to privacy under Article 7 of the EU Charter of Fundamental Rights.
This connects, moreover, to the argument that encryption protections are anchored in fundamental rights. As I have argued elsewhere, legal mandates that systemically weaken cryptographic protections go to the very essence of the right to privacy in Article 7 of the Charter. European courts are increasingly tying encryption to the protection of fundamental rights. The European Court of Human Rights’ recent judgment in Podchasov v Russia categorically rejected laws requiring the systemic weakening of encryption as incompatible with Article 8 ECHR.
This jurisprudence does more than impose negative obligations on states not to undermine encryption; fundamental rights under the ECHR and the EU Charter also generate positive obligations to secure the conditions for privacy and cybersecurity, which apply horizontally to private actors through instruments like the GDPR. The CJEU confirmed as much in Russmedia at para 90, holding that controllers must, when determining what measures are appropriate under Article 25, take account of the severity of the interference with the fundamental rights guaranteed by the Charter. Article 25, on this reading, requires controllers to take affirmative steps to protect the confidentiality of communications they process, including through state of the art, judicially recognized measures such as E2EE.
Whether this doctrinal framework translates into enforcement is, however, another matter. The fact that Meta can withdraw E2EE from Instagram while simultaneously maintaining it on WhatsApp suggests that DPbDD is treated as commercially negotiable rather than a binding legal norm. Article 25 has long been criticized as vague, abstract, and difficult to enforce. Commentators have described it as a “hollow norm” and potentially an “empty abstraction” that regulated entities can satisfy through purely procedural window-dressing. If that critique is correct, Meta’s decision is not an aberration but a predictable consequence of a legal framework under which “appropriate measures” can be quietly abandoned once they cease to be commercially convenient.
The CJEU’s recent Grand Chamber judgment in Russmedia offers reason to think otherwise. The Court held that an online marketplace operator, as controller, is required to implement proactive technical and organizational measures before the publication of content containing personal data. In the Instagram context, the case for enforcement may be stronger still. The usual objection to enforcing Article 25, that its requirements are too indeterminate to ground a finding of breach, loses much of its force when the controller’s own product portfolio provides evidence of the state of the art. Meta’s decision offers Data Protection Authorities an unusually clean test case. If DPbDD is to amount to more than an aspirational standard, this may be its moment.
*This work was supported by the Novo Nordisk Foundation (NNF) via a grant for the scientifically independent Collaborative Research Program in Bioscience Innovation Law (Inter-CeBIL Program – Grant No. NNF23SA0087056).