Privacy Washing: Why is there a need for an evaluation framework for Privacy Enhancing Technologies?
Article 12 of the Universal Declaration of Human Rights refers to an individual’s right to be protected against interference with their privacy. The increasing adoption of AI in all disciplines threatens that right. Last month, privacy experts from around the world were joined by legal scholars, policy experts, and policy regulators at a Dagstuhl seminar to develop an evaluation framework for Privacy Enhancing Technologies that can be adopted worldwide.
Significant amounts of data need to be provided in order to train an AI model. The rapid increase in AI adoption came at the detriment of privacy protection. To mitigate this risk, experts develop Privacy Enhancing Technologies (PETs). An example of PETs are tools designed to enable data processing by machine learning algorithms without allowing unnecessary access to personal information.
While PETs show great promise in protecting sensitive data from malicious access, scientists also note the potential pitfalls of these technologies: a lack of standardized frameworks for the privacy evaluation of these tools leads to the adoption of solutions that either do not protect privacy or still have potential to harm the users. The seminar discussed the use of “Privacy Washing”, a practice where businesses declare they employ PETs to boost their profits without ensuring that the technologies employed are fully protective of the users. “Detection of privacy washing is a challenging task, as the lead causes may occur at different stages of complex system pipelines, from system purpose specification, through algorithms, all the way to communication about what the system does.", says Asia Biega, faculty member at MPI-SP and participant at the Dagstuhl seminar.
At Dagstuhl, privacy experts from around the world were joined by legal scholars, policy experts, and policy regulators to develop an evaluation framework for Privacy Enhancing Technologies that can be adopted worldwide. “Protecting the data is only a first step, and it is insufficient in many cases. […] Limitations of PETs should be well documented so that privacy washing through PETS is stopped”, says Carmela Troncoso, Scientific Director at MPI-SP and co-organizer of a Dagstuhl seminar. The participants at the seminar also worked on drafting a position paper on privacy washing through PETS and came up with ideas on how this issue can be avoided in the future.
About the Dagstuhl Seminars
Dagstuhl Castle, the home of the Leibniz Center for Informatics, provides a unique environment designed to foster and facilitate scientific exchange. The review process of the seminar proposal is extremely competitive, ensuring the best scientific quality for the participants at these events. The Dagstuhl Seminar titled “PETs and AI: Privacy Washing and the Need for a PETs Evaluation Framework”, organized by Carmela Troncoso (MPI-SP), Emiliano De Cristofaro (University of California - Riverside), Kris Shrishak (Irish Council for. Civil Liberties), and Thorsten Strufe (Karlsruher Institut für Technologie) took place in March 2025.