The Jury of the Stefano Rodotà Award, composed of the members of the Bureau of the Committee of Convention 108, has selected the winners of the Stefano Rodotà Data Protection Award 2024. After reviewing all the exceptional works submitted by the applicants, the Jury has decided to award;
- PhD thesis category, Konrad Kollnig for his thesis “Regulatory Technologies for the Study of Data and Platform Power in the App Economy”
- Article Category, Lin Kyi for her co-authored article “Investigating Deceptive Design in GDPR’s Legitimate Interest”
WINNER OF THE PhD THESIS CATEGORY
“Regulatory Technologies for the Study of Data and Platform Power in the App Economy”
About the PhD thesis
Konrad Kollnig’s PhD thesis conducted a technological and legal study into mobile apps, with the aim of improving data protection in practice. After all, mobile apps are our primary means of accessing the internet. For example, 82% of Facebook users access the social network only through its app, not its website. At the same time, crucial questions at the intersection of data protection law and practice have long remained unanswered: Do apps properly seek consent? How does data protection compare between iOS and Android? Has the GDPR changed apps’ data practices? Kollnig’s thesis delivered answers to these and other important data protection questions, and revealed critical challenges in the enforcement of applicable data protection laws. The key innovations of this work lie in innovatively bridging the gap between law and technology, as well as developing new technical methods to assist others in academia, industry, and authorities with the study of data protection in apps. This methodology is shared freely and openly, and is already used – in a simplified form – by millions of individuals every day. Building on this work, Konrad Kollnig is now writing a book on how to regulate the app economy (a topic that has, due to its technical nature, received rather limited attention) as well as applying his expertise from the PhD work to the upcoming EU Artificial Intelligence Act.
About the Author - Konrad Kollnig
Konrad Kollnig is an Assistant Professor at the Law&Tech Lab of Maastricht University’s Law Faculty – a unique research laboratory where lawyers and computer scientists collaborate and apply technical methods to contemporary legal challenges. He is also an Associate Researcher at the Open Data Institute, where he works on data access and privacy-enhancing technologies. Previously, he was a doctoral researcher at the Department of Computer Science of the University of Oxford under the supervision of Prof. Sir Nigel Shadbolt. He holds a BSc in Computer Science and Mathematics from RWTH Aachen University (Distinction) and an MSc in Computer Science from the University of Oxford (Distinction).
WINNER OF THE ARTICLE CATEGORY
“Investigating Deceptive Design in GDPR’s Legitimate Interest”
About the Article
Drawing insights from human-computer interactions and data protection law, researchers from the Max Planck Institute, Utrecht University, and the University of Washington investigated how legitimate interests are being used in practice, and how users perceived these practices through two empirical studies. In the first study, the researchers conducted a web crawl of 10,000 top sites to identify: i) how legitimate interests are being used in privacy notices by data controllers and the legal implications of such practices, ii) deceptive designs (dark patterns) used by data controllers corresponding to the implementations of the legitimate interest as legal basis. In the second study, the researchers surveyed 400 EU participants to investigate how end-users perceived legitimate interest’s practices.
The researchers found that when disclosed, legitimate interests are often difficult to object to, and are designed in ways that might confuse users, therefore leading to lower engagement rates. Moreover, their findings demonstrate that IAB Europe’s Transparency and Consent Framework plays a major role in how legitimate interests are applied. Their survey found that the ways legitimate interests are used in practice do not match users’ beliefs about how their data should be used, indicating that user preferences should be taken into account when creating and revising data protection laws and defining industry standards.
About the Author(s)
Applicant: Lin Kyi is a PhD student at the Max Planck Institute for Security and Privacy (website: https://www.linkyi.net/)
- Sushil Ammanaghatta Shivakumar: Research Assistant at the Max Planck Institute for Security and Privacy (website: https://sushil579.github.io/)
- Cristiana Santos: Assistant Professor at Utrecht University’s School of Law (website: https://rel-incode.github.io/cristianasantos/)
- Franziska Roesner: Associate Professor at the Paul G. Allen School of Computer Science and Engineering at the University of Washington (website: https://www.franziroesner.com/)
- Frederike Zufall: tenure-track professor at the Chair of Public Law and Computer Science at Karlsruhe Institute of Technology (website: https://www.zar.kit.edu/english/21_mitarbeitende_frederike.zufall.php )
- Asia Biega: tenure-track faculty at the Max Planck Institute for Security and Privacy (website: https://asiabiega.github.io/)