The greater presence and pervasiveness of ICT in everyday life poses new concerns about personal rights such as non-discrimination and individual privacy. Since these concerns limit the practical applicability and the broad acceptance of advanced technologies, such as decision support systems (DSS) and location-based services (LBS), the diffusion of these applications relies on the existence of technical tools to enforce personal rights. Developing these tools requires a constant reference to what is legal and what is not, in measurable and formal terms. This makes it necessary to have an interdisciplinary approach between the legal and the computer science research areas.
This is the approach we intend to follow in this project: starting from the requirements derived by the current regulation and legal debate, we intend to develop formal models and technical tools to enforce the intertwined personal rights of non-discrimination and individual privacy. The state-of-the-art of both the legal and the computer science literature concentrates on the concept of data anonymization as a possible strategy to guarantee personal rights in the automated processing of personal data. Unfortunately, the issue is more complex than one would expect. According to existing regulations, data are anonymous if it is “reasonably impossible” for a malicious adversary to re-associate data with the identity of its respondent. However, it is hard, both in legal and in computer science terms, to formally define this “reasonably impossible” condition. Another problem is that anonymity does not necessarily prevent the identification of a group of individuals, e.g., a minority or a group protected by law, and then an unfair or discriminatory treatment of members of that group.
Our research intends to provide answers to the protection of the intertwined personal rights of non-discrimination and privacy-preservation both from a legal and a computer science perspective. On the legal perspective, our objective consists of a systematic and critical review of the existing laws, regulations, codes of conduct and case law, and in the study and the design of quantitative measures of the notions of anonymity, privacy and discrimination that are adequate for enforcing those personal rights in ICT systems. On the computer science perspective, our objective consists of designing legally-grounded technical solutions for discovering and preventing discrimination in DSS and for preserving and enforcing privacy in LBS. We believe that the techniques applicable to the two problems share common issues and solutions.