Descripción del proyecto
Personal data are constantly collected and shared via web cites, mobile applications like social networking and navigation apps, smart home devices like smart TVs and voice assistants, and IoT devices. Personal data are then monetized to support targeted advertising, personalized services, differential pricing, risk assessment and influencing public opinion. This happens at the expense of privacy and fairness for individuals and the society. To address this, governments around the world are enacting privacy laws, e.g. GDPR (European Union) and CCPA (California). Unfortunately, since profits can be at odds with privacy considerations, industry players have an incentive to circumvent the law. What is more, the technical concepts and associated tools developed so far and used by the laws are neither strong enough nor wide enough in scope. Last, users themselves are conflicted: they enjoy the plethora of personalized services but are alarmed by the loss of their privacy. In this proposal we advocate for a user-centered approach to privacy where each user may dictate how much privacy is willing to trade in exchange for services. We will systematically investigate the efficiency of state of the art privacy mechanisms, both formal, e.g. differential and information theoretic privacy, and data-driven, e.g. generative adversarial privacy, in terms of how well they protect data privacy while maintaining some utility of the obfuscated data and the services that depend upon them. We will do so not only via analysis but also via real world experiments in the context of applications at the forefront of personal data privacy leaks. We will also introduce novel privacy tools for real world use cases which allow users to select the desired level of data privacy and utility of service. Use cases of interest include mobile smartphone data leaks, online tracking via web browsing and apps usage, and user profiling within popular apps like video sharing.