Job description
Are you passionate about improving the digital landscape and protecting users' rights? Do you believe that the current state of web services and apps could be more user-centric and ethically sound? Then join the Digital Security group (DiS) as a PhD Candidate!
Current consumer technology comes with many risks to the digital rights and well-being of its users: Many web services, apps and even operating systems have adopted the business model of being available to end users free of monetary charges, monetising their personal data instead; large Internet companies widely employ manipulative design to keep users’ attention and lock them into their ecosystems; and AI companies mine user-generated content from the internet with disregard for privacy or intellectual property laws. Jurisdictions across the globe try to tackle these challenges through new legislation and platform regulation such as the European Union’s General Data Protection Regulation (GDPR), Digital Services Act (DSA), Digital Markets Act (DMA), and AI Act, or new privacy laws in different US states, but their effects are often limited and enforcement is slow.
In the case of privacy laws, the most visible impact is an increase in the prevalence and complexity of privacy policies or cookie consent notices that give the illusion of control but are often not properly implemented and overwhelm users with legal text they are unlikely to read, let alone understand its implications. Instead, more fundamental approaches are required and users’ privacy and other digital rights need to be built into systems from the very beginning of the application lifecycle instead of letting users agree to a privacy notice added just before the service, app or device is rolled out. This is ever more important in the age of AI, in which it is becoming increasingly difficult to predict what inferences might be drawn from one’s personal data in the future.
As a PhD candidate in the area of online privacy and digital rights, you will work towards a future in which technology will consider users’ digital rights holistically throughout the whole application lifecycle. This includes understanding what threats current and emerging technologies pose to users’ rights, how the involved actors are handling these threats, and what could be done to encourage a more holistic consideration of users’ rights. The Digital Security group studies these questions from an empirical perspective, using both technical measurements to identify and measure security and privacy issues at scale and methods from human-computer interaction such as surveys and interviews to understand the human factor behind these observations and find out how the situation could be improved.
The concrete focus of your PhD research will be determined based on your interests. Possible topics include, but are not limited to (i) measurements of privacy issues on websites and in mobile apps and smart devices, often at large scale using automation tools, (ii) analysis of deceptive design in online services and similar practices that can be harmful to users, (iii) analysis of countermeasures against online tracking and strategies for digital self-defence, (iv) surveys and interview studies with end users and developers about their awareness, perception and handling of risks to their digital rights, and (v) the analysis, (multilingual) comparison and improvement of privacy notices such as cookie banners or privacy policies.
You will be supervised by Assistant Professor Christine Utz (https://christineutz.net) to conduct research on your selected topics of interest and publish the results at top-ranked international academic conferences. Beyond academia, your research will have the potential to identify and raise awareness about violations of users’ digital rights by online services and apps, support the development of ethical and human-centred alternatives, as well as inform national and European stakeholders about the effectiveness of past and future regulations related to user privacy and digital rights, and what could be done to address shortcomings.
You will spend about 10% of your time (0.1 FTE) assisting with teaching at our department. This will typically include tutoring practical assignments, grading coursework, and supervising student projects.