- This event has passed.
Online advertisements range from employment, housing or health products to insurance, loans or gambling. Tech companies use Big Data analytics and artificial intelligence (AI) to draw non-intuitive and unverifiable inferences and predictions about the behaviours, preferences, and private lives of individuals to offer them products.
These inferences draw on highly diverse and feature-rich data of unpredictable value, and create new opportunities for discriminatory, biased, and privacy-invasive decision-making, often based on sensitive attributes of individuals’ private lives (e.g. sexual orientation, ethnicity, religion). EU non-discrimination law and data protection law afford greater protection to sensitive attributes, or ‘special category data’, describing characteristics such as health, ethnicity, or political beliefs. However, this talk will show that both laws do not sufficiently guard against privacy and discrimination risks.
Please join us for a special lunch lecture with Sandra Wachter, an Associate Professor and Senior Research Fellow in Law and Ethics of AI, Big Data, and robotics as well as Internet Regulation at the Oxford Internet Institute at the University of Oxford. She will explore privacy law and EU non-discrimination law and show their limits both in terms of areas it applies to and people it protects. She will conclude by arguing that a ‘right to reasonable inferences’ could provide a remedy against new forms of discrimination and privacy violations.
The event will be hosted by SAILS and co-sponsored by SPICE and LST. Lunch will be provided.