Privacy and Inferences: Measurement, Detection and Protection
The frequency of credit card leaks attests to the difficulty of protecting even declared user attributes. If we also consider sensitive information that can be inferred about a user, protection becomes even harder, yet many well-publicized privacy breaches involve inference.
In this talk, I'll begin two steps before protection and first discuss how to determine what users want to keep private. I'll focus on self-reported data, one of the most commonly used sources for understanding user privacy concerns, and methods for reducing self-report bias. Building on this, I'll present a simple method for identifying the inference channels through which a user may reveal private information and I'll talk about how encryption can be used to protect against the completion of sensitive inference channels.
Jessica is a research scientist and privacy product lead at Google working on leveraging data for better security and privacy. Her interests include usability of security and privacy technology, trends in privacy-related attitudes and methods for measuring and predicting privacy-related behaviors, attitudes and risks. Prior to Google, she was an area manager at Xerox PARC and a research scientist at Bell Labs and RSA Labs. She serves regularly on the program committees of ACM and IEEE sponsored security/privacy conferences and is on the editorial boards of the Journal of Computer Security and the International Journal of Information and Computer Security and the advisory board of the Association for Women in Mathematics. Jessica holds a PhD in Mathematics from U. C. Berkeley.