Listen to the Voices of Everyday Users: Democratizing Privacy Ratings for Sensitive Data Access in Mobile Apps
Liu Wang, Tianshu Zhou, Haoyu Wang, Yi Wang
TLDR
This paper introduces DePRa, a system that democratizes mobile app privacy ratings by involving everyday users to assess sensitive data access.
Key contributions
- Proposes a novel paradigm: democratizing privacy assessment by repositioning users as active evaluators.
- Introduces DePRa, a prototype system for user-driven privacy ratings with contextual explanations.
- Evaluated DePRa with 200 users, demonstrating its feasibility in capturing user opinions.
- Shows democratized assessment can complement expert audits for inclusive privacy evaluation.
Why it matters
This paper democratizes mobile app privacy assessment, moving beyond expert-only audits. It empowers users to evaluate data access, fostering more aligned, scalable privacy enforcement and building trust in app data usage.
Original Abstract
Mobile apps frequently request excessive data access, raising significant privacy concerns. While regulations like GDPR emphasize data minimization, they provide limited guidance on concretely defining and enforcing necessary data access. Existing regulatory mechanisms primarily rely on expert-driven audits that face challenges in scalability, neutrality, and alignment with user expectations. In this paper, we propose a novel paradigm--democratizing privacy assessment, inspired by prior work on user-centric privacy perceptions--which repositions users as active evaluators in the privacy auditing process, recognizing that user perceptions of data usage play a crucial role in assessing the appropriateness and necessity of data access. To operationalize this paradigm, we introduce DePRa, a prototype system developed through participatory design, featuring contextual explanation provision, category-based representative selection, an intuitive rating interface, and preference-based rating adjustment. We evaluated DePRa with 200 everyday mobile app users, analyzing how effectively it captures user opinions on sensitive data access, comparing their privacy ratings with expert assessments, and exploring risk preference-based score calibration. Our findings show the feasibility and promise of democratized privacy assessment, highlighting its potential to complement expert auditing and support inclusive privacy evaluation.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.