P-SGD: A Stochastic Gradient Descent Solution for Privacy-preserving During Protection Transitions

P-SGD--Overview

Approach overview

Advances in privacy-enhancing technologies, such as context-aware and personalized privacy models, have paved the way for successful management of the data utility-privacy trade-off. However, significantly lowering the level of data protection when balancing utility-privacy to meet the individual's needs makes subsequent protected data more precise. This increases the adversary's ability to reveal the real values of the previous correlated data that needed more protection, making existing privacy models vulnerable to inference attacks.

To overcome this problem, we propose a stochastic gradient descent solution for privacy-preserving during protection transitions, denoted P-SGD. The goal of this solution is to minimize the precision gap between sequential data when downshifting the protection by the privacy model. P-SGD intervenes at the protection descent phase and performs an iterative process that measures data dependencies, and gradually reduces protection accordingly until the desired protection level is reached. It considers also possible changes in protection functions and studies their impact on the protection descent rate. The proposed solution is generic and compliant to numerous existing privacy models in different application domains. It can be plugged into the privacy model to provide an additional layer of protection against inference attacks (see figure above). We validated our proposal and evaluated its performance. Results show that P-SGD is fast, scalable, and maintains low computational/storage complexity.

Illustrating-Example

Research Areas

  • Privacy Enhancing Technologies
  • Data Privacy
  • Stochastic Gradient Descent Methods
  • Context-awareness
  • Internet of Things

PhD Student

Karam Bou Chaaya

Project Members

  • Richard Chbeir
  • Philippe Arnould
  • Mahmoud Barhamgi
  • Djamal Benslimane

P-SGD Prototype