Sharing data in exchange for goods and services presents an opportunity for users to improve their quality of life, however, it also exposes them to many privacy risks. In fact, processing and analyzing collected sensor data (e.g., location of individuals, patient’s vital signs), which are spatio-temporal in nature, can lead to disclosing a wide variety of privacy-sensitive information about users, such as health conditions, performed or daily activities, habits, preferences, and so on. This disclosure may be intentional if users are aware of it and have entered into agreements with relevant service providers. However, it can be harmful if the data/information of users is misused by providers, sold to interested third parties without user consent, or stolen by cybercriminals as providers are often victims of cyber-attacks that lead to data breaches.
Consequently, involving users in the control and protection of their privacy is currently receiving extensive attention from both legal and technical perspectives. Nonetheless, existing legal frameworks for data protection (e.g., GDPR) might not necessarily deter data consumers from abusing, intentionally or unintentionally, the data of users. The Facebook-Cambridge Analytica and Exactis scandals are only few examples of a long series of data breach scandals that happened despite the existence of appropriate data protection laws. In addition, privacy laws vary among countries, some providing more protection than others (e.g., GDPR for the European Union, CCPA for the state of California). This increases the difficulty and complexity of managing and preserving the privacy of users, especially when users, service providers, and third parties are located in different countries governed by different data protection laws. Therefore, all these constraints emphasize the need for user-centric technical solutions that maintain the same level of data privacy protection in all countries.
Current approaches of user-centric privacy-preserving mainly rely on preference specification and policy enforcement, where users specify their privacy preferences and accept policies that enforce these preferences. However, they all share two main limitations:
(1) lack of user awareness. The user may not be completely aware of the direct and indirect privacy risks involved with the exchange of her data with providers to correctly specify her preferences in the first place. She may simply not know what sensitive information might be revealed from her data when data pieces are analyzed in isolation or combined with each other or/and with other side information acquired from external data sources (e.g., social networks).
(2) lack of context-based privacy decision making. The data sharing or protection decisions are often made/accepted by the user in a static way. This means that they remain unchanged regardless of context changes. However, the sensitivity of data may vary from a context to another , , i.e., new privacy risks may emerge as others may lose their significance. This makes static decisions overprotective in some contexts, causing unnecessary loss of data quality which may downgrade the accuracy of associated services; or under-protective, leading consequently to privacy violations. Therefore, the user must be able to make dynamic adjustments to her privacy decisions to cope with the dynamicity of her context.
The objectives of our research project are to design suitable solutions that overcome the aforementioned two limitations, and to provide a complete context-aware privacy framework that meets the guidelines of current privacy standards (i.e., Privacy by Design and ISO/IEC 27701). Specifically, the framework needs to cope with:
- Raising user awareness of the privacy risks associated with their data sharing and/or imposed by their surrounding environments, by providing them with a dynamic/contextual overview of risks tailored to their level of expertise.
- Assisting users in optimizing their data utility-privacy decisions according to their situations, needs and preferences, by providing them with the best data protection strategies that could be implemented in their situations.
- Ensuring appropriate protection of the data collected, according to user decisions, before being transmitted to data consumers.
- Privacy Engineering
- Data Privacy
- Context-aware Computing
- Semantic Modeling
- Semantic Reasoning
- Privacy Risk Management
- Internet of Things
- Big Data
- uCSN: An Ontology for user-Context Modeling in Sensor Networks
- CaSPI: A Context-aware Semantic Reasoning Approach For Dynamic Privacy Risk Inference
- δ-Risk: Toward Context-aware Multi-objective Privacy Management in Connected Environments
- P-SGD: A Stochastic Gradient Descent Solution for Privacy-preserving During Protection Transitions
- Bou-Chaaya, K., Barhamgi, M., Chbeir, R., Arnould, P., & Benslimane, D. (2019). Context-aware system for dynamic privacy risk inference: Application to smart iot environments. Future Generation Computer Systems, 101, 1096-1111.
- Bou-Chaaya, K., Chbeir, R., Alraja, M. N., Arnould, P., Perera, C., Barhamgi, M., & Benslimane, D. (2021). δ-Risk: Toward Context-aware Multi-objective Privacy Management in Connected Environments. ACM Transactions on Internet Technology (TOIT), 21(2), 1-31.
- Bou-Chaaya, K., Chbeir, R., Barhamgi, M., Arnould, P., & Benslimane, D. (2021, June). P-SGD: A Stochastic Gradient Descent Solution for Privacy-Preserving During Protection Transitions. In International Conference on Advanced Information Systems Engineering (pp. 37-53). Springer, Cham.