The P in IoT stands for Privacy

The P in IoT stands for Privacy

by Narseo Vallina and Aniketh Girish (IMDEA Networks Institute)

Can dynamic and static analysis techniques be used to audit and certify the privacy behaviours of IoT products and their regulatory compliance? What are the technical and methodological barriers that must be overcome in order to do this?

The Internet of Things (IoT) or commonly referred to as smart devices are an interconnected collection of physical objects such as sensors, actuators and other devices that can interact with each other and also with other systems over the Internet. Recent statistics [1] reveal the tremendous adoption of consumer IoT devices as they are projected to grow up to 75 billion devices by 2025. Today, even our coffee machines and vacuum cleaners are connected to the Internet through our phones. This hyper-connectivity of cyber-physical systems has created a world of extreme convenience, but it also has its downside.

The sheer growth of IoT comes at the cost of user’s privacy [1, 2, 3] at our smart homes. Smart devices can collect, process and analyse personal information to learn users’ online behaviours and track them. Furthermore, the large volume of data collected could be sold to deceptive tracking companies, such as data brokers or advertisers without consumer awareness. Such tracking companies can potentially use the data to infer personal information about users, reveal the lifestyle and even reveal socio-economic status of a consumer. Such misuse of personal information can have negative consequences for the consumer’s privacy, resulting in more considerable harms like polarisation, ideological manipulation and much more. On the other hand, due to the tight product cycle and the lack of privacy and security expertise, IoT product vendors and developers may not implement the required privacy and security practices correctly, introducing privacy and security mis-configurations. Another obvious reason for privacy risk in smart homes lies in the usage of unencrypted communication protocols. A recent study has found evidence in real IoT devices that they use deprecated versions of the TLS (Transport Layer Security) protocol or insecure cipher suites in a significant fraction of their connections. The same researchers have also found many of them to be vulnerable to TLS interception attacks [2]. They showed that devices are slow to adopt new TLS versions and to secure the set of supported cipher suites. This requires conducting regular auditing and updating to ensure consumer devices’ connections remain secure.

Legislative measures such as the General Data Protection Regulation directive (GDPR) in the EU or California Consumer Privacy Act (CCPA) in the USA are introduced to protect consumers from such privacy risks. Regulations, such as GDPR, mandates smart products that collect personal data to portray a privacy policy in which they clearly explain data collection purposes, and prohibit the collection of personal data without informed user consent. On the other hand, various consortia have been formed to define privacy and security standards for smart products. To enhance the guarantees and trustworthiness of IoT systems, the EU Cybersecurity Act and Cyber Resilience Act, and industry alliances, such as ioXt, have put forward programs for labelling and certifying digital products. However, smart home consumer privacy is still at risk, as illustrated by the FTC settlement with Vizio for violating user privacy and collecting data without consent in 2017 [4]. Unfortunately, regulators lack tools for exhaustively and automatically auditing smart platforms.

The lack of practical and effective privacy testing tools in the state of the art is partly due to the heterogeneity of smart products. Smart devices are typically made of various sub-systems such as diverse connectivity protocols, vendor services and even platform apps. This varied architecture of smart products makes it difficult to develop universal tools for privacy auditing. For example, some IoT devices support a wide range of network interfaces and short-range protocols like WiFi, Bluetooth or ZigBee which must be actively monitored to detect (even unintended) privacy leaks. The privacy risk is further aggravated by the hyper connectivity of modern systems and data distribution, which facilitates seamless device-to-device connectivity. For example, a smart bulb can be controlled using a smart speaker with a simple voice command. This potential hyper-connectivity can have negative consequences for users’ privacy by enabling data leaks between devices (e.g., side and covert channels) without user awareness. However, auditing smart products in combination with other products and services requires combining multiple testing techniques and vantage points to gain visibility into products, but the state of the art has not developed such methods effectively and in a scalable manner yet.

This is due to the fact that smart products and its associated supply chain rarely open their code for independent auditing, and thus auditors and users have to blindly believe that they will protect users’ privacy and only perform data collection as specified in the privacy policy. For example, our initial work in this area has already revealed that privacy-intrusive 3rd-party libraries embedded on mobile apps silently scan the entire home network address space to profile users’ households and social structures without consent [5].

Ignoring the variety of network interfaces, integration capabilities, and protocols present in the home IoT ecosystem neglects the possibility of concerning privacy risks to the end user, either intentionally or by mistake, through covert- and side-channels between IoT devices and applications. Modern smart products are not isolated entities. This makes traditional black-box testing approaches fundamentally inadequate to capture the dynamic, unexpected and privacy-intrusive behaviors of interconnected smart devices. There is an urgent need for methodologies and a robust testing framework to independently audit the security and privacy status of a smart product deployed at consumer homes. However, the closed and proprietary nature of smart products makes the development of a  scalable privacy analysis framework challenging. But, static and dynamic analysis methodologies built with TRUST aWARE overcomes these challenges to infer potential privacy intrusive behaviors of smart products. The combination of static and dynamic analysis techniques developed in TRUST aWARE would enable auditors (independent testers) to model and monitor smart products at scale. For instance, an auditor can first carry out static analysis to identify interesting code branches that can further be instrumented for dynamic analysis to confirm the findings. In particular, the dynamic analysis framework enables auditors to audit the runtime behavior of smart products in realistic environments to automatically characterise smart product behavior and identify their S&P risks (including those attributed to third-party components).


References

[1] Jingjing Ren,Daniel. J. Dubois, David R. Choffnes, Anna Maria Mandalari, Roman Kolcun, and Hamed Haddadi. 2019. Information Exposure From Consumer IoT Devices: A Multidimensional, Network-Informed Measurement Approach. In Proceedings of the Internet Measurement Conference (IMC).

[2] Muhammad Talha Paracha, Daniel J. Dubois, Narseo Vallina-Rodriguez, and David R. Choffnes. 2021. IoTLS: Understanding TLS Usage in Consumer IoT Devices. In Proceedings of the Internet Measurement Conference (IMC).

[3] Hooman Mohajeri Moghaddam, Gunes Acar, Ben Burgess, Arunesh Mathur, Danny Yuxing Huang, Nick Feamster, Edward W. Felten, Prateek Mittal, and Arvind Narayanan. 2019. Watching You Watch: The Tracking Ecosystem of Over- the-Top TV Streaming Devices. In Conference on Computer and Communications Security (CCS).

[4] VIZIO to Pay 2.2 Million to FTC, State of New Jersey to Settle Charges It Collected Viewing Histories on 11 Million Smart Televisions without Users Consent. 2017. https://www.ftc.gov/news-events/press-releases/2017/02/ vizio-pay-22-million-ftc-state-new-jersey-settle-charges-it.

[5] Aniketh Girish, Vijay Prakash, Serge Egelman, Joel Reardon, Juan Tapiador, Danny Yuxing Huang, Srdjan Matic, Narseo Vallina-Rodriguez. 2022. Challenges in inferring privacy properties of smart devices: Towards scalable multi-vantage point testing methods. In ACM International Conference on Emerging Networking Experiments and Technologies Student’s workshop (CoNEXT’22 SW).

error: Content is protected !!