Poster: PrivaSee: Augmented Reality-Enabled Privacy Perception Visualization for Internet of Things

Internet of Things (IoT) provides a wide range of services to improve convenience and comfort in our daily lives. However, various sensors equipped on IoT devices often raise privacy concerns. Prior works on privacy focus on passive protection from the data and device perspective, such as data encryption and communication protocol design. In this work, we introduce PrivaSee, an augmented reality (AR)-enabled privacy visualization platform to empower users with proactive privacy protection by enhancing their understanding of privacy perception for multimodal sensors.


INTRODUCTION
With the rise of smart devices, various sensors have entered our daily lives.Surveys show that the number of households with smart home devices in the US will reach 93.59 million by the year 2027 [1].These smart devices, equipped with various sensors, can pick up abundant human-induced signals for information inference, which poses challenges to user privacy.Prior works on privacy mainly focused on data protection, such as cryptography for personal data [4] and communications [2].However, these methods mainly focus on device-level protection, a passive protection that overlooks the user's perspective.Without the user in the loop for privacy protection, there is limited transparency and trust between humans and machines.Furthermore, it is a one-sizefits-all solution without considering users' preferences based on their perceived privacy.
To enable interactive and intuitive control related to privacy, we propose to "show" the end users the privacy risks associated with various IoT sensors.We introduce PrivaSee, an Augmented Reality (AR) enabled privacy perception platform that allows users to visualize and understand the privacy risk of multimodal IoT sensors with an intuitive and immersive experience.PrivaSee focuses on visualizing two properties of a system's sensing capabilitysensing range and inference granularity -for potential privacy leakage.The system first models the IoT sensing system deployment based on the ambient signals generated by human activities of daily living [5,6], then visualizes the two properties of the sensing system derived from the deployment model.Based on this visualization of the sensing capability, the user can proactively make configuration choices, such as system deployment location and device settings, for better privacy protection tailored to the user's preference.

PRIVASEE DESIGN
PrivaSee focuses on two tasks -sensing-privacy interpretation and privacy visualization.
Sensing-Privacy Interpretation.We consider the system's sensing capability (sensing range and inference granularity) key impacting factors on users' perceived privacy risks.Visualizing the sensing range of an IoT sensor helps users understand the spatial detection risks.Visualizing the inference granularity helps users understand the contextual leakage risks.We assess three inference tasks as illustration of the sensor's capability: localization, identification, and speech recognition.The assessments can be from expert knowledge or data-driven system modeling [6].
Privacy Visualization via AR.We visualize this sensingprivacy interpretation by overlaying (1) the IoT devices' sensing range on the users' view, and (2) the inference granularity of the selected sensor.As shown in Figure 2, the green masks depict the sensing range of the selected sensor from the user's viewpoint.The spider chart on the right corner illustrates the inference granularity of the sensor.Different sensors (type, location) show different sensing capability properties.

FEASIBILITY CHECK
The implementation challenge here is to ensure continuous tracking of the sensing range and real-time rendering within the AR camera view when the user moves around, especially on edge devices (i.e., headsets) where computational resources and power supply are constrained.To verify the feasibility of PrivaSee on resource-constrained devices, we implemented a proof-of-concept visualization platform and measured real-time rendering performance.Our setup includes an Asus ROG Flow Z13 tablet as an edge device, capturing the user's viewpoint and displaying privacy risks visualization.The AR information overlay in the visualization is rendered on a desktop server with an NVIDIA GeForce 3090 GPU.The captured view from the edge device to the server and AR visualization from the server to the edge device are transmitted in real time.We use LightGlue [3] for local feature matching between the anchor image (from automatic sensing capability estimation) and the current frame.Through the keypoint matches between two images, Pri-vaSee projects the sensing range from the anchor image to the current frame.In our experiment, we can achieve 10 FPS rendering with a 100ms delay on the edge device.

FUTURE WORK AND CONCLUSION
We plan to explore this direction from the following aspects.
(1) Comprehensive interpretation between sensing capability and privacy.(2) Multimodal system sensing capability characterization beyond vibration and vision; (3) Efficient continuous sensing range tracking and rendering; (4) Intuitive visualization schemes for different sensing modalities.
In conclusion, PrivaSee is a proactive privacy protection solution that provides an intuitive and immersive visualization of privacy risk for multimodal IoT sensors.It unleashes the power of visualization to fill the knowledge gap between users' understanding of sensors' capability and privacy.

Figure 1 :
Figure 1: PrivaSee intuition.PrivaSee provides an intuitive and immersive privacy perception solution for users to visualize multimodal IoT sensors' capability.

Figure 2 :
Figure 2: Illustration of PrivaSee's interface for different sensors with screen interaction.(a) Vibration sensor on the table.(b) Vibration sensor on the floor.(c) Camera on the monitor.When user selects the target sensor, the sensing area will be highlighted and the inference granularity will be depicted with the spider chart.