AMBIENT SENSING INFRASTRUCTURE

Introduction

The Ambient Sensing infrastructure (AS) consists of a collection of mobile (e.g. wearable, robot mounted) and environmental (e.g. fixed onto walls, ceiling) sensors intended to record the workplace and its active components (e.g. human workers, robotic agents, obstacles, and tools). Our setup includes both multiple instances of the same sensor type (typically ranging from 2 to 5) and different sensor types capturing distinct data modalities (e.g. RGB, RGB-D, thermal, infrared). The use of multiple instances of the same sensor type serves two main purposes: (1) it enables covering different viewpoints, which is particularly useful for minimizing instances where objects or events are obscured by obstacles, and (2) it provides redundancy in case of sensor failures, ensuring uninterrupted data collection. The inclusion of different sensor types enables the collection of complementary information, thereby enhancing the overall understanding of the workplace environment.

Technical Specifications

Software and hardware Requirements

The hardware components required by AS can be divided in three categories: (1) the actual sensors, (2) the computational devices of the Fog cluster used to collect data from these sensors, and (3) the Realtime Communications Network for transmitting the collected data to Ambient Digitalization. Data is also stored in a database to ensure redundancy and persistence.

The software components required by AS consist of: (1) a module that allows to configure the sensor network by selecting which sensors to use, (2) a module that checks the health status of the sensors and sends alert notifications if any sensors are faulty to prevent missing or incomplete footage, (3) a module that calibrates the healthy sensors to ensure data collected from different sensors conform to a consistent reference system, and (4) a module that records data from each calibrated healthy sensor.

You can find a block diagram representing AS technical specifications below.

AS technical block diagram

Usage Manual

The software components of AS consist of the following four modules.

The Configuration module enables operators to select which sensors to activate from the list of available options through the user interface. For example, operators can choose to activate only environmental sensors or exclude sensors recording specific areas due to privacy concerns.

The Fault Detection module examines the health status of the sensors chosen using the Configuration module and sends alert notifications if any of the sensors are identified as faulty to prevent missing or incomplete footage.

The Sensor Calibration module ensures the calibration of the healthy sensors, guaranteeing that data collected from different sensors adhere to a consistent reference system.

The Recording module employs sensors' APIs to capture and record data from them.

You can find the usage model of AS below.

AS usage model

Functional Specifications

Functional Block Diagram

The functional block diagram of AS is reported below.

AS functional block diagram

Upon initiating data acquisition, we verify the proper functionality of the sensors. If any sensors are found to be faulty, an alert notification is raised. We then check if healthy sensors have been calibrated. If not, a calibration procedure is executed to ensure data collected from different sensors conform to a consistent reference system. Once sensors are calibrated, the data recording process begins. For each time frame, a sensor captures a representation of the environment from a specific viewpoint. For example, an RGBD sensor (e.g. Intel RealSense) captures an RGB image encoding photometric information, and a depth image encoding geometric information. Captured data are transmitted to Ambient Digitalization for real-time processing, and stored in a database to ensure redundancy and persistence. Storing data in a database enables offline data annotation by experts, and software inspection in the event of unexpected software behavior. Each data acquisition session comprises several checkpoints, which are defined either as a number of frames or a time interval. After each checkpoint, an assessment is conducted to determine whether to conclude the acquisition. If not, we repeat the procedure described so far. In cases involving solely environmental sensors, calibrating the sensors once per acquisition session suffices. Conversely, when using mobile sensors, sensor calibration must be repeated every time the mobile sensors assume a new position.