AMBIENT DIGITALISATION MODULES

Introduction

The Ambient Digitalization (AD) component creates a digital reconstruction of the working environment and the entities within it (such as human workers, robotic agents, and tools) by utilizing the data collected by the Ambient Sensing (AS) infrastructure. The goal of AD is to generate a virtual model that accurately replicates both the static and dynamic components of the real-world scenario, including their features and interactions. In the default case where the AS infrastructure incorporates RGB-D sensors, the resulting digital reconstruction contains both photometric and geometric information. If sensors of different types are also considered, AD produces a multimodal digital representation that coherently merges the diverse sensor modalities.

Technical Specifications

Software and hardware Requirements

The AD component requires hardware components that can be divided in three categories: (1) the AS infrastructure, (2) the computational devices of the Fog cluster responsible for processing the data collected by the AS infrastructure, and (3) the Realtime Communications Network, which is responsible for receiving data collected by the AS infrastructure and for transmitting the data processed by AD to the Perception Enhancement (PE) component. Data is also stored in a database to ensure redundancy and persistence.

The AD component requires the following software components: (1) a module that synchronizes data acquired from AS sensors to create time-aligned representations, (2) a module that processes the photometric and geometric information collected by a sensor to create a textured 3D point cloud representation, (3) a module that aligns these point cloud fragments (e.g. by performing point cloud registration) and creates a unified and cohesive point cloud representation.

You can find a block diagram representing AD technical specifications below.

AD technical block diagram

Usage Manual

The AD software components consist of the following three modules.

Synchronization module: It ensures the time alignment of data streams from the AS sensors. It is required only if time-alignement is not imposed by using specific hardware equipment, such as clock cables.

Point Cloud Generation module: It fuses the photometric and geometric information collected from a sensor as image representations, to create a textured 3D point cloud representation.

Point Cloud Registration module: It aligns the point cloud fragments generated by the Point Cloud Generation module and creates a unified and cohesive point cloud representation.

You can find the usage model of AS below.

AD usage model

Functional Specifications

Functional Block Diagram

The AD functional block diagram is reported below.

AD functional block diagram

Ambient Digitalization begins by collecting data acquired within a specific time frame using the AS sensors. In cases where the AS infrastructure incorporates RGBD sensors, this results in a list of RGBD images, with each image corresponding to a sensor in the AS infrastructure.

If the data collected by the AS sensors is not synchronized, we proceed to synchronize them, e.g. by performing timestamp alignment.

Given an RGBD image and the intrinsic parameters of the sensor that captured it, we proceed to extract a textured 3D point cloud capturing the fragment of the working environment visible from the sensor's viewpoint.

Following this, point cloud registration is performed on the array of textured point clouds obtained from each sensor. This process aligns each point cloud fragment, leading to the creation of a digital reconstruction of the working environment that seamlessly integrates information from various viewpoints.

The resulting 3D point cloud serves as an effective reconstruction of the working environment at a specific time frame. This entire process is repeated for each time frame for which the AS sensors provide data, continuing until the data streaming reaches its conclusion.