Functional Specifications
Introduction
This section of the documentation serves as a comprehensive overview of the platform's functional architecture and the core interactions between its components. The section begins by presenting a high-level functional diagram that depicts the main interactions of the AI-PRISM components, offering a general overview of the system's operation. Following the diagram, we delve into the specifics of the platform's interfaces through a table that provides high-level details of their functionalities and purposes. This breakdown serves as an essential reference point for understanding how the various elements of the platform interact and collaborate to achieve its overarching objectives. Next, the high-level features of the platform are presented, followed by a mapping of the platform's components to the high-level features they support. Finally, we provide links to detailed descriptions of the functional components of the platform.
Functional Component Diagram
The following figure depicts the holistic view of the main functional components of the AI-PRISM platform, their relations to the work-packages and tasks in the project plan, and their interactions.
From left to right, first, the diagram shows the components which represent the physical devices that are used to collect data from the environment and perform actions in the physical world. These are grouped into:
- T3.1 Base Hardware (BH) components, which represents the hardware components of the robot, such as sensors, actuators, and other peripherals.
- T3.2 Ambient Sensing (AS) components, which represents the hardware components that are used to collect data from the environment, such as cameras, LiDARs, and other sensors.
- T3.3 Real Time Communications (RC) components, which represents the network infrastructure required to ensure real time communications in the ambient.
T3.3 Communications Modules (CM) are used to connect the hardware devices to the cyber world. These modules are connected to the T3.1 ROS Framework (RF), which is the main framework used to control the actions of robots and the ambient sensing devices. This modularity ensures that AI-PRISM can easily adapt to different environments and work with a variety of hardware solutions. Communication Modules are also connected to the T3.1 IIoT Platform, which provides a one-stop shop to facilitate device management and communication management. In this sense, the IIoT Platform is also connected to the Real Time Communication network components to manage the Quality of Service of (real time) communications. This way, the IIoT Platform allows to manage all devices and connectivity in the collaboration ambient. Summarizing, communication between the cyber world and the physical world:
- Is implemented through communication modules that manage the connection with hardware devices.
- All communication modules are connected to the ROS Framework.
- Communication modules are connected to the IIoT Platform, which provides central device management functions and manages QoS and communication with higher layers.
T3.2 Ambient sensing (AS) components are also connected to the IIoT Platform through Communication Modules to keep all components of the ambient connected to the cyber world in a unified way. These modules process the raw data collected from the environment to create digital representations of the ambient (e.g. cloud points) and enable higher-level perception functionalities.
The T3.3 Data Platform (DP) is used to store and stream data from the ambient and the robots. It implements the data streaming and data storage services required to enable higher cognitive functionalities, which are grouped into:
- T4.2 AI based Perception Enhancing Modules (PE) which are used to process the data collected from the ambient and the robots and extract high-level features from it (e.g. object recognition, object tracking, etc.). Each module is specialized in a specific perception task to foster modularity and reusability.
- T4.3 AI-based Agent Level Reasoning Enhancing Modules (AR) which are used to enable a smooth control of the actions the robot needs to perform, adapting to the changing conditions of its surroundings ambient and the human. Agent level modules will learn the current status of the collaborative task the robot is performing (what is the actual task the robot is performing within the planned sequence of task, and what is the situation in the ambient), and based on this information choose the best action to ensure a smooth collaboration. AI based Agent Level Reasoning are connected directly to the robot controller and the ROS Framework to minimize latency.
- T4.4 Ambient Level Reasoning, Acting and Control (AR) which are specialized in higher-level reasoning problems that focus on larger-scale time horizons and physical environments. Ambient level reasoning is concerned with the definition and configuration of the collaborative ambient (what resources are available, what tasks are to be performed, with what resources, and when) and the control and coordination of operations (monitoring the execution of the manufacturing process and adjusting the allocation of resources to ensure optimal performance). AI based Ambient Reasoning modules connect to the Data Platform to access the data collected from the ambient and the robots, and to the IIoT Platform to send to the ambient updated programming instructions (e.g. updated tasks queues, or new reasoning rules or constraints) to ensure overall performance.
Finally, the T4.5 Human-Robot Interaction (HRI) components are used to enable interaction between the human and the robot. These components are connected to agent level reasoning modules to control the robot actions.
Note that the AI-PRISM platform is designed to be modular and extensible. This means that the platform can be easily extended to support new functionalities and components. As mentioned above, the platform can be extended with new hardware (sensing, robots) and communication modules to adapt to new environments. Similarly, it can be extended to support new perception tasks by adding new Perception Enhancing Modules (PE), or new reasoning tasks by adding new Agent or Ambient Level Reasoning Modules.
Note as well that, the functional architecture of AI-PRISM enables the implementation of hierarchical reasoning and control mechanisms that implement complementary control loops at different levels of abstraction. This way, the platform can be used to implement a wide range of collaborative scenarios, from simple human-robot collaboration scenarios to more complex scenarios that involve multiple robots and humans. For instance, in a typical scenario, agent level reasoning modules will be used to control the actions of a single robot, using sensors to detect changes in its surroundings and adapt the execution of planned actions according to these changes. This low-level control loop uses low latency communications to ensure that the robot can react to changes in its surroundings or to human interaction commands in real time. At the same time, ambient level reasoning modules will be used to monitor the execution of the manufacturing process, possibly coordinating actions of different robots, and adjust the allocation of resources to ensure optimal performance. This high-level control loop works at larger time scales and consequently, it uses Data Platform services and higher-level perception and reasoning modules which are not subject to real time constraints.
Additionally, the AI-PRISM Platform implements the following additional cross-cutting functionalities: - T3.4 Simulation Environment (SE), which is used to simulate the behavior of the robots and the ambient. This component is used to test to-be scenarios in a virtual environment before deploying them in the real world. Data driven simulation and digital twining are enabled by connecting the simulation environment to the Data Platform services. Additionally, other modules can connect to the simulation environment, for instance to enable simulation based reasoning or to generate data to train the AI models. - T4.5 Programming by Demonstration Environment (PD), which is used to enable the human to teach the robot new tasks by demonstrating them. This component is connected to the Human Machine Interaction Modules to enable the human to interact with the robot, and to the Data Platform to store the learned tasks and the data collected during the demonstration. - T5.1 Human Safety Management Procedures (SP), which are used to ensure that the robots are safe to operate in the presence of humans. This component acts as a central repository of safety procedures and rules, implemented based on hierarchical control loops involving different components and it is connected to the Data Platform to ensure that these control mechanisms are configured to operate in a safe way.
Main interfaces
The table below provides a brief description of each of the interfaces shown in the functional diagram.
ID | Component | Name | Description |
---|---|---|---|
I_BH_1 | AI-PRISM Base Hardware | Hardware Drivers | Interfaces to control base hardware. Communications modules will use these interfaces to manage and interact with hardware devices using these interfaces |
I_AS_1 | AI-PRISM Ambient sensing infrastructure | Sensor Interfaces | Interfaces to control sensing infrastructure equipment. Communication modules will manage and exchange information with sensing hardware devices using these interfaces |
I_RC_1 | AI-PRISM Real Time Communications Network | Network Management Interfaces | Interfaces to manage and control communications network equipment. The IIoT Platform will use these (northbound) interfaces of real time communications infrastructure to manage QoS in the network |
I_AD_1 | AI-PRISM Ambient Digitalisation Modules | Ambient Digitalization Interfaces | Interfaces to integrate ambient digitalisation modules into the ROS framework |
I_CM_1 | AI-PRISM Communications Modules | Device Management | Device Management interfaces to manage robots and ambient sensing infrastructure devices in the ambient from AI-PRISM. The IIoT Platform will use these interfaces to manage devices and communications with higher layers in a centralized way |
I_IP_1 | AI-PRISM IIoT Platform | Equipment Control | Interfaces to Control equipment from higher-level reasoning modules. Higher-level reasononig modules will use this interfaces to define or update scheduled tasks, control rules or constraints |
I_DS_1 | AI-PRISM Data Platform | Data Interfaces | The Data Platform provides centralised batch and streaming services to access data through these interfaces |
I_RF_1 | AI-PRISM ROS Framework | ROS Interfaces | ROS communication interfaces used in the ROS framework |
I_PE_1 | AI-PRISM AI-based Perception Enhancing Modules | Perception Interfaces | Interfaces to store extracted features in the data platform |
I_HI_1 | AI-PRISM Human - Machine Interaction (HMI) Modules | HMI Interfaces | Interfaces to enable human robot interaction and collaboration |
I_PD_1 | AI-PRISM Programming by Demonstration Environment | Programming by Demonstration Interfaces | Interfaces to stored learned programs in the Data Platform |
I_DR_1 | AI-PRISM AI-based Agent Level Reasoning Enhancing Modules | Agent Level Reasoning Interfaces | Interfaces to integrate agent level reasoning and robot control into the platform |
I_CR_1 | AI-PRISM AI-based Ambient Level Reasoning Enhancing Modules | Ambient Level Reasoning Interfaces | Interfaces to store the results of ambient level reasoning modules in the Data Platform |
I_SE_1 | AI-PRISM Simulation Environment | Simulation Interfaces | Interfaces to request simulations to the simulation environment |
High Level Features
High-level features of AI-PRISM as a whole.
Feature 1. Introduce collaborative robotics in manufacturing environments
AI-PRISM facilitates the introduction of collaborative robotics in manufacturing scenarios. The modular platform, which requires minimal programming skills, can automate tasks that traditionally require human perception and manipulation. The robotic solutions delivered are robust, easy to use, require minimal learning and can be configured without requiring highly skilled personnel.
Feature 2. Digitalize collaborative workplaces
Sensors on the robots and around the workspace (collaboration ambient) collect data about the environment and the activities taking place. This includes visual data from RGBD cameras, spatial data from on-board LiDAR or radar sensors, and even data from sensors installed in manufacturing equipment. This data can then be processed by AI algorithms to create a digital representation or model of the environment. This involves identifying and tracking objects and people in the environment, understanding the tasks being performed, and predicting future actions or changes in the environment.
Feature 3. Capture tacit knowledge from workers
Capture expert knowledge that is difficult to transfer to other team members verbally or in paper. AI-PRISM captures knowledge as workers interact with collaborative robots. Through its "Programming by Demonstration" capability, the digital models are enhanced and cobots learn tasks by observing human actions.
Feature 4. Enhance reasoning capabilities of collaborative robotics
The high-level features extracted by AI based perception can help robots better understand human actions and intentions, allowing robots to anticipate human actions and adjust their own actions accordingly, leading to smoother and more efficient collaboration, better communication between humans and robots, and better planning of future actions.
Feature 5. Enhanced manufacturing operations management
The digital information extracted can also be used for optimization purposes at higher management levels, for instance to improve manufacturing operations scheduling, workforce planning, AMR navigation and routing, or workspace layout optimization.
Solutions Map
Mapping of solutions to high-level features.
Detailed Functional Specs
The following table contains links to the detailed specifications of each component in the reference architecture.
Acronym | Link |
---|---|
TM | Template |
RF | RF Documentation |
BH | BH Documentation |
CM | CM Documentation |
AS | AS Documentation |
RC | RC Documentation |
IP | IP Documentation |
DS | DS Documentation |
SE | SE Documentation |
AD | AD Documentation |
CD | CD Documentation |
PE | PE Documentation |
DR | DR Documentation |
CR | CR Documentation |
HI | HI Documentation |
PD | PD Documentation |
SP | SP Documentation |
NS | NS Documentation |