HUMAN-MACHINE INTERACTION (HMI) MODULES

Introduction

An HMI module is part of the eTaSL controller explained in the programming-by-demonstration module. It consists basically of two parts:

The architectural diagram depicts how these parts interact:

Architectural diagram

As [1] describes the robot behaviour can be constructed from a number of building blocks (i.e. groups of constraints), of which there are basically two types: - RAM (reactive action model) that describes nominal behaviour and how to deviate from this nominal behaviour in case of disturbances - SIM (sensor interaction model) that describes a set of constraints that define the interaction with sensor information. HMI-related constraints are of this type.

In other words, the constraint-based approach allows us to decouple the specification of what to do with the HMI from the other constraints of the robot system. For a particular execution of the robot application or demonstration, these building blocks are assembled into one eTaSL specification.

Within AI-Prism, we can distinguish between the following types of constraints (building blocks):

[1] Iregui, Santiago, Joris De Schutter, and Erwin Aertbeliën. "Reconfigurable constraint-based reactive framework for assistive robotics with adaptable levels of autonomy." IEEE Robotics and Automation Letters 6.4 (2021): 7397-7405.

Technical Specifications

Software and hardware Requirements

A ROS2 Humble environment is assumed and therefore a Ubuntu Linux - Jammy Jellyfish (22.04) Operating System is recommended. This module closely interacts with the Programming-by-demonstration module.

Usage Manual

Most of the developments of HMI are defined within the eTaSL task specification framework. For installation instructions and a usage manual, we refer to the programming-by-demonstration component.

Use Case 1.

Use Case Diagram

The main objective of the Human-machine interaction module (HMI) is to provide the necessary interfaces between the human workers and the robotic agents during two important phases of human-robot collaborative tasks.

The first phase, referred to as the demonstration phase, consists of allowing the human to teach through demonstration the task (or part of the task) that is desired to be executed by the robot. If only motion is important to specify the task, the demonstrator can opt for teaching using kinesthetic teaching (hand-guiding the robot while performing the task) or via passive observation. The latter refers to the robotic system “observing” what the human does to perform the task while remaining still. This can be done by designing specific tools that facilitate the demonstration while tracking the motion (i.e. position and orientation). On the other hand, if not only motion but also interaction forces and moments are important (e.g. in assembly operations or other operations that require contact), then forces and moments are also recorded and later on provided to the learning model. More details on the programming by demonstrations are given here.

The second phase, the execution phase, executes a skill that allows human-robot interaction. HMIs must, therefore, allow this interaction by communicating information from the human to the robot. For example, if the human wants to disturb the trajectory of the robot (e.g. to hold a tool or to stop a robotic arm), this can be done via force interaction (by measuring end effector forces/moments or joint torques). This HMI/sensor information can be mapped accordingly and interpreted at setpoints that are sent to a continuous robot controller. These setpoints can be desired force, torque, velocity, or rotational velocity that are send to the controller. The exact interpretation in terms of behaviour is specified in the constraint-based specification language of the robot controller (see PD component) The HMI/sensor information can also be mapped at the discrete level, for example, by pressing a button to indicate the pause, and in that case, this would be sent instead to a state coordinator (e.g. the rFSM finite state machine).

The transfer of information is not only from human to machine, but also from machine to human. HMIs inform the operator about the states of the robot to improve situation awareness and avoid mode confusion. Different modalities exist and can be used depending on the application and the context. For example, LED red lights can be used to indicate problems with the task execution or the machine. Or, during demonstration when there is physical interaction between human and robot force interaction can be used to implicitly communicate to the operator. Graphical user interfaces to also belong to this category.

This component and the programming-by-demonstration component are heavily interlinked. The programming-by-demonstration component uses the HMI to interact with the operator during both the demonstration phase and the execution phase. Compared to the use case diagram in the programming-by-demonstration component, this diagram offers more detail on the demonstration phase and execution phase.

During demonstration the demonstrator interacts with the HMI in order to demonstrate the subtask at hand (right side of the use case diagram). Two approaches are envisioned to demonstrate:

When also interaction forces are of importance to the application, passive observation system is preferred because such a system introduces less force disturbances to the demonstration. In these cases, the tracking device is extended with a force torque sensor.

Additionally, the demonstrations also include data segmentation and interaction with the learning algorithm, this is handled by a discrete-event HMI. This discrete-event type of HMI can be a button/light system or a graphical user interface.

During execution of the nominal application (left side of the use case diagram), it is the operator that still can interact with the system. An example of such an interaction is indicating to the robot system that the operator is back and wants to take over the task at hand.

functional block diagram

Use Case Mock-ups

This is a backend/hardware development component, hence it does not require front-end interfaces. A mock-up scenario is explained in the programming-by-demonstration module

Functional Specifications

Functional Block Diagram

The functional block diagram explains a typical flow chart for the demonstration phase and the execution phase.

In the demonstration phase, nominal parts of the application are running until a subtask is reached where the force/motion interaction needs to be adapted to the site/location/task at hand. The robot application pauses until the demonstrator signals a start, for example by pressing a button, and the force/motion interaction is recorded. This continues until a set of demonstrations of sufficient quality is obtained.

During the execution phase, the demonstrations and the extracted model of the force/motion interaction is combined with the constraint-based specification that the application developer has made. This allows the system to execute the task at hand. During execution there still can be interactions with the operator. The appropriate reaction to disturbances or input of the operator is described using the constraint-based task specification language explained in the programming by demonstration module. This is combined with discrete-event coordination to handle starting/stopping/resuming/recovering. The flow chart on the right gives a simple example of such a coordination.

functional block diagram

Main interfaces

Due to the tight coupling of this HMI module and the programming-by-demonstration module, we do not repeat the interfaces listed in the programming-by-demonstration module and limit ourselves to the specific eTaSl descriptions of the HMI's behavior:

ID Component Name Description Sense
1 eTaSL controller during PbD eTaSL HMI specification A set of constraints that describe how the input/output of the HMI should be handled. In
2 eTaSL controller during execution eTaSL HMI specification A set of constraints that describe how the input/output of the HMI should be handled. In

Sequence Maps

For the sequence map we refer to the programming-by-demonstration module that illustrates how the HMI-modules are used in a robot application that involves programming-by-demonstration.