Underwater Human Arm Manipulator – System Calibration

Controlling a remotely operated underwater vehicle (ROV) is an extremely challenging task that requires precise maneuvering and navigation in complex as well as often unpredictable environments. The operator faces numer - ous difficulties, including limited visibility and communication constraints, in addition to the need to interpret data from various sensors. This paper describes a method for calibration of a wearable system equipped with inertial measurement unit (IMU) sensors that control the underwater manipulators. To implement a solution that allows the robot to be controlled by the operator’s hand movements, it is necessary to measure the movement of the arm. This task is carried out using the IMU sensors, which are mounted in appropriate places on the ROV operator’s suit to allow mapping the movement of his/her upper limbs. These movements are transferred to the manipulator’s arms on the ROV, making it possible to interact with the environment by – manipulating objects under-water.


INTRODUCTION
In the field of robotics, the methods for controlling a robotic arm have remained unchanged for decades.The main applications of robotic arms are repetitive and continuous operations.Thus, the main programming methods are divided into on-line and off-line planning [1].As a consequence of the recent global pandemic, another branch emerged, virtual reality programming [2], which is a connection between on-line and off-line programming.
Less frequent, but not less important is the application of manipulators in teleoperation.In this case, robotic arms are controlled in real-time by teleoperators to perform difficult and precise tasks.Teleoperation is used in medicine [3,4], bomb diffusion [5], space exploration [6,7], and remotely operated underwater vehicles [8,9].The development of devices facilitating the control of manipulators has significantly relieved operators and reduced the occurrence of errors [10].The literature confirms that the implementation of assistive devices, such as haptic feedback systems [11], augmented reality interfaces [12], and advanced automation algorithms [13], has significantly enhanced the control of underwater robots.When creating a teleoperated manipulator, developers face the problem of designing a human robot interface (HRI), which is a set of controls and state displays that ease the manipulation process.Most ready-to-use solutions taken from online programming are good, but lack intuitiveness and require operator training.In the teleoperation of remotely operated underwater vehicle (ROV), at least two operators are typically used, one for platform stability and another for manipulator operation [13,14].The operator uses a controller, which is a model of the manipulator that imitates its shape [15,16].The operator can change the positions of the model elements.These movements are translated into the movements of the real machine.In this case, the system is called a master-slave control, where the manipulator is a slave that follows the movement of master (the controller).This type of controller offers joint control of the manipulator that requires additional training.
In contrast, the films and animation industry commonly utilizes the motion capture technology to accurately capture and track body movements [17].In this research, several inertial measurement units (IMUs) were used to track the hand motion, which controlled the manipulator in an ROV.Using this approach, an operator can manipulate a robot arm using their own hands.Similar solutions were investigated in [18,19,20]; however, these were usually limited to direct mapping of one IMU channel to one joint or control with the use of predefined commands.
The contribution of this study is a calibration procedure that maps a set of rotation angles, obtained from four IMUs, from the position of the operator's hand as angles of the wrist, elbow, and arm bend, to angles of the joints of the manipulator.This enables the robot to operate by user's hand movement.The procedure is divided into two main steps, presented in Figure 1.

PROBLEM FORMULATION
Owing to the nature of the human arm, the proposed solution uses four inertial sensors, one for each movable part of the hand.Accordingly, the sensors (S 0,1,2,3 ) measure the movement and rotation of different body segments (Seg), including the reference sensor for the body (Seg 0 ), arm (Seg 1 ), forearm (Seg 2 ), and hand (Seg 3 ).Because of the need to determine the movement of the operator's hand (frame-forearm-hand), regardless of the operator's movement, the values of the hand movement parameters are defined relative to the operator, that is, to his/her body, whose movement is measured by the reference sensor S 0 .Thus, the proposed solution is independent of the operator movements, and only the rotational movements of the kinematic chain members of the operator's hand are defined.This is especially important if the ROV operator is on a rocking vessel.All IMU measurement sensors measure their orientation relative to a reference system, hereinafter referred to as the navigation frame (nf) [21,22,23].An example of the arrangement of the measurement sensors on the body the operator in the proposed solution is shown in Figure 2.An important requirement during the installation of sensors is the selection of such places that will guarantee their stable and unchanging position on the surface of the measured object/member during its movement.The Xsens IMU system was selected in [24] as the source of movement and rotation measurements.
For an unambiguous description of the rotational movement of individual parts of the operator's hand, appropriately oriented reference coordinate systems should be associated with them.Figure 3 shows the proposed orientation of the reference systems with respect to the orientation of the individual parts of the operator's hand.For the neutral position of the operator's hand, a system of free arrangement along the body in the upright standing position was selected (Figure 2a). Figure 2b presents the operator in different body positions; however, in both configurations, all the frames of reference have the same configuration: • the X axis (φ angle), represented in red, is directed perpendicularly from the operator's chest parallel to the ground; • the Y axis (angle θ), represented in green, is directed perpendicularly from the ground along the straight arm in the neutral position or, in the case of Seg 1,2,3 is directed to the previous joint; • the Z axis (ψ angle), represented in blue, is directed perpendicularly from the surface determined by the XY axes and outwards from the operator's side parallel to the surface of the earth.
The XYZ axes of the reference systems of the IMU sensors were similarly marked, the arrangement of which in relation to sensor S resulted from its construction, as shown in Figures 1 and  2. The arrangement of the axes of the IMU reference system relative to the operator depends on the mounting on the operator's body.It is important to notice that the frame resulting in segment Seg n is not perpendicular to sensor S n .The model of the operator's hand in the presented solution was constructed as a kinematic chain consisting of three movable members with a rotational connection and nine degrees of freedom (9DoF).The rotation of a given member is always determined in relation to the directly preceding member, that is, in the local variant, which facilitates the interpretation of the obtained results, as opposed to the global description (relative to the corpus).
Rotation is often represented in one of three forms: a matrix of rotation, angles of rotation (e.g., Euler angles), or quaternions [25].In the proposed solution, the form based on the rotation matrix is used as the most convenient description.In the presented solution, one of the twelve sequences of Euler rotations was selected and, marked as a sequence of rotations about the XYZ axis: where: s α = sin(α) and c α = cos(α) for α ∈ {φ,θ,ψ}.
For the purpose of calibrating the IMU measurement sensors with the kinematic chain of the operator's hand, three groups of rotation matrices describing the relationships between the sensors and members of the hand model were defined (Figure 3).The first group (marked in blue) concerns the rotation matrix describing the orientations of the sensors (S 0 − S 3 ) in relation to their reference/navigation system (N).The second group (marked in green) concerns the mutual orientation matrix of the members of the operator's hand kinematic chain model (Joints J 0 − J 3 ).The last group (marked in red) describes the rotation of IMU measuring sensor in relation to the frame of reference of the operator's hand on which the sensor is mounted.The matrices were determined during the calibration process.
In the presented solution, the following symbols of rotation matrix types were adopted: where: ; j > k; j, k = {0, 1, 2, 3} -for the j th term in the kinematic chain of the operator's hand, rotation of the j th term relative to the k th term.• X, Y, Z, V -for the calibration matrix associating the appropriate IMU sensor with the member on which it is mounted.
A graphical presentation of the respective rotation matrices is shown in Figure 3.

CALIBRATION
The calibration process consists of finding unknown X, Y, Z, V rotation matrices describing the mutual orientation of the IMU sensor in relation to the frame of reference of the kinematic chain member of the hand on which it is mounted.These matrices must satisfy the following system of equations: where: The solution of the above system of equations is precisely the process of determining the calibration matrices of the operator's hand.
To solve the calibration problem (determining the X, Y, Z, V rotation matrices), the calibration poses that solve the above system of Equations ( 2) have been proposed.The calibration process with the proposed static poses must be carried out each time the operator wears a costume with IMU sensors installed.An illustration of the hand position of the operator in the calibration poses is schematically shown in Figure 4.
These poses can be characterized as follows: • pose "I" -the operator in a standing position holds his/her arm straight down vertically, adjacent to the side of the body.• pose "T" -starting from the "I" pose, the operator raises the extended arm to a horizontal position from the side of the body.• pose "F" -starting from the "I" pose, the operator raises his/her extended arm to a horizontal position in front of him/her.
The proposed definition of the I-T-F calibration positions simplifies the rotation matrix of the kinematic chain model in the system of equations: , where R pose is the rotation matrix of the model for a specific calibration position.R I for the "I" pose, R T for the "T" pose, and R F for the "F" pose.Consequently the where: {    }  = {   ,1  ,    ,2  , … ,    ,  } is the measurement sequence of the rotation matrix of a given S i sensor in a given pose.
For simplicity, it was assumed that the number of measurements/points in each calibration pose was the same and equaled n.The process of solving the above system of equations was carried out using an iterative algorithm, in which each iteration consisted of two steps: • a "prediction" of the Y,Z, and V matrices describing the rotations of the sensors mounted on the < arm >, < forearm >, < hand > members, respectively.• a "correction" of matrix X describing the rotation of the reference sensor mounted on the reference member < body >.
The processes of "prediction" and "correction" can be implemented by utilizing a modified algorithm for the rotation matrix, which solves the well-known "Wahba's problem" [26].Alternatively, as demonstrated in a later implementation, the modified least-squares method adapted for the rotation matrix was also employed in [27,28].

RESULTS
For the purpose of verifying the correct operation of the calibration process, several measurement and calibration experiments were conducted, and a visual assessment of the correspondence of the recreated movements of the virtual arm after the calibration process with the hand movements made by the tester was carried out.An example of the results of a test is presented below.The test sequence in the first phase consisted of sequential calibration poses.Then, the subject performed several characteristic movements similar to the calibration positions.
Figure 5 shows the values of the rotation angles (φ, θ, ψ) (expressed in degrees) for the IMU measuring sensors (S 0 − S 3 ).The calibration sequences were marked at time intervals of 3s each ("I" is the pink area, "T" is the lime area, and "F" is the purple area).
For the measurement data collected in this way, the previously described calibration process was carried out.As a result of the "IMUarm" calibration, the following calibration matrices were determined.
Taking into account the definition of the operator's hand model and the obtained calibration matrix (4), it was possible to determine the change in the orientation of the individual members of the kinematic chain.The rotation angles (φ,θ,ψ) were determined for the<body− shoulder>,<shoulder−forearm>, and < forearm − hand > pairs, respectively.
The courses of the appropriate angles for a given hand model segment are shown in Figure 6.The first three vertical lines (pink, lime, purple) shown in Figure 6 correspond to the selected time intervals of the corresponding poses (I, T, and F) in the calibration algorithm: 5s for the "I" position, 10s for the "T" position and 16s for the "F" position.To increase the readability of the obtained results, a visualization of the arm movements was prepared for the previously taken measurements from the IMU sensors after the "IMU-arm" calibration process.Examples of moments and the corresponding views of the visualizations "-from the front", "-from the right side," and "-from the top" are shown in Figure 7.

ROV MANIPULATOR
The constructed manipulator attached to the ROV has five degrees of freedom (5DoF), the base that can be moved up or down, four motors that rotate in the horizontal plane, a rotary effector, and a closing gripper.The manipulator and its kinematic diagram are shown in Figure 8.For transportation purpose and owing to construction constraints, there is a redundancy between joints θ 2 , θ 3 , θ 4 .The closing of gripper controlled in different manners, and thus, only five joints were used during the procedures described here.After the calibration procedure, the solution was tested and was proven to function on with the designed manipulator.The user attached the four IMU sensors and preformed a calibration procedure that required standing in positions "I", "T", and "F".After the hand movement was mapped, the selected movements of the user joints were mapped onto the movements of the manipulator Visualization of the reconstructed positions from the measurement sequence (Figure 5) after the calibration process (Figure 6) with calibration matrices X, Y, Z, V (4) for three selected times: t = 5s for position "I", t = 10s for position "T" and t = 16s for position "F" joints.Using the presented IMU-based system, the manipulator can imitate the movement of a human arm.Due to the typical tasks performed by underwater robots, the manipulator is intended to perform typical movements such as closing the gripper, manipulating the grasped object and avoiding obstacles based on the video signal.The current control system does not provide for on-line or off-line trajectory planning.Figures 10 and 11 show the frames from the recording taken during testing of the equipment.The markings in these figures represent the user's movements that were performed.On the basis of the real robot presented in Figure 8, the kinematic scheme shown in Figure 9 can be determined.
After checking the operation of the manipulator control system using the IMU sensors mounted on the operators, tests on the ROV began.Figure

CONCLUSIONS
Despite the significant similarity between the kinematic chain of the applied manipulator and the structure of the human arm, the existing differences led to difficulties in directly converting the positions of arm parts and the angles between them into angles of the joints of manipulator.The implementation of the proposed calibration procedure allowed assigning appropriate conversion coefficients to translate the movements during gripper positioning of the manipulator.With this success, the user could successfully control the manipulator,  regardless of the user position.Owing to the calculation of the position with respect to reference S 0 , keeping the hands in a constant position, user could walk freely without manipulator movement.The solution was working slowly and with little delay; however, the method of control of the manipulator was intuitive.Figure 13 shows the tests of the manipulator attached to the ROV in a natural environment.After tracking the operator's movement, a problem of steering delay due to the use of hydraulic motors was detected.As part of further work, the decision was made to rebuild the manipulator drive system and replace the hydraulic motors with their electric equivalents.This solution should eliminate the problems with delayed reaction to the operator's movements.

Figure 1 .
Figure 1.Idea of a human-machine interface that allows the robot to be controlled by the operator's hand using IMU sensors

Figure 2 .
Figure 2. Proposal of placing of IMU sensors on the operator's body in the system modeling the movement/orientation of hand parts

Figure 3 .
Figure 3. Schematic model of the association of the IMU measuring sensors and members of the kinematic chain of the operator's arm =   ()  ()  ()

Figure 5 .Figure 6 .Figure 7 .
Figure 5. Courses of rotation angles of individual measuring sensors in the "IMU-arm" calibration experiment

Figure 11 .
Figure 11.Testing of elbow movement