With human-like morphology and locomotor capabilities, humanoid robots are expected to be widely used in a variety of operations and locomotion tasks in the future to support or replace human work.
However, in a cramped and confined work environment, humanoid robots must be able to perform robustly in them, with multi-contact movement capabilities.
Multi-contact sports involve contact not only with the ends of the robot's limbs, such as hands and feet, but also with contact with the middle of the limbs, such as knees and elbows.
Although there has been active research in recent years on the planning and control of multi-contact movement of humanoid robots, most of the humanoid robots that achieve multi-contact movement are limited to the hands and feet, rather than being able to make contact with any area of the entire body like humans.
▍Propose a new control method to achieve multi-contact movement of the whole body
In general, we define a movement involving the contact of any part of the robot's body as a whole-body multi-contact movement.
There are two main challenges to achieving this complex form of movement: the ability to perceive whole-body contact, and balance control in a multi-contact state.
Recently, researchers from the CNRS-AIST JRL and Tokyo University of Science have worked together to conduct an in-depth study of this and develop a control method.
This method realizes whole-body multi-contact movement through distributed tactile sensors installed on the surface of the robot's body. Compared to traditional force/torque sensors, these thin, flexible, distributed tactile sensors are able to measure whole-body contact without drastically changing the shape of the robot's body. This allows the robot not only to achieve multi-contact movement at the end of the limbs, but also to support it through the middle of the limbs, such as knees and elbows, which greatly improves the robot's motion stability in the face of interference and environmental errors.
Although there is a wide range of research on tactile measurement in humanoid robots, from the development of sensors to sensor-based motion generation, there has been little research on the use of tactile sensors for balance control in humanoid robots, except for the calculation of the center of pressure (CoP) and the support area of the soles of the robot's feet when walking bipedally. In the present study, the team explicitly used tactile sensors to control the balance of human movements in whole-body contact.
By extending the previously developed effective multi-contact motion control technology, the research team equipped the robot with distributed tactile sensors, extended the measurement range to the middle region, and stabilized the robot's motion and balance by using feedback control from force/torque sensors and distributed tactile sensors. The results of dynamic simulation show that the haptic feedback developed by the research team greatly improves the stability of the whole body multi-contact motion to interference and environmental errors.
In addition, the research team conducted simulation-world and real-world experiments. In the test, RHP Kaleido, a life-size humanoid robot with distributed tactile sensors on its limbs, demonstrated a variety of full-body multi-touch movements, such as stepping forward and supporting the body with forearm contact, and maintaining seated balance with thigh contact. This suggests that through the control method developed by the research team, the humanoid robot can perform full-body multi-contact movements with better robustness.
该研究成果的相关论文以“Whole-Body Multi-Contact Motion Control for Humanoid Robots Based on Distributed Tactile Sensors”为题发表在《IEEE Robotics and Automation Letters》上。
Next, let's explore this research result in depth with the Robotics Lecture Hall!
▍What are the two major implementation difficulties? How does the new control method work?
It is understood that the control system proposed by the research team consists of two parts: centroid motion control and limb motion control. Compared to the previously developed control system, the new system adds a module extension based on tactile sensors to accommodate whole-body contact. Specifically:
Centroid motion control
The centroid motion control is achieved by the resultant moment acting on the center of mass (CoM) of the robot. The resultant moment consists of the contact moments distributed in the various contact areas of the robot, which are updated online based on the actual contact polygons measured by the distributed tactile sensors. Model predictive control (MPC) is used for centroid motion planning to minimize the error between the centroid state and the reference state, and the centroid state is stabilized by proportional-derivative (PD) feedback control.
Limb movement control
Its limb motion control achieves the desired resultant moment by distributing the contact moment to each contact area. Moment distribution is done by solving a quadratic programming problem that ensures that unilateral and frictional constraints are satisfied. In addition, Damping Control is used to achieve the desired contact torque for each contact area, where distributed tactile sensors are used to measure the actual contact moment.
Distributed tactile sensing
In order to achieve full-body multi-contact motion control, the research team also installed distributed tactile sensors on the surface of the robot's limbs to obtain key contact information. These distributed tactile sensors consist of multiple units that can only measure the normal tactile response, from which the data is converted into contact moments for damping control. In order to update the vertices of the contact polygon online, the system estimates the contact surface area based on the measurement of distributed tactile sensors and calculates the smallest rectangle containing all the elements that detected the contact as the contact polygon.
▍Simulation and real experiments to prove that the control method is feasible!
To verify the feasibility of the control method, the research team installed distributed tactile sensor e-skins on the forearms and legs of the life-size humanoid robot RHP Kaleido, and conducted experiments in both simulated and real-world environments.
Simulation experiments
In the dynamics simulator MuJoCo, the research team verified the effectiveness of full-body multi-contact motion with the virtual humanoid robot JVRC13. The simulation experiment involved three movements: elbow contact walking, knee contact standing, and thigh contact sitting. Experimental results show that the use of haptic feedback significantly improves the robustness of robot motion compared with no haptic feedback.
Elbow Contact Walking: When walking on sloped walls with errors, the robot with haptic feedback is able to walk stably within a wider range of wall height errors, and improves tracking performance based on the zero moment point (ZMP) of the foot alone.
Knee contact standing: Robots with haptic feedback are able to withstand greater disturbance forces and maintain balance when subjected to disturbing forces in the front and rear directions.
Thigh-contact sitting: When sitting on a rotating seat panel, the robot that uses haptic feedback and updates the contact area is able to successfully maintain balance and avoid tipping backwards.
Real-world experiments
In the real world, the research team used RHP Kaleido, a humanoid robot equipped with a distributed tactile sensor e-skin, to demonstrate full-body multi-contact motion.
Preliminary experiments verify the effectiveness of haptic feedback through the physical interaction between humans and the robot's forearm. Subsequently, the robot successfully performed a walking movement with the forearm in contact with the environment and a seated balance movement with only the thighs touching the seat.
Experimental results show that the robot is able to remain stable in full-body multi-contact motion, despite environmental errors and model errors.