天天看點

ARUCO / VISP Hand-Eye CalibrationARUCO / VISP Hand-Eye Calibration

ARUCO / VISP Hand-Eye Calibration

ARUCO / VISP Hand-Eye CalibrationARUCO / VISP Hand-Eye Calibration

實驗使用的标定包是 https://github.com/jhu-lcsr/aruco_hand_eye git clone到~/catkin_ws/src/中并編譯

或者使用我在realsense sr300和kinova j2s7s300上測試過的代碼 https://github.com/FeiYuejiao/aruco_hand_eye

使用aruco線上生成網站生成marker http://chev.me/arucogen/

ARUCO / VISP Hand-Eye CalibrationARUCO / VISP Hand-Eye Calibration

生成的圖像如下,由于列印機列印出來的尺寸可能有偏差,不是100mm,在列印出來以後自己測量真實尺寸就行。

ARUCO / VISP Hand-Eye CalibrationARUCO / VISP Hand-Eye Calibration

啟動realsense,檢視有哪些frame可以選擇,我選的是camera_aligned_depth_to_color_frame下面的camera_color_optical_frame because aruco uses rgb image and depth frame is aligned to color frame here:

roslaunch roslaunch realsense2_camera rs_rgbd.launch

rosrun rqt_tf_frame rqt_tf_frame

ARUCO / VISP Hand-Eye CalibrationARUCO / VISP Hand-Eye Calibration

在官方readme中給了eye-in-hand和eye-on-base兩種情況,本實驗使用的是eye-on-base,就是相機固定位置,maker固定在機械臂末端。參考eye-on-base建立一個kinova.launch檔案:

<launch>
  <!-- The end-effector frame_id, depend on what robot you use -->
  <arg name="ee_frame" value="/j2s7s300_end_effector"/>

  <!-- Bring up a realsense -->
  <include file="$(find realsense2_camera)/launch/rs_rgbd.launch">
  </include>

 <!-- Calibrate the extrinsics for a realsense mounted to a robot base -->
 <!-- User need to specify markerid and markersize -->
  <include file="$(find aruco_hand_eye)/launch/aruco_hand_eye.launch">
    <arg name="markerid"   value="571"/>
    <arg name="markersize" value="0.100"/>
    <arg name="publish_tf" value="true"/>
    
  <!-- In eye-on-base case, marker_parent_frame is the end-effector frame, camera_parent_frame is the world or base frame -->
    <arg name="marker_parent_frame" value="$(arg ee_frame)"/>
    <arg name="camera_parent_frame" value="/world"/>
    <arg name="camera" value="/camera/color/"/>
    <!-- Here I use the realsense's camera_color_optical_frame because aruco uses rgb image and depth frame is aligned to color frame here -->
    <arg name="camera_frame" value="/camera_color_optical_frame"/>
  </include>

</launch>
           

啟動kinova

roslaunch kinova_bringup kinova_robot.launch kinova_robotType:=j2s7s300

啟動arucu_hand_eye

roslaunch aruco_hand_eye kinova.launch

打開rviz

rosrun rviz rviz

你可以使用moveIt!對kinova進行控制,easy_hand_eye中就是這樣的,但是調了半天沒調通,我才使用aruco_hand_eye,隻要給它/j2s7s300_end_effector 的frame資訊就夠了。

rviz中看aruco_tracker_result 是這樣的有些角度可能沒有坐标系産生,這些位置就不能用。

ARUCO / VISP Hand-Eye CalibrationARUCO / VISP Hand-Eye Calibration

如果一切順利,在

roslaunch aruco_hand_eye kinova.launch

視窗應該出現下面的指令行,每capture一個sample,控制機器人末端變換一定的角度或者位置,因為我對moveit!還不是很熟悉,為了友善直接使用的joystick。capture 3張圖後計算出的平移和旋轉如下,看平移已經比較接近我測量的真實值,但是可以多capture幾張,我一般控制在20張左右。有些位置結果不太好要discard。

ARUCO / VISP Hand-Eye CalibrationARUCO / VISP Hand-Eye Calibration

真實的實驗環境:

ARUCO / VISP Hand-Eye CalibrationARUCO / VISP Hand-Eye Calibration

下方是aruco_hand_eye官方的readme内容供參考~

Use Cases

This package uses the ARUCO planar target tracker from

aruco_ros

and the VISP

hand-eye calibration from

visp_hand2eye_calibration

to provide a simple

camera pose estimation package.

If you’re unfamiliar with Tsai’s hand-eye calibration [1], it can be used in two ways:

  • eye-in-hand – To compute the static transform from a robot’s

    end-effector to the optical frame of a camera. In this case, the camera is

    mounted on the end-effector, and you place the visual target so that it is

    fixed relative to the base of the rboot.

  • eye-on-base – To compute the static transform from a robot’s base to the

    optical frame of a camera. In this case, the camera is mounted to the base of

    the robot (or kinematic chain), and you place the visual target so that it is

    fixed relative to the end-effector of the robot.

Usage

For both use cases, you can either launch the

aruco_hand_eye.launch

launchfile, or you can include it in another launchfile as shown below. Either

way, the launchfile will bring up the

aruco_ros

tracker and the

visp_hand2eye_calibration

solver, along with an integration script. By

default, the integration script will interactively ask you to accept or discard

each sample.

eye-in-hand

<launch>
  <include file="$(find aruco_hand_eye)/launch/aruco_hand_eye.launch">
    <arg name="markerid"   value="582"/>
    <arg name="markersize" value="0.141"/>
    <arg name="publish_tf" value="true"/>

    <arg name="marker_parent_frame" value="/base_link"/>
    <arg name="camera_parent_frame" value="/ee_link"/>

    <arg name="camera" value="/camera/rgb"/>
    <arg name="camera_frame" value="/camera_rgb_optical_frame"/>
  </include>
</launch>
           

eye-on-base

<launch>
  <include file="$(find aruco_hand_eye)/launch/aruco_hand_eye.launch">
    <arg name="markerid"   value="582"/>
    <arg name="markersize" value="0.141"/>
    <arg name="publish_tf" value="true"/>

    <arg name="marker_parent_frame" value="/ee_link"/>
    <arg name="camera_parent_frame" value="/base_link"/>

    <arg name="camera" value="/camera/rgb"/>
    <arg name="camera_frame" value="/camera_rgb_optical_frame"/>
  </include>
</launch>
           

Examples

For calibrating a kinect mounted to the base of a manipulator that can grasp the target:

roslaunch aruco_hand_eye kinect.launch ee_frame:=/my/robot/ee_link
           

References

[1] Tsai, Roger Y., and Reimar K. Lenz. “A new technique for fully autonomous

and efficient 3D robotics hand/eye calibration.” Robotics and Automation, IEEE

Transactions on 5.3 (1989): 345-358.

繼續閱讀