Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question]: About getting cartesian coordinates #206

Open
barsm42 opened this issue Jul 22, 2024 · 8 comments
Open

[Question]: About getting cartesian coordinates #206

barsm42 opened this issue Jul 22, 2024 · 8 comments
Labels
manipulators Issues related to manipulators question Further information is requested

Comments

@barsm42
Copy link

barsm42 commented Jul 22, 2024

Question

Hello,
I want to do a pick and place process with the vx300s arm. I have ALOHA setup. I will move puppet robot with master arm. Then, I want to read the cartesian coordinate of the puppet robot to do calibration process with the camera.

Before working on real robots, I am studying on simulation. On simulation, I move robot with bot.arm.set_ee_cartesian_trajectory(x=-0.016, y=-0.2, z=0.038). Then robot moves to position. I try to get the cartesian positions from robot with pos = bot.arm.get_ee_pose() function. It gives me a 4x4 matrix. When I try to extract the cartesian position from the matrix with pos = pos[0:3,3] command, it gives different positions.

The position is x=-0.016, y=-0.2, z=0.038.
The result is x=0.3435 y=-0.2 z=0.288

I couldn't find a solution for this. What should i do to get the robot position as cartesian position? I tried pos = bot.arm.get_ee_pose_command(), it didn't give the same results, also.

Robot Model

vx300s

Operating System

Ubuntu 20.04

ROS Version

ROS 1 Noetic

Additional Info

No response

@barsm42 barsm42 added manipulators Issues related to manipulators question Further information is requested labels Jul 22, 2024
@lukeschmitt-tr
Copy link
Member

Could you provide a full script that reproduces this issue in the simulator?

@barsm42
Copy link
Author

barsm42 commented Jul 23, 2024

  #!/usr/bin/env python3
  # -*- coding: utf-8 -*-

from interbotix_xs_modules.arm import InterbotixManipulatorXS
import numpy as np
import sys, random
from time import sleep

def main():
    bot = InterbotixManipulatorXS("vx300s", "arm", "gripper") 
    if (bot.arm.group_info.num_joints < 5):
        print('This demo requires the robot to have at least 5 joints!')
        sys.exit()
    while True:    
      bot.arm.go_to_home_pose()
      pos = bot.arm.get_joint_commands()
      print("==========================")
      print("Home Coordinate:", pos)
      print("==========================")
      bot.arm.set_single_joint_position("wrist_angle", np.pi/2.0)

      limit1_forward = +50
      limit2_backward = -100
      limit1_right = 200 
      limit2_left = -200
      limit1_up = -150
      limit2_down = +50
      
      x1 = random.randrange(limit2_backward, limit1_forward)/1000
      y1 = random.randrange(limit2_left,limit1_right)/1000
      z1 = random.randrange(limit1_up, limit2_down)/1000        
      print("Target Coordinate:", x1,y1,z1)
      print("==========================")
  
      bot.arm.set_ee_cartesian_trajectory(x=x1,  y=y1, z=z1,moving_time=5)
      sleep(0.1)
      pos = bot.arm.get_ee_pose()
      pos = pos[0:3,3]
      print("Current Coordinate:",pos)
      print("==========================")       
      bot.gripper.open()
      bot.arm.set_ee_cartesian_trajectory(z=0.02,moving_time=5)
      bot.gripper.close()
if __name__=='__main__':
    main()

This is my full script that I run in the simulator.

@swiz23
Copy link
Contributor

swiz23 commented Jul 26, 2024

I think I might be able to help out here.

bot.arm.set_ee_cartesian_trajectory is a function that allows you to specify relative motion of the end effector w.r.t. a frame known as T_sy ('T_sy' is the exact same as the base_link frame of the robot, but it contains the yaw of the end-effector frame relative to its base_link frame). In your scenario, no matter what the starting pose of the robot is, it will try to move along the x-axis of T_sy by -0.016m, the y-axis of T_sy by -0.2m, and the z-axis of T_sy by 0.038m.

See this link for the function description.

bot.arm.get_ee_pose on the other hand gives you the absolute pose of the end effector relative to the base_link frame (a.k.a T_sb - the trasform from the 'space' frame [base_link] to the 'body' frame [ee_gripper_link])

See frame terminology here, but note that the definition for T_sy is wrong. Instead of

The transform from the Body Frame to a virtual frame with the exact same x, y, z, roll, and pitch as the Space Frame. However, it contains the ‘yaw’ of the Body Frame.

it should be...

The transform from the Space Frame to a virtual frame with the exact same x, y, z, roll, and pitch as the Space Frame. However, it contains the ‘yaw’ of the Body Frame.

Here's the documentation for get_ee_pose

If you want to command the robot's end effector to an absolute pose relative to the base_link frame of the robot, then use the set_ee_pose_components function instead.

@barsm42
Copy link
Author

barsm42 commented Jul 29, 2024

@swiz23
Thank you for your detailed reply.
I used set_ee_pose_components function, but it didn't move as I want like in set_ee_cartesian_trajectory function.
Is there any function that calculates end effector's relative pose relative to the base_link frame of the robot?

For example, is it possible to get robot's end effector relative positions according to the default home position?
Home position is [0,0,0,0,0,0] in Joint mode. For example, I moved robot with x=0.05, y=0.02 in cartesian mode. Is there any function to get how many mm robot moved in all axes according to the home position?

@swiz23
Copy link
Contributor

swiz23 commented Jul 29, 2024

set_ee_pose_components will not move the robot in cartesian space. There is currently no function that will give you the end effector's current pose relative to the end effector's pose in the home position. But you could get it yourself by doing the following:

Command the arm to go to the home pose. Then call the get_ee_pose_command function. This will return the end-effector's pose relative to the base_link frame when all joints are 0 as a 4x4 transformation matrix. We can call this transformation matrix T_sh (the transformation from the 'space' frame (a.k.a the base_link of the robot) to the end effector frame when in the 'home' pose.

Then you can call the set_ee_cartesian_trajectory function to move the robot relative to its current pose. Now call the get_ee_pose function to get the current end_effector pose relative to the base_link frame as a 4x4 transformation matrix (let's call this T_sc - the transformation matrix from the 'space' frame to the 'current' end effector position). Now you can calculate the displacement of the current end-effector's pose relative to the 'home' end-effector's pose by taking the inverse of T_sh and multiplying by T_sc => (T_sh)^-1 * T_sc = T_hs * T_sc = T_hc (a.k.a the transformation matrix from the end effector's 'home' pose to its 'current' pose).

@barsm42
Copy link
Author

barsm42 commented Jul 31, 2024

@swiz23
Thank you for your detailed reply. However, because all joints are 0 as a 4x4 transformation matrix, there is an error (singular matrix) during the inverse transform.
I solved my problem on real time process by doing manual calibration.

Step 1. Move robot from home position to another position along x and y. x and y are displacement values.
Step 2. Move the object manually to x and y under the gripper which is in the camera view.
Step 3. Move robot to home position.
Step 4. Get the pixels x and y of the object from camera.

Do the first 4 steps for at least four times on a diagonal line, and save robot displacement and camera pixel values.

Step 5. Obtain the formula between robot x axis and camera x axis by doing regression on Excel.
Step 6. Obtain the formula between robot y axis and camera y axis by doing regression on Excel.

I calibrated the left robot with high camera in Aloha setup. The right robot also can be calibrated.

@swiz23
Copy link
Contributor

swiz23 commented Jul 31, 2024

Hmmm, strange. I don't think taking the inverse of that transformation matrix should result in an error. T_sh for the vx300s arm is equal to...

[[1.0, 0.0, 0.0, 0.536494],
  [0.0, 1.0, 0.0, 0.0],
  [0.0, 0.0, 1.0, 0.42705],
  [0.0, 0.0, 0.0, 1.0]]

The inverse of this matrix ( done by using the function here) is:

[[1.0, 0.0, 0.0, -0.536494],
  [0.0, 1.0, 0.0, 0.0],
  [0.0, 0.0, 1.0, -0.42705],
  [0.0, 0.0, 0.0, 1.0]]

@barsm42
Copy link
Author

barsm42 commented Jul 31, 2024

Thank you for the matrix. I tried in simulation, it didn't give the results that I want.

I am moving the robot real-time according to the my plan for now.
Thank you for your help. It helped me to understand the robot movement.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
manipulators Issues related to manipulators question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants