-
Notifications
You must be signed in to change notification settings - Fork 86
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question]: About getting cartesian coordinates #206
Comments
Could you provide a full script that reproduces this issue in the simulator? |
This is my full script that I run in the simulator. |
I think I might be able to help out here.
See this link for the function description.
See frame terminology here, but note that the definition for T_sy is wrong. Instead of
it should be...
Here's the documentation for If you want to command the robot's end effector to an absolute pose relative to the base_link frame of the robot, then use the |
@swiz23 For example, is it possible to get robot's end effector relative positions according to the default home position? |
Command the arm to go to the home pose. Then call the Then you can call the set_ee_cartesian_trajectory function to move the robot relative to its current pose. Now call the |
@swiz23 Step 1. Move robot from home position to another position along x and y. x and y are displacement values. Do the first 4 steps for at least four times on a diagonal line, and save robot displacement and camera pixel values. Step 5. Obtain the formula between robot x axis and camera x axis by doing regression on Excel. I calibrated the left robot with high camera in Aloha setup. The right robot also can be calibrated. |
Hmmm, strange. I don't think taking the inverse of that transformation matrix should result in an error. T_sh for the vx300s arm is equal to...
The inverse of this matrix ( done by using the function here) is:
|
Thank you for the matrix. I tried in simulation, it didn't give the results that I want. I am moving the robot real-time according to the my plan for now. |
Question
Hello,
I want to do a pick and place process with the vx300s arm. I have ALOHA setup. I will move puppet robot with master arm. Then, I want to read the cartesian coordinate of the puppet robot to do calibration process with the camera.
Before working on real robots, I am studying on simulation. On simulation, I move robot with bot.arm.set_ee_cartesian_trajectory(x=-0.016, y=-0.2, z=0.038). Then robot moves to position. I try to get the cartesian positions from robot with pos = bot.arm.get_ee_pose() function. It gives me a 4x4 matrix. When I try to extract the cartesian position from the matrix with pos = pos[0:3,3] command, it gives different positions.
The position is x=-0.016, y=-0.2, z=0.038.
The result is x=0.3435 y=-0.2 z=0.288
I couldn't find a solution for this. What should i do to get the robot position as cartesian position? I tried pos = bot.arm.get_ee_pose_command(), it didn't give the same results, also.
Robot Model
vx300s
Operating System
Ubuntu 20.04
ROS Version
ROS 1 Noetic
Additional Info
No response
The text was updated successfully, but these errors were encountered: