Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

Inverse Kinematics

We have a big amount of points (Cartesian cordinates) like a 100.000, we want to store the joints of the robots to reach this positions. 

We start forme home (know joints positions) and right know (with c# api) we move the robot to the cartesian position, and read the joints of the robots in a while loop, the procces is slow (because is needed move the robot).

Our idea is resolve the inverse kinematics without move the virtual robot, something like:

last_joints= home_joints.
new_joints[i]= solve_ik(last_joints, cartesian pose[i]);


You should be able to calculate the inverse kinematics of a point without moving the robot using SolveIK. The code you provided looks invalid. You should provide poses as the input to SolveIK, not joints.

More information here:

If you just need to do the conversion you should be able to easily convert cartesian coordinates to joint values with a small python script.
# read list_poses as a list of lists (array of arrays)
for xyzwpr in list_poses:
   pose = KUKA_2_Pose()
   joints = robot.SolveIK(pose)
This example can give you a better idea:
The doubt is the joints is thinking that the robot is where, I mean, I understand that give a configuration of axis depending of the previous pose of the robot, or always give it from home?
You can provide the joint value of the previous point to help the algorithm.

Find useful information about RoboDK and its features by visiting our Online Documentation and by watching tutorials on our Youtube Channel


Users browsing this thread:
1 Guest(s)