08-20-2023, 03:57 PM
(This post was last modified: 09-06-2023, 10:22 PM by Albert.)
I've created a program on RoboDK using the Yaskawa Motoman GP8 robot, which I have been able to transfer over to the real robot using the Ethernet cable. When the program is run the real robot carries out the same movements as the simulation.
I have included a SICK WL4S Laser Sensor in my program (The sensor found in your online library).
Is it possible to transfer the laser sensor part of the simulation to a real sensor? If so do you have any examples to prove that this can actually be done, i.e. not just the laser working on a simulation, but then this being transferred over to a real life sensor and robot?
If it can be transferred over to a real robot, does the sensor need to be connected to the robot to work or can it be connected directly to the laptop / computer running the simulation?
To make a real laser sensor work from the RoboDK program, does the real sensor have to be the exact same as the SICK WL4S Laser Sensor, or can it be any laser sensor and the python code and program created for the SICK WL4S Laser sensor created in RoboDK can still be transferred over to the sensor?
Thanks for your reply Albert.
The simulation on RoboDK works when an object passes the laser sensor, then the robot will go and pick up the object.
So I need to replicate this on the real robot. This means getting a sensor and I assume the sensor would need to be somehow connected to the RoboDK program so that it acts like the sensor in the simulation. Therefore when a real object passes the real sensor the real robot will then move into position and pick the real object up.
I am open for suggestions on how best to do this. The sensor in the simulation works using a standard python code in which the only part that identifies the type of sensor is the part of the code that says the following:
SENSOR_NAME = 'Sensor SICK WL4S'
SENSOR_VARIABLE = 'SENSOR'
Then the main program just has a step in the coding that says:
This is the reason why I also asked if there are any examples of this working on a real robot setup and not just in a simulation.
I am carrying out some work in which I need a sensor ordered to be able to use in September. This is why I could do with knowing if it would need to be the exact same sensor as the one in the simulation or if any laser sensor can be used and it would still work using the python code in the simulation.
I have already contacted RoboDK by email for a separate issue regarding the gripper not opening and closing even though the simulation does. However, in my most recent response to the emails I have asked the same question about the sensor. But, I thought it might be worthwhile sending it into the forum as people from outside RoboDK might have had a similar experience.
We don't have an example integrating a real sensor but we can help you create one.
Is it possible to connect to the sensor through a socket connection? Or let us know if you have a datasheet or a place where we can find examples on how to read the state of the sensor.
I am currently trying to purchase a sensor, so will look into your suggestion once I have one.
Could you advise on how I can save a RoboDK program to a USB and keep it in the jbi format? The reason I ask, is that I have created a gripper program, which won’t transfer over to the real robot, even though the movement commands do. So, I am wanting to upload the gripper commands directly to the YRC1000 Teach Pendant via a USB, and then save them as job and run them that way to prove if the RoboDK program created works. However, when I generate the program, it is in the jbi format, but when I save it to the USB, the code ends up in notepad, so I think this is the reason it doesn’t show when I connect the USB. However, there doesn’t seem to be any option to keep it in jbi format when saving it to a USB.
Could anyone advice on this please?
You should make sure you don't change the extension of the program files when you transfer programs to the robot controller.
For example if you generate a program for a Yaskawa/Motoman controller you'll obtain a JBI file. You can open it with Notepad to see the contents but you should not edit the file and make sure you transfer it to the robot as a JBI file.
You can load the 3D model in RoboDK to simulate the sensor. I believe the official SICK models include the field of view of the laser so you can split the model between the sensor and the laser (field of view). With some scripting you can convert the collision state between the sensor line and your parts into a digital input.
RoboDK supports common 3D formats such as STEP, IGES, STL, PLY, etc.