Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

Controlling real laser sensor with RoboDK

#1
Hi, 

I've created a program on RoboDK using the Yaskawa Motoman GP8 robot, which I have been able to transfer over to the real robot using the Ethernet cable. When the program is run the real robot carries out the same movements as the simulation.

I have included a SICK WL4S Laser Sensor in my program (The sensor found in your online library).

Is it possible to transfer the laser sensor part of the simulation to a real sensor? If so do you have any examples to prove that this can actually be done, i.e. not just the laser working on a simulation, but then this being transferred over to a real life sensor and robot?

If it can be transferred over to a real robot, does the sensor need to be connected to the robot to work or can it be connected directly to the laptop / computer running the simulation?

To make a real laser sensor work from the RoboDK program, does the real sensor have to be the exact same as the SICK WL4S Laser Sensor, or can it be any laser sensor and the python code and program created for the SICK WL4S Laser sensor created in RoboDK can still be transferred over to the sensor?
#2
What do you mean by transferring the laser sensor part of the simulation to a real sensor? Do you mean the algorithm? Or the digital input of the sensor?

Or are you asking about information about deploying your solution for production? You should be able to deploy your solution on embedded systems running RoboDK in the background. It is better if you contact us by email so we can better help you:
https://robodk.com/contact
#3
Thanks for your reply Albert.

The simulation on RoboDK works when an object passes the laser sensor, then the robot will go and pick up the object.

So I need to replicate this on the real robot. This means getting a sensor and I assume the sensor would need to be somehow connected to the RoboDK program so that it acts like the sensor in the simulation. Therefore when a real object passes the real sensor the real robot will then move into position and pick the real object up.

I am open for suggestions on how best to do this. The sensor in the simulation works using a standard python code in which the only part that identifies the type of sensor is the part of the code that says the following:

SENSOR_NAME = 'Sensor SICK WL4S'
SENSOR_VARIABLE = 'SENSOR'

Then the main program just has a step in the coding that says:

WaitSensor()

This is the reason why I also asked if there are any examples of this working on a real robot setup and not just in a simulation.

I am carrying out some work in which I need a sensor ordered to be able to use in September. This is why I could do with knowing if it would need to be the exact same sensor as the one in the simulation or if any laser sensor can be used and it would still work using the python code in the simulation.

I have already contacted RoboDK by email for a separate issue regarding the gripper not opening and closing even though the simulation does. However, in my most recent response to the emails I have asked the same question about the sensor. But, I thought it might be worthwhile sending it into the forum as people from outside RoboDK might have had a similar experience.
#4
We don't have an example integrating a real sensor but we can help you create one.

Is it possible to connect to the sensor through a socket connection? Or let us know if you have a datasheet or a place where we can find examples on how to read the state of the sensor.
#5
Thanks Albert,
I am currently trying to purchase a sensor, so will look into your suggestion once I have one.
Could you advise on how I can save a RoboDK program to a USB and keep it in the jbi format? The reason I ask, is that I have created a gripper program, which won’t transfer over to the real robot, even though the movement commands do. So, I am wanting to upload the gripper commands directly to the YRC1000 Teach Pendant via a USB, and then save them as job and run them that way to prove if the RoboDK program created works. However, when I generate the program, it is in the jbi format, but when I save it to the USB, the code ends up in notepad, so I think this is the reason it doesn’t show when I connect the USB. However, there doesn’t seem to be any option to keep it in jbi format when saving it to a USB.
Could anyone advice on this please?
#6
You should make sure you don't change the extension of the program files when you transfer programs to the robot controller.

For example if you generate a program for a Yaskawa/Motoman controller you'll obtain a JBI file. You can open it with Notepad to see the contents but you should not edit the file and make sure you transfer it to the robot as a JBI file.
#7

.pdf   Sesnor Datasheet.pdf (Size: 379.57 KB / Downloads: 141)
(08-24-2023, 10:56 PM)Albert Wrote: We don't have an example integrating a real sensor but we can help you create one.

Is it possible to connect to the sensor through a socket connection? Or let us know if you have a datasheet or a place where we can find examples on how to read the state of the sensor.

Hi,
I’ve identified a SICK WL4S Laser Sensor that looks like the one on the RoboDK online library.
The sensor identified is WL4S-3V1132. I’ve attached the datasheet. Looking at the information attached, is this something that can be integrated with RoboDK so I can replicate my sensing simulation on the real robot and sensor?
As per your question on the socket connection and datasheet. There seems to be multiple types of these sensors if you go on cdn.sick.com different types of these sensors can be found or on rs-online.com with the datasheets attached.
#8
You can load the 3D model in RoboDK to simulate the sensor. I believe the official SICK models include the field of view of the laser so you can split the model between the sensor and the laser (field of view). With some scripting you can convert the collision state between the sensor line and your parts into a digital input.

RoboDK supports common 3D formats such as STEP, IGES, STL, PLY, etc.
#9
(09-06-2023, 03:43 PM)Albert Wrote: You can load the 3D model in RoboDK to simulate the sensor. I believe the official SICK models include the field of view of the laser so you can split the model between the sensor and the laser (field of view). With some scripting you can convert the collision state between the sensor line and your parts into a digital input.

RoboDK supports common 3D formats such as STEP, IGES, STL, PLY, etc.

Thanks Albert,
Does the 3D model actually need loaded in, can the SICK WL4S you already have in the online RoboDK library not just be used?
When you are saying that “with some scripting you can convert the collision state between the sensor line and your parts into a digital input”, is this just to get an imported 3D model to work on the simulation or is this to get the sensor in the simulation to control the real sensor?
If this is to get the simulation sensor to control the real sensor / get the real sensor to work like the simulated sensor, have you any examples of the type of script that would be required?
#10
The attached RoboDK project shows an example to simulate a laser sensor.

You can find the SensorActivate script wich allows you to turn the contact between the laser and an object into a virtual digital input.

Code:
# Type help("robolink") or help("robodk") for more information
# Press F5 to run the script
# Note: you do not need to keep a copy of this file, your python script is saved with the station
from robolink import *    # API to communicate with RoboDK
from robodk import *      # basic matrix operations
RDK = Robolink()

# This script allows to simulate a laser sensor
# that detect the presence of a part in a plane,
# such as a part crossing a laser sensor in a
# conveyor belt

# Use a target from the station as a reference plane:
SENSOR_NAME = 'Sensor SICK WL4S'
SENSOR_VARIABLE = 'SENSOR'

# Look for parts with the keyword "Part"
PART_KEYWORD = 'Part '

# Update status every 1 ms
RECHECK_PERIOD = 0.001

# Get the sensor and the detection plane
SENSOR = RDK.Item(SENSOR_NAME, ITEM_TYPE_OBJECT)

# retrieve all objects
all_objects = RDK.ItemList(ITEM_TYPE_OBJECT, False)
part_objects = []
for obj in all_objects:
    if obj.Name().count(PART_KEYWORD) > 0:
        part_objects.append(obj)
   

# Loop forever to detect parts
detected_status = -1
while True:
    detected = 0
    for obj in part_objects:
        # check if an object has the part keyword and is detected by the laser
        if SENSOR.Collision(obj) > 0:
            detected = 1
            break

    # update the current status of the sensor
    if detected_status != detected:
        detected_status = detected
        RDK.setParam(SENSOR_VARIABLE, detected_status)
        print('Object detected status --> %i' % detected_status)

    # Wait some time...
    pause(RECHECK_PERIOD)


Attached Files
.rdk   Example-06.d-Pick and place with laser sensor.rdk (Size: 2.7 MB / Downloads: 160)
  




Users browsing this thread:
1 Guest(s)