I am using RoboDK for a relatively large-scale simulation. To reduce the load on my PC, I currently use the following configuration:
However, sometimes the program seems to encounter issues. For example, the optimal cost graph appears completely flat, and I cannot understand why, especially when Render = True is enabled. Could there be some kind of conflict between these commands?
Additionally, I am trying to measure simulation time, but I want to obtain real execution time, not the scaled simulation time. For example, if the real execution time is 4 seconds and I set the simulation speed to 100×, I don’t want to get 0.04 seconds but still 4 seconds.
Is there a function in RoboDK that allows me to retrieve the real execution time?
Thank you!
Code:
RDK.Render(False)
RDK.setRunMode(RUNMODE_SIMULATE)
RDK.setRunMode(RUNMODE_QUICKVALIDATE) # Movements
RDK.setSimulationSpeed(100)
Additionally, I am trying to measure simulation time, but I want to obtain real execution time, not the scaled simulation time. For example, if the real execution time is 4 seconds and I set the simulation speed to 100×, I don’t want to get 0.04 seconds but still 4 seconds.
Is there a function in RoboDK that allows me to retrieve the real execution time?
Thank you!