VISOR Robotic URCap Reference Manual
VISOR Robotic URCap Reference Manual
SensoPart Industriesensorik GmbH and its subsidiaries are not liable for printing errors and mistakes, which occurred in
drafting this document. This document is subject to delivery and technical alterations without warning.
Under no circumstances can SensoPart Industriesensorik GmbH or its subsidiaries be held responsible for any errors,
omissions, or problems in this information, and any incident (losses, injuries or damages) that might result from the use or
miss-use of this document or the software and hardware described therein.
Use of the VISOR® hardware, software, and its components is entirely at your own risk. Should anything prove faulty, you
assume the entire cost of all service, repair or correction.
The provided programs and examples allow using the VISOR ® Robotic in combination with UR CB-series or e-series
robots. They assume a very careful user considering all necessary safety measures. Always conduct a risk assessment
before placing the robot into operation. All example programs are provided for informational purposes only. They must
be custom modified, tested and debugged to work with each unique application.
No part of this document may be reproduced, published, or stored in databases or information retrieval systems in any
form – even in part – nor may illustrations, drawings, or the layout be copied without prior written permission from
SensoPart Industriesensorik GmbH.
Document History
Version Remarks
1.0 Initial version
2 Update for VISOR® Robotic URCap Version 1.0.8
3 Update for VISOR® Robotic URCap Version 1.2.0
4 Update for VISOR® Robotic URCap Version 1.3.1
5 Update for VISOR® Robotic URCap Version 1.4.0
VISOR® Robotic URCap Step-by-Step
Content
1 Introduction .................................................................................................................................................................................................................................2
1.1 Guide to symbols .........................................................................................................................................................................................................2
1.2 Safety instructions ........................................................................................................................................................................................................2
1.3 Intended use ...................................................................................................................................................................................................................3
1.4 Prerequisites and supported Versions ..............................................................................................................................................................3
2 VISOR® Robotic URCap: Installation Node...............................................................................................................................................................5
2.1 Installation Node: Network ....................................................................................................................................................................................6
2.1.1 Connection Status .......................................................................................................................................................................................................7
2.1.2 Auto Connect ................................................................................................................................................................................................................8
2.2 Installation Node: Jobs ...............................................................................................................................................................................................9
2.2.1 Backing up and Restoring Jobsets from/to the VISOR® ..........................................................................................................................9
2.2.2 Uploading the VISOR® QuickStart Template Jobsets ........................................................................................................................... 11
2.3 Installation Node: Calibration ............................................................................................................................................................................. 11
2.4 Installation Node: Live Image .............................................................................................................................................................................. 12
3 VISOR® Robotic URCap: Program Node “VISOR® Calibrate” ................................................................................................................... 13
3.1 Pre-Requisites: Necessary Settings of the Robot ..................................................................................................................................... 13
3.2 Calibration with the Calibration Plate ............................................................................................................................................................ 14
3.3 Calibration with the Point Pair List .................................................................................................................................................................. 15
3.4 Calibration: Hints and Good Practices........................................................................................................................................................... 21
3.4.1 Requirements to the Optical Set-Up and the Usage of the Calibration Plate ........................................................................ 22
4 VISOR® Robotic URCap: Program Node “VISOR® Pick” .............................................................................................................................. 24
4.1 VISOR® Pick: Program Node and Options ................................................................................................................................................. 25
4.2 VISOR® Pick Method Node / Behavior if no part has been found ................................................................................................ 26
5 VISOR® Robotic URCap: Assignments, Program Node “VISOR® Terminal” ...................................................................................... 27
5.1 Variables, Assignments and Logs....................................................................................................................................................................... 27
5.2 VISOR® Terminal ...................................................................................................................................................................................................... 30
6 SensoConfig: Choosing the Right Detector for Robotics Applications ................................................................................................... 31
6.1 Detecting Objects Using Contour Detector ............................................................................................................................................. 32
6.2 Detecting Objects Using Contour Alignment ........................................................................................................................................... 34
6.2.1 Grip Point Correction – Result Offset .......................................................................................................................................................... 36
6.2.2 Gripping Region ......................................................................................................................................................................................................... 36
6.3 Detecting Objects Using BLOB-Detector ................................................................................................................................................... 38
7 Troubleshooting ..................................................................................................................................................................................................................... 41
1
VISOR® Robotic URCap Version 1.4.0: User Manual
1 Introduction
Where is the part exactly? Which is the best place to grip it? Is there sufficient space to take hold of it or is another part in
the way? The robot needs all this information to be able to pick up a part from a conveyor belt, a tray or vibrating
conveyor. The SensoPart VISOR® vision sensor is a reliable, robust and easy to integrate solution for generating the
necessary information. Such a machine vision solution for identifying objects in an image and sending the information to
the robot offers several chances and advantages for solving robotic applications. Typical applications include locating
objects for pick and place applications or assembly processes of all kinds.
VISOR® is a vision sensor, i.e. it comprises camera, illumination, optics and a computing unit for processing the data and
communicating the results to the robot or control in a small robust housing. A PC or laptop is only necessary for
configuring it. In production, it is a stand-alone device sending data (e.g. object positions in robot coordinates) directly to
the robot. It can communicate via discrete I/O, Ethernet TCP/IP, Profinet, EtherNet/IP, or RS422/RS232.
The VISOR® URCap is a software extension for Universal Robots robots with PolyScope controllers. It allows to easily
connect to a VISOR® vision sensor.
Attention
This symbol indicates text passages that must be observed. Non-observance may result in personal injury
or damage to property.
Please note
This symbol highlights useful tips and recommendations as well as information for efficient operation.
DANGER
Insufficient safety measures can cause death, injuries and property damage!
The correct installation, connection, and configuration of the robot and the vision sensor as well as
securing the working area of the robot are the responsibility of the user or operator. Please follow the
safety instructions in the robot’s manual.
WARNING
Failure of the sensor can occur and, if the working area of the robot is not properly secured, it can
endanger persons or machines.
Dedicated safety devices must be considered at all times when the vision system is connected to a robot
or machine. Please also follow the safety instructions in the VISOR® user manual.
Attention
The user must have read and understood all of the instructions in the following manual before handling the
VISOR® Robotic ABB library.
Any use of the vision sensor in noncompliance with these warnings is inappropriate and may cause injury
or damage of property.
2
VISOR® Robotic URCap Step-by-Step
This manual is intended to be used during programming and maintenance of a robot vision application involving SensoPart
VISOR® vision sensors. It is made for robot programmers, robot application developers and maintenance technicians. It
offers instructions for the implementation of the SensoPart VISOR® vision sensor with a Universal Robots UR3/UR5/UR10
robot and describes an example project for realizing a simple pick-and-place application with a robot guided by a vision
sensor. It describes how to set up the SensoPart VISOR® Robotic URCap together with a SensoPart “VISOR® Robotic
Advanced” or “VISOR® Allround Professional” vision sensor.
Please note
For the VISOR Robotic URCap Version 1.4.0, the robot needs to run with robot firmware PolyScope CB-
series 3.12 or higher or e-series 5.6 or higher with either UR3, UR5, UR10 or UR16 robot arms.
3
VISOR® Robotic URCap Version 1.4.0: User Manual
Please note
Robot programs / PolyScope scripts that have been written using an earlier version of the VISOR®
Robotic URCap cannot be used anymore. The robot program as well as the robot installation file need to
be rewritten.
In many cases some program lines, that are not related to URCap commands (e.g. move commands or
waypoints) can be copied from programs made with the older URCap to the a new program, but there is
no guarantee that programs written with the older version will work.
That means for the installation of the VISOR® Robotic URCap Version 1.4.0 on systems that had previous versions of this
URCap installed:
4
VISOR® Robotic URCap Step-by-Step
Figure 1 shows the main screen of the VISOR® Robotic URCap installation node. It can be found in the installation screen
of the UR PolyScope software. The installation node allows to set the main parameters of the URCap that need to be set
for various robot programs and can be used to perform some comfort functions like live image, jobset backup etc..
The URCap can be enabled and disabled by respectively checking and unchecking the checkbox “Enable URCap”. In order
for this setting to be taken into account after the restart of the robot, the user must save the robot installation (Figure 2).
In the “Installation” tab, click on the “Load/Save” submenu and save the current installation by saving it (“Save” or “Save
As…” button).
5
VISOR® Robotic URCap Version 1.4.0: User Manual
6
VISOR® Robotic URCap Step-by-Step
The Device Connected led shows the connection status. If a VISOR ® with valid URCap compatible jobset is connected to
the robot, its status is “green”. Only in this case the robot program can be run.
7
VISOR® Robotic URCap Version 1.4.0: User Manual
If the device has been selected from the list, but has not yet been connected to, its network settings can be changed:
Sensor Name
IP-address
Subnet-mask
Settings defined by DHCP server (if available in the network).
Please note
After selecting the “Auto connect” function, the “robot installation” has to be saved (cf. chapter 2, Figure
2). This rule goes for any changes in the installation node of the VISOR® Robotic URCap.
8
VISOR® Robotic URCap Step-by-Step
The installation node “Jobs” of the VISOR® Robotic URCap (Figure 6) provides information on the job that is currently
active on the VISOR® and it allows to the change the currently active job. Furthermore, backups can be performed and
the VISOR® Robotic job templates can be played onto the VISOR®.
The VISOR® Robotic URCap updates the list of active jobs each time the VISOR® is connected. If you have added new
jobs in SensoConfig while the URCap was connected to the VISOR ®, please press “Retrieve job list” before continuing
programming the URCap. This way, the new jobs can also be selected in the program nodes.
9
VISOR® Robotic URCap Version 1.4.0: User Manual
For uploading a jobset to the VISOR®, press the button “Upload jobset” and select the jobset from the list.
10
VISOR® Robotic URCap Step-by-Step
The jobsets are stored on the VISOR® depending on the firmware version and hardware mode (V20 Allround
Professional, V20 Robotic etc.). If you cannot see a jobset you previously stored, please check the following points:
Are you using the same VISOR ® model (V10 / V20; monochrome / color; Robotics / Allround)?
Are you using the same firmware version of the VISOR® as when you saved the jobset on the robot?
Please note that these backup jobsets are solely for made for making quick and simple backups on the robot controller.
We still recommend using SensoConfig for storing the jobsets after creation. The SensoConfig jobset-files can be easily
converted to other firmware versions or product models.
URCap_Job-Template_Contour_Cal_Plate_200mm:
This Jobset templates contains a job template with the settings described in the chapter “QuickStart”.
An automatic (re-)calibration can be performed in the robot program, e.g. using the point-pair-list calibration
method. The necessary robot program could e.g. be stored in a subroutine of the main robot program and be
started based on an event sent by the machine control to the robot.
When the camera is mounted to the robot arm, for each image acquisition pose and working area, a different
calibration is necessary. By having the calibration saved in the VISOR ®s job, such applications can be easily solved.
In the tab “Calibration” of the installation node (Figure 8), the calibration settings can be copied from the active job to one
special or all other jobs on the VISOR®. This way, the actual calibration only has to be performed once.
11
VISOR® Robotic URCap Version 1.4.0: User Manual
For displaying live images, the VISOR® has to be in “run mode”, i.e. it cannot be worked in SensoConfig in parallel. The
image transmission is limited by the bandwidth of the connection and other factors. Thus, small delays cannot be
completely avoided.
12
VISOR® Robotic URCap Step-by-Step
Point-Pair-List
Robot picks a part, places it at several positions inside the field of view and sends the world coordinates to the
VISOR® for generating the correlation between camera and world coordinates.
Applications:
calibration of large fields of view, automatic (re-) calibration of the system.
For details on the settings, please refer to chapter 8.1.5 of the VISOR ® user manual. For getting started quickly, we
recommend using the easier to use calibration plate method.
13
VISOR® Robotic URCap Version 1.4.0: User Manual
SensoPart offers four different calibration plates, width: 50 mm, 100mm, 200mm, 500mm. They can be ordered from
SensoPart as aluminum-composite plates.
Please note
A pdf-file containing the calibration pattern can be found in the .pdf files that come with the VISOR ®
installation “[VISOR® Installation Directory]\Documentation\Calibrationplates”.
When printing, please print the files with the printer set to the setting “actual size”.
Figure 11 displays the process for the calibration with the calibration plate. The user places calibration plate into the field
of view of the camera and moves robot to the four fiducial coordinates. Then the VISOR® takes an image of the
calibration pattern in order to actually perform the calibration. Using the VISOR® Robotic URCap the user is guided
through the process step-by-step.
The step-by-step description for setting up the calibration process with the programming node “VISOR® Calibrate” and
the “Calibration Plate” can be found in chapter 3 of the quickstart manual.
The VISOR® program nodes “Calibrate with the Calibrations Plate” follows the PolyScope programming principles and
guides you through the process step-by-step (Figure 12).
The upper part of the routine defines the parameters and boundary conditions for the process. These nodes require your
input, action or decision [A].
The lower part of the routine [C] contains the “Calibration Plate Process”. It will be running automatically if all necessary
inputs in part [A] have been made.
Here you can integrate your own commands at any position, e.g. additional move commands or gripper controls.
14
VISOR® Robotic URCap Step-by-Step
The process can be started once all nodes are marked green
In order to run the process, the 9 calibration poses have to be defined as well as a start/exit pose, the image acquisition
pose, an intermediate pose that allows to safely reach the calibration poses and the object pose, from where the
reference object is taken. Figure 13 graphically displays these poses that have to be defined by the user and the robot
movements of the point pair list calibration process. The robot program starts at a start/exit pose. From here, it moves to
a pose where a reference object can be picked from. This object is placed at nine calibration poses that have been defined
by the user. At each of these poses, the VISOR® takes an image and estimates the position of the object in image
coordinates. This way in the VISOR® software a list, the point pair list, is built up. This list contains two pairs of coordinates
(x, y)world and (x, y)pixel. By correlating these values, the VISOR® is calibrated.
15
VISOR® Robotic URCap Version 1.4.0: User Manual
Figure 14 explains the structure of the calibration with the point pair list. The VISOR® program nodes “Calibrate with the
Point Pair List” follow the PolyScope programming principles and guides you through the process step-by-step. The upper
part of the routine defines the parameters and boundary conditions for the process. These nodes require your input,
action or decision [A].
The gripper controls for picking and releasing the reference part need to be added here, before running the
process.
Furthermore, you can integrate your own commands, e.g. additional move commands for more precisely defining
the movement of the robot.
The process can be started once all nodes are marked green
16
VISOR® Robotic URCap Step-by-Step
17
VISOR® Robotic URCap Version 1.4.0: User Manual
Please note
Please make sure, the calibration parameters of
the jobset on the VISOR® have been
configured correctly and accordingly to the
choices made in the URCap.
Calibration method (Calibration plate /
Point pair list)
calibration plate type (50 mm, 100 mm
200 mm, 500 mm)
z-offset
focal length
18
VISOR® Robotic URCap Step-by-Step
In the folder “Pose Setup”, the four initial robot poses for
the calibration have to be defined.
Please note
The robot moves in a direct linear movement
from the start pose to this pose. If an approach
pose above the object would be necessary
(e.g. because a bracket gripper is applied), you
can add this waypoint in the Point Pair List
process.
19
VISOR® Robotic URCap Version 1.4.0: User Manual
20
VISOR® Robotic URCap Step-by-Step
12. Once all the necessary poses have been set up, the
point pair list process runs automatically.
Within the process additional move commands
and waypoints can be added if necessary.
Make one robot program for the calibration and save it on the robot.
If the camera needs to be re-calibrated, you can load the program and run it.
21
VISOR® Robotic URCap Version 1.4.0: User Manual
If you are using other jobs with the same optical set-up, the calibration can be copied from the calibrated job to all other
or some jobs by using the function provided in the URCap’s installation node.
Attention
The image acquisition pose has to be the same as the image acquisition pose used in the VISOR® Pick
process (especially if the camera is mounted to the robot arm).
We therefore recommend storing this pose either as a feature variable in the installation of the robot (Figure 15 a.)) or as
a waypoint in the program (Figure 15 b.))
Figure 15: Storing the important poses as features in the installation node.
3.4.1 Requirements to the Optical Set-Up and the Usage of the Calibration Plate
If you have problems detecting the calibration plate (calibration fails) or the deviation values are too large (typical values <
2px), please consider VISOR® manual, chapter 8.1.5.4 on how to optimize the calibration using calibration plates. If you
notice very large deviations for the fiducial
When using the calibration plate, please note the following boundary conditions:
22
VISOR® Robotic URCap Step-by-Step
To perform a calibration, at least one search pattern must be found. For small calibration patterns, it may be
necessary to use at least two search patterns.
The calibration pattern should cover the entire field of view of the VISOR® vision sensor. For a successful, precise
calibration it’s not necessary that the entire calibration plate is visible.
That means the fiducials should not be seen in the image – only the calibration pattern!
For detailed information, please refer to the VISOR® user manual (chapter 8.1.5.4.)
23
VISOR® Robotic URCap Version 1.4.0: User Manual
Figure 16 shows the poses the VISOR® Pick Part process moves to:
Please Note:
The necessary settings of the VISOR® are described in chapter 4 of the quickstart manual.
24
VISOR® Robotic URCap Step-by-Step
The lower part of the routine [C] contains the “point pair list process”. It will be running automatically if all necessary
inputs in part [A] have been made.
The process can be started once all nodes are marked green
The gripper controls for picking and releasing the reference part need to be added here, before running the
process.
Furthermore, you can integrate your own commands, e.g. additional move commands for more precisely defining
the movement of the robot.
There are two branches “Object found” or “Object not found”, that are executed if a part has been found by the
camera or not. The branches can be filled with the respective actions
A step-by-step info on how to set up the VISOR® Pick process for a simple Pick-and-Place task can be found in chapter
3.3.4 of the quickstart manual.
25
VISOR® Robotic URCap Version 1.4.0: User Manual
4.2 VISOR® Pick Method Node / Behavior if no part has been found
When clicking on the VISOR® Pick Method child node, the user can specify several options:
A dedicated dropdown indicates the job the picking process will be executed within. See the configuration for the
standard job configuration.
The Choose frame dropdown will be described here.
Once the product detected, the robot goes to position returned by the VISOR® sensor. In order to smooth the
approach, the robot goes first to an intermediate pose before reaching the product itself. This intermediate pose is
located on top of the detected product relatively to an offset. This Z Offset can be specified by the user in mm.
The user can specify the behavior of the program if the object is not detected, by clicking on the related behavior:
By clicking on the Change method button, the current template is deleted and the user can then select between
the picking process. An additional popup appears in order to confirm this choice
26
VISOR® Robotic URCap Step-by-Step
visor_request_sending_success: indicates if the URCap has correctly sent the request to the VISOR® vision
sensor.
visor_request_success: indicates if the VISOR® vision sensor process requested has been successful.
visor_response: response of the VISOR® vision sensor triggered by the sent request (port 2006 or port 1998).
visor_output_data: output of the VISOR® vision sensor triggered by the sent request (port 2005). The output
format is the one specified in the telegram tab of the VISOR® SensoConfig software (see the Standard
configuration of a job). No output data is generated if the request sent to the VISOR® vision sensor failed.
visor_detector_success: indicates if the vision process, dedicated to the detection of a product, has been
successful. This variable is linked to the Overall result flag in the telegram definition (see the Standard configuration
of a job).
visor_pose: specifies in a UR pose format the 2D position of the detected object, as well as the gripping angle.
These variables are linked to the Pos. X, Pos. Y and Angle values in the telegram definition (see the Standard
configuration of a job). The returned pose has the following format:
visor_pose := p[Pos. X / 1000000, Pos.Y / 1000000, 0, 0, 0, d2r(Angle / 1000)]
visor_status_msg: indicates the status description of the communication with the VISOR® vision sensor following
the sent request.
All poses that have been taught two the system are being stored as variables and can be accessed using assignment nodes.
27
VISOR® Robotic URCap Version 1.4.0: User Manual
Figure 18: Tab Variables displaying the global variables defined by the VISOR® Robotic URCap
Six of these variables can be accessed in UR program, thanks to the dedicated Assignment programming node and the
available functions implemented by the URCap:
If the user has specified additional data in the payload of the job (see the Standard configuration of a job), these data can
be retrieved thanks to two dedicated functions implemented by the URCap. The use of these functions suppose that the
user knows in which order different payload variables are organized, as well as the types of these data.
getVisorStringOutput(index):
access to the payload data, as a string, at index index. The indices start at 0 and the function returns the “Overall
result”, “Pos X”, “Pos Y” and “Angle” variables if the index index is respectively equal to 0, 1, 2 and 3.
getVisorIntegerOutput(index):
access to the payload data, as an integer, at index index. The function returns an array of 2 integer values. The first
value can be seen as a Boolean, specifying if the value conversion into an integer is valid. If the conversion is valid,
the second value is the converted integer. The indices start at 0 and the function returns the “Overall result”, “Pos
X”, “Pos Y” and “Angle” variables if the index index is respectively equal to 0, 1, 2 and 3.
28
VISOR® Robotic URCap Step-by-Step
During the execution of a program implying the use of the URCap VISOR® Robotic, every command request to the
VISOR® vision sensor and all the response data from it are logged in the Log tab of the UR controller.
29
VISOR® Robotic URCap Version 1.4.0: User Manual
30
VISOR® Robotic URCap Step-by-Step
Basically the VISOR® can locate objects with a dedicated detector or with a position alignment.
Detectors:
Detectors are the basic image processing functions of the VISOR ®. In each job, up to 255 detectors can be
executed, but the VISOR® Robotic URCap only accepts coordinates for one part per image.
For robot applications, a detector that returns calibrated coordinates is necessary. The detectors returning
calibrated coordinates are: Contour, Pattern matching, BLOB, caliper
If no additional checks on the part are necessary, using a detector will be the standard method for locating an
object.
Position Alignment:
The position alignment locates the object in the image and this position can be used by the robot.
Additionally it creates a new coordinate system (the so called alignment frame):
o The origin of the alignment frame is the position of the object.
o The positions of all detectors related to the position alignment will be moved accordingly
Applications:
o The contour position alignment can be used for locating the part and for checking if this part can be safely
picked.
31
VISOR® Robotic URCap Version 1.4.0: User Manual
o It can also be used for performing further checks on the object to be picked, e.g. for sorting applications or
for quality checks
Methods:
o There are three methods for position alignment: edge, pattern matching and contour matching.
o The contour alignment comes with a special “gripping space” function, that
The following, most often applied detectors and alignments will be shortly described in the following sections of this
chapter. For a detailed description of these detectors, please refer to the VISOR ® manual:
2. Press “New”
32
VISOR® Robotic URCap Step-by-Step
33
VISOR® Robotic URCap Version 1.4.0: User Manual
The contour detector offers the possibility to add a result offset, too (cf. chapter 6.2.1).
Figure 24 shows the telegram settings for transmitting position and angle to the robot. The features can be selected in
output setup on the payload tab. Header, trailer and separator have to be set this way in order to be compatible to the
URCap (as described earlier).
Attention
When deleting a detector, the telegram settings will be deleted, too.
Figure 24: Telegram settings for transmitting position and angle to the robot
If a part is to be gripped on its outer contours, a further detector can be used to check the available space around
the part.
In sorting applications, the part can be located with the contour alignment and the sorting criteria can be checked
with the consecutive detectors (e.g. has the object the right label, color mark etc.)
34
VISOR® Robotic URCap Step-by-Step
There can be only one contour position alignment per job. When the part moves on subsequent triggers, the contour
alignment will move all detectors so that they are on the configured position on the part. There can be up to 255
detectors per job.
Figure 25 shows how to use the contour alignment for detecting an object position. Just move the green box around the
object contour to be detected. The yellow box defines the search region. An object is being taken into account if its
center is within the search region.
Figure 25: Setting up the contour alignment for detecting an object position
Figure 26 shows the telegram settings for transmitting position and angle to the URCap.
35
VISOR® Robotic URCap Version 1.4.0: User Manual
Figure 26: Telegram settings for transmitting position and angle to the robot
Origin of the
coordinate system of
the contour alignment
Resulting position (center of the green
sent to the robot, can region of interest)
be freely defined,
when setting up the
result offset correction.
Figure 27: Result offset for freely defining the object position relative to the detected position
The contour alignment identifies those objects as candidates whose contour matches the taught-in contour.
36
VISOR® Robotic URCap Step-by-Step
These candidates will be sorted and the first one in the list will be used. By default they are sorted by the score-value, i.e.
the level of accordance between taught-in contour and the contour of the candidates.
However, when using the gripping space function, further criteria can be applied allowing to only return objects that
actually can be picked. The sorting takes place according to the values of “Sorting criteria” and “Sorting order” set in the
“Gripping space” tab (Figure 28).
37
VISOR® Robotic URCap Version 1.4.0: User Manual
Figure 29: Logical links for setting up the gripping space check.
The set-up of this detector follows two steps. In step one object and background are being distinguished from each other.
Step 2 is for selecting the features to distinguish the BLOBs.
38
VISOR® Robotic URCap Step-by-Step
Figure 32 shows the telegram settings for transmitting position and angle to the robot. The features can be selected in the
payload tab. Header, trailer and separator have to be set this way in order to be compatible to the URCap.
39
VISOR® Robotic URCap Version 1.4.0: User Manual
Please note
The BLOB detector can be used in order to transmit positions of more than one object and it is in this
setting (“No. of results = 0”) by default.
However, for communication with the URCap only one result may be transmitted. Thus, the “No. of
results” has to be set to “1”.
Figure 32: Telegram settings for transmitting position and angle to the robot
40
VISOR® Robotic URCap Step-by-Step
7 Troubleshooting
Problem Possible reason Solution
VISOR® Software: If the calibration is active in the If you cannot set up your detectors, please check the
All detector results are “fail” / job settings of the VISOR®, but calibration settings.
Detectors cannot be set up the calibration is not valid, all The VISOR® Robotic template jobset comes with a
detector results are job where the calibration is active and valid i.e. green
automatically “fail”. status (although of course the actual calibration needs
to be performed using the robot.
If you start the jobset from the scratch, please set up
the detectors first and select the calibration methods
as a last step before starting the sensor.
The VISOR® does not send any The detector in the QuickStart When using a different / new detector, the telegram
data to the robot when triggered jobset has been deleted and a settings need to be set up again
by visor_trig() or visor_terminal() new one added.
The part is being moved slightly TCP settings: Robot TCP is not Check
by gripper properly set up: the TCP settings (robot)
During picking the z-offset (VISOR® - SensoConfig calibration
During camera calibration parameters, see VISOR® help/manual)
Calibration z-offset is not the detector settings (VISOR® - SensoConfig setup
correctly entered. step Detector / tab “result offset”)
Detector is not correctly set up: center point, result offset
center point / result offset is not
correctly defined
Sensor cannot be found / Electrical and network Is the VISOR® connected to power (green LED
Difficulties with auto-connection connection on its backside is on)?
Network settings Are the Ethernet cables connected?
Are the network settings of the robot (see robot
manual) correct?
Are the network settings of the VISOR® correct?
Can it be found with the SensoFind software?
Are both components part of the same IP address
space?
The gateway of the robot has to be set (in
Polyscope “Setup Robot” > “Network” tab): The
field should not be left blank. If there is a gateway
in the system, its IP-address has to be entered. If
not, the ip address of the robot should be written
into the field.
The robot is physically connected Is the VISOR® connected to power (green LED on
to a sensor, but it cannot be its backside is on)?
found Are the Ethernet cables connected?
Are the network settings of the robot (see robot
manual) correct?
Are the network settings of the VISOR® correct?
Can it be found with the SensoFind software?
Are both components part of the same IP address
space?
41
VISOR® Robotic URCap Version 1.4.0: User Manual
The connection status is “yellow”. No valid jobset is installed on Install valid URCap quickstart template (wich should
the VISOR®; Ethernet is not contain the correct settings by default) or set up the
enabled. interfaces in SensoConfig > setup step Output > tab
Interfaces correctly:
Ethernet enabled
Ports are set to IN:2006, OUT:2005
Jobset that was stored on the Are you using the same
VISOR® previously is not VISOR® model (V10 / V20; monochrome / color;
displayed. Robotics / Allround)?
Firmware version of the VISOR® as when you
saved the jobset on the robot?
The robot moves to a wrong The result offset is not defined In the pose setup, the offset only affects Z direction
gripping position correctly. and the angle. Please set a result offset for X and Y
(see chapter 6.2.1)
42
VISOR® Robotic URCap Step-by-Step
Notes
43
Note
W e re s e rv e t he ri g ht t o m od i f y , a d d o r re m o v e c o nt e nt s of t hi s
d oc u m e nt a t a n y t i m e w i t h o ut p r i o r n ot i ce .
P a s s i ng a nd re p rod u ct i o n of t h i s d oc u m e nt , us e a nd d i s cl os ur e of i t s
co nt e nt a r e p r oh i b i t e d u nl e s s e x p r e s s l y p e rm i t t e d .
Ge rmany
Se ns oP a rt
I nd us t ri e s e ns or i k G m b H
7 9 2 8 8 G ot t e n he i m
Te l .: + 4 9 7 6 6 5 9 4 7 6 9 - 0
06814830 - 18.06.2020 – 01
i nf o @s e ns op a rt .d e