Sample of Vision Control
Sample of Vision Control
MELFA ROBOT
Sample program operating manual for vision system
CR750/CR751 series controller
CRn-700 series controller
BFP-A8937-A
Safety Precautions
Always read the following precautions and the separate
"Safety Manual" before starting use of the robot to learn the
required measures to be taken.
All teaching work must be carried out by an operator who has received special training.
CAUTION (This also applies to maintenance work with the power source turned ON.)
Enforcement of safety training
For teaching work, prepare a work plan related to the methods and procedures of
CAUTION operating the robot, and to the measures to be taken when an error occurs or when restarting.
Carry out work following this plan. (This also applies to maintenance work with the power
source turned ON.)
Preparation of work plan
Prepare a device that allows operation to be stopped immediately during teaching work.
WARNING (This also applies to maintenance work with the power source turned ON.)
Setting of emergency stop switch
During teaching work, place a sign indicating that teaching work is in progress on the start
CAUTION switch, etc. (This also applies to maintenance work with the power source turned ON.)
Indication of teaching work in progress
Provide a fence or enclosure during operation to prevent contact of the operator and robot.
WARNING Installation of safety fence
Establish a set signaling method to the related operators for starting work, and follow this
CAUTION method.
Signaling of operation start
As a principle turn the power OFF during maintenance work. Place a sign indicating that
CAUTION maintenance work is in progress on the start switch, etc.
Indication of maintenance work in progress
Before starting work, inspect the robot, emergency stop switch and other related devices,
CAUTION etc., and confirm that there are no errors.
Inspection before starting work
The points of the precautions given in the separate "Safety Manual" are given below.
Refer to the actual "Safety Manual" for details.
Use the robot within the environment given in the specifications. Failure to do so could
CAUTION lead to a drop or reliability or faults. (Temperature, humidity, atmosphere, noise
environment, etc.)
Transport the robot with the designated transportation posture. Transporting the robot in
CAUTION a non-designated posture could lead to personal injuries or faults from dropping.
Always use the robot installed on a secure table. Use in an instable posture could lead to
CAUTION positional deviation and vibration.
Wire the cable as far away from noise sources as possible. If placed near a noise source,
CAUTION positional deviation or malfunction could occur.
Do not apply excessive force on the connector or excessively bend the cable. Failure to
CAUTION observe this could lead to contact defects or wire breakage.
Make sure that the workpiece weight, including the hand, does not exceed the rated load
CAUTION or tolerable torque. Exceeding these values could lead to alarms or faults.
Securely install the hand and tool, and securely grasp the workpiece. Failure to observe
WARNING this could lead to personal injuries or damage if the object comes off or flies off during
operation.
Securely ground the robot and controller. Failure to observe this could lead to
WARNING malfunctioning by noise or to electric shock accidents.
Indicate the operation state during robot operation. Failure to indicate the state could
CAUTION lead to operators approaching the robot or to incorrect operation.
When carrying out teaching work in the robot's movement range, always secure the
WARNING priority right for the robot control. Failure to observe this could lead to personal injuries or
damage if the robot is started with external commands.
Keep the jog speed as low as possible, and always watch the robot. Failure to do so
CAUTION could lead to interference with the workpiece or peripheral devices.
After editing the program, always confirm the operation with step operation before
CAUTION starting automatic operation. Failure to do so could lead to interference with peripheral
devices because of programming mistakes, etc.
Make sure that if the safety fence entrance door is opened during automatic operation,
CAUTION the door is locked or that the robot will automatically stop. Failure to do so could lead to
personal injuries.
When the robot arm has to be moved by hand from an external area, do not place hands
WARNING or fingers in the openings. Failure to observe this could lead to hands or fingers catching
depending on the posture.
i
Do not stop the robot or apply emergency stop by turning the robot controller's main
CAUTION power OFF. If the robot controller main power is turned OFF during automatic operation,
the robot accuracy could be adversely affected.Moreover, it may interfere with the
peripheral device by drop or move by inertia of the arm.
Do not turn off the main power to the robot controller while rewriting the internal
CAUTION information of the robot controller such as the program or parameters.
If the main power to the robot controller is turned off while in automatic operation or
rewriting the program or parameters, the internal information of the robot controller may be
damaged.
ii
Revision history
Date of print Specifications No. Details of revisions
2012-11-30 BFP-A8937-* First print
2013-01-07 BFP-A8937-A Error in writing correction.
iii
[Contents]
1. OVERVIEW........................................................................................................................................................................................................1-2
1.1. OVERVIEW.................................................................................................................................................................................................1-2
1.2. METHOD OF IMAGE RECOGNITION....................................................................................................................................................1-2
1.3. BASIC METHOD OF POSITIONAL CORRECTION .............................................................................................................................1-3
2. CAMERA SETUP............................................................................................................................................................................................2-9
iv
1. Overview
1.1. Overview
Thank you for downloading this sample program for the vision system.
This sample program supports the installation of a production ready vision system.
This manual assumes that you have knowledge of the MELFA-BASIC V programming
language, and the knowledge of the operational procedures for EasyBuilder software.
Before using this sample program, please read the following manuals well so that you utilize
the contents of these manuals when using this sample program.
1) “Detailed explanations of functions and operations” (BFP-A8661/BFP-A8869)
2) “Mitsubishi Robot Tool for EasyBuilderInstruction Manual” (BFP-A8820)
1-2
1.3. Basic method of positional correction
(1) Stationary Camera Correction Method1
Camera
Center of
robot tool
Result of image
Workpiece recognition
(1 – 2) Camera Calibration
The calibration of the camera does the teaching at the tool center of the robot.
Center of
robot tool
1-3
(1 – 3) When the center of hand tool position is not corresponding
to the center of robot tool position
When the center of hand tool position is offset from the center of robot tool position, the
working position of robot is not corresponding to the result position of vision sensor.
Center of
robot tool
Camera
Center of
hand tool
Result of image
recognition
Workpiece
(1 - 4) The solution to the difference between the center of hand tool position
And the result position of vision sensor.
There are two solutions below to solve when the center of hand tool position is offset from the
center of robot tool position.
1. Change the robot tool position to the hand tool position by using the “TOOL” instruction.
(Perform calibration after setting the tool data.)
2. Calculate the relative position between the result position of image recognition and the
teaching position.
At first, teach the operation position of robot, then recognize the workpiece with the camera.
By using both samples of position data, you can then calculate the relative position
between both positions.
ex) PBASE=INV(PBVISION)*PTEACH
PSAGYO=PVISION*PBASE
PSAGYO.Z=100
MOVE(PSAGYO)
1-4
(1-4-1) Change the robot tool position to the hand tool position by using the “TOOL” instruction
Change the tool position of robot from the center of the J6 axis flange, to the center of the
hand tool position by using the “TOOL” instruction. Then the recognition result of camera is
outputted as the robot working position.
To perform calibration, adjust the top of the hand tool to the center of the calibration point after
setting the tooldata by using the “TOOL” instruction.
1-5
(1-4-2) Calculate the corrective position
1) Teach the robot operation position on the workpiece. Then recognize the taught
workpiece with the camera.
2) Calculate the relative position between the result position of image recognition and the
teaching position as the corrective position.
Result position of
image recognition
Workpiece (PBVISION)
3) Calculate the operation position of the robot from a relative calculation between the result
position of image recognition and the corrective position.
Result position of
image recognition
with teaching Result position of
operation image recognition
(PBVISION) (PVISION)
Workpiece
Teaching position
(PMGET)
Hand Hand
Workpiece
Camera
Camera
Workpiece Center of
camera
coordinate
1) Calculate the based operation position of the robot, as robot coordinate system data from
the result position of camera and the camera setting position.
Center of Center of
Result of image recognition on camera robot tool
camera coordinate system coordinate
(PVSDATA )
1-7
2) Calculate the relative position between the result position of imaging recognition and the
teaching position
3) Calculate the operation position of the robot from a relative calculation between the result
position of image recognition and the based corrective position.
Workpiece
Teaching position
(PMGET)
Workpiece Hand
Hand
Operative position
(PBPK=PVISION*PNFROF)
(PBPK=PMTCVS*PCAMTL*PVSDATA*PNFROF)
1-8
2. Camera setup
2.1. Stationary Camera Correction Method 1 (Upper side)
2.1.1. Program construction
Program
Description Explanation
name
1 Operation program Transporting the workpiece.
Workpiece coordinate system
This program registers the result of workpiece recognition and the
C – robot coordinate system
taught grab orientation at the pick position.
matching program
TL Tool setting program This program calculates the tool position of hand.
This program has variables for teaching and adjustment of each
WK* Teaching and adjustment
model. (WK1 - )
BASE Base program This program defines the User-defined external variables.
Start of operation
1. Parameter Setting
Setting the use base program (Parameter Name ”PRGUSR” Input data “BASE”)
5. Workpiece recognition and Teaching ( “WK1” and “C” program) Refer to “2.1.6.”
Calculate the relationship between the position of a workpiece recognized by the vision sensor and
the position which the robot grabs the workpiece.
End of operation
2-9
2.1.3. Robot Tool Setting (“TL” program)
The tool data of the robot is calculated by adjusting the top of the hand tool to the center of
the calibration point.
The “TL” program is a program to calculate the tool data of the robot.
2-10
2.1.4. Camera Calibration Setting
Mitsubishi N-Point Calibration makes the
calibration data by using the same points (from
2 to 16) that become a pair on the screen, and
the robot coordinates. Adjust the tip of the robot
tool to center of calibration marking.
User-Defined Point
(2)
(1)
(3)
(4)
(5)
2-11
7) Selects the target point
(1) At first, select the [Point 0] in [Edit Tool – Mcalib_1] pane and move the User-Defined
Point onto the center of the calibration marking in the EasyBuilder view pane.
Then press the [Enter] key.
(2) Repeat the same procedure for Point1 to Point3 moving to each calibration marker.
(1)
(2)
(4)
(5)
2-12
(7) To export the calibration data, click the “Export” button after specifying the file name in the
calibration screen. The “Pass: Export Complete” message is displayed in the palette after
the export is completed normally.
The exported file is preserved in the vision sensor. It is necessary to import the calibration
data when making the Job file.
(7)
(7)
2-13
2.1.5. Make JOB program for vision sensor
Trains the image of the target model. The vision sensor assigns pixel locations to identified
workpiece features, that then are to be translated into robot coordinate system data. Please
import the calibration data when you make a new job.
Train image
2-14
2.1.6. Workpiece Recognition and Teaching (“C” program and “WK1” program)
At first, teach the operation position (pickup position or placement position) of workpiece.
Then recognize the taught workpiece by vision sensor and register the result data as based
recognition data.
(3) Change the vision program name entered after “C_CPRG$=” in the program.
2-15
2.1.7. Setting of Adjustment Variables (“1” program)
This chapter explains operations required to run “1” program.
Note) When your controller has no operation panel, use the dedicated external signals to operate
the robot.
(3) Ending
The robot does not move unless a vision sensor recognizes a workpiece. Stop the flow of
workpieces from the upstream and press the [STOP] button of the operation panel of the robot
controller. Confirm that the [STOP] lamp is turned on.
Note) The robot of the specification without the operation panel of the controller is stopped by the
external signal.
2-16
2.2. Hand Camera Correction Method
2.2.1. Program construction
Program
Description Explanation
name
1 Operation program Transporting the workpiece.
Start of operation
1. Parameter Setting
Setting the use base program (Parameter Name ”PRGUSR” Input data “BASE”)
5. Workpiece recognition and Teaching ( “WK1” and “C” program) Refer to “2.2.5.”
Calculate the relationship between the position of a workpiece recognized by the vision sensor and
the position which the robot grabs the workpiece.
End of operation
2-17
2.2.3. Hand Camera Tool Setting (“TLCAM” program)
The relative position of the hand camera is calculated to adjust the center of the field of the
hand camera to the center of the calibration point.
The “TLCAM” program is a program to calculate the relative position from the tool position of
the robot to the center of the field of the hand camera.
"
2) Draw the cross line on the center of the camera image field of view.
Select the [ Inspect Part] in [Spplication Steps] pane. Then select the [User-Defined Line]
in [Add Tool] pane. And press the [Add] bottun.
2-18
The starting and ending positions of the two user-defined lines depend on the number of pixels
of the camera.
Horizontal line
Start point : 0, 240
End point : 640, 240
<Notice> When Drag and Droping the ”MELFA_Calib.job“ of the sample job program to the
EasyBuilder view pane, the Green cross line is shown.
4) Set the controller [MODE] switch to “MANUAL”. Then set the T/B [ENABLE] switch to”ENABLE”.
5) Open the robot program “TLCAM” using T/B.
6) The posture of the robot hand is directed downward by operation of “Aligning the Hand” on T/B.
7) Attach the Camera Tool Adjustment sheet (Drawn cross line) into the robot operation area.
8) Operate the step execution according to the comment directions in the robot program.
8-1) Move the robot to the position at the center of camera cross lines and adjust to the sheet by
JOG operation using the T/B.
8-2) Next “(3) Only X axis and …..” execute step feed too.
The robot moves to the position in which C axis is rotated by 90 degrees.
2-19
8-3) Move the robot to the position at the center of camera cross lines and adjust- to the sheet by
JOG operation using the T/B.
8-4) Next “(4) Only the C axis is …..” execute step feed too.
8-5) The Tool data of the robot is changed to the center of the camera from the center of the robot
flange.
*Confirm that both centers do not shift even if the robot is operated using C axis JOG operation.
2-20
2.2.4. Hand Camera Calibration Setting (“B” program)
The calibration of the Hand Camera is performed when the robot moves from the aligned position
of the center of the camera and adjustment mark to each of the four positions in all directions.
The moving distance of the robot is set to less than half of the cameras field of view.
(1) Start Position (2) First point (3) Second point (3) Third point (4) Fourth point
2-21
(5) Input the IP - Address of the robot controller.
P0 (Teaching position)
(1) (2)
(3)
(4)
(5)
(a)(c)
(c)
(a)
2-22
(d) Execute step feed to “14 Mov P0*(-10.00,+0.00,+0.00,+0.00,+0.00,+0.00) '2nd point” in
robot program “B”.
The robot moves to the second position for calibration setting.
Then press the “Trigger” button. The camera image is displayed.
(e) Click the “Point1” line in “Edit Tool – Mcalib_1” pane and move “User-Defined Point” to
center of the calibration sheet. Then press the “Enter” key.
(a)(e)
(a)
(e)
(a)(g)
(g)
(a)
(a)(i)
(i)
(a)
2-23
(j) To export the calibration data, click the “Export” button in the calibration screen.
The information is displayed in the palette pane after the export is completed normally.
(j)
(j)
(1)
2.2.5. Workpiece Recognition and Teaching (“C” program and “WK1” program)
At first, teach the operation position (pickup position or placement position) of the workpiece.
And then, recognize the taught workpiece with the vision sensor and register the result data as
based recognition data.
2-24
< Notes when teaching the work recognition position “PCVS*”. >
The focus at camera tool adjustment position has been adjusted to be the upper surface of
operation table. If workpiece is put at the camera recognition area, the focus is not appropriate at
the upper surface of workpiece. Therefore, it is necessary to raise the camera position when
recognising the workpiece. Keep the camera work distance between the recognition position of
robot and the upper surface of workpiece consistant.
PCVS*
PCVS* PCVS*
(3) Change the vision program name entered after “CPRG$=” in the program.
2-25
2.3. Stationaly Camera Correction Method 2 (Lower side)
2.3.1. Program construction
Program
Description Explanation
name
1 Operation program Transporting the workpiece.
1. Parameter Setting
Setting the use base program (Parameter Name ”PRGUSR” Input data “BASE”)
5. Workpiece recognition and Teaching ( “WK1” and “C” program) Refer to “2.3.5.”
Calculate the relationship between the position of a workpiece recognized by the vision sensor and
the position which the robot grabs the workpiece.
End of operation
2-26
2.3.3. Robot Tool Setting (“TL” program)
The tool data of the robot is calculated by adjusting the top of the hand tool to the center of
camera viewing field. Move the robot to the position of height for which the focus is suitable.
The tool position of the robot hand is calculated to adjust the center of the viewing field of
camera and the top of the robot tool.
The “TL" program is a program to calculate the tool data of the robot.
The tool data of the robot is calculated by two teaching points, as described in the following
figures.
2-27
8) Operate the step execution according to the comment directions in the robot program.
8-1) Move the robot to the position that aligns the center of cross line of camera and the top of the
caliblation jig of the robot by using JOG operation with the T/B.
8-2) Next “(3) Only X axis and …..” execute step feed too.
The robot moves to the position in which the C axis is rotated by 90 degrees.
8-3) Move the robot to the position that aligns the center of the cross line of the camera and the
top of caliblation jig of the robot by using the JOG operation with the T/B.
8-4) Next “(4) Only C axis is …..” execute step feed too.
8-5) The Tool data of the robot is changed to the center of the caliblation jig from the center of the
robot flange.
*Confirm that both centers do not shift even if the robot is operated by using the C axis JOG
operation.
2-28
2.3.4. Camera Calibration Setting (“B” program)
In this method, the camera is placed on the table
surface and directed upward. Therefore, when setting
the calibration, it is necessary to specify four points on
the space.
The calibration setting is executed at each of the four
points while moving the robot by each in the cameras
field of view. Move the robot to the point that is to be
focused on at the top of camera jig.
P0 (Teaching position)
(1) (2)
(3)
(4)
(5)
2-29
(6) Mitsubishi N-Point Calibration
Mitsubishi N-Point Calibration in this sample create the calibration data by using four points
that become a pair on the screen relating to robot coordinates.
(a) Execute step feed to “15 Mov P0*(+10.00,+0.00,+0.00,+0.00,+0.00,+0.00) '1st point” in
robot program “B”. The Robot moves to first calibration point.
(b) Click the “Point0” line in “Edit Tool – Mcalib_1” pane and move “User-Defined Point” to
center of calibration jig. Then press the “Enter” key.
(c) Press the “Get position” button. Current position of robot is input to “World X/Y” input area.
(b)(c)
(b)
(c)
(e)(f)
(e)
(f)
(h)(i) (h)
(i)
2-30
(j) Execute step feed to “18 Mov P0*(+0.00,-10.00,+0.00,+0.00,+0.00,+0.00) '4th point” in
robot program “B”. The Robot moves to fourth calibration point.
(k) Click the “Point3” line in “Edit Tool – Mcalib_1” pane and move “User-Defined Point” to
center of calibration jig. Then press the “Enter” key.
(l) Press the “Get position” button. Current position of robot is input to “World X/Y” input area.
(k)(l)
(l)
(k)
(m) To export the calibration data, click “Export” button in the calibration screen.
The information is displayed in the palette pane after the export is completed normally.
(m)
(mk
)
2-31
2.3.5. Workpiece Recognition and Teaching (“C” program and “WK1” program)
This method is used to correct the gap of the workpiece that the robot is holding.
At first, teach the pickup position and the placement position of workpiece.
Then, the workpiece that the robot is holding is recognized with a fixed camera and the result is
registered.
(3) Change the vision program name entered after “CPRG$=” in the program.
2-32
2.3.6. Calculation procedure of correction value of holding workpiece
The figure below is a diagram showing the procedure to correct the deviation of the workpiece
the robot is holding.
Notch
<STEP 1>
Inv(PTCVSDB)
PBFLNG=Inv(PTCVSDB) * P_TCVS
PBFLNG
Workpiece
2-33
<STEP 2>
PVHND : The relative position that gripping workpiece from result position of
vision recognition (PHVSDATA2) to the amount of relative movement
from vision recognition as gripping position of workpiece(PBFLNG).
PHVSDATA2 : The result of vision recognition.
PBFLNG : The amount of relative movement from the Inversed position of
recognition result of vision sensor to the position of vision
recognition. This variable is calculated on STEP 1
<STEP 3>
PHOSEI Inv(PVHND)
Workpiece
The position of robot that PBFLNG (Position when
assuming the workpiece
workpiece shifts)
is gripped at a correct
position in robot hand.
2-34