0% found this document useful (0 votes)
88 views39 pages

Sample of Vision Control

This document provides instructions for setting up and using the vision system for Mitsubishi industrial robots. It discusses two camera setup methods: 1) a stationary camera correction method using an upper side camera and 2) a hand camera correction method. The stationary camera method involves setting the robot tool, calibrating the camera, creating programs for workpiece recognition and teaching, and setting adjustment variables. It then describes performing automatic operation. The hand camera correction method is also discussed but no details are provided. Safety precautions for working with robots are also listed at the beginning.

Uploaded by

Quan Le
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
88 views39 pages

Sample of Vision Control

This document provides instructions for setting up and using the vision system for Mitsubishi industrial robots. It discusses two camera setup methods: 1) a stationary camera correction method using an upper side camera and 2) a hand camera correction method. The stationary camera method involves setting the robot tool, calibrating the camera, creating programs for workpiece recognition and teaching, and setting adjustment variables. It then describes performing automatic operation. The hand camera correction method is also discussed but no details are provided. Safety precautions for working with robots are also listed at the beginning.

Uploaded by

Quan Le
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 39

Mitsubishi Industrial Robot

MELFA ROBOT
Sample program operating manual for vision system
CR750/CR751 series controller
CRn-700 series controller

BFP-A8937-A
Safety Precautions
Always read the following precautions and the separate
"Safety Manual" before starting use of the robot to learn the
required measures to be taken.

All teaching work must be carried out by an operator who has received special training.
CAUTION (This also applies to maintenance work with the power source turned ON.)
Enforcement of safety training

For teaching work, prepare a work plan related to the methods and procedures of
CAUTION operating the robot, and to the measures to be taken when an error occurs or when restarting.
Carry out work following this plan. (This also applies to maintenance work with the power
source turned ON.)
Preparation of work plan

Prepare a device that allows operation to be stopped immediately during teaching work.
WARNING (This also applies to maintenance work with the power source turned ON.)
Setting of emergency stop switch

During teaching work, place a sign indicating that teaching work is in progress on the start
CAUTION switch, etc. (This also applies to maintenance work with the power source turned ON.)
Indication of teaching work in progress

Provide a fence or enclosure during operation to prevent contact of the operator and robot.
WARNING Installation of safety fence

Establish a set signaling method to the related operators for starting work, and follow this
CAUTION method.
Signaling of operation start

As a principle turn the power OFF during maintenance work. Place a sign indicating that
CAUTION maintenance work is in progress on the start switch, etc.
Indication of maintenance work in progress

Before starting work, inspect the robot, emergency stop switch and other related devices,
CAUTION etc., and confirm that there are no errors.
Inspection before starting work
The points of the precautions given in the separate "Safety Manual" are given below.
Refer to the actual "Safety Manual" for details.

Use the robot within the environment given in the specifications. Failure to do so could
CAUTION lead to a drop or reliability or faults. (Temperature, humidity, atmosphere, noise
environment, etc.)

Transport the robot with the designated transportation posture. Transporting the robot in
CAUTION a non-designated posture could lead to personal injuries or faults from dropping.

Always use the robot installed on a secure table. Use in an instable posture could lead to
CAUTION positional deviation and vibration.

Wire the cable as far away from noise sources as possible. If placed near a noise source,
CAUTION positional deviation or malfunction could occur.

Do not apply excessive force on the connector or excessively bend the cable. Failure to
CAUTION observe this could lead to contact defects or wire breakage.

Make sure that the workpiece weight, including the hand, does not exceed the rated load
CAUTION or tolerable torque. Exceeding these values could lead to alarms or faults.

Securely install the hand and tool, and securely grasp the workpiece. Failure to observe
WARNING this could lead to personal injuries or damage if the object comes off or flies off during
operation.

Securely ground the robot and controller. Failure to observe this could lead to
WARNING malfunctioning by noise or to electric shock accidents.

Indicate the operation state during robot operation. Failure to indicate the state could
CAUTION lead to operators approaching the robot or to incorrect operation.

When carrying out teaching work in the robot's movement range, always secure the
WARNING priority right for the robot control. Failure to observe this could lead to personal injuries or
damage if the robot is started with external commands.

Keep the jog speed as low as possible, and always watch the robot. Failure to do so
CAUTION could lead to interference with the workpiece or peripheral devices.

After editing the program, always confirm the operation with step operation before
CAUTION starting automatic operation. Failure to do so could lead to interference with peripheral
devices because of programming mistakes, etc.

Make sure that if the safety fence entrance door is opened during automatic operation,
CAUTION the door is locked or that the robot will automatically stop. Failure to do so could lead to
personal injuries.

Never carry out modifications based on personal judgments, or use non-designated


CAUTION maintenance parts.
Failure to observe this could lead to faults or failures.

When the robot arm has to be moved by hand from an external area, do not place hands
WARNING or fingers in the openings. Failure to observe this could lead to hands or fingers catching
depending on the posture.

i
Do not stop the robot or apply emergency stop by turning the robot controller's main
CAUTION power OFF. If the robot controller main power is turned OFF during automatic operation,
the robot accuracy could be adversely affected.Moreover, it may interfere with the
peripheral device by drop or move by inertia of the arm.

Do not turn off the main power to the robot controller while rewriting the internal
CAUTION information of the robot controller such as the program or parameters.
If the main power to the robot controller is turned off while in automatic operation or
rewriting the program or parameters, the internal information of the robot controller may be
damaged.

ii
Revision history
Date of print Specifications No. Details of revisions
2012-11-30 BFP-A8937-* First print
2013-01-07 BFP-A8937-A Error in writing correction.

iii
[Contents]

1. OVERVIEW........................................................................................................................................................................................................1-2

1.1. OVERVIEW.................................................................................................................................................................................................1-2
1.2. METHOD OF IMAGE RECOGNITION....................................................................................................................................................1-2
1.3. BASIC METHOD OF POSITIONAL CORRECTION .............................................................................................................................1-3

2. CAMERA SETUP............................................................................................................................................................................................2-9

2.1. STATIONARY CAMERA CORRECTION METHOD 1 (UPPER SIDE) ..........................................................................................2-9


2.1.1. Program construction..............................................................................................................................................................2-9
2.1.2. Setup Procedure .......................................................................................................................................................................2-9
2.1.3. Robot Tool Setting (“TL” program).............................................................................................................................. 2-10
2.1.4. Camera Calibration Setting .............................................................................................................................................. 2-11
2.1.5. Make JOB program for vision sensor.......................................................................................................................... 2-14
2.1.6. Workpiece Recognition and Teaching (“C” program and “WK1” program)......................................... 2-15
2.1.7. Setting of Adjustment Variables (“1” program)..................................................................................................... 2-16
2.1.8. Automatic Operation ............................................................................................................................................................ 2-16
2.2. HAND CAMERA CORRECTION METHOD ...................................................................................................................................... 2-17
2.2.1. Program construction........................................................................................................................................................... 2-17
2.2.2. Setup Procedure .................................................................................................................................................................... 2-17
2.2.3. Hand Camera Tool Setting (“TLCAM” program) ................................................................................................. 2-18
2.2.4. Hand Camera Calibration Setting (“B” program)................................................................................................. 2-21
2.2.5. Workpiece Recognition and Teaching (“C” program and “WK1” program)......................................... 2-24
2.3. STATIONALY CAMERA CORRECTION METHOD 2 (LOWER SIDE) ....................................................................................... 2-26
2.3.1. Program construction........................................................................................................................................................... 2-26
2.3.2. Setup Procedure .................................................................................................................................................................... 2-26
2.3.3. Robot Tool Setting (“TL” program).............................................................................................................................. 2-27
2.3.4. Camera Calibration Setting (“B” program).............................................................................................................. 2-29
2.3.5. Workpiece Recognition and Teaching (“C” program and “WK1” program)......................................... 2-32
2.3.6. Calculation procedure of correction value of holding workpiece.................................................................. 2-33

iv
1. Overview
1.1. Overview
Thank you for downloading this sample program for the vision system.
This sample program supports the installation of a production ready vision system.

This manual assumes that you have knowledge of the MELFA-BASIC V programming
language, and the knowledge of the operational procedures for EasyBuilder software.
Before using this sample program, please read the following manuals well so that you utilize
the contents of these manuals when using this sample program.
1) “Detailed explanations of functions and operations” (BFP-A8661/BFP-A8869)
2) “Mitsubishi Robot Tool for EasyBuilderInstruction Manual” (BFP-A8820)

1.2. Method of image recognition


There are explanations of the following three methods of image recognition in this manual.

(1) Stationary Camera Correction Method 1 (Upper side)


In this case, the robot will be holding the recognized workpiece,
and the camera will be mounted in a fixed location, directed
downward from above the robot.
When the positioning area of workpiece is fixed, this method
is used.

(2) Hand Camera Correction Method


In this case, the camera is installed in the robot hand, and the robot
moves to the recognition position to identify the workpiece
.
When the recognition area of a wide camera is required, or
when there are two or more recognition positions, this method is used.

(3) Stationary Camera Correction Method 2 (Lower side)


In this case, use to correct the placement position of a robot
gripped workpiece recognized by the camera, which is mounted
in a fixed location below the robot and directed upward.
This method is used for required recognition accuracy such as the
part mounting system and transport PCB boad system.

1-2
1.3. Basic method of positional correction
(1) Stationary Camera Correction Method1

(1 - 1) Result of Image Recognition


Result position of vision sensor is outputted as robot coordinate system.

Camera

Center of
robot tool

Result of image
Workpiece recognition

(1 – 2) Camera Calibration
The calibration of the camera does the teaching at the tool center of the robot.

Center of
robot tool

1-3
(1 – 3) When the center of hand tool position is not corresponding
to the center of robot tool position
When the center of hand tool position is offset from the center of robot tool position, the
working position of robot is not corresponding to the result position of vision sensor.

Center of
robot tool
Camera

Center of
hand tool

Result of image
recognition
Workpiece

(1 - 4) The solution to the difference between the center of hand tool position
And the result position of vision sensor.

There are two solutions below to solve when the center of hand tool position is offset from the
center of robot tool position.

1. Change the robot tool position to the hand tool position by using the “TOOL” instruction.
(Perform calibration after setting the tool data.)

2. Calculate the relative position between the result position of image recognition and the
teaching position.
At first, teach the operation position of robot, then recognize the workpiece with the camera.
By using both samples of position data, you can then calculate the relative position
between both positions.

ex) PBASE=INV(PBVISION)*PTEACH
PSAGYO=PVISION*PBASE
PSAGYO.Z=100
MOVE(PSAGYO)

1-4
(1-4-1) Change the robot tool position to the hand tool position by using the “TOOL” instruction
Change the tool position of robot from the center of the J6 axis flange, to the center of the
hand tool position by using the “TOOL” instruction. Then the recognition result of camera is
outputted as the robot working position.

Change the tool position from


robot flange to center of hand
tool by “Tool” instruction
Camera (Tool PTool1 )

Center of robot tool


after change tool
data
Result position of
image recognition
(PBVISION)
Workpiece

To perform calibration, adjust the top of the hand tool to the center of the calibration point after
setting the tooldata by using the “TOOL” instruction.

Change the tool position from


robot flange to center of hand
tool by “Tool” instruction
(Tool PTool1)

1-5
(1-4-2) Calculate the corrective position
1) Teach the robot operation position on the workpiece. Then recognize the taught
workpiece with the camera.
2) Calculate the relative position between the result position of image recognition and the
teaching position as the corrective position.

Relative position between Result


position of image recognition and
Camera teaching position
(PBASE=INV(PBVISION))*PTEACH)

Center of robot tool


Teaching position
(PTEACH)

Result position of
image recognition
Workpiece (PBVISION)

3) Calculate the operation position of the robot from a relative calculation between the result
position of image recognition and the corrective position.

Result position of
image recognition
with teaching Result position of
operation image recognition
(PBVISION) (PVISION)

Workpiece
Teaching position
(PMGET)

Hand Hand
Workpiece

Relative position between result


position of imaging recognition and Operative position
teaching position (PSAGYO=PVISION*PBASE)
(PBASE=INV(PBVISION))*PMGET)
1-6
(2) Hand Camera Correction Method

(2 -1) Output position of the vision sensor


Result position of hand camera is outputted as camera coordinate system.

Camera
Camera

Workpiece Center of
camera
coordinate

Result of image recognition on Camera


camera coordinate system view
(PCAMDATA)

Top view Side view

1) Calculate the based operation position of the robot, as robot coordinate system data from
the result position of camera and the camera setting position.

Center of Center of
Result of image recognition on camera robot tool
camera coordinate system coordinate
(PVSDATA )

Teaching position for


image recognition
(PMTCVS)
Camara setting
position
(PCAMTL)

Calculated result of Workpiece position on robot


coordinate system
(PBVISION=PMTCVS*PCAMTL*PVSDATA)

1-7
2) Calculate the relative position between the result position of imaging recognition and the
teaching position
3) Calculate the operation position of the robot from a relative calculation between the result
position of image recognition and the based corrective position.

Result position of image recognition


with teaching operation
(PBVISION)
(PBVISION=PMTCVS*PCAMTL*PVSDATA)
Result position of
image recognition
(PVISION)

Workpiece

Teaching position
(PMGET)

Workpiece Hand
Hand

Operative position
(PBPK=PVISION*PNFROF)
(PBPK=PMTCVS*PCAMTL*PVSDATA*PNFROF)

Relative position between result position of imaging


recognition and teaching position
(PNFROF=INV(PBVISION))*PMGET)
(PNFROF=INV(PVSDATA)*INV(PCAMTL)*INV(PMTCVS)*PMGET)

1-8
2. Camera setup
2.1. Stationary Camera Correction Method 1 (Upper side)
2.1.1. Program construction
Program
Description Explanation
name
1 Operation program Transporting the workpiece.
Workpiece coordinate system
This program registers the result of workpiece recognition and the
C – robot coordinate system
taught grab orientation at the pick position.
matching program
TL Tool setting program This program calculates the tool position of hand.
This program has variables for teaching and adjustment of each
WK* Teaching and adjustment
model. (WK1 - )
BASE Base program This program defines the User-defined external variables.

2.1.2. Setup Procedure

Start of operation

1. Parameter Setting
Setting the use base program (Parameter Name ”PRGUSR” Input data “BASE”)

2. Robot Tool Setting ( “TL” program) Refer to “2.1.3.”


Setting the tool coordinate of robot.

3. Calibration of Vision Coordinate and Robot Coordinate Systems Refer to “2.1.4.”


Change coordinate system from camera coordinate to robot coordinate system.

4. Vision Sensor Settings Refer to “2.1.5.”


Make the JOB of vision sensor

5. Workpiece recognition and Teaching ( “WK1” and “C” program) Refer to “2.1.6.”
Calculate the relationship between the position of a workpiece recognized by the vision sensor and
the position which the robot grabs the workpiece.

6. Setting of Adjustment Variables ( “1*”program) Refer to “2.1.7.”


Teach the Home position and transportation destination at system start-up.

7. Automatic Operation Refer to “2.1.8.”


In automatic operation, the robot operates via commands from the vision sensor control.

End of operation

2-9
2.1.3. Robot Tool Setting (“TL” program)
The tool data of the robot is calculated by adjusting the top of the hand tool to the center of
the calibration point.
The “TL” program is a program to calculate the tool data of the robot.

Figure2-1-3-1 C-axis is 0 degree C Figure2-1-3-2 C-axis is 90 degree C

(1) Operation procedure for tool setting


1) Mount a calibration jig on the hand of the robot.
2) Attach the calibration sheet into the robot operation area.
3) Set the controller [MODE] switch to “MANUAL”. Then set the T/B [ENABLE] switch
to”ENABLE”.
4) Open the robot program “TL” by T/B
At first it is necessary to initialize the Tool Data of the robot by executing the “Tool P_NTOOL”
instruction in program “TL”.
5) The posture of the robot hand is made downward by operation of “Aligning the Hand” on T/B.
6) Operating the step execution according to the comment directions in the robot program.
6-1) Move the robot to position right at center of the calibration marking by using the JOG
operation of the T/B.
6-2) Next “(3) Only X axis and …..” execute step feed to.
The robot moves to the position in which C axis is rotated by 90 degrees.
6-3) Move the robot to the position right at center of the calibration marking by only using the
JOG operation of the T/B for the X axis and Y axis.
6-4) Execute step feed to the “End”.
6-5) Close the program.

2-10
2.1.4. Camera Calibration Setting
Mitsubishi N-Point Calibration makes the
calibration data by using the same points (from
2 to 16) that become a pair on the screen, and
the robot coordinates. Adjust the tip of the robot
tool to center of calibration marking.

(1) Operation procedure for calibration setting


1) Mount a calibration jig on the hand of the
robot.
2) Set the controller [MODE] switch to “MANUAL”.
Then set the T/B [ENABLE] switch to”ENABLE”.
3) Start the EasyBuilder (In-Sight Explorer) and
set the vision sensor to “Offline”.
Select the [Live Video] of EasyBuilder and
display the picture from the vision sensor in
real-time. Refer to the “Mitsubishi Robot Tool for EasyBuilder Instruction Manual” for the
operation of Mitsubishi Robot Tool for EasyBuilder.
4) Attach the calibration sheet within the field of the vision sensor while checking the live image of
EasyBuilder.
5) Close [Live Video] of EasyBuilder and drag&drop the “MELFA_Calib.job” file to View Area of
EeasyBuilder.
6) Explanation of operation procedure of the Mitsubishi N-Point calibration tool is below.
(1) Select the [Inspect Part] in [Application Steps] pane.
(2) Select the [MCalib_1] in [Palette] pane.
(3) Select the [Settings] Tab in [Edit Tool – Mcalib_1] pane.
(4) Input the file name if need to change that.
(5) Input the IP-Address of the robot controller.

User-Defined Point

(2)
(1)

(3)
(4)

(5)

2-11
7) Selects the target point
(1) At first, select the [Point 0] in [Edit Tool – Mcalib_1] pane and move the User-Defined
Point onto the center of the calibration marking in the EasyBuilder view pane.
Then press the [Enter] key.
(2) Repeat the same procedure for Point1 to Point3 moving to each calibration marker.

(1)
(2)

(3) Switches the In-Sight Explorer(EasyBuilder View) to “Online”.


(4) Selects the camera coordinates related to the robot coordinate system.
(5) When “Get position” button is clicked after selecting the point in the list and moving the robot,
it is set into “World X/World Y”.
Getting the current position of the robot is possible only in online mode.
(6) Repeat the operation of (4) (5) for the remaining three points.

(4)

(5)

2-12
(7) To export the calibration data, click the “Export” button after specifying the file name in the
calibration screen. The “Pass: Export Complete” message is displayed in the palette after
the export is completed normally.
The exported file is preserved in the vision sensor. It is necessary to import the calibration
data when making the Job file.

(7)

(7)

2-13
2.1.5. Make JOB program for vision sensor
Trains the image of the target model. The vision sensor assigns pixel locations to identified
workpiece features, that then are to be translated into robot coordinate system data. Please
import the calibration data when you make a new job.

(1) Example of making Job file

Train image

Select “Import” on Calibration type pane.


Next, Select calibration file. (Select the millimeter as a unit.)

2-14
2.1.6. Workpiece Recognition and Teaching (“C” program and “WK1” program)
At first, teach the operation position (pickup position or placement position) of workpiece.
Then recognize the taught workpiece by vision sensor and register the result data as based
recognition data.

(1) Operation procedure


1) Set the workpiece into the search area of vision sensor.
2) Register workpiece to be recognized by a vision sensor and create a Job program.
3) Set the controller [MODE] switch to “MANUAL”. Then set the T/B [ENABLE] switch to”ENABLE”.
4) Open the robot program “WK1” using T/B.
5) Open the [Position data Edit] screen.
6) Display “PGET” in order to teach the pickup position of workpiece.
7) Move the robot to the pickup position of workpiece and teach it the position.
8) Repeat (6) – (7) when teaching the other positions below.
“PPUT” is place position of workpiece.
“P_HOME” is Home position of robot.
9) Specify a vision program to be started.
(1) Open the [Command Edit] screen.
<PROGRAM> WK1
1 '## Ver.A1 ########################
2 '# Data setting Program
3 '# NAME : WK1.prg
4 '# Date of creation/version :
EDIT DELETE 123 INSERT TEACH

(2) Display the command step shown in the following.


<PROGRAM> WK1
34 '##### Set the In-Sight data #####
35 C_PPRG$="Job1" 'Set the
36 C_CCOM$="COM3:" 'COM numb
37 '
EDIT DELETE 123 INSERT TEACH

(3) Change the vision program name entered after “C_CPRG$=” in the program.

10) Specify a communication line to be connected with vision sensor.


In the same way as in step 9), Specify the line opened for the robot controller may connect with
the vision sensor to the variable.
Change the COM Port name entered after “C_CCOM$=” in the program.
Example) “36 C_CCOM$="COM3:" 'COM number of communication line”
11) Using T/B, close the opened “WK1” program.
12) Open the robot program “C” using T/B.
13) Enter the model number in the X coordinate of the position variable “PNO” in the program.
14) Using T/B, close the opened “C” program.
15) Switches the In-Sight Explorer(EasyBuilder View) to “Online”
16) Run the “C” program automatically with the robot controller.
17) Confirmation after operation
Check the value of the following variable in program “WK1”.
(1) Value of “PCVSD” : Recognized result of Vision sensor.

2-15
2.1.7. Setting of Adjustment Variables (“1” program)
This chapter explains operations required to run “1” program.

(1) Setting of variables


1) Open the robot program “1” using T/B.
2) Enter the model number in the X coordinate of the position variable “PNO” in the program.
3) Using T/B, close the opened “1” program.

2.1.8. Automatic Operation


This chapter explains how to prepare the robot before starting the system.

(1) Preparation and Execution


1) Check that there is no interfering object within the robot movement range.
2) Select the “1” program using Operation Panel.
3) Run the program from the operation panel of the robot controller.

Note) When your controller has no operation panel, use the dedicated external signals to operate
the robot.

(2) At error occurrence


If the robot moves erroneously, refer to separate manual “Troubleshooting”.

(3) Ending
The robot does not move unless a vision sensor recognizes a workpiece. Stop the flow of
workpieces from the upstream and press the [STOP] button of the operation panel of the robot
controller. Confirm that the [STOP] lamp is turned on.

Note) The robot of the specification without the operation panel of the controller is stopped by the
external signal.

2-16
2.2. Hand Camera Correction Method
2.2.1. Program construction
Program
Description Explanation
name
1 Operation program Transporting the workpiece.

B Vision calibration program Calibration program for hand camera


Workpiece coordinate system
This program registers the result of workpiece recognition and the
C – robot coordinate system
taught grab orientation at the pick position.
matching program
TL Tool setting program This program calculates the tool position of hand.
Tool of hand camera This program calculates the relative position between center of
TLCAM
setting program camera coordinate and tool position of robot.
This program has variables for teaching and adjustment of each
WK* Teaching and adjustment
model. (WK1 - )
BASE Base program This program defines the User-defined external variables.

2.2.2. Setup Procedure

Start of operation

1. Parameter Setting
Setting the use base program (Parameter Name ”PRGUSR” Input data “BASE”)

2. Hand Camera Tool Setting ( “TLCAM” program) Refer to “2.2.3.”


Calculate the relative position of camera from tool position of robot to center of camera coordinate.

3. Hand Camera Calibration Setting ( “B” program) Refer to “2.2.4.”


The vision sensor coordinate converts into the robot tool coordinate system.

4. Vision Sensor Settings Refer to “2.1.5.”


Make the JOB of vision sensor

5. Workpiece recognition and Teaching ( “WK1” and “C” program) Refer to “2.2.5.”
Calculate the relationship between the position of a workpiece recognized by the vision sensor and
the position which the robot grabs the workpiece.

6. Setting of Adjustment Variables ( “1*”program) Refer to “2.1.7.”


Teach the Home position and transportation destination at system start-up.

7. Automatic Operation Refer to “2.1.8.”


In automatic operation, the robot operates via commands from the vision sensor control.

End of operation

2-17
2.2.3. Hand Camera Tool Setting (“TLCAM” program)
The relative position of the hand camera is calculated to adjust the center of the field of the
hand camera to the center of the calibration point.
The “TLCAM” program is a program to calculate the relative position from the tool position of
the robot to the center of the field of the hand camera.

Figure2-2-3-1 C-axis is 0 degree C Figure2-2-3-2 C-axis is 90 degree C

(1) Operation procedure


1) Start the EasyBuilder (In-Sight Explorer) and set the vision sensor to “Offline”.
Select the [Set Up Image] in [Application Steps] pane.
Then select the [Continuous] for Trigger in [Edit Acquisition Settings] pane.

"

2) Draw the cross line on the center of the camera image field of view.
Select the [ Inspect Part] in [Spplication Steps] pane. Then select the [User-Defined Line]
in [Add Tool] pane. And press the [Add] bottun.

2-18
The starting and ending positions of the two user-defined lines depend on the number of pixels
of the camera.

<Example> In case of “640 X 480” pixel


Vertical line
Start point : 320, 0
End point : 320, 480

Horizontal line
Start point : 0, 240
End point : 640, 240

<Notice> When Drag and Droping the ”MELFA_Calib.job“ of the sample job program to the
EasyBuilder view pane, the Green cross line is shown.

3) Set the vision sensor to “Online”. Click the online icon.


Set the camera to start the “continuous” trigger.

4) Set the controller [MODE] switch to “MANUAL”. Then set the T/B [ENABLE] switch to”ENABLE”.
5) Open the robot program “TLCAM” using T/B.
6) The posture of the robot hand is directed downward by operation of “Aligning the Hand” on T/B.
7) Attach the Camera Tool Adjustment sheet (Drawn cross line) into the robot operation area.
8) Operate the step execution according to the comment directions in the robot program.
8-1) Move the robot to the position at the center of camera cross lines and adjust to the sheet by
JOG operation using the T/B.
8-2) Next “(3) Only X axis and …..” execute step feed too.
The robot moves to the position in which C axis is rotated by 90 degrees.

2-19
8-3) Move the robot to the position at the center of camera cross lines and adjust- to the sheet by
JOG operation using the T/B.
8-4) Next “(4) Only the C axis is …..” execute step feed too.

8-5) The Tool data of the robot is changed to the center of the camera from the center of the robot
flange.

*Confirm that both centers do not shift even if the robot is operated using C axis JOG operation.

8-6) Execute step feed to “End”.


The Tool data of the robot is returned to the center of the robot flange.
8-7) Close the program.

2-20
2.2.4. Hand Camera Calibration Setting (“B” program)
The calibration of the Hand Camera is performed when the robot moves from the aligned position
of the center of the camera and adjustment mark to each of the four positions in all directions.
The moving distance of the robot is set to less than half of the cameras field of view.

(1) Start Position (2) First point (3) Second point (3) Third point (4) Fourth point

(1) Operation procedure


1) Set the controller [MODE] switch to “MANUAL”. Then set the T/B [ENABLE] switch to”ENABLE”.
2) Start the EasyBuilder (In-Sight Explorer) and set the vision sensor to “Offline”.
3) Drag&drop the “MELFA_CalibH.job” file to View Area of EeasyBuilder.
4) Change the trigger mode to [Continuous] mode. (Refer “2.2.3.” (1), 1) )
5) Set the vision sensor to “Online”. Click the online icon. (Refer “2.2.3.” (1), 3) )
Set the camera to start the “continuous” trigger.
6) Open the robot program “B” using T/B.
7) The posture of the robot hand is directed downward by operation of “Aligning the Hand” on T/B.
8) Attach the Hand Camera Calibration sheet (Drawn cross line) into the robot operation area.
9) Operate the step execution according to the comment directions in the robot program.
9-1) Move the robot to the position at the center of camera cross lines and adjust- to the sheet by
JOG operation using the T/B.
9-2) Next “(3) Calibration of Hand Camera…..” execute step feed too.
9-3) Explanation of operation procedure of the Mitsubishi N-Point calibration tool is below.
(1) Select the [Inspect Part] in [Application Steps] pane.
(2) Select the [MCalib_1] in [Palette] pane.
(3) Select the [Settings] Tab in [Edit Tool – Mcalib_1] pane.
(4) Input the file name if need to change that.

2-21
(5) Input the IP - Address of the robot controller.

P0 (Teaching position)

(1) (2)

(3)

(4)

(5)

(6) Mitsubishi N-Point Calibration


Mitsubishi N-Point Calibration in this sample creates the calibration data by using four points
that become a pair on the screen relating to robot coordinates.
(a) Click the data input point line (“Point0” to “Point3”) in “Edit Tool – Mcalib_1” pane.
And then, Enter the moving distance value of robot which reversed the sign to “World X/Y”
Input area.
(b) Execute step feed to “13 Mov P0*(+10.00,+0.00,+0.00,+0.00,+0.00,+0.00) '1st point” in
robot program “B”.
The robot moves to the first position for calibration setting.
Then press the “Trigger” button. The camera image is displayed.
(c) Click the “Point0” line in “Edit Tool – Mcalib_1” pane and move “User-Defined Point” to
center of the calibration sheet. Then press the “Enter” key.

(a)(c)
(c)

(a)

2-22
(d) Execute step feed to “14 Mov P0*(-10.00,+0.00,+0.00,+0.00,+0.00,+0.00) '2nd point” in
robot program “B”.
The robot moves to the second position for calibration setting.
Then press the “Trigger” button. The camera image is displayed.
(e) Click the “Point1” line in “Edit Tool – Mcalib_1” pane and move “User-Defined Point” to
center of the calibration sheet. Then press the “Enter” key.

(a)(e)

(a)
(e)

(f) Execute step feed to “15 Mov P0*(+0.00,+10.00,+0.00,+0.00,+0.00,+0.00) '3rd point” in


robot program “B”.
The robot moves to the third position for calibration setting.
Then press the “Trigger” button. The camera image is displayed.
(g) Click the “Point2” line in “Edit Tool – Mcalib_1” pane and move “User-Defined Point” to
center of the calibration sheet. Then press the “Enter” key.

(a)(g)
(g)

(a)

(h) Execute step feed to “16 Mov P0*(+0.00,-10.00,+0.00,+0.00,+0.00,+0.00) '4th point” in


robot program “B”.
The robot moves to the fourth position for calibration setting.
Then press the “Trigger” button. The camera image is displayed.
(i) Click the “Point3” line in “Edit Tool – Mcalib_1” pane and move “User-Defined Point” to
center of the calibration sheet. Then press the “Enter” key.

(a)(i)
(i)

(a)

2-23
(j) To export the calibration data, click the “Export” button in the calibration screen.
The information is displayed in the palette pane after the export is completed normally.

(j)

(j)

(1)

2.2.5. Workpiece Recognition and Teaching (“C” program and “WK1” program)
At first, teach the operation position (pickup position or placement position) of the workpiece.
And then, recognize the taught workpiece with the vision sensor and register the result data as
based recognition data.

(1) Operation procedure


1) Set the workpiece into the search area of vision sensor.
2) Register workpiece to be recognized by a vision sensor and create a Job program.
3) Set the controller [MODE] switch to “MANUAL”. Then set the T/B [ENABLE] switch to”ENABLE”.
4) Open the robot program “WK1” using T/B.
5) Open the [Position data Edit] screen.
6) Display “PGET” in order to teach the pickup position of workpiece.
7) Move the robot to the pickup position of workpiece and teach it the position.
8) Repeat (6) – (7) and theach the other positions below.
“PPUT” is place position of workpiece.
“P_HOME” is Home position of robot.
“PCVS*” is the workpiece recognition position. (* = number of sarch position)

2-24
< Notes when teaching the work recognition position “PCVS*”. >
The focus at camera tool adjustment position has been adjusted to be the upper surface of
operation table. If workpiece is put at the camera recognition area, the focus is not appropriate at
the upper surface of workpiece. Therefore, it is necessary to raise the camera position when
recognising the workpiece. Keep the camera work distance between the recognition position of
robot and the upper surface of workpiece consistant.

PCVS*

PCVS* PCVS*

If put the Rise the camera Keep


Camera workpiece … position the camera
work work distance
distance
The focus
is not
appropriate

9) Specify a vision program to be started.


(1) Open the [Command Edit] screen.
<PROGRAM> WK1
1 '## Ver.A1 ########################
2 '# Data setting Program
3 '# NAME : WK1.prg
4 '# Date of creation/version :
EDIT DELETE 123 INSERT TEACH

(2) Display the command step shown in the following.


<PROGRAM> WK1
36 '##### Set the In-Sight data #####
37 C_PPRG$="Job1" 'Set the
38 C_CCOM$="COM3:" 'COM nuum
39 '
EDIT DELETE 123 INSERT TEACH

(3) Change the vision program name entered after “CPRG$=” in the program.

10) Specify a communication line to be connected with vision sensor.


In the same way as in step 9), Specify the line opened for the robot controller that may connect
the vision sensor to the variable.
Change the COM Port name entered after “C_CCOM$=” in the program.
Example) “38 C_CCOM$="COM3:" 'COM number of communication line”
11) Using T/B, close the opened “WK1” program.
12) Open the robot program “C” using T/B.
13) Enter the model number in the X coordinate of the position variable “PNO” in the program.
14) Using T/B, close the opened “C” program.
15) Switch the In-Sight Explorer(EasyBuilder View) to “Online”
16) Run the “C” program automatically with the robot controller.
17) Confirmation after operation
Check the value of the following variable in program “WK1”.
(1) Value of “PCVSD” : Recognized result of Vision sensor.

2-25
2.3. Stationaly Camera Correction Method 2 (Lower side)
2.3.1. Program construction
Program
Description Explanation
name
1 Operation program Transporting the workpiece.

B Vision calibration program Calibration program for hand camera


Workpiece coordinate system
This program registers the result of workpiece recognition and the
C – robot coordinate system
taught grab orientation at the pick position.
matching program
TL Tool setting program This program calculates the tool position of hand.
This program has variables for teaching and adjustment of each
WK* Teaching and adjustment
model. (WK1 - )
BASE Base program This program defines the User-defined external variables.

2.3.2. Setup Procedure


Start of operation

1. Parameter Setting
Setting the use base program (Parameter Name ”PRGUSR” Input data “BASE”)

2. Hand Camera Tool Setting ( “TLCAM” program) Refer to “2.3.3.”


Calculate the relative position of camera from tool position of robot to center of camera coordinate.

3. Hand Camera Calibration Setting ( “B” program) Refer to “2.3.4.”


The vision sensor coordinate converts into the robot tool coordinate system.

4. Vision Sensor Settings Refer to “2.1.5.”


Make the JOB of vision sensor

5. Workpiece recognition and Teaching ( “WK1” and “C” program) Refer to “2.3.5.”
Calculate the relationship between the position of a workpiece recognized by the vision sensor and
the position which the robot grabs the workpiece.

6. Setting of Adjustment Variables ( “1*”program) Refer to “2.3.6.”


Teach the Home position and transportation destination at system start-up.

7. Automatic Operation Refer to “2.3.7.”


In automatic operation, the robot operates via commands from the vision sensor control.

End of operation

2-26
2.3.3. Robot Tool Setting (“TL” program)
The tool data of the robot is calculated by adjusting the top of the hand tool to the center of
camera viewing field. Move the robot to the position of height for which the focus is suitable.
The tool position of the robot hand is calculated to adjust the center of the viewing field of
camera and the top of the robot tool.
The “TL" program is a program to calculate the tool data of the robot.
The tool data of the robot is calculated by two teaching points, as described in the following
figures.

Figure2-3-3-1 C-axis is 0 degree C Figure2-3-3-2 C-axis is 90 degree C

(1) Operation procedure


1) Mount a calibration jig on the hand of the robot.
2) Set the controller [MODE]switch to “MANUAL”.
Then set the T/B[ENABLE] switch to”ENABLE”.
3) Open the robot program “TL” using T/B
At first, it is necessary to initialize the Tool Data of Robot by executing the “Tool P_NTOOL”
instruction in program “TL”.
4) The posture of the robot hand is directed downward by operation of “Aligning the Hand” on T/B.
5) Start the EasyBuilder (In-Sight Explorer) and set the vision sensor to “Offline”.
Select the [Set Up Image] in [Application Steps] pane.
Then select the [Continuous] for Trigger in [Edit Acquisition Settings] pane.
6) Drag and Drop the ”MELFA_CalibH.job“ of sample job program to EasyBuilder view pane.
The Green cross line is shown.
7) Set the vision sensor to “Online”. Click the online icon.
Set the camera to start the “continuous” trigger.

2-27
8) Operate the step execution according to the comment directions in the robot program.
8-1) Move the robot to the position that aligns the center of cross line of camera and the top of the
caliblation jig of the robot by using JOG operation with the T/B.

Drag and Drop


the ”MELFA_CalibH.job“
to EasyBuilder view pane P0 (Teaching position)

8-2) Next “(3) Only X axis and …..” execute step feed too.
The robot moves to the position in which the C axis is rotated by 90 degrees.
8-3) Move the robot to the position that aligns the center of the cross line of the camera and the
top of caliblation jig of the robot by using the JOG operation with the T/B.
8-4) Next “(4) Only C axis is …..” execute step feed too.

P90 (Teaching position)

8-5) The Tool data of the robot is changed to the center of the caliblation jig from the center of the
robot flange.

*Confirm that both centers do not shift even if the robot is operated by using the C axis JOG
operation.

8-6) Execute step feed to the “End”.


8-7) Close the program.

2-28
2.3.4. Camera Calibration Setting (“B” program)
In this method, the camera is placed on the table
surface and directed upward. Therefore, when setting
the calibration, it is necessary to specify four points on
the space.
The calibration setting is executed at each of the four
points while moving the robot by each in the cameras
field of view. Move the robot to the point that is to be
focused on at the top of camera jig.

(1) Operation procedure


1) Mount a calibration jig on the hand of the robot.
2) Set the controller [MODE] switch to “MANUAL”.
Then set the T/B [ENABLE] switch to”ENABLE”.
3) Start the EasyBuilder (In-Sight Explorer) and set the
vision sensor to “Offline”.
4) Drag&drop the “MELFA_CalibH.job” file to View Area
of EasyBuilder.
5) Change the trigger mode to [Continuous] mode. (Refer “2.2.3.” (1), 1) )
6) Set the vision sensor to “Online”. Click the online icon. (Refer “2.2.3.” (1), 3) )
Set the camera to start the “continuous” trigger.
7) Open the robot program “B” using T/B.
8) The posture of the robot hand is directed downward by operation of “Aligning the Hand” on T/B.
9) Operate the step execution according to the comment directions in the robot program.
9-1) Move the robot to the position that aligns the center of the cross line of the camera and the
top of the caliblation jig of the robot by using the JOG operation with the T/B.
9-2) Next “(3) Calibration of Hand Camera…..” execute step feed to.
9-3) Explanation of operation procedure of the Mitsubishi N-Point calibration tool is below.
(1) Select the [Inspect Part] in [Application Steps] pane.
(2) Select the [MCalib_1] in [Palette] pane.
(3) Select the [Settings] Tab in [Edit Tool – Mcalib_1] pane.
(4) Input the file name if need to change that.
(5) Input the IP - Address of the robot controller.

P0 (Teaching position)

(1) (2)

(3)

(4)

(5)

2-29
(6) Mitsubishi N-Point Calibration
Mitsubishi N-Point Calibration in this sample create the calibration data by using four points
that become a pair on the screen relating to robot coordinates.
(a) Execute step feed to “15 Mov P0*(+10.00,+0.00,+0.00,+0.00,+0.00,+0.00) '1st point” in
robot program “B”. The Robot moves to first calibration point.
(b) Click the “Point0” line in “Edit Tool – Mcalib_1” pane and move “User-Defined Point” to
center of calibration jig. Then press the “Enter” key.
(c) Press the “Get position” button. Current position of robot is input to “World X/Y” input area.

(b)(c)

(b)
(c)

(d) Execute step feed to “16 Mov P0*(-10.00,+0.00,+0.00,+0.00,+0.00,+0.00) '2nd point” in


robot program “B”. The Robot moves to second calibration point.
(e) Click the “Point1” line in “Edit Tool – Mcalib_1” pane and move “User-Defined Point” to
center of calibration jig. Then press the “Enter” key.
(f) Press the “Get position” button. Current position of robot is input to “World X/Y” input area.

(e)(f)

(e)
(f)

(g) Execute step feed to “17 Mov P0*(+0.00,+10.00,+0.00,+0.00,+0.00,+0.00) '3rd point” in


robot program “B”. The Robot moves to third calibration point.
(h) Click the “Point2” line in “Edit Tool – Mcalib_1” pane and move “User-Defined Point” to
center of calibration jig. Then press the “Enter” key.
(i) Press the “Get position” button. Current position of robot is input to “World X/Y” input area.

(h)(i) (h)

(i)

2-30
(j) Execute step feed to “18 Mov P0*(+0.00,-10.00,+0.00,+0.00,+0.00,+0.00) '4th point” in
robot program “B”. The Robot moves to fourth calibration point.
(k) Click the “Point3” line in “Edit Tool – Mcalib_1” pane and move “User-Defined Point” to
center of calibration jig. Then press the “Enter” key.
(l) Press the “Get position” button. Current position of robot is input to “World X/Y” input area.

(k)(l)

(l)

(k)

(m) To export the calibration data, click “Export” button in the calibration screen.
The information is displayed in the palette pane after the export is completed normally.

(m)

(mk
)

2-31
2.3.5. Workpiece Recognition and Teaching (“C” program and “WK1” program)
This method is used to correct the gap of the workpiece that the robot is holding.
At first, teach the pickup position and the placement position of workpiece.
Then, the workpiece that the robot is holding is recognized with a fixed camera and the result is
registered.

(1) Operation procedure


1) Set the controller [MODE] switch to “MANUAL”. Then set the T/B [ENABLE] switch to”ENABLE”.
2) Open the robot program “WK1” using T/B.
3) Open the [Position data Edit] screen.
4) Display “PGET” in order to teach the pickup position of workpiece.
5) Move the robot to the pickup position of workpiece and teach it the position.
6) Repeat (4) – (5) and theach the other positions below.
“PPUT” is place position of workpiece.
“P_HOME” is Home position of robot.
“PCVS” is the workpiece recognition position.
7) Set the workpiece into the placement position.
8) Specify a vision program to be started.
(1) Open the [Command Edit] screen.
<PROGRAM> WK1
1 '## Ver.A1 ########################
2 '# Data setting Program
3 '# NAME : WK1.prg
4 '# Date of creation/version :
EDIT DELETE 123 INSERT TEACH

(2) Display the command step shown in the following.


<PROGRAM> WK1
33 '##### Set the In-Sight data #####
34 C_PPRG$="Job1" 'Set the
35 C_CCOM$="COM3:" 'COM nuum
36 '
EDIT DELETE 123 INSERT TEACH

(3) Change the vision program name entered after “CPRG$=” in the program.

9) Specify a communication line to be connected with vision sensor.


In the same way as in step 9), Specify the line opened for the robot controller that may connect
the vision sensor to the variable.
Change the COM Port name entered after “C_CCOM$=” in the program.
Example) “35 C_CCOM$="COM3:" 'COM number of communication line”
10) Using T/B, close the opened “WK1” program.
11) Open the robot program “C” using T/B.
12) Enter the model number in the X coordinate of the position variable “PNO” in the program.
13) Using T/B, close the opened “C” program.
14) Switch the In-Sight Explorer(EasyBuilder View) to “Online”
15) Run the “C” program automatically with the robot controller.
16) Confirmation after operation
Check the value of the following variable in program “WK1”.
(1) Value of “PCVSD” : Recognized result of Vision sensor.

2-32
2.3.6. Calculation procedure of correction value of holding workpiece
The figure below is a diagram showing the procedure to correct the deviation of the workpiece
the robot is holding.

Notch

Workpiece 1. Gripping 2. Recognize 3.The work piece


the workpiece the workpiece to the
corrected position

<STEP 1>

Calculate the relative position (PBFLNG) from


the position of the recognition result of the vision
sensor to the position of the vision recognition.
Robot
The position of
vision recognition. P_TCVS
(Robot tool position)
PTCVSDB

Inv(PTCVSDB)
PBFLNG=Inv(PTCVSDB) * P_TCVS
PBFLNG

Workpiece

P_TCVS : The position of vision recognition. (Teaching position)


PTCVSDB : Recognition result of vision sensor as based registration data
when the placement position of workpiece is taught.
Inv(PTCVSDB) : Inverse matrix data of PTCVSDB
PBFLNG : The amount of relative movement from the Inversed position of
recognition result of vision sensor(Inv(PTCVSDB)) to the position
of vision recognition(P_TCVS).

2-33
<STEP 2>

From the result of vison recognition,


Calculate the robot position (PVHND)
Robot
assuming the wokpiece is gripped at
a correct position in the robot hand. The position of
vision recognition.
(Robot tool position) PVHND
PVHND=PHVSDATA2*PBFLNG
PHVSDATA2

The position of robot that


assuming the workpiece Workpiece
PBFLNG (Position when
is gripped at a correct
position in robot hand. workpiece shifts)

PVHND : The relative position that gripping workpiece from result position of
vision recognition (PHVSDATA2) to the amount of relative movement
from vision recognition as gripping position of workpiece(PBFLNG).
PHVSDATA2 : The result of vision recognition.
PBFLNG : The amount of relative movement from the Inversed position of
recognition result of vision sensor to the position of vision
recognition. This variable is calculated on STEP 1

<STEP 3>

In this step, calculate the amount of


correction. (PHOSEI)

The correct placement position of Robot


workpiece can get to relative calculation The position of P_TCVS
from taught placement position(PPUT) vision recognition.
to amount of correction (PHOSEI). (Robot tool position)
ex) PPUT2=PPUT*PHOSEI PTCVSDB
Mov PPUT2
PHVSDATA2
PBFLNG
PHOSEI=Inv(PVHND)*P_TCVS

PHOSEI Inv(PVHND)
Workpiece
The position of robot that PBFLNG (Position when
assuming the workpiece
workpiece shifts)
is gripped at a correct
position in robot hand.

P_TCVS : The position of vision recognition. (Teaching position)


PHVSDATA2 : The result of vision recognition.
PBFLNG : The amount of relative movement from the Inversed position of
recognition result of vision sensor(Inv(PTCVSDB)) to the position
of vision recognition(P_TCVS).
PTCVSDB : Recognition result of vision sensor as based registration data
when the placement position of workpiece is taught.
PHOSEI : The amount of relative movement for correct the destination position.

2-34

You might also like