以視覺為基礎之高爾夫球桿桿頭組裝及檢測系統 Vision

以視覺為基礎之高爾夫球桿桿頭組裝及檢測系統
Vision-Based Assembly and Inspection System for Golf Club
Heads: VAIS-GCH
Submitted to: 2nd International Conference on Advanced Design
and Manufacturing Engineering (ADME 2012)
Vision-Based Assembly and Inspection System for Golf Club Heads:
VAIS-GCH
Chih-chung Lin1,a, Chen Cheng-yen2, Che-Han Su3, John Y. Chiang4,b,
Chengqi Chen 5,c
1,2
3,4
Intelligent Systems Group Metal Industries Research & Development Centre, Taiwan
National Sun Yat-Sen University Department of Computer Science and Engineering, Taiwan
5
Zhejiang Shuren University Office of International Exchange and Cooperation, China
a
[email protected], [email protected], [email protected]
Keywords: vision-based, golf club heads, 3D space correspondence, manufacturing process.
Abstract. This study proposes a vision-based assembly and inspection system for golf club heads
(VAIS-GCH) to mitigate the time- and labor-consuming golf club head production process.
Cameras capture images of the striking plate and casting body for visual processing. Binarization,
boundary detection, feature extraction, and object range and center identification are performed to
determine the spatial orientation of the target. This is used as a basis for deriving the barycenter
shift (Δx, Δy) and shifting angle θ of the striking plate, the barycenter shift (Δx, Δy) and shifting
angle θ of the casting body, and the shifting angle (θx, θy) and insertion depth d of the XY plane.
The mechanical arm is then directed in grasping and adjusting the yet-to-be assembled objects to
the canonical orientation according to the measured striking plate barycenter shift and shifting angle.
Three-dimensional (3D) rotation is then performed on the striking plate according to the shifting
angle of the cas1ting body and the XY plane shifting angle. Following this, the striking plate is
moved to the casting body barycenter according to the barycenter shift of the casting body, thereby
facilitating the completion of the coupling and welding processes of the striking plate and casting
body. Testing is then performed on the loft angle φ of the striking plate and casting body after
coupling, thus increasing the detection efficiency and accuracy of the golf club head assembly.
1.
Introduction
In industry, robotic arms are often used for automated processes. For example, robotic arms for
welding are extremely common in the automobile industry. Feature identification is performed in
the welding process after image capture. Image features are used to calculate corresponding 3D
space coordinates and angles of rotation for fast, dynamic planning of smooth paths. This allows
mechanical arms to reach the correct position accurately [1-5]. However, this process is restricted
by the majority of metal parts being 3D space curved surface constructions, which are substantially
more complex than simple planar objects. This causes difficulty in performing automated
integration, welding, and assembly. Therefore, the visualized 3D space correspondence and
coupling for objects has become a key technology that must be developed by 3D industries.
This study presents a vision-based assembly and inspection system for golf club heads
(VAIS-GCH) to resolve these deficiencies. This system adopts automated mechanical arms matched
with cameras to capture images and image processing technology for coupling object orientation
correction, 3D space rotation, object coupling or correspondence, and coupling and loft angle
testing. This system can completely automate the golf club head manufacturing process.
2.
VAIS-GCH System Processing Procedure
The VAIS-GCH system can be divided into the striking plate suction, striking plate and casting
body coupling, as well as the loft angle detection stages. When the mechanical arm begins the
striking plate suction stage, it moves to the top of the striking plate to capture images. The convex
hull is used to calculate the minimum circumscribed rectangle of the striking plate to derive the
striking plate shifting angle θ, barycenter position, and barycenter shift (Δx, Δy). These are used as
a basis to control the suction parts of the mechanical arm in suctioning the striking plate and
correcting it to the canonical orientation. Entering the casting body coupling stage, the mechanical
arm moves to the top of the casting body to capture images. After the boundaries, barycenter
position, and barycenter shift (Δx, Δy) of the casting body are calculated, the derived casting body
boundaries and stored template database data are used for further comparison to obtain the shifting
angle θ and the XY plane shifting angles θx and θy. The mechanical arm then adjusts the striking
plate 3D space loft and angle of elevation according to the placement status of the casting body in
actual space. The striking plate barycenter is moved to the casting body barycenter, and images of
the casting body obtained from the sides undergo binarization and boundary separation. The vertical
gap between the casting body coupling opening and the striking plate is used to calculate coupling
depth d. Coupling between the striking plate and the casting body is performed using the preset
coupling depth. Finally, in the loft angle testing stage, the casting body images captured from the
side are binarized, and boundary separation is performed to derive the horizon of the casting body
bottom and the centerline of the striking plate. The loft angle φ, formed by the vertical horizon and
centerline vectors, is then tested to confirm whether specifications are satisfied.
2.1 Striking Plate Suction Stage
Correcting the striking plate to the canonical orientation requires calculating the shifting angle θ
and barycenter position shift (Δx, Δy). To determine the striking plate placement angle shift, a
minimum circumscribed rectangle surrounding the striking plate must first be employed. The
minimum circumscribed rectangle and image centerline are used to derive the striking plate shifting
angle θ. The striking plate minimum circumscribed rectangle can be determined by the necessary
characteristic of the minimum rectangle possessing one side overlapping with the polygon formed
by the convex hull. The recorded convex hull collection or congregation points are used to calculate
the minimum circumscribed rectangle to define the striking plate range.
The captured image center point and the striking plate barycenter position are used to derive
the barycenter shift (Δx, Δy). The minimum circumscribed rectangle is used to obtain the shifting
angle θ. When performing the coupling steps, and when the mechanical arm suctions the striking
plate from the conveyor belt, because the initial settings for the suction position of the suction parts
on the mechanical arm are the picture center point, the barycenter shift (Δx, Δy) must be translated
to make the suction position of the suctions parts correspond to the striking plate barycenter
position. Suction actions are then performed. After suction, the reverse rotation shifting angle θ of
the striking plate is corrected to the canonical orientation.
2.2 Striking Plate and Casting Body Coupling Stage
When the casting body is placed on the fixture, it can easily demonstrate the shifting angle θ and
XY plane shifting angles θx and θy. The casting body images captured from the top and the casting
body with canonical orientation from the database are compared to derive the casting body shifting
angle θ and the XY plane shifting angles θx and θy. Spatial 3D shift adjustment is performed along
(θ, θx, and θy) using global search methods for the casting body with canonical orientation from the
database. This is then projected to the XY plane. The similarity of the coupling opening boundary
after projection and the coupling opening boundaries in the casting body image captured by the
cameras is compared. The shifting angle θ and the XY plane shifting angles θx and θy are thus
derived. The casting body with canonical orientation from the database is made T, each 3D space
point is composed of xys coordinates, and all xyz information is influenced by the casting body
shifting angle θ and the XY plane shifting angles θx and θy. Thus, the casting body can be
expressed as T(x(  , x , y ), y(  , x , y ), z(  , x , y )) , and the image projected to the XY plane is
At (x(  , x , y ), y(  , x , y )) . Assuming that the casting body image captured by the top camera is
At (x(  , x , y ), y(  , x , y )) , 3D rotation is performed on At. This corresponds to the (θmin, θxmin,
θymin) for the minimum D value; that is, the casting body shifting angle θ and the XY plane shifting
angles θx and θy.
D
 | A ( x( , ,
 , x , y
t
x
y
), y ( , x , y ))  A0 ( x( ,  ,  ), y( ,  ,  )) |
(1)
To seal the striking plate and casting body coupling openings completely, the coupling depth d
must be calculated. Otherwise, the striking plate coupling may be too shallow or too deep. Using
the casting body coupling opening boundary groove from the side image, whether the striking plate
coupling depth is correct can be derived. When coupling is too shallow, the striking plate
boundaries exceed the coupling opening. By contrast, when coupling is too deep, the plate
boundary cannot be seen from the coupling opening. To calculate the coupling depth d, the vertical
distance between the casting body plane and the striking plate plane must be calculated. Because
the coupling opening welding beam boundary exhibits protuberance, the brightness of the coupling
opening boundary under illumination is higher than that of the other sections. Thus, binarization
processing can be performed to distinguish the casting body and striking plate coupling openings.
The binarized image boundary identification can be corresponded to the coupling opening
boundary section. The vertical distance between the casting body coupling opening groove and the
striking plate is the coupling depth.
Three-dimensional spatial rotation is performed on the striking plate by the mechanical arm
using the casting body shifting angle θ and the XY plane shifting angles θx and θy, obtained by
following the steps described above. The striking plate barycenter is moved to the casting body
center position, allowing the striking plate and casting body coupling openings to overlap entirely.
The mechanical arm moves down distance d to seal the striking plate and casting body entirely.
2.3 Loft Angle Detection Stage
After completing coupling of the striking plate and casting body, whether the striking plate loft
angle φ conforms to the predetermined specifications must be detected. To calculate the loft angle
φ, the body bottom horizon and striking plate centerline must first be derived.
3.
Conclusion and Future Prospects
VAIS-GCH provides spatial coupling and detection for 3D objects. Barycenter shift (Δx, Δy) and
the shifting angle θ are used to suction the striking plate and correct it to the canonical orientation.
Rotation in 3D space is performed according to the casting body shifting angle θ and XY plane
shifting angles θx and θy. The casting body barycenter shift (Δx, Δy) is moved to align the striking
plate to the casting body coupling opening. After accurately calculating the coupling depth d, 3D
component combination is performed. The correctness of the loft angle φ can be detected
automatically. In the future, an additional mechanical arm must be controlled to perform welding of
the striking plate and casting body automatically. This automates the entire production process.
Additionally, VAIS-GCH performs 3D spatial coupling using global searching to calculate the
casting body shifting angle θ and the XY plane shifting angles θx and θy. This calculation is
extremely complex. Thus, a system with higher computational efficiency must be employed. In the
future, the search range can be improved, and a rapid method of calculating correction angles can be
developed. This would reduce the cost of 3D component assembly in 3D industries.
References
[1] B. Rooks, “Robot welding in shipbuilding,” Industrial Robot, 24(6), pp. 413-7, 1997
[2] M. J. Tsai, Lin Shi-Da, Chen M.C., "Mathematic Model for Robotic Arc Welding Off-line
Programming System," Int. J. Computer Integrated Manufacture, 5(4), pp. 300-309,
1992.
[3] Kim J.-S., Choi B.O., Nnaji D.W., “Robot arc welding operations planning
with
a
rotating/tilting
positioned,”
Int.
J.
Production Research, 36(4), pp.
957, 1998.
[4] M. J. Tsai and Nai-Jun Ann, “An Automatic Golf Head Robotic Welding System Using 3D
Machine Vision System,” IEEE Conf. on Advanced Robotics and its Social Impacts
[5] G. C. Burdea, 1987, “Two Piece Jigsaw Puzzle Robot Assembly With Vision, Position and
Force Feedback,” IEEE Transaction of Computer Society Press, pp. 505-511.