Search

EP-4739469-A1 - POSITIONING METHOD AND POSITIONING SYSTEM

EP4739469A1EP 4739469 A1EP4739469 A1EP 4739469A1EP-4739469-A1

Abstract

The invention is a positioning method for positioning a reference feature (10) with respect to a target feature (11) along a positioning plane within a workspace accessible by a robotic manipulator, wherein one of said reference and target features (10, 11) is on a first object attached to the robotic manipulator and the other one of said reference and target features (10, 11) is on a second object being arranged in the workspace and being detached from the robotic manipulator. First and second positioning steps are carried out and are repeated until at least one stop condition is achieved, wherein the at least one stop condition comprises the condition that the calculated respective physical distance is within a threshold limit. The invention also relates to a positioning system, a computer program product and a computer-readable medium for carrying out the method.

Inventors

  • ERDÖS, Ferenc Gábor
  • JUNIKI, Ádám
  • Tipary, Bence

Assignees

  • HUN-REN Számítástechnikai és Automatizálási Kutatóintézet

Dates

Publication Date
20260513
Application Date
20240705

Claims (14)

  1. 1. A positioning method for positioning a reference feature (10) with respect to a target feature (11 ) along a positioning plane within a workspace accessible by a robotic manipulator (100), wherein one of said reference and target features (10, 11 ) is on a first object attached to the robotic manipulator (100) and the other one of said reference and target features (10, 11 ) is on a second object being arranged in the workspace and being detached from the robotic manipulator (100), wherein a first imaging unit (20) with an estimated first scale factor is used for imaging, from a first viewpoint (24), the target feature (11 ) as well as at least one marker (21 ) defining a first line (22) in the image of the first imaging unit (20), the first imaging unit (20) and the at least one marker (21 ) imaged by the first imaging unit (20) being fixed with respect to the reference feature (10), and a second imaging unit (30) with an estimated second scale factor is used for imaging, from a second viewpoint (34) different from the first viewpoint (24), the target feature (11 ) as well as at least one marker (31 ) defining a second line (32) in the image of the second imaging unit (30), the second imaging unit (30) and the at least one marker (31 ) imaged by the second imaging unit (30) being fixed with respect to the reference feature (10), and wherein the viewpoints (24, 34) and the markers (21 , 31 ) are arranged so that a first plane (23) defined by the first viewpoint (24) and the at least one marker (21 ) defining the first line (22), and a second plane (33) defined by the second viewpoint (34) and the at least one marker (31 ) defining the second line (32) intersect each other along a virtual axis (40), and the virtual axis (40) intersects the positioning plane, the positioning method comprising: a first positioning step comprising: - taking an image by the first imaging unit (20), the image containing a first imaging (111 ) of the target feature (11 ), - determining a distance within the image between the first line (22) and the first imaging (111 ) of the target feature (11 ), - calculating a respective physical distance based on the determined distance and the estimated first scale factor, - checking, whether at least one stop condition is achieved, and if not, controlling the robotic manipulator (100) to a movement compensating the calculated respective physical distance, and repeating the first positioning step until at least one stop condition is achieved, the positioning method further comprising: a second positioning step comprising: - taking an image by the second imaging unit (30), the image containing a second imaging (112) of the target feature (11 ), - determining a distance within the image between the second line (32) and the second imaging (112) of the target feature (11 ), - calculating a respective physical distance based on the determined distance and the estimated second scale factor, - checking, whether the at least one stop condition is achieved, and if not, controlling the robotic manipulator (100) to a movement compensating the calculated respective physical distance, and repeating the second positioning step until at least one stop condition is achieved, wherein the at least one stop condition comprises the condition that the calculated respective physical distance is within a threshold limit.
  2. 2. The positioning method according to claim 1 , characterized in that the reference feature (10) is a point, a location, an area or a portion of the first object, and the target feature (11 ) is a point, a location, an area or a portion of the second object.
  3. 3. The positioning method according to claim 2, characterized in that subsequent to achieving the stop condition that the calculated respective physical distance is within a threshold limit, a movement is carried out with the first object towards the second object.
  4. 4. The positioning method according to any of claims 1 to 3, characterized in that the same at least one marker (21 , 31 ) is imaged by the first imaging unit (20) and the second imaging unit (30).
  5. 5. The positioning method according to any of claims 1 to 4, characterized in that the viewpoints (24, 34) and the markers (21 , 31 ) are arranged so that the reference feature (10) is on the virtual axis (40).
  6. 6. The positioning method according to any of claims 1 to 5, characterized in that the robotic manipulator (100) is controlled to movements in an X direction and a Y direction of the positioning plane, and the viewpoints (24, 34) and the markers (21 , 31 ) are arranged so that the first plane (23) is perpendicular to the second plane (33).
  7. 7. The positioning method according to any of claims 1 to 6, characterized in that the viewpoints (24, 34) and the markers (21 , 31 ) are arranged so that the first plane (23) is perpendicular to a first controlled movement direction of the robotic manipulator (100), and the second plane (33) is perpendicular to a second controlled movement direction of the robotic manipulator (100).
  8. 8. The positioning method according to any of claims 1 to 7, characterized in that the at least one stop condition also comprises one or more of the following further conditions: - a number of repetitions of the first positioning step has reached an upper limit and/or a number of repetitions of the second positioning step has reached an upper limit and/or a time taken by the repeatedly performed first and second positioning steps has reached an upper limit; - an upper distance limit has been achieved from an initial distance between the reference feature (10) and the target feature (11 ); - a collision or a pre-collision state has been achieved by the first and/or second positioning steps.
  9. 9. The positioning method according to any of claims 1 to 8, characterized in that said distance determining steps comprise - image processing for identifying respective positions of the first and second imagings (111 , 112) of the target feature (11 ) within the respective images, and/or image processing for identifying respective positions of imagings (211 , 311 ) of the markers (21 , 31 ) within the respective images.
  10. 10. The positioning method according to any of claims 1 to 9, characterized in that a positioning point is defined on the virtual axis (40) by means of a third plane intersecting the virtual axis (40), wherein the third plane is defined by a third viewpoint of a third imaging unit and/or further at least one marker defining a third line in a imaging unit image.
  11. 11 . The positioning method according to any of claims 1 to 10, characterized in that the imaging units (20, 30) are - separate cameras, - a single camera combined with an optic system imaging different viewpoints into respective parts of the single camera image, - a single camera combined with an optic system alternatingly imaging different viewpoints into the single camera image, or - a combination of the above options.
  12. 12. A positioning system for positioning a reference feature (10) with respect to a target feature (11 ) along a positioning plane, the positioning system comprising a robotic manipulator (100) and a workspace accessible by the robotic manipulator (100), wherein one of said reference and target features (10, 11 ) is on a first object attached to the robotic manipulator (100) and the other one of said reference and target features (10, 11 ) is on a second object being arranged in the workspace and being detached from the robotic manipulator (100), the positioning system further comprising a first imaging unit (20) with an estimated first scale factor for imaging, from a first viewpoint (24), the target feature (11 ) as well as at least one marker (21 ) defining a first line (22) in the image of the first imaging unit (20), the first imaging unit (20) and the at least one marker (21 ) imaged by the first imaging unit (20) being fixed with respect to the reference feature (10), and a second imaging unit (30) with an estimated second scale factor for imaging, from a second viewpoint (34) different from the first viewpoint (24), the target feature (11 ) as well as at least one marker (31 ) defining a second line (32) in the image of the second imaging unit (30), the second imaging unit (30) and the at least one marker (31 ) imaged by the second imaging unit (30) being fixed with respect to the reference feature (10), wherein the viewpoints (24, 34) and the markers (21 , 31 ) are arranged so that a first plane (23) defined by the first viewpoint (24) and the at least one marker (21 ) defining the first line (22), and a second plane (33) defined by the second viewpoint (34) and the at least one marker (31 ) defining the second line (32) intersect each other along a virtual axis (40), and the virtual axis (40) intersects the positioning plane, the positioning system further comprising a processor adapted to perform the steps of the positioning method of claim 1 .
  13. 13. A computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the positioning method of claim 1 .
  14. 14. A computer-readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the steps of the positioning method of claim 1.

Description

POSITIONING METHOD AND POSITIONING SYSTEM TECHNICAL FIELD The subject matter disclosed herein relates to a positioning method, a positioning system, a computer program product and a computer readable medium for positioning a reference feature with respect to a target feature. More particularly, the disclosed subject matter relates to systems and methods for using visual servoing techniques in a unique way to achieve accurate positioning in case if exact calibrations, exact mutual positions/orientations, exact camera scale factors are not necessarily available. The subject matter disclosed herein is especially suitable for positioning a tool with respect to a workpiece to enable carrying out various operations in an accurately positioned way. BACKGROUND ART Automatic measurement and diagnosis of used printed circuit boards (PCBs) is a challenging task due to the need for precise fixturing of the boards in traditional measuring tools, as well as the PCB schematics, including the geometry, for robot programming. Hence, PCB repair shops usually employ human operators to find and identify the root cause of the malfunctions of the broken PCBs. In general, these shops face a large variety of products in small batch sizes but with many, frequently recurring product types. Even though measuring operations often contain repetitive steps - seemingly good candidates for automation - automated solutions cannot yet provide a flexibility for repair shops to be worth investing into. There are known methods for automatic quality assurance and electrical testing of PCBs as a part of manufacturing processes. These electrical tests ensure that the bare or populated PCBs leaving the factory are tested thoroughly regarding their electrical parameters (voltages, currents, etc.). There are two main testing approaches: the in-circuit tester systems with bed-of-nails fixtures and the flying probe tests. The bed-of-nails approach is the fastest testing method due to its massive parallelization with the test-pins in the bed connecting simultaneously to the PCB. On the other hand, this method is the least flexible as it requires a pre- manufactured, accurate fixture, which can only be used for a single type of PCB. Accordingly, this method in general is only applicable in mass production of PCBs. There is a need for eliminating the constraints of PCB-specific fixtures, and correspondingly, to flexibly enable tests without fixturing , especially with the flexible flying probe testers. However, prior art flying probe tests require accurately set up measurement point positions. These tests can be rapidly prepared from a digital geometric representation, such as a computer-aided design (CAD) model, due to tight manufacturing tolerances of PCBs. Therefore, flying probe tests are typically used for manufactured PCBs and not for repairing assembled PCB products, where geometric information may be missing. Even if the measurement point positions are defined offline, i.e. , with coordinates and not through manual teaching, the points might not be accessible from the usual vertical approach direction because of obstacles (cables, heatsinks, etc.) in the assembly. Consequently, the diagnosis of used PCBs calls for even more flexible measurement methods. Beside managing a large product range, handling the variation between different products of the same type is also challenging. These variations commonly occur when the production is maintained over a longer time period, or when there are multiple manufacturers for some PCB components. Although connection point positions seldom change, having varying visual appearance and dimensions in minor features like heatsinks and mounted components is rather typical. Therefore, automated measurement point detection, together with automated program generation or adaptation would be highly beneficial, which calls for a flexible, robotized solution. In classical robotic applications, such problems may be tackled using an online teaching method, where the tool center point (TCP) of the robot pose is set by an operator at a setup phase and the taught points are interpolated during the measuring cycle. This method can be performed with the precision determined by the repeatability of the robot, which is usually in the order of 0.1 -0.01 mm for industrial robot arms. However, taught points can only be used successfully if the robot repeatability, fixturing precision of the workpieces, and the manufacturing and assembly precision of the workpieces are such that the accumulating errors still allow the precision required for the desired subsequent action. Otherwise, the accumulating errors also need to be taken into account during the measurement. Visual servoing, also known as vision-based robot control, is a known technique, which uses feedback information extracted from a vision sensor, i.e. visual feedback to control the motion of a robot. There are known solutions, which use visual servoing for probe positioni