EP-3772396-B1 - METHOD FOR COMPUTER-ASSISTED DETECTION AND EVALUATION OF A WORK CYCLE IN WHICH A HUMAN WORKER AND A ROBOTIC SYSTEM INTERACT
Inventors
- Ghrairi, Zied
- Heuermann, Aaron
- Kirisci, Pierre
- Kahlert, Toralf
Dates
- Publication Date
- 20260506
- Application Date
- 20200630
Claims (15)
- A method for computer-assisted detection and evaluation of a workflow, in which a human worker (1) and a robotic system (3) interact, wherein: - based on a detection by means of a first sensor system (2, 4, 5), digital movement data (BD) of the worker (1) are determined, which contain a plurality of first position data sets for a plurality of successive time points, wherein a respective first position data set contains position values for describing the position and orientation of body parts (101, 102, 103) of the worker (1) in a stationary coordinate system (KS) at the respective time point; - based on a detection by means of a second sensor system (12) and/or based on kinematic control data (KD) of the robotic system (3), digital movement data (BD') of the robotic system (3) are determined, which contain a plurality of second position data sets for a plurality of successive time points, wherein a respective second position data set contains position values for describing the position and orientation of movable components (301, 302, 303, 304) of the robotic system (3) in the stationary coordinate system at the respective time point; - from the digital movement data (BD, BD') of the worker (1) and of the robotic system (3), movement patterns (BM) are determined, which are contained in the digital movement data (BD, BD') and which comprise first movement patterns (BM1) and second movement patterns (BM2), wherein the first movement patterns (BM1) are each assigned to a movement of the worker (1) from a plurality of predefined movements of the worker (1) and the second movement patterns (BM2) are each assigned to a movement of the robotic system (3) from a plurality of predefined movements of the robotic system (3), wherein the extraction of the movement patterns (BM) is based on a pattern recognition with access to pre-known movement patterns (BM'); characterized in that - the determined movement patterns (BM) are supplied as input data to a data-driven model (MO), which is trained via machine learning based on training data (TD), wherein the data-driven model (MO) determines as output data activities (AK), which contain first and second activities (AK1, AK2), wherein the first activities (AK1) form classes, which each specify one or more first movement patterns (BM) as a work activity of the worker (1) from a plurality of predefined work activities of the worker (1), and wherein the second activities (AK2) form classes, which each specify one or more second movement patterns (BM) as a work activity of the robotic system (3) from a plurality of predefined work activities of the robotic system (3); - based on the activities (AK), an output is generated via a user interface (11) for assisting the worker (1).
- The method according to claim 1, characterized in that during the determination of the movement patterns (BM), from the digital movement data (BD, BD') of the worker (1) and of the robotic system (3), segments (SE) are initially extracted, which comprise first segments and second segments (SE1, SE2), wherein the first segments (SE1) each comprise first position data sets for a several successive time points and the second segments (SE2) each comprise second position data sets for several successive time points, wherein from the segments (SE) the movement patterns (BM) are subsequently extracted, wherein the first movement patterns (BM1) each comprise one or more first temporally successive segments (SE1) and the second movement patterns (BM2) each comprise one or more second temporally successive segments (SE2).
- The method according to claim 1 or 2, characterized in that the activities (AK) are subjected to a process mining (PM), whereby information is obtained about one or more work processes (PR), which are represented by the activities (AK) and are contained in the workflow.
- The method according to claim 3, characterized in that based on the activities (AK) and the information about the one or more work processes (PR), the output is generated via the user interface (11) for assisting the worker (1).
- The method according to one of the preceding claims, characterized in that the data-driven model (MO) is based on one or more neural networks and/or on one or more Bayesian networks and/or on support vector machines and/or on decision trees.
- The method according to one of the preceding claims, characterized in that the first sensor system (2, 4, 5) comprises a plurality of near-body sensors (2), which are attached to the body of the worker (1), preferably one or more fiber Bragg grating sensors and/or one or more position sensors and/or one or more acceleration sensors and/or one or more inertial sensors, and/or in that the second sensor system (12) comprises a plurality of near-robot sensors, which are attached to one or more movable components (301, 302, 303, 304) of the robotic device (3), preferably one or more fiber Bragg grating sensors and/or one or more position sensors and/or one or more acceleration sensors and/or one or more inertial sensors.
- The method according to one of the preceding claims, characterized in that via the detection by means of the first sensor system (2, 4, 5) for a respective body part (101, 102, 103), local position values are determined in a local coordinate system (LK), which is assigned to the respective body part (101, 102, 103), wherein the position values of the first position data sets depend on the local position values and a position value of a reference point (RP) of the worker (1) in the stationary coordinate system (KS), wherein the reference point (RP) of the worker (1) has a fixed positional relationship to the local coordinate systems (LK), and/or in that via the detection by means of the second sensor system (12) and/or from the kinematic control data (KD) for a respective movable component (301, 302, 303, 304) of the robotic system (3), local position values are determined in a local coordinate system, which is assigned to the respective movable component (301, 302, 303, 304), wherein the position values of the second position data sets depend on the local position values and a position value of a reference point (RP') of the robotic system (3) in the stationary coordinate system (KS), wherein the reference point (RP') of the robotic system (3) has a fixed positional relationship to the local coordinate systems.
- The method according to claim 7, characterized in that the first sensor system (2, 4, 5) comprises a localization sensor system (4, 5), with which the position value of the reference point (RP) of the worker (1) in the stationary coordinate system (KS) is detected with the aid of a localization based on wirelessly transmitted signals, and/or in that the second sensor system (12) comprises a localization sensor system, with which the position value of the reference point (RP') of the robotic system (3) in the stationary coordinate system (KS) is detected with the aid of a localization based on wirelessly transmitted signals.
- The method according to claim 8, characterized in that the localization sensor system (4, 5) of the first sensor system (2, 4, 5) and/or the localization sensor system of the second sensor system (12) is based on a localization by means of ultrasound signals and/or by means of electromagnetic high-frequency signals, preferably via Bluetooth and/or UWB.
- The method according to one of the preceding claims, characterized in that at least a part of the first sensor system (2, 4, 5) is designed redundantly, so that one or more sets of redundant measured values are obtained by the first sensor system (2, 4, 5), wherein the redundant measured values of a respective set were detected independently of one another and in the case that the redundant measured values of a respective set deviate from one another by a predetermined amount, one or more predefined actions are carried out automatically, and/or in that at least a part of the second sensor system (12) is designed redundantly, so that one or more sets of redundant measured values are obtained by the second sensor system (12), wherein the redundant measured values of a respective set were detected independently of one another and in the case that the redundant measured values of a respective set deviate from one another by a predetermined amount, one or more predefined actions are carried out automatically.
- The method according to one of the preceding claims, characterized in that it is monitored whether measured values of the first sensor system (2, 4, 5) and/or of the second sensor system (12) are transmitted at regular time intervals via a data transmission link, wherein in the case that the regular time intervals are not complied with, one or more predefined actions are carried out automatically.
- The method according to one of the preceding claims, characterized in that the minimum distance (d min ) between the worker (1) and the robotic system (3) is calculated from the digital movement data (BD) of the worker (1) and the digital movement data (BD') of the robotic system (3), and the robotic system (3) is controlled as a function of the calculated minimum distance (d min ).
- The method according to claim 12, characterized in that based on the detection by means of the second sensor system (12) and/or from the kinematic control data (KD) of the robotic system (3) and/or based on a detection by means of a further sensor system, it is further derived whether one or more workpieces with predetermined dimensions are held by the robotic system (3) at the respective time point, wherein the predetermined dimensions of held workpieces are taken into account in the calculation of the minimum distance (d min ) by treating the held workpiece or workpieces as part of the robotic system (3).
- An apparatus for the computer-assisted detection and evaluation of a workflow, in which a human worker (1) and a robotic system (3) interact, wherein the apparatus comprises a computer means, wherein the computer means is configured to carry out a method in which: - based on a detection by means of a first sensor system (2, 4, 5), digital movement data (BD) of the worker (1) are determined, which contain a plurality of first position data sets for a plurality of successive time points, wherein a respective first position data set contains position values for describing the position and orientation of body parts (101, 102, 103) of the worker (1) in a stationary coordinate system (KS) at the respective time point; - based on a detection by means of a second sensor system (12) and/or based on kinematic control data (KD) of the robotic system (3), digital movement data (BD') of the robotic system (3) are determined, which contain a plurality of second position data sets for a plurality of successive time points, wherein a respective second position data set contains position values for describing the position and orientation of movable components (301, 302, 303, 304) of the robotic system (3) in the stationary coordinate system (KS) at the respective time point; - from the digital movement data (BD, BD') of the worker (1) and of the robotic system (3), movement patterns (BM) are determined, which are contained in the digital movement data (BD, BD') and which comprise first movement patterns (BM1) and second movement patterns (BM2), wherein the first movement patterns (BM1) are each assigned to a movement of the worker (1) from a plurality of predefined movements of the worker (1) and the second movement patterns (BM2) are each assigned to a movement of the robotic system (3) from a plurality of predefined movements of the robotic system (3), wherein the extraction of the movement patterns (BM) is based on a pattern recognition with access to pre-known movement patterns (BM'); characterized in that - the determined movement patterns (BM) are supplied as input data to a data-driven model (MO), which is trained via machine learning based on training data (TD), wherein the data-driven model (MO) determines as output data activities (AK), which contain first and second activities (AK1, AK2), wherein the first activities (AK1) each classify one or more first movement patterns (BM) as a work activity of the worker (1) from a plurality of predefined work activities of the worker (1) and wherein the second activities (AK2) each classify one or more second movement patterns (BM) as a work activity of the robotic system (3) from a plurality of predefined work activities of the robotic system (3); - based on the activities (AK), an output is generated via a user interface (11) for assisting the worker (1).
- The apparatus according to claim 14, characterized in that the apparatus is configured to carry out a method according to one of claims 2 to 13.
Description
Die Erfindung betrifft ein Verfahren und eine Vorrichtung zur rechnergestützten Erfassung und Auswertung eines Arbeitsablaufs, bei dem ein menschlicher Werker und ein robotisches System wechselwirken. Aus dem Stand der Technik ist es bekannt, im Rahmen eines Arbeitsablaufs, bei dem ein menschlicher Werker und ein robotisches System miteinander interagieren, die Bewegungen des Werkers und des robotischen Systems zu erfassen. Beispielsweise offenbart das Dokument WO 2015/114089 A1 eine Sicherheitseinrichtung für einen Werker, bei dem über eine mobile, vom Werker zu tragende Erfassungseinrichtung der Werker im Arbeitsbereich einer bewegten automatischen Maschine lokalisiert wird, wobei bei einer Kollision oder Kollisionsgefahr mit der automatischen Maschine ein Warnsignal an den Werker übermittelt wird. Bis dato gibt es im Stand der Technik keine Ansätze, die mit entsprechenden Erfassungseinrichtungen erfasste Bewegungen eines Werkers und eines robotischen Systems einer Analyse unterziehen, um rechnergestützt die vom Werker bzw. vom robotischen System ausgeführten Arbeitstätigkeiten zu bestimmen. Die Druckschrift EP 2 772 811 A2 zeigt ein Verfahren gemäß dem Oberbegriff des Patentanspruchs 1. Aufgabe der Erfindung ist es, ein Verfahren und eine Vorrichtung zur rechnergestützten Erfassung und Auswertung eines Arbeitsablaufs zu schaffen, bei dem die erfassten Bewegungen eines Werkers und eines robotischen Systems einer umfassenden Analyse zur Extraktion von Arbeitstätigkeiten unterzogen werden. Diese Aufgabe wird durch das Verfahren gemäß Patentanspruch 1 bzw. die Vorrichtung gemäß Patentanspruch 14 gelöst. Weiterbildungen der Erfindung sind in den abhängigen Ansprüchen definiert. Im Rahmen des erfindungsgemäßen Verfahrens wird ein Arbeitsablauf, bei dem ein menschlicher Werker und ein robotisches System wechselwirken, erfasst und ausgewertet. Der Begriff des robotischen Systems ist dabei weit zu verstehen und kann je nach Ausgestaltung eine oder mehrere automatisch bewegte Maschinen (d.h. Roboter) umfassen. Gegebenenfalls kann das robotische System auch nur einen einzelnen Roboter enthalten. Erfindungsgemäß werden basierend auf einer Erfassung mittels einer ersten Sensorik digitale Bewegungsdaten des Werkers ermittelt, welche eine Vielzahl von Positionsdatensätzen für eine Vielzahl von aufeinander folgenden Zeitpunkten enthalten, wobei jeder erste Positionsdatensatz Positionswerte zur Beschreibung der Position und Orientierung von Körperteilen des Werkers in einem ortsfesten Koordinatensystem zum jeweiligen Zeitpunkt enthält. Im Unterschied zu den ersten Positionsdatensätzen beziehen sich die weiter unten genannten zweiten Positionsdatensätze nicht auf die Position und Orientierung von Körperteilen des Werkers, sondern auf die Position und Orientierung von beweglichen Bauteilen des robotischen Systems. Der Begriff der ersten und zweiten Positionsdatensätzen ist weit zu verstehen. Es muss lediglich gewährleistet sein, dass sich aus den Positionswerten der Positionsdatensätze die Position und Orientierung von Körperteilen des Werkers bzw. von beweglichen Bauteilen des robotischen Systems im ortsfesten Koordinatensystem ergeben. Vorzugsweise beschreiben die Positionswerte der ersten und zweiten Positionsdatensätze die Position und Orientierung im dreidimensionalen Raum. Unter einem ortsfesten Koordinatensystem ist ein globales Koordinatensystem zu verstehen, das weder der Bewegung des Werkers noch der Bewegung des robotischen Systems folgt. Vorzugsweise sind das ortsfeste Koordinatensystem und die weiter unten genannten lokalen Koordinatensysteme dreidimensionale Koordinatensysteme. Im erfindungsgemäßen Verfahrens werden ferner basierend auf einer Erfassung mittels einer zweiten Sensorik und/oder basierend auf kinematischen Steuerdaten des robotischen Systems digitale Bewegungsdaten des robotischen Systems ermittelt, welche eine Vielzahl von zweiten Positionsdatensätzen für eine Vielzahl von aufeinander folgenden Zeitpunkten enthalten, wobei jeder zweite Positionsdatensatz Positionswerte zur Beschreibung der Position und Orientierung von beweglichen Bauteilen des robotischen Systems in dem ortsfesten Koordinatensystem zum jeweiligen Zeitpunkt enthält. Die oben beschriebenen Bewegungsdaten des Werkers bzw. des robotischen Systems stellen vorverarbeitete Rohdaten dar, die aus der ersten Sensorik bzw. der zweiten Sensorik bzw. der Steuerung des Roboters stammen. Je nach Ausgestaltung kann die Vorverarbeitung neben der eigentlichen Ermittlung der Positionsdatensätze verschiedene Schritte, wie beispielsweise eine Normierung von Sensordaten, eine Umrechnung in standardisierte Einheiten, Transformationen und dergleichen umfassen. Unter kinematischen Steuerdaten des robotischen Systems sind Steuerdaten für das robotische System zu verstehen, die dessen Bewegung bestimmen. Es gibt dabei bekannte Verfahren, die aus solchen Steuerdaten unter Verwendung des CAD- bzw. Kinematik-Modells des robotischen Systems die oben genannte