Search

DE-102018208278-B4 - Operational assistance procedure, control unit, operational assistance system and work device

DE102018208278B4DE 102018208278 B4DE102018208278 B4DE 102018208278B4DE-102018208278-B4

Abstract

Operational assistance procedure (S) for a work device (1), in which (S1) Object boxes (54) for an object (52) in a field of view of the work device (1) at successive time points or such object boxes (54) characterizing data are obtained, in which, at or for the step (S1) of obtaining the object boxes (54) and/or the object boxes (54) characterizing data are obtained, (S1a) a field of view (50) of the underlying working device (1) is optically captured two-dimensionally and monocularly by recording temporally successive images and (S1b) in successive recorded images at least one object (52) and an object box (54) associated with the object (52) are identified, (S2) from object boxes (54) of a given object (52) to successively or directly successively recorded images, an instantaneous scaling change or derived sizes of an object box (54) to the respective object (52) and an instantaneous lateral position change of the object box in the image (54) to the respective object (52) can be determined, (S3) from the current scaling change or quantities derived therefrom and the current lateral position change in the image to an object box (54) to a respective object (52) a future predicted object box (55) is determined and (S4) the position of the predicted object box (55) and/or the ratio of a lateral extent of the predicted object box (55) to a lateral extent of a captured field of view (50) and/or the captured images or a portion thereof are determined and evaluated and (S5) depending on the result of the evaluation (i) determines whether one of the predicted object box (55) is to underlying object (52) is critical or not with respect to a possible collision, and/or (ii) an operating state of the working device (1) is controlled or regulated, wherein an underlying object (52) of a predicted object box (55) is determined to be non-critical with respect to a possible collision, in particular with a criticality value of 0% if the predicted object box (55) is completely outside of an underlying image or a predefined portion thereof.

Inventors

  • Jens Roeder
  • Michael Kessler
  • Patrick Koegel
  • Steffen Brueggert

Assignees

  • ROBERT BOSCH GMBH

Dates

Publication Date
20260513
Application Date
20180525

Claims (11)

  1. Operational assistance procedure (S) for a work device (1), in which (S1) object boxes (54) for an object (52) in a field of view of the work device (1) are obtained at successive time points or data characterizing such object boxes (54), in which, during or for the step (S1) of obtaining the object boxes (54) and/or the data characterizing the object boxes (54), (S1a) a field of view (50) of the underlying work device (1) is optically captured two-dimensionally and monocularly by recording temporally successive images, and (S1b) at least one object (52) and an object box (54) associated with the object (52) are determined in successive recorded images, (S2) from object boxes (54) of a given object (52) to successively or directly successively recorded images, an instantaneous scaling change or derived quantities of an object box (54) for the respective object (52) are obtained. and an instantaneous lateral position change of the object box in the image (54) relative to the respective object (52) is determined, (S3) from the instantaneous scaling change or quantities derived therefrom and the instantaneous lateral position change in the image relative to an object box (54) relative to a respective object (52), a future predicted object box (55) is determined, and (S4) the position of the predicted object box (55) and/or the ratio of a lateral extent of the predicted object box (55) to a lateral extent of a captured field of view (50) and/or the captured images or a section thereof are determined and evaluated, and (S5) depending on the result of the evaluation, (i) it is determined whether an object (52) underlying the predicted object box (55) is critical with respect to a possible collision, and/or (ii) an operating state of the working device (1) is controlled or regulated, wherein An object (52) underlying a predicted object box (55) is determined to be non-critical with respect to a possible collision, in particular with a criticality value of 0%, if the predicted object box (55) lies completely outside of an underlying image or a specified section thereof.
  2. Operational assistance method (S) according to one of the preceding claims, in which an object box (55) predicted into the future is determined at least to a currently most recently captured image by iteratively determining and updating values for the scaling of a respective object box (54), for the coordinates of a respective object box (54), for the translation of a respective object box (54) and for the lateral width of a respective object box (54) over a plurality of time increments up to a forecast period.
  3. Operational assistance procedure (S) according to Claim 2 , in which, for each time increment, the following steps are performed – in particular in the specified order: (I1) Resetting or pre-populating the values to be calculated according to the allocation rules Scaling alt := new scaling BoxTranslationX alt := BoxTranslationX new BoxBreite alt := New box width BoxPositionLinks alt := BoxPositionLinks new BoxPositionRight alt := BoxPositionRight new , (I2) Update the scaling according to the following assignment rule Skalierung neu : = 1 / ( 2 − Skalierung alt ) (I3) Update the horizontal object box translation according to the following mapping rule BoxTranslationX neu : = BoxTranslationX alt × Skalierung alt (I4) Update horizontal object box width according to the following assignment rule BoxBreite neu : = BoxPositionRechts alt − BoxPositionLinks alt (I5) Predicting the horizontal box positions according to the following allocation rules BoxPositionLinks neu : = BoxPositionLinks alt + BoxTranslationX neu − 0,5 × BoxBreite neu × ( Skalierung neu − 1 ) / Skalierung neu BoxPositionRechts neu : = BoxPositionRechts alt + BoxTranslationXneu + 0,5 × BoxBreite neu × ( Skalierung neu − 1 ) / Skalierung neu , where `Scaling old` , `Scaling new` denote the old and new scaling of an object box (54), `BoxTranslationX old` , `BoxTranslationX new` denote the old and new displacement of an object box (54), `BoxWidth old` , `BoxWidth new` denote the old and new width of an object box (54), `BoxPositionLeft old` , `BoxPositionLeft new` denote the old and new position of the lower left corner of an object box (54) as the first x-coordinate of the respective object box (54), and `BoxPositionRight old` , `BoxPositionRight new` denote the old and new position of the lower right corner of an object box (54) as the second x-coordinate of the respective object box (54) or their values.
  4. Operational assistance procedure (S) according to Claim 2 or 3 , in which the following calculation rule is executed to determine a new box position: BoxPositionLinks neu : = ( BoxPositionLinks aktuell + BoxGeschwindigkeitLinks aktuell * T Pr a ¨ diktion ) / ( 1 + NormGeschwindigkeit aktuell * T Pr a ¨ diktion ) BoxPositionRechts neu : = ( BoxPositionRechts aktuell + BoxGeschwindigkeitRechts aktuell * T Pr a ¨ diktion ) / ( 1 + NormGeschwindigkeit aktuell * T Pr a ¨ diktion ) where BoxPositionLeft new and BoxPositionLeft current , respectively BoxPositionRight new and BoxPositionRight current , is the new and current position of the left and right box edges, respectively, and BoxVelocityLeft current and BoxVelocityRight current, respectively, is the currently measured angular velocity of the left and right box edges, respectively , and NormalVelocity current is the currently measured so-called normalized box velocity, and T Prediction is the prediction time associated with the prediction time step, where NormalVelocity is derived from the calculated scaling change of the object box.
  5. Operational assistance method (S) according to one of the preceding claims, in which an object (52) underlying a predicted object box (55) is determined as critical with respect to a possible collision, in particular with a criticality value of 100%, if the proportion of the width of the predicted object box (55) to the width of an underlying image or a predetermined section thereof exceeds a predetermined first threshold value.
  6. Operational assistance procedure (S) according to Claim 5 , in which the value of a criticality determined for an object (52) is reduced by the proportion by which the object box (55) predicted for the object (52) is positioned in its width outside the underlying image or the specified section.
  7. Operational assistance method (S) according to one of the preceding claims, in which a pedestrian (52') is identified as object (52), the position and movement of the pedestrian (52') as object (52) are examined and evaluated on the basis of a pedestrian model, an acceleration capability of the pedestrian (52') as object (52) is determined on the basis of a velocity determined for the pedestrian (52'), and the criticality for the pedestrian (52') as object (52) is determined on the basis of the velocity and the acceleration capability, wherein, in particular, on the basis of the acceleration capability, an object box (56) enclosing the predicted object box (55) or at least laterally or horizontally encompassing it is generated and is used as the basis for the evaluation of the criticality.
  8. Control unit (10) for an operational assistance system (100) of a work device (1) and in particular of a vehicle, which is configured to control, run, and/or operate an underlying operational assistance system (100) according to an operational assistance system (S) according to one of the preceding claims.
  9. Operational assistance system (100) for a work device (1) and in particular for a vehicle, which is equipped with an operational assistance procedure (S) according to one of the Claims 1 until 7 to execute and which in particular requires a control unit (10) to Claim 8 exhibits.
  10. Work device (1) which includes an operational assistance system (100) according to Claim 9 exhibits and which is specifically designed as a vehicle, motor vehicle or passenger car.
  11. Use of the operational assistance procedure (S) according to one of the Claims 1 until 7 , the control unit (10) after Claim 8 , of the operational assistance system (100) according to Claim 9 and/or the work device (1) according Claim 10 for pedestrian protection, for cyclist protection, for ACC and/or for avoidance systems or procedures.

Description

State of the art The present invention relates to an operating assistance system for a work device or for a vehicle, a control unit for an operating assistance system of a work device, an operating assistance system as such, and a work device and in particular a vehicle. In the case of work equipment, and particularly in the automotive sector, operational assistance systems and procedures are increasingly being used. These systems and procedures analyze the environment of the respective device for potential collisions with objects and issue corresponding warnings and/or intervene in the operation of the device. With known systems and procedures, comparatively complex systems and data structures are used, for example, with the evaluation of three-dimensional data, and/or the predictive power of the corresponding assessments of the environment is insufficient for intervening in the operation of the device, for example, for a braking decision. The writing US 2005/0131 646 A1 Disclosing a method for collision detection, comprising: detecting an object within a first operational area of an object tracker; determining a classification of the detected object with the object tracker; tracking said object with the object tracker; detecting the detected object within a second operational area of a collision detector; and activating a safety measure by means of the collision detector based on the classification. The document WO 2016/014 548 A1 Disclosing a method for operating a pedestrian collision avoidance system in a vehicle, the method comprises: sensing the area surrounding the vehicle with a radar sensor and a video camera; transmitting radar information from the radar sensor and video information from the video camera to an electronic control unit; detecting an object in the video information with the electronic control unit; classifying the object as a pedestrian based on a comparison of the video information with a database; determining, with the electronic control unit, a distance between an object classified as a pedestrian and the vehicle based on the radar information; determining, with the electronic control unit, a feature of the object classified as a pedestrian based on the video information, the distance, and the database; storing the feature of the object classified as a pedestrian in memory; and, when the object classified as a pedestrian is no longer detected by the radar sensor: determining, with the electronic control unit, an updated distance to the object classified as a pedestrian based on the video information and the feature of the object classified as a pedestrian. Determine, using the electronic control unit, whether there is a collision potential between the vehicle and the object classified as a pedestrian, based in part on the distance to the object classified as a pedestrian; and if the collision potential exists, activate an automatic vehicle response. Disclosure of the invention In contrast, the operational assistance method according to the invention with the features of claim 1 has the advantage that a particularly reliable collision prediction can be generated for the operation of a work device using comparatively simple means. According to the invention, this is achieved with the features of claim 1 by creating an operational assistance method for a work device, and in particular for a vehicle, in which (S1) Object boxes for an object in a field of view of the work device at successive time points in time or data characterizing such object boxes are obtained, (S2) from object boxes of a given object to consecutively or directly consecutively recorded images, a momentary scaling change or scale change of an object box to the respective object and a momentary lateral position change of the object box to the respective object can be determined, (S3) from the current scaling change, scale change or derived quantities and the current lateral position change to an object box, a future predicted object box is determined for a given object and (S4) the position of the predicted object box and/or the ratio of a lateral extent of the predicted object box to a lateral extent of a captured field of view and/or the captured images are determined and evaluated and (S5) depending on the result of the assessment, (i) it is determined whether an object underlying the predicted object box is critical with respect to a possible collision or not, and/or (ii) an operating state of the work device is controlled or regulated. According to the invention, the evaluation of the environment of the work device is based on so-called object boxes and correspondingly predicted object boxes and their size development in relation to a captured field of view. This data can be acquired essentially in two dimensions and determined with high accuracy. In principle, the data associated with the object boxes can be provided externally, for example by optical detection units of conventional driver assistance systems.