Search

CN-116324883-B - Image acquisition device

CN116324883BCN 116324883 BCN116324883 BCN 116324883BCN-116324883-B

Abstract

The image acquisition device has a photoelectric image sensor, a position sensor, a graphical user output interface, and an evaluation unit. The image sensor includes an image dataset representing an imaging of a scene located in front of the image sensor on an image plane of the image sensor. The position sensor detects a spatial position of the image plane with respect to a reference direction and provides position data indicative of the rotation angle and the tilt angle. The evaluation unit uses the position data to determine a projective transformation that maps the image dataset onto a projection plane tilted with respect to the image plane according to the rotation and the tilting. The evaluation unit determines an image portion of the image dataset mapped to the projection plane by projective transformation in the projection plane and displays the image portion and an area of the imaged scene at least within the image portion in the graphical user output interface.

Inventors

  • SYLVIA SCHMIDT
  • Patrick Tess

Assignees

  • 徕卡相机股份公司

Dates

Publication Date
20260512
Application Date
20211008
Priority Date
20201008

Claims (16)

  1. 1. An image acquisition apparatus (10) comprising a photoelectric image sensor (12), a position sensor (14), a graphical user output interface (16, 17) and an evaluation unit (20), Wherein the image sensor (12) is configured to acquire an image dataset (100) representing an imaging of a scene (1) located in front of the image sensor (12) on an image area (30) of the image sensor (12), Wherein the position sensor (14) is configured to detect a spatial position of the image area (30) with respect to a reference direction (50) and to provide position data indicative of both a rotation angle (52) and a tilt angle (54), wherein the rotation angle is an angle by which the image area (30) rotates about an optical axis (43) of the image sensor (12) when the image dataset (100) is acquired, and the tilt angle is an angle by which the image area (30) tilts about a horizontal axis (121) when the image dataset (100) is acquired, Wherein the horizontal axis (121) is oriented perpendicular to the optical axis (43) and perpendicular to the reference direction (50), Wherein the evaluation unit (20) is configured to determine a projective transformation from the position data, the projective transformation mapping the image dataset (100) onto a projection plane (120) in dependence of both rotation and tilting of the image region (30), Wherein the projection plane (120) is inclined with respect to the image area (30) according to the inclination angle (54) and intersects the image area (30) along an intersection line which rotates in the image area (30) according to the rotation angle (52) with respect to a central axis (41, 42) of the image area (30), Wherein the evaluation unit (20) is configured to determine an image portion (60,81,82,83) of the projection plane (120) for the image dataset (100) mapped onto the projection plane (120) by the projective transformation, Wherein the evaluation unit (20) is further configured to display in the graphical user output interface (16, 17) at least the area of the scene (1) imaged onto the image area (30) that is located within the image portion (60,81,82,83) simultaneously with the image portion (60,81,82,83).
  2. 2. The image acquisition apparatus (10) according to claim 1, Wherein the evaluation unit (20) is configured to determine the image portion (60,81,82,83) as a rectangular portion such that, in the projection plane (120), a first central axis (75) of the image portion (60,81,82,83) extends parallel to the horizontal axis (121) and a second central axis (76) of the image portion (60,81,82,83) perpendicular to the first central axis (75) extends parallel to the reference direction (50).
  3. 3. An image acquisition apparatus (10) as claimed in claim 1 or 2, Wherein the evaluation unit (20) is configured to determine the image portion (60,81,82,83) such that, when a predefined aspect ratio is maintained, at least two corners (65, 66,67, 68) of the image portion (60,81,82,83) are located on edges (31, 32,33, 34) of the image region (30), which edges are mapped onto the projection plane (120) by the projective transformation.
  4. 4. The image acquisition apparatus (10) according to claim 1 or 2, Wherein the evaluation unit (20) is configured to determine the image portion (60,81,82,83) independently of the position of the image sensor (12) such that a center (39) of the image region (30) projected onto the projection plane (120) by the projective transformation is located on a central axis (76) of the image portion (60,81,82,83).
  5. 5. The image acquisition apparatus (10) according to claim 1 or 2, Wherein the evaluation unit (20) is configured to determine the angle (65, 66,67, 68) of the image portion (60,81,82,83) in the projection plane (120) as the intersection of a diagonal (69) of the image portion (60,81,82,83) with an edge (31, 32,33, 34) of the image region (30) projected onto the projection plane (120), the diagonal being predefined by the aspect ratio, at least when the tilt angle (54) is equal to zero.
  6. 6. The image acquisition apparatus (10) according to claim 1 or 2, Wherein the evaluation unit (20) is configured to determine an angle (65, 66,67, 68) of an image portion (60,81,82,83) in the projection plane (120) as an intersection of a diagonal (72, 73) of half (70) of the image portion (60,81,82,83) with an edge (31, 32,33, 34) of the image region (30) projected onto the projection plane (120), the diagonal being predefined by an aspect ratio, at least when the tilt angle (54) differs from zero by at least a threshold value and the rotation angle (52) is equal to zero, Wherein in the projection plane (120), the diagonal (72, 73) extends through a center (74) of the other edge (61, 62,63, 64) of the image portion (60,81,82,83) aligned parallel to the horizontal axis (121).
  7. 7. The image acquisition apparatus (10) according to claim 1 or 2, Wherein a projection center (125) of the projective transformation is arranged on the optical axis (43); wherein a distance (126) from the image region (30) to the projection center (125) corresponds to a focal length of an imaging optics (18) of the image acquisition device (10) imaging the scene (1) onto the image sensor (12), the focal length being normalized to a diagonal of the image region (30).
  8. 8. The image acquisition apparatus (10) according to claim 1 or 2, Wherein the evaluation unit (20) is configured to determine the image portion (60,81,82,83) using only corner points (35, 36,37, 38) of the image region (30) projected onto the projection plane (120).
  9. 9. The image acquisition apparatus (10) according to claim 1 or 2, Wherein the evaluation unit (20) is configured to determine the image portion (60,81,82,83) based on a predefined aspect ratio, Wherein the predefined aspect ratio is different from the aspect ratio of the image sensor (12) and/or the aspect ratio of the user output interface (16, 17), and the evaluation unit (20) is configured to receive user input via the user input interface (17) to specify the predefined aspect ratio.
  10. 10. The image acquisition apparatus (10) according to claim 1 or 2, Wherein the evaluation unit (20) is configured to display, in the graphical user output interface (16, 17), a region within the image portion of the scene imaged onto the image region (30) as a region of the image dataset (100) transformed into the projection plane (120) by means of the projective transformation, Wherein the evaluation unit (20) is configured to display the image portion (60,81,82,83) by cropping the transformed image dataset (100).
  11. 11. The image acquisition apparatus (10) according to claim 1 or 2, Wherein the evaluation unit (20) is configured to display the scene (1) imaged onto the image area (30) in its entirety in the graphical user output interface (16, 17) without using the projective transformation, Wherein the evaluation unit (20) is configured to display the image portion (60,81,82,83) by means of a frame superimposed on the imaged scene (1).
  12. 12. The image acquisition apparatus (10) according to claim 11, Wherein the evaluation unit (20) is configured to display in the user output interface (16, 17) the positions of measurement points (91, 92) for determining acquisition parameters of the image acquisition device (10), Wherein the position of the measuring points (91, 92) is displayed relative to the complete and untransformed scene (1) imaged onto the image area (30), Wherein the image acquisition device (10) comprises a combined user interface (17) comprising the user output interface and an overlay position input interface for defining a position of the measurement point (91, 92) relative to an untransformed scene (1), and the combined user interface is configured to detect the position of the measurement point (91, 92) as the position at which actuation of the overlay position input interface is detected within the untransformed scene (1).
  13. 13. The image acquisition apparatus (10) according to claim 1 or 2, Wherein the reference direction (50) is located in the projection plane (120), or Wherein the projection plane (120) is inclined with respect to the reference direction (50) by a residual angle that is not equal to zero and is smaller than the inclination angle (54).
  14. 14. The image acquisition apparatus (10) according to claim 1 or 2, Wherein the image sensor (12) is configured to acquire a sequence of image data sets representing the scene (1) imaged onto the image area (30) at successive points in time, Wherein the position sensor (14) is configured to detect a respective spatial position of the image sensor (12) for each image dataset (100) and to provide respective position data, Wherein the evaluation unit (20) is configured to determine a respective image portion (60,81,82,83) for each image dataset (100) projected onto a respective projection plane (120) by means of a projection transformation determined from the respective position data, Wherein the evaluation unit (20) is further configured to continuously display the respective image portion (60,81,82,83) in the user output interface (16, 17) at least together with an area of the scene (1) imaged onto the image area (30) that is located within the respective image portion (60,81,82,83).
  15. 15. The image acquisition apparatus (10) according to claim 10, Wherein the image portion (60,81,82,83) is displayed by cropping the transformed image dataset (100) by means of an edge of the user output interface (16, 17).
  16. 16. A method (300) for operating an image acquisition device (10), the method (300) comprising: Acquiring (305) an image dataset (100) using an image sensor of the image acquisition device, Wherein the image dataset (100) represents an imaging of a scene (1) located in front of the image sensor (12) on an image area (30) of the image sensor (12); -detecting (310) a spatial position of the image area (30) with respect to a reference direction (50); -providing position data indicative of both a rotation angle (52) and a tilt angle (54), wherein the rotation angle is an angle by which the image area (30) rotates about an optical axis (43) of the image sensor (12) when the image dataset (100) is acquired (305), and the tilt angle is an angle by which the image area (30) is tilted about a horizontal axis (121) when the image dataset (100) is acquired (305); Wherein the horizontal axis (121) is oriented perpendicular to the optical axis (43) and perpendicular to the reference direction (50); Determining (320) a projective transformation from the position data, wherein the projective transformation maps the image dataset (100) onto a projection plane (120) according to both rotation and tilting of the image region (30), Wherein the projection plane (120) is tilted with respect to the image area (30) about the horizontal axis (121) according to the tilt angle (54) and intersects the image area (30) along an intersection line, which rotates in the image area (30) according to the rotation angle (52) with respect to a central axis (41) of the image area (30); -determining (340) an image portion (60,81,82,83) in the projection plane (120) for the image dataset (100) mapped onto the projection plane (120) by the projective transformation; -displaying the image portion (60,81,82,83) and an area within the image portion (60,81,82,83) in the scene (1) imaged at least onto the image area (30) in a graphical user output interface (16, 17) of the image acquisition device (10).

Description

Image acquisition device Technical Field The present invention relates to an image acquisition device comprising a photo image sensor and a method of operating an image acquisition device comprising a photo image sensor. Background An image capture device (e.g., a camera or video camera) including an optoelectronic image sensor images a scene located in front of the image capture device onto an image area of the image sensor and generates image data representative of the scene imaged onto the image area. During imaging of a scene, if the image area is tilted with respect to a reference area of the scene (e.g., with respect to a vertical plane of the facade), it may happen that the reference area of the scene is not aligned parallel to the image area of the image sensor. Thus, different regions within the reference region of the scene have different pitches than the image region of the image sensor, and these different regions are imaged onto the image region with different size magnification. This results in perspective distortion of the scene imaged onto the image area. As a result, for example, straight lines extending parallel to each other in the reference plane may be imaged as so-called converging lines, which may occur in particular when imaging high buildings from a lower view angle. Perspective distortion caused by tilting of the image acquisition device may be translated during subsequent processing of the image data by an inverse transformation that compensates for the perspective distortion that occurs during image data acquisition. In this regard, straight lines extending parallel to each other in the original scene are typically identified within the image data, and the image data is then corrected by inverse transformation such that the identified straight lines are also aligned parallel in the acquired scene. However, in the inverse transform, the edges of the image data are also distorted, and thus the transformed image data must be cropped to avoid the case of edge tilting. In this way, the available image portions are reduced. This may lead to that, when the acquired image data is subsequently processed, the image areas that are present in the original acquisition and are important for the composition of the image have to be cropped and can no longer be used. Disclosure of Invention It is an object of the invention to enable a user of an image acquisition device to generate image data conveniently, wherein the image data after correction of perspective distortion occurring during acquisition still comprises all image areas related to the composite acquisition. This object is achieved by an image acquisition apparatus and a method of operating an image acquisition apparatus according to the independent claims. Further limitations are specified in the dependent claims, respectively. The image acquisition device has a photoelectric image sensor, a position sensor, a graphical user output interface, and an evaluation unit. The image sensor is configured to acquire an image dataset representing an imaging of a scene located in front of the image sensor on an image area of the image sensor. Furthermore, the position sensor is configured to detect a spatial position of the image area relative to the reference direction and to provide position data indicative of both a rotation angle, which is an angle by which the image area rotates about an optical axis of the image sensor, in particular a vertical projection onto the image area relative to the reference direction, when the image dataset is acquired, and a tilt angle, which is an angle by which the image area is tilted about a horizontal axis, in particular relative to the reference direction, when the image dataset is acquired. Here, the horizontal axis is oriented perpendicular to the optical axis and perpendicular to the reference direction. The evaluation unit is configured to determine a projective transformation from the position data, the projective transformation mapping the image dataset onto the projection plane according to both the rotation and the tilting of the image area. Wherein the projection plane is inclined with respect to the image area according to the inclination angle and intersects the image area along an intersection line that rotates in the image area according to the rotation angle with respect to a central axis of the image area. The evaluation unit is further configured to determine an image portion in the projection plane for the image dataset mapped onto the projection plane by means of the projective transformation and to display in the graphical user output interface together with the image portion at least an area within the image portion of the scene imaged onto the image area. Within the framework of the invention, a simple and most importantly correct perspective correction of perspective distortion is achieved by automatically detecting the position of the image sensor relative to the acquired scene by mea