Search

JP-7855154-B2 - Information processing device and information processing method

JP7855154B2JP 7855154 B2JP7855154 B2JP 7855154B2JP-7855154-B2

Inventors

  • 馬場 直哉
  • 羽下 哲司

Assignees

  • 三菱電機モビリティ株式会社

Dates

Publication Date
20260507
Application Date
20240227

Claims (6)

  1. An image acquisition unit that acquires captured images of a person from an imaging device, An image rotation unit rotates the image acquired by the image acquisition unit toward the reference angle by the amount of the installation angle of the imaging device relative to a reference angle, A face region detection unit detects the region of a person's face from the captured image based on the image captured by the image rotation unit, A reference region detection unit detects a reference region from the captured image acquired by the image acquisition unit, based on the facial region of the person detected by the facial region detection unit, which contains a rectangular region without tilt that is inscribed within the facial region of the person. An information processing device comprising: an information output unit that outputs information indicating the reference region or the average brightness in the reference region to the imaging device based on the reference region detected by the reference region detection unit.
  2. The information processing apparatus according to claim 1, wherein the reference region detection unit detects, based on the facial region of a person detected by the facial region detection unit, a reference region from the captured image acquired by the image acquisition unit, which is similar in shape to a non-tilted rectangular region circumscribing the facial region of the person and is inscribed in the facial region of the person.
  3. The system includes an accessory detection unit that detects whether or not a person is wearing an accessory based on the captured image acquired by the image acquisition unit. The information processing apparatus according to claim 1, wherein, when the attached accessory detection unit detects that a person is wearing an accessory, the reference area detection unit excludes the area of the accessory from the reference area detection target of the captured image acquired by the image acquisition unit.
  4. The information processing apparatus according to claim 3, characterized in that the accessory to be detected by the aforementioned accessory detection unit is a mask.
  5. The information processing apparatus according to claim 1, further comprising a person state detection unit that detects the state of a person based on the facial region of the person detected by the facial region detection unit.
  6. The image acquisition unit acquires an image of a person from the imaging device, The image rotation unit rotates the image acquired by the image acquisition unit toward the reference angle by the amount of the installation angle of the imaging device relative to the reference angle, The face region detection unit detects the region of a person's face from the captured image based on the image captured after rotation by the image rotation unit, The reference region detection unit detects a non-tilted rectangular region inscribed within the region of the person's face from the captured image acquired by the image acquisition unit, based on the region of the person's face detected by the face region detection unit, as a reference region. An information processing method comprising the step of an information output unit outputting information indicating a reference region or the average brightness in the reference region to the imaging device based on the reference region detected by the reference region detection unit.

Description

This disclosure relates to an information processing device and an information processing method. Conventionally, in face detection or facial recognition using images acquired by an imaging device, an area within the face region of the image is set as the reference region used for optimal exposure control of the image. Typically, the reference region is a rectangular area without tilt relative to the image. On the other hand, if the installation angle (roll angle) of the imaging device is tilted due to the installation conditions of the imaging device, the face region in the captured image may be tilted. When the face region is tilted in this way, the background region encroaches on the reference region. Furthermore, if exposure control is performed using a reference area that includes the background area in addition to the face area, ideal exposure control cannot be achieved. For example, at night, the background area is darker than the face area, so the average brightness in the reference area becomes lower than expected, and the captured image may be overexposed due to exposure control. Therefore, even when the face region is tilted, it is necessary to detect an ideal reference region that does not include the background region. In contrast, for example, the imaging device disclosed in Patent Document 1 shows a configuration that ensures the background is not included in the photometric area even when the face is tilted. That is, this imaging device detects the angle of the face and tilts the angle of the photometric area according to that angle. Japanese Patent Publication No. 2010-200057 This figure shows an example configuration of a crew status detection system equipped with an information processing device according to Embodiment 1.This figure shows an example of the configuration of the information processing device according to Embodiment 1.This is a flowchart showing an example of the operation of the information processing device according to Embodiment 1.Figures 4A and 4B show examples of the operation of the image rotation unit in Embodiment 1. Figure 4A shows an example of an image captured by the image acquisition unit (image captured before rotation by the image rotation unit), and Figure 4B shows an example of an image captured after rotation by the image rotation unit.This figure shows an example of the operation of the face region detection unit in Embodiment 1.Figures 6A to 6C are diagrams showing examples of operation of the reference region detection unit in Embodiment 1. Figure 6A shows an example of detection of an external rectangular region by the reference region detection unit, Figure 6B shows an example of detection of two intersection points by the reference region detection unit, and Figure 6C shows an example of detection of an internal rectangular region by the reference region detection unit.This figure shows an example of the configuration of the information processing device according to Embodiment 2.This is a flowchart showing an example of the operation of the information processing device according to Embodiment 2.Figures 9A and 9B show examples of operation of the information processing device according to Embodiment 2. Figure 9A shows an example of detecting a reference area when the person is not wearing an accessory, and Figure 9B shows an example of detecting a reference area when the person is wearing an accessory (mask).Figures 10A and 10B show examples of the hardware configuration of the information processing device according to Embodiments 1 and 2. The embodiments will be described in detail below with reference to the drawings. Embodiment 1. Figure 1 shows an example of the configuration of a crew status detection system equipped with an information processing device 2 according to Embodiment 1. The occupant status detection system according to Embodiment 1 is a system for detecting the status of occupants in a vehicle. The occupant status detected by the occupant status detection system includes information related to face detection or face recognition. Information related to face detection includes information such as whether there are occupants, whether the occupants are dozing off, or whether the occupants are distracted. Information related to face recognition includes information such as who the occupants are. This crew status detection system includes, for example, an imaging device 1 and an information processing device 2, as shown in Figure 1. In this example, the imaging device 1 and information processing device 2 are shown as being applied to a crew status detection system. However, the imaging device 1 and information processing device 2 are not limited to this and may be applied to other systems as well. Furthermore, this example shows the case where the imaging device 1 and the information processing device 2 are separate components. However, the system is not limited to this configuration, and the imaging device 1 and the information proces