Search

JP-7854914-B2 - Remote monitoring system

JP7854914B2JP 7854914 B2JP7854914 B2JP 7854914B2JP-7854914-B2

Inventors

  • 久保(葛谷) 麻未
  • 石塚 博基
  • 海野 和也
  • 山田 貴洋

Assignees

  • 株式会社東海理化電機製作所

Dates

Publication Date
20260507
Application Date
20220930

Claims (11)

  1. A mobile device is equipped with four or more imaging units, each capable of setting its imaging range so that overlapping areas are formed where adjacent imaging areas overlap. A composite image generation unit combines four or more images captured by each of the four or more imaging units by overlapping the overlapping portions to generate a composite image that includes the entire periphery of the moving object in the horizontal direction. A mobile body including a transmission unit that transmits the composite image generated by the composite image generation unit, A receiving unit that receives the composite image transmitted from the moving body, Display unit and A remote monitoring device including a display image generation unit that generates a display image to be displayed on the display unit from the composite image received by the receiving unit, Equipped with, The display image generation unit generates the display image in a viewpoint or display mode desired by the monitor monitoring the display unit of the remote monitoring device , and displays an image indicating the area in the direction the monitor wants to see over the display image, allowing the monitor to change the direction they want to see by moving the image .
  2. The remote monitoring system according to claim 1, wherein the display image generation unit generates the display image, which is an overhead view image taken from a viewpoint desired by the monitor.
  3. The remote monitoring system according to claim 1, wherein the display image generation unit generates the display image cropped in the direction desired by the monitor.
  4. A mobile device is equipped with four or more imaging units, each capable of setting its imaging range so that overlapping areas are formed where adjacent imaging areas overlap. A composite image generation unit combines four or more images captured by each of the four or more imaging units by overlapping the overlapping portions, thereby generating a composite image that includes the entire periphery of the moving object in the horizontal direction. A mobile body including a transmission unit that transmits the composite image generated by the composite image generation unit, A receiving unit that receives the composite image transmitted from the moving body, Display unit and A remote monitoring device including a display image generation unit that generates a display image from the composite image received by the receiving unit, which is displayed on the display unit and matches the mode set on the mobile device, Equipped with, The aforementioned mobile device is capable of autonomous driving and is configured to allow setting of at least a normal driving mode and an abnormal driving mode. The remote monitoring system includes a display image generation unit that changes the appearance of the display image between the normal driving mode and the abnormal driving mode , generates the display image based on the composite image received by the receiving unit when the normal driving mode is set, and generates the display image based on notification information from the automated driving system and information detected by sensing by sensors mounted on the mobile body when the abnormal driving mode is set .
  5. The aforementioned mobile body is configured to allow setting a recovery driving mode in which it starts moving after stopping . The remote monitoring system according to claim 4 , wherein the display image generation unit generates the display image in which the lower side of the moving body is visible when the return-to-running mode is set.
  6. The remote monitoring system according to claim 1 or 4 , wherein each of the four or more captured images is set to include a part of the moving object.
  7. The moving body further includes a lower shooting unit whose shooting range is set such that the area below the moving body, including a part of the moving body, becomes the shooting area. The remote monitoring system according to claim 1 or 4 , wherein the composite image generation unit also combines the lower side image captured by the lower side imaging unit with the composite image.
  8. The remote monitoring system according to claim 1 or claim 4 , wherein the composite image is a 360-degree spherical image.
  9. The remote monitoring system according to claim 1 or 4 , wherein the display image generation unit generates the display image based on the driving status of the moving body.
  10. The remote monitoring system according to claim 1 , wherein the display image generation unit generates the display image according to the mode set on the mobile body.
  11. The remote monitoring system according to claim 1 or claim 4 , wherein the composite image has a vertical-to-horizontal ratio of 1:2.

Description

Applicable under Article 30, Paragraph 2 of the Patent Law. Published on June 21, 2022 at http://www.tokai-rika.co.jp/topics/2022/220621.pdf. Exhibited at the Automotive Engineering Exposition 2022 Nagoya from June 29, 2022 to July 1, 2022. This invention relates to a remote monitoring system. Patent Document 1 discloses a vehicle remote control system that generates a composite image, or overhead view, of the area surrounding the vehicle as seen from a virtual viewpoint, based on multiple images captured by multiple cameras mounted on the vehicle. In this vehicle remote control system, the vehicle generates a display image based on the overhead view for display on the monitor of a remote monitoring device (operation terminal), and transmits the generated display image to the remote monitoring device for display on the remote monitoring device's monitor. Japanese Patent Publication No. 2019-156299 This is a schematic diagram showing the general configuration of a remote monitoring system according to one embodiment of the present invention.This is a block diagram showing the hardware configuration of a remote monitoring system according to one embodiment of the present invention.This diagram shows the positions where cameras are placed on the vehicle and the images captured at each position.This is an explanatory diagram for explaining how to combine images.This is an explanatory diagram for describing equirectangular composite images.This is an explanatory diagram illustrating the conversion from equirectangular to spherical format in a 360-degree image.This figure shows an example of a display image shown on the display unit of a remote monitoring device.This figure shows an example of a display image shown on the display unit of a remote monitoring device.This figure shows an example of a display image shown on the display unit of a remote monitoring device.This figure shows an example of a display image shown on the display unit of a remote monitoring device.This figure shows an example of a display image shown on the display unit of a remote monitoring device.This figure shows an example of a display image shown on the display unit of a remote monitoring device. A remote monitoring system 10 according to one embodiment of the present invention will be described using Figures 1 to 12. In the following description, when the directions of front, rear, left, right, up, and down are indicated, they refer to the front, rear, left, right, up, and down directions of a vehicle. Figure 1 is a schematic diagram showing the configuration of the remote monitoring system 10 according to this embodiment, and Figure 2 is a block diagram showing the hardware configuration of the remote monitoring system 10 according to one embodiment of the present invention. As shown in Figure 1, the remote monitoring system 10 of this embodiment is configured to include a vehicle 20 as a mobile unit, a remote monitoring device 30, and a network 40. Here, we will describe the vehicle 20. Vehicle 20 is an autonomous, self-driving vehicle, and in this embodiment, it is configured to accommodate a passenger. As shown in Figure 2, vehicle 20 includes a camera 22, an image processing device 24, a vehicle control device 26, and a communication unit 28. The camera 22 in this embodiment, as an example, includes nine cameras: a front camera 22A, a right-front camera 22B, a right-side camera 22C, a right-rear camera 22D, a rear camera 22E, a left-rear camera 22F, a left-side camera 22G, a left-front camera 22H, and a downward camera 22J. These nine cameras 22A to 22J each correspond to the imaging unit of the present invention, and as an example, each uses a circular fisheye lens with a field of view larger than 180 degrees. The image captured by the circular fisheye lens is a fisheye image, characterized by subjects near the center of the imaging range appearing large, while subjects at the edges of the imaging range appear small. The imaging range is represented by a circular image. In this embodiment, the "captured image" may be a still image or a video. Figure 3 shows the positions of the nine cameras 22A to 22J on the vehicle 20 and the images captured at each position. As shown in Figure 3, the front camera 22A is mounted on the front of the vehicle 20, and the image P1 captured by the front camera 22A shows the front of the vehicle 20. The right front camera 22B is mounted on the right front of the vehicle 20, and the image P2 captured by the right front camera 22B shows the right front of the vehicle 20. The right side camera 22C is mounted on the right side of the vehicle 20, and the image P3 captured by the right side camera 22C shows the right side of the vehicle 20. The right rear camera 22D is mounted on the right rear of the vehicle 20, and the image P4 captured by the right rear camera 22D shows the right rear of the vehicle 20. The rear camera 22E is mounted on the rear of the vehicle 20, and the image P5 captured by the rear ca