Search

JP-7856460-B2 - Control device, control method, and control program

JP7856460B2JP 7856460 B2JP7856460 B2JP 7856460B2JP-7856460-B2

Inventors

  • ▲高▼橋 宏輔

Assignees

  • 富士フイルム株式会社

Dates

Publication Date
20260511
Application Date
20220325

Claims (9)

  1. It includes at least one processor, The aforementioned processor, When a first user observes an image captured by the camera of a glasses-type display device using the glasses-type display device, and a second user observes the captured image using a display device different from the glasses-type display device, A first image, obtained by controlling the shooting process based on a first region within the captured image, is output to the glasses-type display device. A control device that controls the output of a second image, which has been captured based on a second region within the aforementioned captured image, to the display device.
  2. The aforementioned processor, As part of the aforementioned shooting control, the camera is controlled to capture a single image with a dynamic range suitable for both capturing a subject in the first region and capturing a subject in the second region. The control device according to claim 1, which performs control to output the first image and the second image corresponding to the single captured image.
  3. The aforementioned processor, The brightness value of the aforementioned single captured image is adjusted for the glasses-type display device to generate the first image. The control device according to claim 2, which generates a second image by adjusting the brightness value of the first captured image for the display device.
  4. The aforementioned processor, As part of the aforementioned shooting control, the focus position of the camera's optical system is set between a first focus position in which the subject in the first region is in focus and a second focus position in which the subject in the second region is in focus, and the camera is controlled to capture a single image with the focus position set such that the subject in the first region and the subject in the second region are included in the depth of field . The control device according to claim 1, which performs control to output the first image and the second image corresponding to the single captured image.
  5. The aforementioned processor, The first captured image is defined as the first image, obtained by controlling the exposure of the camera based on the subject in the first region. The second image is a second captured image obtained by controlling the exposure of the camera based on the subject captured in the second region. The control device according to claim 1.
  6. The control device according to any one of claims 1 to 5, wherein the second image is a partial image obtained by cutting out the second region from the captured image.
  7. The aforementioned processor, A control device according to any one of claims 1 to 6, which determines the first region based on the gaze of the first user detected by a gaze detection device.
  8. A control method by a processor provided in a control device, When a first user observes an image captured by the camera of a glasses-type display device using the glasses-type display device, and a second user observes the captured image using a display device different from the glasses-type display device, A first image, obtained by controlling the shooting process based on a first region within the aforementioned captured image, is output to the glasses-type display device. The system controls the output of a second image, which has been captured using a second region within the captured image as a reference, to the display device. Control method.
  9. An image processing program executed by a processor in a control device, When a first user observes an image captured by the camera of a glasses-type display device using the glasses-type display device, and a second user observes the captured image using a display device different from the glasses-type display device, A first image, obtained by controlling the shooting process based on a first region within the captured image, is output to the glasses-type display device. The system controls the output of a second image, which has been captured using a second region within the aforementioned captured image as a reference, to the display device. Control program.

Description

This disclosure relates to a control device, a control method, and a control program. Conventionally, technologies have been known that allow users other than a specific user to view images observed by a particular user using a glasses-type display device. For example, Patent Document 1 describes a technology that shares content within virtual reality content viewed by a player wearing a head-mounted display with other viewers. The technology described in Patent Document 1 scales up or down the images provided to viewers depending on their position. Special table 2019-516159 publication This is a configuration diagram showing an example of the configuration of an image display system according to an embodiment.This is a diagram showing an example of the configuration of AR glasses and a smartphone.This is a perspective view showing an example of AR glasses according to an embodiment.This is a diagram illustrating the images captured and viewed through AR glasses.A block diagram showing an example of the hardware configuration of a smartphone according to an embodiment.This diagram illustrates the perception of the real world through the displays of AR glasses and image display devices.This block diagram shows an example of the configuration of a smartphone according to the embodiment.These are diagrams illustrating the first and second images of the first embodiment.This block diagram shows an example of the hardware configuration of an image display device according to an embodiment.This block diagram shows an example of the configuration of an image display device according to an embodiment.This flowchart shows an example of control processing performed by a smartphone and display control processing performed by an image display device according to the first embodiment.These are diagrams illustrating the first and second images of the second embodiment.This flowchart shows an example of control processing performed by a smartphone and display control processing performed by an image display device according to the second embodiment.These are diagrams illustrating the first and second images of the third embodiment.This flowchart shows an example of control processing performed by a smartphone and display control processing performed by an image display device according to the third embodiment. The following describes in detail, with reference to the drawings, examples of embodiments for carrying out the technology of this disclosure. [First Embodiment] Referring to Figure 1, the configuration of the image display system 1 of this embodiment will be described. As shown in Figure 1, the image display system 1 of this embodiment comprises AR (Augmented Reality) glasses 10, a smartphone 12, and an image display device 14. The smartphone 12 and the image display device 14 are connected via a network 19 by wired or wireless communication. The AR glasses 10 of this embodiment are an example of the glasses-type display device of this disclosure, and the smartphone 12 of this embodiment is an example of the control device of this disclosure. Furthermore, the image display device 14 of this embodiment is an example of a display device different from the glasses-type display device of this disclosure. The image display system 1 of this embodiment has the function of displaying images captured by the camera 27 of the AR glasses 10, showing the real world, on the display 20 of the AR glasses 10 and the display 56 of the image display device 14. That is, according to the image display system 1 of this embodiment, the user of the image display device 14 can also see the real world as seen by the user of the AR glasses 10. Hereinafter, the user using the AR glasses 10 will be referred to as the "first user," the user using the image display device 14 will be referred to as the "second user," and when referring to both collectively without distinction, they will simply be referred to as "user." Referring to Figure 2, the configuration of the AR glasses 10 and smartphone 12 in this embodiment will be described. The AR glasses 10 is a device that enables the user to view images corresponding to each projected image by projecting a left-eye projection image from an OLED (Organic Light Emitting Diode) 26L onto the left-eye lens 22L, and projecting a right-eye projection image from a right-eye OLED 26R onto the right-eye lens 22R. In the following, the "left-eye projection image" and the "right-eye projection image" will be collectively referred to as the "projected image." Figure 3 shows a perspective view of an example of the AR glasses 10 of this embodiment. As shown in Figures 2 and 3, the AR glasses 10 comprises a display 20, a left-eye OLED 26L, a right-eye OLED 26R, a camera 27, and a gaze detection sensor 28. The display 20 also includes a left-eye lens 22L and a left-eye light guide plate 24L, and a right-eye lens 22R and a right-eye light guide plate 24R. Light corresponding to the left-eye projection image projected