Search

JP-2022533714-A5 -

JP2022533714A5JP 2022533714 A5JP2022533714 A5JP 2022533714A5JP-2022533714-A5

Dates

Publication Date
20230515
Application Date
20200512

Description

In one or more embodiments, the near-left frame and the near-right frame are displayed to the user at a first depth. The far-left frame and the far-right frame may be displayed to the user at a second depth, the second depth being greater than the first depth. The first and second depths correspond to approximately 1.96 and approximately 0.67 diopters, respectively. When the near-left frame, far-left frame, near-right frame, and far-right frame are displayed to the user, the user can perceive a 3D image. The 3D image may correspond to frames of 3D image data. The present invention provides, for example, the following: (Item 1) A method for displaying a three-dimensional ("3D") image in blend mode, Rendering frames of 3D image data, The process involves analyzing the frames of the aforementioned 3D image data and generating depth data. Using the depth data, the 3D image data is segmented into i) at least one near frame of two-dimensional ("2D") image data corresponding to near depth, and ii) at least one far frame of 2D image data corresponding to far depth beyond the near depth from a certain viewpoint. The near frame and far frame are displayed at the near depth and far depth, respectively, and the near frame and far frame are displayed simultaneously. Methods that include... (Item 2) The aforementioned near-depth corresponds to approximately 1.96 diopters. The aforementioned depth corresponds to approximately 0.67 diopters, as described in item 1. (Item 3) The frame of the aforementioned 3D image data is Depth segmented data and, Stereo color pair data and, Real-world mesh data and The method according to item 1, comprising: (Item 4) Analyzing the frames of the aforementioned 3D image data and generating the aforementioned depth data is, The disparity map is generated from the frames of the 3D image data, Reprojecting the frame of the aforementioned 3D image data The method described in item 1, including the method described in item 1. (Item 5) Segmenting the 3D image data into near and far frames of the 2D image data using the aforementioned depth data is: Identifying a near-only set of virtual objects/pixels with individual depths within a near-depth range, Identifying a far-only set of virtual objects/pixels with individual depths within the far-depth range, Identifying near and far overlapping sets of virtual objects/pixels with individual depths within the central depth range and The method described in item 1, including the method described in item 1. (Item 6) Furthermore, using the depth data, the 3D image data is segmented into near and far frames of the 2D image data. Adding the aforementioned virtual object/pixel set to the nearest frame of the 2D image data, Adding the aforementioned set of virtual objects/pixels specifically for distant objects to the distant frame of the 2D image data, Blend analysis is performed on the near and far overlap sets of virtual objects/pixels to identify the near and far overlap sets of virtual objects/pixels, Adding the aforementioned near-overlap set of virtual objects/pixels to the near frames of the 2D image data, Adding the aforementioned far-overlapping set of virtual objects/pixels to the far-over frame of the 2D image data. The method described in item 6, including the method described in item 6. (Item 7) Blend analysis is the method described in item 6, involving linear interpolation, nonlinear interpolation, and/or multiple linear interpolations. (Item 8) A method for displaying a three-dimensional ("3D") image in blend mode, Rendering frames of 3D image data, The process involves analyzing the frames of the aforementioned 3D image data and generating depth data. Using the aforementioned depth data, the 3D image data is segmented into multiple frames of two-dimensional ("2D") image data, Displaying the aforementioned multiple frames, wherein the multiple frames are The leftmost frame of the 2D image data corresponding to the near depth, A left-far frame of 2D image data corresponding to a far depth greater than the aforementioned near depth from a certain viewpoint, The near-right frame of the 2D image data corresponding to the aforementioned near depth, The rightmost frame of the 2D image data corresponding to the aforementioned depth and Equipped with, The aforementioned left-most frame and the aforementioned left-farthest frame are displayed simultaneously. The aforementioned near-right frame and the aforementioned far-right frame are displayed simultaneously. Methods that include... (Item 9) The left-near frame and the left-far frame are displayed to the user's left eye, as described in item 8. (Item 10) The aforementioned near-right frame and far-right frame are displayed to the user's right eye, as described in item 8. (Item 11) The left-most frame and the right-most frame are displayed to the user at a first depth according to the method described in item 8. (Item 12) The method according to item 11, wherein the left-far frame and the right-far fr