KR-20260067848-A - A PORTABLE DISPLAY DEVICE DISPLAYING 3-DIMENSIONAL IMAGE AND CONTROLLING METHOD THEREOF
Abstract
A portable display device for displaying stereoscopic images and a method for controlling the same may be provided. Specifically, the portable display device may include: a camera; an inertial measurement unit (IMU) sensor; a memory for storing at least one instruction; and at least one processor, wherein the at least one processor, by executing the at least one instruction, causes the portable display device to detect the position of a user's eye through the camera, correct the pose of the user's eye, acquire IMU information related to the orientation of the portable display device through the inertial measurement unit sensor, primarily predict the position of the user's eye for a plurality of future situations, and secondarily predict the position of the user's eye using the IMU information.
Inventors
- 윤건수
- 원광현
- 오영호
Assignees
- 삼성전자주식회사
Dates
- Publication Date
- 20260513
- Application Date
- 20241106
Claims (20)
- In a portable display device (100) that displays a stereoscopic image, Camera (110); Inertial measurement unit (IMU) sensor (220); Memory (240) for storing at least one instruction; and It includes at least one processor (230), The above at least one processor (230) executes the above at least one instruction, thereby the portable display device (100), The position of the user's eyes is detected through the above camera (100), and Correcting the pose of the above user's eyes, and IMU information related to the orientation of the portable display device (100) is obtained through the inertial measurement unit sensor (220), and Predicting the position of the user's eye in multiple future situations in a primary manner, A portable display device (100) that secondarily predicts the position of the user's eye using the above IMU information.
- In Article 1, The above IMU information is, A portable display device (100) comprising the movement of the portable display device (100), the movement of the user's eyes, and the angle between the portable display device (100) and the user's line of sight.
- In either Article 1 or Article 2, The above at least one processor (230) executes the above at least one instruction, thereby the portable display device (100), Detecting the position of the user's eye and the direction of the eye's gaze, Calculate the first position of the user's eye by assuming that the direction of the user's gaze is a first axis direction facing vertically toward the portable display device, and The distance in the first axis direction from the above portable display device (100) to the first position is corrected to correct the first position, and A portable display device (100) that sets the second position so that the user's eye is parallel to the portable display device (100).
- In at least one of claims 1 to 3, The above at least one processor (230) executes the above at least one instruction, thereby the portable display device (100), Capturing the user using the above camera, and Extract multiple 3D landmarks from the captured face of the above user, and A portable display device (100) that estimates the pose of the user based on the plurality of three-dimensional landmarks.
- In at least one of claims 1 to 4, The above at least one processor (230) executes the above at least one instruction, thereby the portable display device (100), A portable display device (100) that predicts the speed of the user's eyes.
- In at least one of claims 1 to 5, The above at least one processor (230) executes the above at least one instruction, thereby the portable display device (100), Predicting a first future situation at a target time point to be predicted, a second future situation at a time point 0.5 frames prior to the target time point, and a third future situation at a time point 0.5 frames after the target time point, and A portable display device (100) that predicts the position of the user's eye in each of the first future situation, the second future situation, and the third future situation.
- In at least one of claims 1 to 6, The above at least one processor (230) executes the above at least one instruction, thereby the portable display device (100), A portable display device (100) that corrects the position of the eye primarily predicted using the interpupillary distance (IPD).
- In at least one of claims 1 to 7, The above at least one processor (230) executes the above at least one instruction, thereby the portable display device (100), A portable display device (100) that applies the IMU information to each of the above multiple future situations and compares the results of the application to obtain the position of the eye that is secondarily predicted.
- In at least one of claims 1 to 8, The above at least one processor (230) executes the above at least one instruction, thereby the portable display device (100), Accumulate the above IMU information, and A portable display device (100) that corrects the position of the eye using the accumulated IMU information.
- In at least one of claims 1 to 9, The above at least one processor (230) executes the above at least one instruction, thereby the portable display device (100), Acquiring multiple features from the movement of the above user, and Compensation is performed for each of the above plurality of features, and A portable display device (100) that reflects the result of performing the above compensation in the viewpoint mapping of the above stereoscopic image.
- In a method for controlling a portable display device that displays stereoscopic images, The operation of detecting the position of the user's eyes through the camera of the above-mentioned portable display device; The above portable display device performs the operation of correcting the pose of the user's eyes; The operation of acquiring IMU information related to the orientation of the portable display device through an inertial measurement unit (IMU) sensor of the portable display device; The operation of the above portable display device primarily predicting the position of the user's eye regarding a plurality of future situations; and A method comprising the operation of the portable display device secondarily predicting the position of the user's eye using the IMU information.
- In Article 11, The above IMU information is, A method comprising the movement of the portable display device, the movement of the user's eyes, and the angle between the portable display device and the user's line of sight.
- In one of paragraphs 11 and 12, The action of correcting the above pose is, An operation to detect the position of the user's eye and the direction of the eye's gaze; An operation to calculate the first position of the user's eye by assuming that the direction of gaze of the user's eye is a first axis direction facing vertically toward the portable display device; An operation to correct the first position by correcting the distance in the first axis direction from the portable display device to the first position; and A method comprising the action of setting the second position so that the user's eye is parallel to the portable display device.
- In at least one of claims 11 to 13, The action of correcting the above pose is, An action of capturing the user using the above camera; The operation of extracting a plurality of 3D landmarks from the captured face of the user; and A method comprising the operation of estimating the pose of the user based on the plurality of three-dimensional landmarks.
- In at least one of claims 11 to 14, A method further comprising an action of predicting the speed of the user's eyes.
- In at least one of claims 11 to 15, The operation of primarily predicting the position of the user's eye regarding the plurality of future situations above is, An operation to predict a first future situation at a target time point to be predicted, a second future situation at a time point 0.5 frames prior to the target time point, and a third future situation at a time point 0.5 frames after the target time point; and A method comprising an operation to predict the position of the user's eye in each of the first future situation, the second future situation, and the third future situation.
- In at least one of claims 11 to 16, A method further comprising the operation of correcting the position of the eye primarily predicted using the Inter-Pupilary Distance (IPD).
- In at least one of claims 11 to 17, The operation of secondarily predicting the position of the user's eye using the above IMU information is, A method comprising the operation of applying the IMU information to each of the plurality of future situations and comparing the results of the application to obtain the secondarily predicted position of the eye.
- In at least one of claims 11 to 18, The operation of accumulating the above IMU information; and A method further comprising an operation to correct the position of the eye using the accumulated IMU information.
- In at least one of claims 11 to 19, An action of acquiring multiple features from the movement of the above user; An operation to perform compensation for each of the above plurality of features; A method further comprising an operation of reflecting the result of performing the above compensation in the viewpoint mapping of the above stereoscopic image.
Description
A portable display device displaying a 3-dimensional image and a control method thereof The present disclosure relates to a portable display device for displaying stereoscopic images and a method for controlling the same. Specifically, the present disclosure relates to a technology for reducing crosstalk that occurs when a light field display (LFD) is applied to a portable display device such as a tablet or a smartphone. A light field display (LFD) can be a 3D display that creates stereoscopic images by generating a light field, which is represented as a vector distribution of light in space, using a flat display and optical elements. A light field can be a vector function representing the direction of light propagation and the intensity of light at every point in 3D space. A light field display can display the depth and sides of an object, thereby enabling the realization of stereoscopic images more naturally. Light field displays can have the characteristic of displaying different information depending on the direction the user is looking. To apply light field display technology, it may be necessary to accurately determine where the user's eyes are located. Conventional stereoscopic display devices that applied light field display technology used two cameras to determine the position of the user's eyes. Recently, there have been attempts to apply light field displays to portable display devices such as tablets and smartphones. When applying light field displays to portable display devices, unlike conventional stereoscopic display devices, it may not be easy to accurately measure the distance between the portable display device and the eyes by using a single camera to determine the position of the user's eyes. If the distance between the portable display device and the eyes cannot be accurately measured, crosstalk may occur in the stereoscopic image. Additionally, at least one of the portable display device and the user may move during use. If at least one of the portable display device and the user moves, a technology may be required to predict the position of the user's eyes and display a stereoscopic image that matches the predicted position of the user's eyes. If the position of the user's eyes is not accurately predicted, crosstalk may occur in the stereoscopic image. FIG. 1 is a drawing showing a stereoscopic image displayed on a portable display device according to one embodiment, directed toward the user's eyes. FIG. 2 is a block diagram showing a portable display device according to one embodiment. FIG. 3 is a drawing showing at least one of a portable display device and a user moving according to one embodiment. FIG. 4 is a flowchart illustrating a control method for a portable display device according to one embodiment. FIG. 5 is a flowchart illustrating in more detail a control method for a portable display device according to one embodiment. FIG. 6 is a diagram showing a portable display device according to one embodiment correcting a user's pose. FIG. 7 is a flowchart illustrating a method for a portable display device to estimate a user's pose according to one embodiment. FIG. 8 is a diagram showing a portable display device according to one embodiment estimating a user's pose. FIG. 9 is a diagram showing a portable display device according to one embodiment correcting a user's pose. FIG. 10 is a drawing showing a portable display device according to one embodiment correcting a user's pose. FIG. 11 is a drawing showing a portable display device according to one embodiment correcting a user's pose. FIG. 12 is a diagram showing a portable display device according to one embodiment calculating the speed at which a user's eye moves. FIG. 13 is a diagram showing a portable display device according to one embodiment calculating the speed at which a user's eye moves. FIG. 14 is a diagram showing a portable display device according to one embodiment calculating the speed at which a user's eye moves. FIG. 15 is a diagram showing a portable display device according to one embodiment primarily predicting the position of a user's eyes. FIG. 16 is a diagram showing a portable display device according to one embodiment correcting an eye position predicted using inter-pupillary distance (IPD). FIG. 17 is a diagram showing a portable display device according to one embodiment secondarily predicting the position of the user's eyes. FIG. 18 is a flowchart illustrating a method for a portable display device according to one embodiment to display a stereoscopic image in accordance with the position of the user's eyes. FIG. 19 is a flowchart illustrating a method in which a portable display device according to one embodiment uses accumulated IMU information to correct the position of a user's eye and displays a stereoscopic image to match the corrected position of the eye. FIG. 20 is a diagram showing a portable display device according to one embodiment correcting the position of a user's eyes using accumulated