Search

KR-102964656-B1 - pupil tracking device and controlling method for the same

KR102964656B1KR 102964656 B1KR102964656 B1KR 102964656B1KR-102964656-B1

Abstract

The present invention relates to a binocular pupil tracking device and a control method thereof. By acquiring images of the binoculars of a VR/AR glasses wearer as a binocular mixed image using a beam splitter optical system and an X-Cube optical system, and then processing and restoring the binocular mixed image into a binarized image, the number of expensive camera sensors used in multiple applications can be reduced to one, thereby reducing power consumption and consequently significantly reducing the manufacturing cost of the VR/AR glasses device. Furthermore, by acquiring a binarized binocular image for eye tracking using an eye tracking algorithm on a binocular mixed image using a single camera sensor, the desynchronization phenomenon that frequently occurs when using two physical cameras is not induced, thereby minimizing errors that occur during eye tracking and consequently maximizing the usability of the VR/AR glasses.

Inventors

  • 박지용
  • 구정식
  • 김수민

Assignees

  • 재단법인 구미전자정보기술원

Dates

Publication Date
20260512
Application Date
20231208

Claims (14)

  1. A VR/AR device that allows a wearer to view VR/AR content after wearing it on their ears or head, and outputs binocular color information by detecting the pupils of the right and left eyes and mixing them into one to determine the wearer's gaze position; A gaze position determination unit comprising: a channel separation unit that separates the binocular mixed color information detected by the above VR/AR device according to a set procedure to separate it into the pupil color information of the right eye and the pupil color information of the left eye, respectively, and determines the wearer's current gaze position using an eye tracking algorithm based on the separated pupil color information of the right eye and the pupil color information of the left eye, In a binocular pupil tracking device, The above-mentioned VR/AR device includes a device body that a wearer can wear on their ears or head to view VR/AR content; A white light source unit installed at the center of the device body facing the wearer and emitting white light toward both eyes of the wearer; A beam splitter mounted inside the above device body and receiving light reflected from both eyes by the white light of the white light source, which reflects a portion of the light and outputs a portion of the remaining light; An R-side X-cube installed at a position capable of receiving light reflected from the wearer's right eye via the beam splitter, and which separates and outputs only the blue wavelength image from the wavelength of light received from the right eye by the beam splitter through a built-in filter; An L-side X-cube installed at a position capable of receiving light reflected from the wearer's left eye via the beam splitter, and which separates and outputs only the red wavelength image from the wavelength of light received from the left eye by the beam splitter through a built-in filter; It includes a single camera sensor configured as only one, which is installed at the center of the device body portion at the opposing positions of the wearer's right and left eyes and simultaneously mixes the blue wavelength image and the red wavelength image output from the R-side X-cube and the L-side X-cube, respectively, to detect a single binocular mixed image. The above gaze position determination unit further includes a channel separation module that separates the binocular mixed pupil image or binocular mixed color information detected as one by the single camera sensor into channels for the right eye pupil color information and the left eye pupil color information, respectively, according to a set procedure. Binocular pupil tracking device.
  2. delete
  3. delete
  4. In paragraph 1, The above single camera sensor is characterized by further including a memory unit that stores a binocular mixed pupil image in the form of color information by mixing infrared signals received after being reflected from the right and left eyes, respectively, and detecting them as a single image. Binocular pupil tracking device.
  5. delete
  6. In paragraph 1, The above gaze position determination unit further includes a binarization processing module that processes the pupil color information of the right eye and the pupil color information of the left eye, each separated by the channel separation module, into binarized signals using an eye tracking algorithm and outputs them. Binocular pupil tracking device.
  7. In paragraph 6, The above-mentioned gaze position determination unit further includes a gaze position determination module that identifies the position of the left pupil and the position of the right pupil using an eye tracking algorithm, respectively, the binarized pupil signals of the right eye and the left eye that have been binarized by the binarization processing module, calculates the difference between the identified pupil positions, and then outputs a result value indicating where the wearer's gaze is currently directed. Binocular pupil tracking device.
  8. A device body that the wearer can wear on their ears or head to view VR/AR content; A white light source unit installed at the center of the device body facing the wearer and emitting white light toward both eyes of the wearer; A beam splitter mounted inside the above device body and receiving light reflected from both eyes by the white light of the white light source, which reflects a portion of the light and outputs a portion of the remaining light; An R-side X-cube installed at a position capable of receiving light reflected from the wearer's right eye via the beam splitter, and which separates and outputs only the blue wavelength image from the wavelength of light received from the right eye by the beam splitter through a built-in filter; An L-side X-cube installed at a position capable of receiving light reflected from the wearer's left eye via the beam splitter, and which separates and outputs only the red wavelength image from the wavelength of light received from the left eye by the beam splitter through a built-in filter; A VR/AR device comprising a single camera sensor installed at the center of the device body portion at a position opposite to the wearer's right and left eyes, and configured to detect a single binocular mixed image by simultaneously mixing a blue wavelength image and a red wavelength image output from the R-side X-cube and the L-side X-cube, respectively; It includes a gaze position determination unit that separates the binocular mixed color information detected by the above VR/AR device into channels according to a set procedure to separate the pupil color information of the right eye and the pupil color information of the left eye, respectively, binarizes the separated pupil color information of the right eye and the pupil color information of the left eye using an eye tracking algorithm, and then compares them to determine the wearer's current gaze position. The above gaze position determination unit further includes a channel separation module that separates the binocular mixed pupil image or binocular mixed color information detected as one by the single camera sensor into channels for the right eye pupil color information and the left eye pupil color information, respectively, according to a set procedure. Binocular pupil tracking device.
  9. A first step in which a wearer wears a VR/AR device on their ear or head to view VR/AR content; A second step in which, after the first step above, the VR/AR device detects the pupils of the right and left eyes to determine the wearer's gaze position and outputs binocular mixed color information mixed into one; After the second step above, the gaze position determination unit separates the binocular mixed color information detected by the VR/AR device into channels according to a set procedure to separate the pupil color information of the right eye and the pupil color information of the left eye, respectively, and includes a third step of determining the wearer's current gaze position using an eye tracking algorithm for each of the separated pupil color information of the right eye and the pupil color information of the left eye. The above third step further includes a channel separation step in which the gaze position determination unit separates the binocular mixed pupil image or binocular mixed color information, detected as one by a single camera sensor through a channel separation module, into the pupil color information of the right eye and the pupil color information of the left eye, respectively, according to a set procedure. Control method for a binocular pupil tracking device.
  10. In Paragraph 9, The above second step further comprises a binocular coherence image extraction step in which a white light source unit installed at the center of a device body facing the wearer emits white light toward both eyes of the wearer, and when a beam splitter receives light reflected from both eyes by the white light of the white light source unit, half of the amount of light is reflected and only half of the remaining amount of light is output, and an R-side X-cube and an L-side X-cube linked to the beam splitter separate only images of blue wavelength and red wavelength from the wavelengths of light received from the right eye and left eye, respectively by the beam splitter, through a built-in filter and output them respectively. Control method for a binocular pupil tracking device.
  11. In Paragraph 9, The above second step further comprises a binocular mixed image extraction step in which a single camera sensor simultaneously mixes a blue wavelength image and a red wavelength image output from the R-side X-cube and the L-side X-cube, respectively, to detect a single binocular mixed image. Control method for a binocular pupil tracking device.
  12. delete
  13. In Paragraph 9, The above third step is characterized by further including a binarization processing step in which the gaze position determination unit processes the pupil color information of the right eye and the pupil color information of the left eye, each separated into channels through a binarization processing module, into binarized signals using an eye tracking algorithm and outputs them. Control method for a binocular pupil tracking device.
  14. In Paragraph 9, The above third step further includes a gaze position determination step in which the gaze position determination unit determines the position of the left pupil and the position of the right pupil using an eye tracking algorithm, respectively, the binarized pupil signals of the right eye and the left eye, which are binarized by the binarization processing module through the gaze position identification module, calculates the difference between the respective positions of the identified pupils, and then outputs a result value indicating where the wearer's gaze is currently directed. Control method for a binocular pupil tracking device.

Description

Pupil tracking device and controlling method for the same The present invention relates to a binocular pupil tracking device and a control method thereof, and more specifically, to a binocular pupil tracking device and a control method thereof that can reduce the desynchronization phenomenon that occurs when using multiple camera sensors and significantly reduce the manufacturing cost of VR/AR glasses by acquiring images of both eyes of a VR/AR glasses wearer as a binocular mixed image through a single camera sensor and then processing the acquired binocular mixed image into a binarized image to restore it. In general, various wearable devices are being developed in accordance with the trend toward lighter and smaller digital devices. These wearable devices include VR (Virtual Reality)/AR ( Augmented Reality) devices worn by hooking them onto the ears like glasses, and Head Mounted Displays (hereinafter referred to as HMDs). The aforementioned VR/AR devices are devices worn by hooking them onto the ears in the form of glasses, such as Google Glass, while the HMD is a device worn on the user's head; both refer to various devices capable of providing multimedia content. In other words, the aforementioned VR/AR devices or HMDs are worn on the user's body and provide images to the user in various environments as the user moves. Furthermore, while the VR/AR devices or HMDs may be equipped with a monocular lens, they may also be equipped with binocular lenses that provide images to both eyes. In particular, when providing images to both eyes, since the distance between the pupils varies for each wearer, the distance between the pupils must be easily adjusted to fit the wearer in order to experience a clear image with good focus. In particular, VR/AR devices such as the above require wearers to focus on objects in front of them through binocular eye movements in order to accurately perceive them; through this behavior, the wearer can naturally recognize the object and visually confirm its shape. Therefore, such VR/AR devices must identify these binocular eye movements of the wearer and transmit gaze information of the wearer equipped with the VR/AR device to the content in order to provide realistic VR/AR or metaverse content. Furthermore, to track the gaze of a person wearing the VR/AR device, a device and method are required to acquire information regarding both of the person's eyes within the VR/AR device. Then, referring to FIG. 1, if we examine the pupil tracking device of a conventional VR/AR device as described above, A device body (70) that a wearer can wear on their ears or head to view VR/AR content; An infrared light source (71) installed at an appropriate location, such as the center, of the device body (70) facing the wearer, and emitting infrared rays toward both eyes of the wearer; R (right eye) camera (73A) and L (left eye) camera (73B) are respectively installed at appropriate locations on the device body (70) at opposing positions of the wearer's right eye and left eye, and respectively pick up reflected infrared signals after infrared rays emitted from an infrared light source (71) are incident on the right eye and left eye and store them in memory (72A, 72B); The system is configured to include a computer unit (74) that retrieves the respective infrared color-shaped binocular signals of the wearer’s right eye and left eye, which are picked up by the R camera (73A) and L camera (73B) and stored in memory (72A, 72B), detects the position of the pupils of each eye using an eye-tracking algorithm, determines the wearer’s gaze position by comparing the detected pupil positions, and then reflects this in the content. Meanwhile, the operation of the pupil tracking device of the conventional VR/AR device as described above involves first having the wearer wear the device body (70), which allows the wearer to view VR/AR content, on their ear or head and turning on a set switch. Then, an infrared light source (71) installed at an appropriate location, such as the center, of the device body facing the wearer emits infrared rays toward both eyes of the wearer. Then, an R camera (73A) and an L camera (73B), respectively installed at appropriate locations on the device body (70) facing the wearer's right and left eyes, pick up the infrared signals of the right and left eyes that are reflected back after the infrared rays emitted from the infrared light source (71) are incident on the right and left eyes, respectively, and store them in memory (72A, 72B). At this time, the computer unit (74) linked with the device body (70) retrieves the respective binocular signals in the form of infrared colors reflected by the wearer's right eye and left eye, which are picked up by the R camera (73A) and L camera (73B) and stored in memory (72A, 72B), and detects the position of the pupils of each eye using an eye-tracking algorithm, compares the detected pupil positions to determine the wearer's gaze position, and then re