CN-116452530-B - Eye movement tracking method and eye movement tracking device
Abstract
The application relates to an eye movement tracking method and an eye movement tracking device, wherein the method comprises the steps of constructing a human eye movement tracking system, and collecting the gazing point of the eyes of the person in real time, and calculating a track map and a hotspot map of the gazing point of the person observing one object. From this result, it is used to evaluate the focus of attention and key parts of interest of people when browsing the outsides. The application not only can be used for improving and adjusting the design style of products, but also is more suitable for subjective feeling of masses and attracts eyeballs of audience. Meanwhile, the application can also be used for evaluating the emotion and fatigue states of people and clinically treating emotion disorder in the medical nerve field.
Inventors
- GUO WEIJIE
- WU ZONGYU
- LI GUANGYAO
- LIN XIAOLIN
- LIN RENHUI
- LV YIJUN
- CHEN ZHONG
Assignees
- 厦门大学
Dates
- Publication Date
- 20260512
- Application Date
- 20230407
Claims (8)
- 1. An eye movement tracking method is characterized by comprising the following steps: s1, shooting an eye image through an infrared eye camera, and performing image processing on the eye image to obtain a pupil center; S2, irradiating eyes through an infrared light source to form a bright spot on the surface of the eyeball, detecting the bright spot of an eye image shot by an infrared eye camera, calculating the centroid of the bright spot, and constructing an eye vector by utilizing the centroid and the pupil center; s3, constructing a mapping equation for mapping the eye vectors to scene coordinates, wherein the mapping equation is a quadratic polynomial conversion equation from the eye motion vectors to the scene camera coordinates, and acquiring a plurality of groups of eye images and scene images on a user gaze calibration card to obtain a plurality of groups of eye motion vectors and scene camera coordinates; The conversion equation from the eye movement vector to the scene camera coordinates is as follows: ; ; Wherein Xe and Ye are eye motion vectors, and Xs and Ys are scene camera coordinates; S4, respectively utilizing an infrared eye camera and a distant view camera to acquire an eye image and a scene image of a user, obtaining an eye movement vector through the eye image, and substituting the eye movement vector into a mapping equation in the step S3 to obtain a screen fixation coordinate; And S5, drawing a thermodynamic diagram by utilizing the screen fixation coordinates and superposing the thermodynamic diagram on the scene image to obtain an image fused with the thermodynamic diagram, wherein the fused image drawing method comprises the steps of setting the maximum interest value of a user at the center (x, y) if the user has one fixation at the position (x, y), linearly decreasing outwards, and then matching different colors according to different interest values.
- 2. The method for eye tracking according to claim 1, wherein S1 comprises: s11, changing an eye image into a gray image; S12, binarizing the gray level image; S13, carrying out contour detection on the binarized gray level image to obtain a contour center, wherein the contour center is used as a pupil center; s14, finding a circle with the center closest to the center on the gray level image by using a Hough circle detection function; and S15, marking the detected outline in the original image.
- 3. The method for eye tracking according to claim 1, wherein S1 comprises: s11, changing an eye image into a gray image; S12, binarizing the gray level image; S13, performing iterative open operation on the binarized gray level image twice by using a convolution kernel of 3x 3; And S14, performing contour detection on the image subjected to the open operation by using an ellipse fitting mode, taking the second largest contour in all contours obtained by fitting as a pupil contour, and taking the center of the ellipse obtained by fitting as the pupil center.
- 4. The method for eye tracking according to claim 1, wherein S2 comprises: S21, performing gray level conversion on the eye image; S22, binarizing the gray level image processed in the step S21, wherein a threshold value is set for the binarization function; s23, performing open operation by using a convolution check image of 3X 3; s24, performing contour detection on the binarized image processed in the step S23 to obtain a bright point contour; s25, calculating the mass center of the bright point by using the bright point outline, and then constructing an eye vector by using the mass center and the pupil center.
- 5. The method for eye tracking according to claim 4, wherein S5 comprises: S51, reading screen fixation coordinates identified by eye movement tracking; S52, putting all screen fixation coordinates into a list type variable data, wherein the data= [ [ x1, y1] [ x2, y2] ]; and S53, drawing a thermodynamic diagram, and weighting and overlaying the thermodynamic diagram on the original scene diagram.
- 6. The eye tracking device is characterized by comprising a frame and a main control module, wherein a long-range view camera, an infrared eye camera and an infrared light source are arranged on the frame, the infrared eye camera and the infrared light source are arranged on one side, close to the glasses legs, of the frame, the long-range view camera is arranged on one side, far away from the glasses legs, of the frame, the long-range view camera, the infrared eye camera and the infrared light source are in signal connection with the main control module, and the main control module is used for obtaining an image fused with a scene graph through the method of any one of claims 1-5 and transmitting the image to a communication terminal through wireless communication.
- 7. The eye tracker of claim 6, wherein the infrared light source is an infrared LED light source with a wavelength of 940 nm.
- 8. The eye tracking device of claim 6, wherein the main control module is configured to supply power to the infrared LED light source by using a usb-to-type-c circuit structure, and the circuit structure comprises a type-c female socket card slot with 6 pins.
Description
Eye movement tracking method and eye movement tracking device Technical Field The present application relates to the field of eye tracking, and in particular, to an eye tracking method and an eye tracking device. Background The current eye movement tracking technology research content mainly aims at realizing the requirements of the field of emotion analysis and fatigue degree evaluation of a detected person by utilizing the near-eye movement tracking technology, the main research focus is focused on the change of the eye point of a human eye, the external environment is not seen by human eyes, and the focus and the interesting key parts of people when browsing external things cannot be evaluated, so that the improvement and adjustment of the design style of products (web pages, books, painting and building specifications) are not facilitated. Disclosure of Invention Aiming at the technical problems in the background technology, the application provides an eye movement tracking method and an eye movement tracking device, which adopt the following technical scheme: in a first aspect, the present application provides an eye tracking method, comprising the steps of: s1, shooting an eye image through an infrared eye camera, and performing image processing on the eye image to obtain a pupil center; S2, irradiating eyes through an infrared light source to form a bright spot on the surface of the eyeball, detecting the bright spot of an eye image shot by an infrared eye camera, calculating the centroid of the bright spot, and constructing an eye vector by utilizing the centroid and the pupil center; s3, constructing a mapping equation for mapping the eye vectors to scene coordinates; S4, respectively utilizing an infrared eye camera and a distant view camera to acquire an eye image and a scene image of a user, obtaining an eye movement vector through the eye image, and substituting the eye movement vector into a mapping equation in the step S3 to obtain a screen fixation coordinate; and S5, drawing a thermodynamic diagram by using the screen fixation coordinates, and superposing the thermodynamic diagram on the scene image to obtain an image fused with the thermodynamic diagram. By adopting the technical scheme, the eye gaze point of the human eye is acquired in real time by constructing the eye gaze tracking system, and the track map and the hotspot map of the gaze point of people observing one thing (web page, book, building, landscape, etc.) are calculated. From this result, it is used to evaluate the focus of attention and key parts of interest of people when browsing the outsides. The application not only can be used for improving and adjusting the design style of products (web pages, books, paintings and building specifications), but also is more suitable for subjective feeling of masses and attracts eyeballs of audiences. Meanwhile, the application can also be used for evaluating the emotion and fatigue states of people and clinically treating emotion disorder in the medical nerve field. Preferably, the S1 includes: s11, changing an eye image into a gray image; S12, binarizing the gray level image; S13, carrying out contour detection on the binarized gray level image to obtain a contour center, wherein the contour center is used as a pupil center; s14, finding a circle with the center closest to the center on the gray level image by using a Hough circle detection function; and S15, marking the detected outline in the original image. Preferably, the S1 includes: s11, changing an eye image into a gray image; S12, binarizing the gray level image; S13, performing iterative open operation on the binarized gray level image twice by using a convolution kernel of 3x 3; And S14, performing contour detection on the image subjected to the open operation by using an ellipse fitting mode, taking the second largest contour in all contours obtained by fitting as a pupil contour, and taking the center of the ellipse obtained by fitting as the pupil center. Preferably, the S2 specifically includes: S21, performing gray level conversion on the eye image; s22, binarizing the gray level image processed in the step S21, wherein a higher threshold value is set for the binarization function; s23, performing open operation by using a convolution check image of 3X 3; s24, performing contour detection on the binarized image processed in the step S23 to obtain a bright point contour; s25, calculating the mass center of the bright point by using the bright point outline, and then constructing an eye vector by using the mass center and the pupil center. Preferably, the step S3 specifically includes: s31, acquiring a plurality of groups of eye images and scene images on the user gazing calibration card so as to obtain a plurality of groups of eye motion vectors and scene camera coordinates; S32, substituting a plurality of groups of eye movement vectors and scene camera coordinates into a conversion equation from the eye movement vector