Search

CN-115916032-B - Eye movement tracking system, eye movement tracking method, and recording medium

CN115916032BCN 115916032 BCN115916032 BCN 115916032BCN-115916032-B

Abstract

An eye tracking system of an embodiment is provided with at least one processor. The at least one processor performs a process of dynamically setting a partial region of the first content displayed on the screen as a guide region in order for a user to look at the partial region, determining a first viewpoint coordinate of the user in the screen based on movement of eyes of the user looking at the guide region, calculating a difference between the determined first viewpoint coordinate and the region coordinate of the guide region in the screen, and correcting a second viewpoint coordinate of the user viewing the second content displayed on the screen using the difference.

Inventors

  • KAWAKAMI NOBUO
  • Odagiri kaneri

Assignees

  • 多玩国株式会社

Dates

Publication Date
20260508
Application Date
20210901
Priority Date
20200930

Claims (6)

  1. 1. An eye tracking system is provided with at least one processor executing an operating system and an application program, wherein, The at least one processor performs the following: In order to make the user watch the partial area of the first content displayed on the screen, dynamically setting the following partial area in the first content displayed on the plurality of screens displayed after the application program is started as each guiding area, wherein the partial area is the area in each screen of the plurality of screens, wherein the selection object capable of being selected by the user is displayed; Determining a first viewpoint coordinate of the user in each of the plurality of pictures based on movement of eyes of the user looking at the guide area when the selection object in each of the plurality of pictures is selected; calculating, with respect to each of the plurality of pictures, a difference between the determined first viewpoint coordinates and region coordinates of the guide region in the picture; And correcting a second viewpoint coordinate of the user viewing the second content displayed on the screen using the statistics of the differences corresponding to the plurality of screens.
  2. 2. The eye tracking system of claim 1, wherein, The at least one processor sets the partial region as the guide region by adjusting the resolution of the screen in such a manner that the resolution of the partial region becomes higher than the resolution of regions other than the partial region.
  3. 3. An eye tracking system according to claim 1 or 2, wherein, The at least one processor sets the partial region as the guide region by blurring a region other than the partial region.
  4. 4. An eye tracking system according to claim 1 or 2, wherein, The at least one processor sets the partial region as the guide region by surrounding an outer edge of the partial region with a wire.
  5. 5. An eye tracking method executed by an eye tracking system having at least one processor executing an operating system and an application program, wherein, The method comprises the following steps: In order to make the user watch the partial area of the first content displayed on the screen, dynamically setting the following partial area in the first content displayed on the plurality of screens displayed after the application program is started as each guiding area, wherein the partial area is the area in each screen of the plurality of screens, wherein the selection object capable of being selected by the user is displayed; Determining a first viewpoint coordinate of the user in each of the plurality of pictures based on movement of eyes of the user looking at the guide area when the selection object in each of the plurality of pictures is selected; calculating, with respect to each of the plurality of pictures, a difference between the determined first viewpoint coordinates and region coordinates of the guide region in the picture; And correcting a second viewpoint coordinate of the user viewing the second content displayed on the screen using the statistics of the differences corresponding to the plurality of screens.
  6. 6. A recording medium having an eye-tracking program recorded thereon, the eye-tracking program causing a computer to execute the steps of: in order to make the user watch the partial area of the first content displayed on the screen, dynamically setting the following partial area in the first content displayed on the plurality of screens displayed after the eye tracking program is started as each guiding area, wherein the partial area is the area in each screen of the plurality of screens, wherein the selection object capable of being selected by the user is displayed; determining a first viewpoint coordinate of the user in each of the plurality of screens based on movement of eyes of the user looking at the guide area when the selection object in each of the plurality of screens is selected; calculating, with respect to each of the plurality of pictures, a difference between the determined first viewpoint coordinates and region coordinates of the guide region in the picture; And correcting a second viewpoint coordinate of the user viewing the second content displayed on the screen using the statistics of the differences corresponding to the plurality of screens.

Description

Eye movement tracking system, eye movement tracking method, and recording medium Technical Field One aspect of the present disclosure relates to an eye movement tracking system, an eye movement tracking method, and a recording medium. Background Eye tracking systems are known that calculate the position of a user's point of view. Patent document 1 describes a correction method of a head-mounted eye tracking device. In this correction method, while the wearer of the eye tracker is looking at the reference direction, the eye tracker acquires eye data relating to the position of the wearer's eyes, and correlates the eye data with the line-of-sight direction corresponding to the reference direction. The eye tracking device includes a spectacle frame having an ophthalmic lens, and determines a line of sight direction corresponding to a reference direction in consideration of an optical refractive function of the ophthalmic lens. Other examples of the correction method are described in patent documents 2 and 3. Prior art literature Patent literature Patent document 1 Japanese patent No. 6656156 Patent document 2 Japanese patent application laid-open No. 2010-259605 Patent document 3 Japanese patent application laid-open No. 2001-204692 Disclosure of Invention Problems to be solved by the invention It is desirable to simply perform correction for eye movement tracking. Means for solving the problems An aspect of the present disclosure relates to an eye tracking system having at least one processor. The at least one processor performs a process of dynamically setting a partial region of the first content displayed on the screen as a guide region in order for a user to look at the partial region, determining a first viewpoint coordinate of the user in the screen based on movement of eyes of the user looking at the guide region, calculating a difference between the determined first viewpoint coordinate and the region coordinate of the guide region in the screen, and correcting a second viewpoint coordinate of the user viewing the second content displayed on the screen using the difference. In such an aspect, a guide area for making the user look at is dynamically set for arbitrary content (first content), and a difference for correction is calculated using the guide area. Thus, it is not necessary to prepare a content for correction in advance, and thus correction for eye movement tracking can be performed more simply. Effects of the invention According to one aspect of the present disclosure, correction for eye movement tracking can be easily performed. Drawings Fig. 1 is a diagram showing an example of application of the assist system according to the embodiment. Fig. 2 is a diagram showing an example of a hardware configuration associated with the assist system according to the embodiment. Fig. 3 is a diagram showing an example of a functional configuration of the auxiliary system according to the embodiment. Fig. 4 is a flowchart showing an example of the operation of the assist system according to the embodiment. Fig. 5 is a flowchart showing an example of the operation of the eye tracking system according to the embodiment. Fig. 6 is a diagram showing an example of a guidance area set for the first content. Fig. 7 is a diagram showing an example of a guidance area set for the first content. Fig. 8 is a flowchart showing an example of the operation of the assist system according to the embodiment. Fig. 9 is a flowchart showing an example of the operation of the assist system according to the embodiment. Fig. 10 is a diagram showing an example of auxiliary information. Detailed Description Embodiments in the present disclosure are described in detail below with reference to the drawings. In the description of the drawings, the same or equivalent elements are denoted by the same reference numerals, and repetitive description thereof will be omitted. [ Overview of System ] The assistance system of the embodiment is a computer system that assists a user who confirms content visually. Content refers to information provided by a computer or computer system that can be identified by a person. Electronic data representing content is referred to as content data. The expression form of the content is not limited, and the content may be expressed by a document, an image (for example, a photograph, a video, or the like), or a combination thereof, for example. The purpose and use scenario of the content are not limited, and the content can be used for various purposes such as education, news, lectures, business transactions, entertainment, medical treatment, games, and chatting. The auxiliary system provides content to the user by sending content data to the user terminal. The user is the person who wants to obtain information from the auxiliary system, i.e. the viewer of the content. The user terminal may also be referred to as a "viewer terminal". The subsidiary system may provide the content data to the user terminal in a