EP-3933554-B1 - VIDEO PROCESSING
Inventors
- Walker, Andrew William
- WILLIAMS, Nigel John
- Jenabzadeh, Mandana
- LEONARDI, ROSARIO
Dates
- Publication Date
- 20260506
- Application Date
- 20210622
Claims (13)
- Apparatus comprising: a video display (1400) to display video images to a user; a gaze detector (1420) configured to detect a gaze direction for one or both eyes of the user while the user views the displayed video images; a head tracker (1410) configured to detect a head orientation of the user; an image processor (1430) configured to generate the video images for display by the video display; the image processor being responsive to one or more control functions dependent upon the gaze direction detected by the gaze detector; and a controller (1450) configured to detect a predetermined condition and, in response to detection of the predetermined condition, to control the image processor to be responsive to one or more control functions dependent upon the head orientation detected by the head tracker in place of the one or more control functions dependent upon the gaze direction detected by the gaze detector, in which detection of the predetermined condition comprises a detection of an operational error condition by the gaze detector, and in which detection of the predetermined condition comprises detection of a predetermined control operation by the user indicating a deemed failure by the user of a gaze tracking process performed by the gaze detector.
- The apparatus of claim 1, in which the predetermined control operation comprises a predetermined pattern of eye movement by the user.
- The apparatus of claim 1, in which the predetermined control operation comprises a detection of closure of one or both of the user's eyes for at least a predetermined eye closure period.
- The apparatus of claim 1, the apparatus comprising a microphone, in which the predetermined control operation comprises receipt by the microphone of a predetermined audio signal.
- The apparatus of any one of the preceding claims, in which the controller is configured, after a detection of the predetermined condition, to control the image processor to return to being responsive to the one or more control functions dependent upon the gaze direction detected by the gaze detector when the predetermined condition is no longer detected.
- The apparatus of claim 5, in which the controller is configured, in response to detection of the predetermined condition, to control the image processor for at least a predetermined control period to be responsive to the one or more control functions dependent upon the head orientation detected by the head tracker in place of the one or more control functions dependent upon the gaze direction detected by the gaze detector.
- The apparatus of any one of the preceding claims, in which the image processor is configured to generate video images for display by the video display as a representation of a part of a virtual scene, the part being dependent upon a viewpoint defined by a current head orientation of the user.
- The apparatus of any one of the preceding claims, in which the one or more control functions comprise one or more of: selection of menu items and movement of a displayed a displayed cursor.
- The apparatus of any one of the preceding claims, comprising a head mountable display, HMD, for wearing by the user, the HMD comprising the video display and one or more cameras to provide images of one or both of the user's eyes to the gaze detector.
- Video game apparatus comprising the apparatus of any one of the preceding claims.
- A method comprising: displaying (2100) video images to a user; detecting (2110), with a gaze detector, a gaze direction for one or both eyes of the user while the user views the displayed video images; detecting (2120) a head orientation of the user; generating (2130) the video images for display by the video display in response to one or more control functions dependent upon the gaze direction detected by the gaze detector; detecting (2140) a predetermined condition; and in response to detection of the predetermined condition, controlling (2150) the generating step to be responsive to one or more control functions dependent upon the detected head orientation in place of the one or more control functions dependent upon the detected gaze direction, in which detection of the predetermined condition comprises a detection of an operational error condition by the gaze detector, and in which detection of the predetermined condition comprises detection of a predetermined control operation by the user indicating a deemed failure by the user of a gaze tracking process performed by the gaze detector.
- Computer software which, when executed by the apparatus of any one of claims 1 to 10, causes the apparatus to perform the method of claim 11.
- A non-transitory, machine-readable storage medium which stores the computer software of claim 12.
Description
This disclosure relates to video processing. When images are displayed to a user wearing a head mountable display (HMD), it is desirable to make the user's experience as realistic as possible. However, some aspects of the human physiological and psychovisual response to viewed images do not lend themselves to being triggered by images displayed by an HMD. Previously proposed arrangements are disclosed in US 2015/185831 A1, US 2018/364810 A1, WO 2019/122493 A1, US 2016/109961 A1, US 2014/078049 A1, and US 2012/272179 A1. It is in this context that the present disclosure arises. This disclosure is defined by claim 1. Further various aspects and features of the present disclosure are defined in the appended claims and within the text of the accompanying description. Embodiments of the disclosure will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 schematically illustrates an HMD worn by a user;Figure 2 is a schematic plan view of an HMD;Figure 3 schematically illustrates the formation of a virtual image by an HMD;Figure 4 schematically illustrates another type of display for use in an HMD;Figure 5 schematically illustrates a pair of stereoscopic images;Figure 6a schematically illustrates a plan view of an HMD;Figure 6b schematically illustrates a near-eye tracking arrangement;Figure 7 schematically illustrates a remote tracking arrangement;Figure 8 schematically illustrates a gaze tracking environment;Figure 9 schematically illustrates a gaze tracking system;Figure 10 schematically illustrates a human eye;Figure 11 schematically illustrates a graph of human visual acuity;Figures 12 and 13 schematically illustrate the use of head tracking;Figure 14 schematically illustrates an example video processing system;Figures 15 and 16 are schematic flowcharts illustrating respective methods;Figure 17 schematically illustrates an eye motion track; andFigures 18 to 21 are schematic flowcharts illustrating respective methods. Example Embodiments Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views, embodiments of the present disclosure are described. In Figure 1, a user 10 is wearing an HMD 20 (as an example of a generic head-mountable apparatus - other examples including audio headphones or a head-mountable light source) on the user's head 30. The HMD comprises a frame 40, in this example formed of a rear strap and a top strap, and a display portion 50. As noted above, many gaze tracking arrangements may be considered particularly suitable for use in HMD systems; however, use with such an HMD system should not be considered essential. Note that the HMD of Figure 1 may comprise further features, to be described below in connection with other drawings, but which are not shown in Figure 1 for clarity of this initial explanation. The HMD of Figure 1 completely (or at least substantially completely) obscures the user's view of the surrounding environment. All that the user can see is the pair of images displayed within the HMD, as supplied by an external processing device such as a games console in many embodiments. Of course, in some embodiments images may instead (or additionally) be generated by a processor or obtained from memory located at the HMD itself. The HMD has associated headphone audio transducers or earpieces 60 which fit into the user's left and right ears 70. The earpieces 60 replay an audio signal provided from an external source, which may be the same as the video signal source which provides the video signal for display to the user's eyes. The combination of the fact that the user can see only what is displayed by the HMD and, subject to the limitations of the noise blocking or active cancellation properties of the earpieces and associated electronics, can hear only what is provided via the earpieces, mean that this HMD may be considered as a so-called "full immersion" HMD. Note however that in some embodiments the HMD is not a full immersion HMD, and may provide at least some facility for the user to see and/or hear the user's surroundings. This could be by providing some degree of transparency or partial transparency in the display arrangements, and/or by projecting a view of the outside (captured using a camera, for example a camera mounted on the HMD) via the HMD's displays, and/or by allowing the transmission of ambient sound past the earpieces and/or by providing a microphone to generate an input sound signal (for transmission to the earpieces) dependent upon the ambient sound. Such a microphone may also provide for the capture of audio signals representing spoken commands by the wearer of the HMD. A front-facing camera 122 may capture images to the front of the HMD, in use. Such images may be used for head tracking purposes, in some embodiments, while it may also be suitable for capturing images for an augmented reality (AR) style experience. A Bluetooth® antenna