Search

EP-3682311-B1 - HEAD-MOUNTED DISPLAY AND CONTROL APPARATUS AND METHOD

EP3682311B1EP 3682311 B1EP3682311 B1EP 3682311B1EP-3682311-B1

Inventors

  • JAMES, Ian, Geoffrey
  • RIDGE, Edmond, Richard
  • SIMM, David, John
  • PIGOTT, Malcolm, Grant

Dates

Publication Date
20260513
Application Date
20180910

Claims (10)

  1. A head-mounted display apparatus (1) for displaying an operational area, the head-mounted display apparatus (1) located in an aircraft, the head-mounted display apparatus (1) comprising a headset (10) for placing over a user's eyes, the headset including a viewing device (12) configured to provide to said user, in use, a view of a real-world environment (46), a display generating device for depicting an operational area, said operational area being defined within said real-world environment (46) and comprising a plurality of functional regions each defining a different one or more selectable functions or operations that can be performed in respect of said operational area, the head-mounted display apparatus (1) being configured to transfer image data from said display generating device into said user's view of said real-world environment (46) at said viewing device (12) to generate an augmented reality environment, the head-mounted display apparatus (1) further comprising a control module (26) including a control device comprising a manually operable input device (28) provided at a HOTAS; and configured to be selectively communicably coupled to all of said functional regions to enable a user to selectively perform the respective one or more functions or operations associated therewith, wherein the control device is communicably coupled to a selected functional region only in response to a respective actuation signal, and an eye tracker module (24) configured to monitor said user's gaze relative to said augmented reality environment, in use, and, when said user's gaze is directed at a selected functional region displayed therein, generate and transmit to said control module (26) a said actuation signal and when said user's gaze is no longer directed at said selected functional region, cause said control device to be decoupled therefrom; the head-mounted display apparatus (1) further comprising an interface adapted to enable the user to select one or more functional regions within a remote head-down display (30) located in the aircraft and move it into the display on the viewing device (12) of the head-mounted display apparatus (1); and, wherein the head-mounted display apparatus (1) is arranged and configured such that whilst the user's gaze is directed at a selected functional region displayed on said viewing device (12) and projected into said user's view of the real-world environment (46), the control device is coupled thereto to enable interaction therewith, and when the user's gaze moves away from the selected functional region, the control device is decoupled therefrom.
  2. A head-mounted display apparatus according to claim 1, wherein a plurality of operational areas is defined within said real-world environment, each operational area being associated with a user's relative or absolute head position and/or orientation, the apparatus further comprising a head tracking module (22) for tracking and determining a user's relative or absolute head position and/or orientation, the processor being configured to receive, from said head tracking module, data representative of said user's relative or absolute head position or orientation, determine a current operational area associated therewith and define one or more selectable functional regions associated with said current operational area.
  3. A head-mounted display apparatus according to claim 1 or claim 2, wherein said viewing device is a transparent or translucent visor, such that an external and internal real-world environment can be viewed by the user, in use, the display generating device being arranged and configured such that image data displayed thereon is transferred into the user's view of the real-world environment through said visor.
  4. A head-mounted display apparatus according to any of the preceding claims, wherein a functional region is defined and displayed on the viewing device the form of an augmented reality information window, with which the user can interact, using said control device, only when said user's gaze is directed thereto within said augmented reality environment.
  5. A head-mounted display apparatus according to claim 4, wherein said augmented reality information window is displayed on the screen when the user directs said gaze at a predefined location within said user's view of the external real-world environment corresponding to a predefined location within a respective operational area, and is removed from the viewing device when the user directs said gaze away from that predefined location.
  6. A head-mounted display apparatus according to any of the preceding claims, wherein when a user's gaze is directed at a selected functional region within a remote head-down display, the control module is configured to communicably couple one or more control devices to said selected functional region, to enable a user to interact therewith, on said head-down display.
  7. A head-mounted display apparatus according to any of the preceding claims, wherein once a control device has been decoupled from a functional region, such a functional region remains as a selectable functional region within the respective operational area, such that it is once again coupled to a control device when the user's gaze is directed thereto within the augmented reality environment.
  8. A head-mounted display apparatus according to any of the preceding claims, including an interface comprising at least one control device to selectively interact with all functional regions available to the user.
  9. A head-mounted display apparatus according to any of the preceding claims wherein the control device comprises a voice recognition module for inputting commands.
  10. A method of displaying an operational area, comprising providing a headset (10) for placing over a user's eyes, the headset included in a head-mounted display apparatus (1) located in an aircraft, the headset including a viewing device (12) configured to provide to said user, in use, a view of a real-world environment (46), a display generating device for depicting an operational area, said operational area being defined within said real-world environment (46), , the display generating device being arranged and configured to transfer image data created thereby into said user's view of said real-world environment (46) through said viewing device (12) to generate an augmented reality environment, the method further comprising providing a control module (26) including a control device comprising a manually operable input device (28) provided at a HOTAS configured to be selectively communicably coupled to a plurality of functional regions to enable a user to selectively perform the respective one or more functions or operations associated therewith, wherein the control device is communicably coupled to a selected functional region only in response to a respective actuation signal, and an eye tracker module (24) configured to monitor said user's gaze relative to said augmented reality environment, in use, and, when said user's gaze is directed at a selected functional region displayed therein, generate and transmit to said control module (26) a said actuation signal and when said user's gaze is no longer directed at said selected functional region, cause said control device to be decoupled therefrom; providing an interface adapted to enable the user to select one or more functional regions within a remote head-down display (30) located in the aircraft and move it into the display on the viewing device (12) of the headset; such that whilst the user's gaze is directed at a selected functional region displayed on said viewing device (12) and projected into said user's view of the real-world environment (46), cause the control device to be coupled thereto to enable interaction therewith, and when the user's gaze moves away from the selected functional region, cause the control device to be decoupled therefrom.

Description

This invention relates generally to a head-mounted display apparatus and method and, more particularly, to a head-mounted display apparatus and method incorporating a control module for enabling a user to interact with regions of an operational area displayed on a head-mounted display screen. It is known, particularly in the field of military fighter aircraft and the like, to provide a helmet-mounted display (HMD), wherein a helmet having a transparent visor is provided with a display generating device, such as an internal screen, such that the wearer can view their real-world environment, whilst also viewing additional images and/or data in a virtual reality format displayed on the screen and projected into the wearer's view of the external real-world environment through the visor. As such, a processor is provided that receives real-time data from multiple sources, such as external image capture devices, speed sensors, weapon sensors, or target tracking systems, and generates two-dimensional image data representative of that data. The image data is displayed on the screen in the form of representative images, and then projected onto the wearer's view of the external real-world environment, thus effectively being superimposed on the wearer's real-world field of view through the visor assembly. Platform operatives need to spend as much time as possible looking out of the platform to maintain safe operation. However, as missions become more complex and the quantity of data required to be assimilated increases accordingly, a careful balance needs to be maintained. Some of this data can be displayed within the operative's view of the real world, as described above, but putting displays in the operative's vision can be very dangerous, as it may obscure their view of a physical entity in the outside world or safety critical parameters in the head-up display (HUD) in the cockpit. The decision about whether to provide augmented reality information within the pilot's field of view through the helmet visor therefore has to outweigh the negative implications and, it is common for large amounts of such data to, instead, be displayed (or made available) on a head-down display (HDD) in the cockpit, which forces the operative to move their eyes away from the outside world to read the data. Furthermore, for interaction with each type of data, a different control device is typically required to be used, which results in a very complex control panel. US 4109145 A describes a line of sight (LOS) apparatus comprising an eye tracker to determine where a user is looking on a screen. The user is able to select commands on the screen based upon a predetermined combination of direction of gaze and time of fixation of gaze. Through this mechanism, the user is able to call up displays onto the screen and further interact using their gaze. US 2016/246384A1 is directed to interactions with a head mounted display in the field of gaming technology. US2014/237366A1 teaches a head-mounted display for context aware augmented reality. EP3138607A1 is directed to a head-up display using voice recognition technology in obscured environments. It would be desirable to provide a helmet-mounted display that is able to selectively display large amounts of data within the user's field of view, without requiring large numbers of control devices and, in accordance with a first aspect of the present invention, there is provided a head-mounted display apparatus according to claim 1. Thus, the present invention provides a head-mounted display apparatus that can effectively incorporate most or all of the data required for use by the operative, without unnecessarily obscuring their view and requiring only a single (or very few) control devices to interact with that data, thereby simplifying the control panel and providing a much more intuitive display system. In an exemplary embodiment, a plurality of operational areas may be defined within said real-world environment, each operational area being associated with a user's relative or absolute head position and/or orientation, the apparatus further comprising a head tracking module for tracking and determining a user's relative or absolute head position and/or orientation, the processor being configured to receive, from said head tracking module, data representative of said user's relative or absolute head position and/or orientation, determine a current operational area associated therewith and define one or more selectable functional regions associated with said current operational area included within said augmented reality environment. In an exemplary embodiment, said viewing device is a transparent or translucent visor, such that the external and internal real-world environment can be viewed therethrough by the user, and the display generating device is arranged and configured to transfer image data displayed thereon into said user's view of the real-world environment such that the or each operational ar