Search

EP-4735979-A1 - GAZE INTERACTIONS WITH USER INTERFACES

EP4735979A1EP 4735979 A1EP4735979 A1EP 4735979A1EP-4735979-A1

Abstract

A method for controlling gaze interactivity with a user interface, UI. The method comprises tracking the gaze of a user on the UI to allow the user to select a UI control with the user's gaze and determining whether the user's gaze is moving towards a region of interest, ROI, on the UI. In response to determining that the user's gaze is moving towards the ROI, gaze interactions with the UI are locked to prevent the user's gaze from selecting a different UI control.

Inventors

  • TUINHOUT, Jelle Jeroen
  • BUIL, VINCENTIUS PAULUS
  • KUHLMANN, Daan
  • LAUTE, Niels
  • WIJN, Victor
  • MAREGUDDI, PRAVEEN NARAYAN

Assignees

  • Koninklijke Philips N.V.

Dates

Publication Date
20260506
Application Date
20240621

Claims (15)

  1. 1. A method for controlling gaze interactivity with a user interface, UI, the method comprising: tracking (302) the gaze of a user on the UI to allow the user to select a UI control with the user’s gaze; in response to the user interacting with a first physical control, locking gaze interactions with the UI to prevent the user’s gaze from selecting a different UI control, wherein the first physical control is configured to adapt a parameter corresponding to a selected UI control.
  2. 2. The method of claim 1, further comprising unlocking gaze interactions with the UI in response to the user interacting with a second physical control.
  3. 3. The method of claim 2, wherein the first physical control is a first function of a physical input device and the second physical control is a second function of the physical input device.
  4. 4. The method of claim 3, wherein the physical input device comprises a rotary button, the first function comprises rotation, and the second function comprises clicking.
  5. 5. The method of any of claims 1 to 4, further comprising displaying, on the UI, the value of the parameter being adapted by the user at the position of the user’s gaze on the UI.
  6. 6. The method of claim 5, further comprising adapting the position at which the parameter is displayed to avoid a region of interest on the UI, or a portion thereof, in response to the gaze of the user being on the region of interest.
  7. 7. The method of any of claims 1 to 6, further comprising unlocking gaze interactions with the UI in response to the user not interacting with the first physical control for at least a first period of time.
  8. 8. The method of claim 7, wherein the first period of time when the user is adapting a first parameter is different to the first period of time when the user is adapting a second, different parameter.
  9. 9. The method of any of claims 1 to 8, wherein the UI is a medical UI configured to display a medical image and UI controls to adapt parameters of the medical image.
  10. 10. The method of any of claims 1 to 9, wherein locking gaze interactions with the UI is further in response to determining that the user’s gaze is moving towards a region of interest on the UI, locking gaze interactions with the UI to prevent the user’s gaze from selecting a different UI control.
  11. 11. The method of any of claims 1 to 10, wherein the first physical control comprises at least one of: a knob; a slider; a button; a dial; a switch; a rotary control/trackball/button; a touchpad; a joystick; and a scroll wheel.
  12. 12. A computer program carrier comprising computer program code which, when executed on a computer, causes the computer to perform all of the steps according to any of claims 1 to 11.
  13. 13. A system for controlling gaze interactivity with a user interface, UI, the system comprising: a gaze tracker configured to track the gaze of a user relative to the UI to allow the user to select a UI control with the user’s gaze; a first physical control configured to adapt a parameter corresponding to a selected UI control; and a processor configured to perform all of the steps according to any of claims 1 to 12.
  14. 14. The system of claim 13, further comprising the UI.
  15. 15. The system of claim 13 or 14, wherein the first physical control comprises at least one of: a knob; a slider; a button; a dial; a switch; a rotary control/trackball/button; a touchpad; a joystick; and a scroll wheel.

Description

GAZE INTERACTIONS WITH USER INTERFACES FIELD OF THE INVENTION The invention relates to controlling gaze interactivity with user interfaces. BACKGROUND OF THE INVENTION Interactions with medical user interfaces (e.g., ultrasound cart systems) are usually one- handed as the user (e.g., sonographer/specialist) often holds a medical device (e.g., transducer) in the other hand. This is also often the non-dominant hand. When the user interface is used (e.g., touch screen) is used, this results in inaccurate and slow interactions on the touch screen. Additionally, there is a large proportion of long-time users developing muscle issues on the dominant hand, arm and shoulder using the medical device. These users may switch hands, using the injured dominant arm to operate the user interface. Interactions with the medical user interface typically requires lifting the hand from a physical control panel to, for example, a touch screen, which can be problematic if the shoulder is injured. Thus, there is a need to improve interactions with medical user interfaces and user interfaces in general. EP 3848779 Al describes a head-mounted display system for controlling a medical imaging device comprising a head-mounted display to be worn by an operator. SUMMARY OF THE INVENTION The invention is defined by the claims. According to examples in accordance with an aspect of the invention, there is provided a method for controlling gaze interactivity with a user interface, UI, the method comprising: tracking the gaze of a user on the UI to allow the user to select a UI control with the user’s gaze; determining whether the user’s gaze is moving towards a region of interest on the UI; and in response to determining that the user’s gaze is moving towards the region of interest, locking gaze interactions with the UI to prevent the user’s gaze from selecting a different UI control. The use of gaze interactions with the UI provides a seamless and intuitive interaction for the user and avoids the user having to raise their hands to interact with the UI (e.g., for a touchscreen UI). Gaze interaction/input refers to a method of human-computer interaction where the direction or position of a person's gaze is used as an input to interact with a computer system or application (in this case, the medical UI). Instead of using traditional input devices like a mouse, gaze interaction relies on tracking the movements and focus of a person's eyes. In other words, gaze interactions use the user’s gaze as an alternative to, for example, the movement of a mouse to highlight controls on the UI. Gaze interaction typically involves the use of eye-tracking technology, which can precisely track the movement and position of the eyes. This technology uses cameras or sensors to capture and analyze the reflection or position of the eyes, allowing the computer system to interpret the user's gaze. Gaze interaction can be integrated into the UIs, allowing for eye-controlled interactions. The first control may be a physical control (e.g., button, rotary control, touchpad etc.), namely a control requiring manual interaction rather than only controlled by gaze. The medical UI may be configured to display one or more UI controls, wherein the user can adapt parameters of an image with the UI controls. When a control displayed on the UI is highlighted by the gaze, a separate button/control can then be used to adapt the parameters corresponding to the control. For example, in an ultrasound UI, the control may correspond to gain for an ultrasound image. The gaze can be used to select the gain control when it is focused on the control and a rotary control could be used to adapt the gain. However, it is common for users to look towards a region of interest as they are adapting the parameter. For example, the region of interest may be an ultrasound image displayed on an ultrasound UI. Often, the gaze of the user is quicker than the reaction time of a hand, or other physical control. Thus, a different UI control may be selected/highlighted by the gaze when the user begins to adapt the parameter with the physical control. As such, it is proposed to lock the gaze interactions when the user’s gaze moves towards the region of interest. Uocking gaze interactions prevents the user’s gaze from being able to select/highlight a different UI control and thus unintentionally adapting a different parameter. Gaze interactions with the UI may be unlocked in response to the user not interacting with a first control for at least a first period of time, wherein the first control is for adapting a parameter corresponding to a UI control. Determining whether the user’s gaze is moving towards a region of interest on the UI may comprise determining the speed of the gaze, relative to the UI, in the direction of the region of interest, being above a first speed threshold. When the user is looking towards the region of interest (e.g., an image), it can be assumed that the user wishes to check how t