Search

EP-4365658-B1 - APPARATUS, METHODS AND COMPUTER PROGRAMS FOR CONTROLLING IMAGE AUGMENTATIONS

EP4365658B1EP 4365658 B1EP4365658 B1EP 4365658B1EP-4365658-B1

Inventors

  • SALMIMAA, MARJA
  • JÄRVENPÄÄ, Toni Johan
  • LEHTINIEMI, ARTO

Dates

Publication Date
20260506
Application Date
20221101

Claims (15)

  1. An apparatus (107) comprising means for: using gaze tracking to determine (301) a location of a user's pupil wherein the user (117) is wearing a head mounted item (103); and characterized by enabling display (303) of an indication (501) of the location of the user's pupil so that the indication (501) is visible on a surface of the head mounted item (103) and wherein the indication (501) is configured to be used for positioning at least one image augmentation for a captured image comprising the user (117).
  2. An apparatus as claimed in claim 1 wherein the at least one image augmentation comprises a graphical overlay configured to be superimposed over at least part of the image comprising the user.
  3. An apparatus as claimed in any preceding claim wherein at least part of the indication is configured to be incorporated as part of the at least one image augmentation.
  4. An apparatus as claimed in any preceding claim wherein the indication of the location of the user's pupil is displayed on a shutter of the head mounted item.
  5. An apparatus as claimed in any preceding claim wherein the indication of the location of the user's pupil comprises outcoupled light from the head mounted item.
  6. An apparatus as claimed in claim 5 wherein at least part of an image displayed by the head mounted item is adapted to enable the outcoupled light to be used to identify the position of the user's pupil.
  7. An apparatus as claimed in any preceding claim wherein the means are for enabling exchange of information with an electronic device being used to capture the image comprising the user wherein the information comprises information relating to at least one of: relative positions of the electronic device and the head mounted item, or relative orientations of the electronic device and the head mounted item.
  8. An apparatus as claimed in any preceding claim wherein the head mounted item comprises an augmented reality headset.
  9. A head mounted item comprising an apparatus as claimed in any preceding claim.
  10. A method comprising: using gaze tracking to determine (301) a location of a user's pupil wherein the user is wearing a head mounted item (103); and characterized by enabling display (303) of an indication (501) of the location of the user's pupil so that the indication (501) is visible on a surface of the head mounted item (103) and wherein the indication (501) is configured to be used for positioning at least one image augmentation for a captured image comprising the user (117).
  11. A computer program comprising instructions which, when executed by an apparatus comprising the necessary hardware, cause the apparatus (107) to perform at least: using gaze tracking to determine (301) a location of a user's pupil wherein the user is wearing a head mounted item; and enabling display (303) of an indication (501) of the location of the user's pupil so that the indication (501) is visible on a surface of the head mounted item (103) and wherein the indication (501) is configured to be used for positioning at least one image augmentation for a captured image comprising the user (117).
  12. An apparatus (113) comprising means for: detecting (305) an indication (501) of the location of a user's pupil wherein the user (117) is wearing a head mounted item (103) and characterized in that the indication (501) is displayed so that it is visible on a surface of the head mounted item (103); capturing (307) one or more images comprising the user (117); and using (309) the indication (501) of the location of the user's pupil to position at least one image augmentation for at least one captured image comprising the user (117).
  13. An apparatus as claimed in claim 12 wherein the means are for image processing the one or more images of the user to remove the indication of the location of the user's pupil from respective ones of the captured one or more images comprising the user.
  14. A method comprising: detecting (305) an indication (501) of the location of a user's pupil wherein the user (117) is wearing a head mounted item (103) and characterized in that the indication (501) is displayed so that it is visible on a surface of the head mounted item (103); capturing (307) one or more images comprising the user (117); and using (309) the indication (501) of the location of the user's pupil to position at least one image augmentation for at least one captured image comprising the user (117).
  15. A computer program comprising instructions which, when executed by an apparatus comprising the necessary hardware (113), cause the apparatus (113) to perform at least: detecting (305) an indication (501) of the location of a user's pupil wherein the user (117) is wearing a head mounted item (103) and characterized in that the indication (501) is displayed so that it is visible on a surface of the head mounted item (103); capturing (307) one or more images comprising the user (117); and using (309) the indication (501) of the location of the user's pupil to position at least one image augmentation for at least one captured image comprising the user (117).

Description

TECHNOLOGICAL FIELD Examples of the disclosure relate to apparatus, methods and computer programs for controlling image augmentations. Some relate to apparatus, methods and computer programs for controlling image augmentations for images of a user wearing a head mounted item. BACKGROUND Image augmentations comprise filters or graphical items that can be added to captured images. For example, the filters or graphical items can be positioned overlaying a captured image or part of a captured image. Such image augmentations could be used in mediated reality applications, in messaging applications, or in any other suitable applications. US 2016/209917 A1 teaches a method to provide visual feedback for gaze-based user-interface navigation. This includes, presenting, on a display, a first image representing a digital object available for user interaction, recognizing a user gaze axis, and computing a point of intersection of the user gaze axis through the first image. An offset distance between the point of intersection and a reference position of the first image is then recognized, and a second image is presented on the display. The second image is presented displaced from the point of intersection by an amount dependent on the offset distance. US 2019/0187482 A1 teaches a head-mounted display device for providing augmented reality contents to a wearer. The head-mounted display device includes an eye tracker, a light projector, a beam steerer and a combiner. The eye tracker is configured to determine a position of a pupil of an eye of the wearer. The light projector is configured to project light for rendering images. The beam steerer is configured to change a direction of the light from the light projector based on the position of the pupil. The combiner is configured to combine the light from the light projector and light from outside of the head-mounted display device for providing an overlap of the rendered image and a real image that corresponds to the light from the outside of head-mounted display device. US 2015/0212330 A1 teaches a head mounted display and a control method thereof. The control methods comprise the following steps. An application processor controls a pico-projector unit to project a virtual image having a virtual object located on a virtual image coordinate in a virtual image coordinate system. An eye image sensing unit captures eye image data. A sensing apparatus sense a touch object to output sensing data. An application specific integrated circuit (ASIC) obtains a real image coordinate of the touch object in a real image coordinate system according to the sensing data. The ASIC obtains a pupil position according to the eye image data, and controls the adjustment unit to adjust an imaging position of the virtual image according to the pupil position. The ASIC determines whether the touch object touched the virtual object according to the pupil position, the real coordinate and the virtual coordinate. US 2014/085198 A1 relates to correlating pupil position to gaze location within a scene. At least some of the illustrative embodiments are methods including: receiving, by a first computer system, a first video stream depicting an eye of the user, the first video stream comprising a first plurality of frame; receiving, by the first computer system a second video stream depicting a scene in front of the user, the second video stream comprising a second plurality of frames; determining, by the computer system, pupil position within the first plurality of frames; calculating, by the first computer system, gaze location in the second plurality of frames based on pupil position within the first plurality of frames; and sending an indication of the gaze location to a second computer system, the second computer system distinct from the first computer system, and the sending in real-time with creation of the first video stream. US 2018/314324 A1 teaches methods and systems for controlling an electronic device. The method includes acquiring, by a processor of the electronic device, a facial image. The facial image includes a face of a user having one or more eyes. The processor detects a center position of the eye and a pupil position corresponding to a center of a pupil of the eye, determines an eye gaze position based on the center position and the pupil position, analyzes the eye gaze position in consecutively captured facial images, and performs a function associated with a touch event at the eye gaze position in response to determining that the gaze position corresponds to a location of the touch event in the consecutively captured facial images. BRIEF SUMMARY According to various, but not necessarily all, examples of the disclosure there may be provided an apparatus comprising means for: using gaze tracking to determine a location of a user's pupil wherein the user is wearing a head mounted item; andenabling display of an indication of the location of the user's pupil so that the indication is