Search

EP-3992582-B1 - METHOD, APPARATUS AND COMPUTER PROGRAM PRODUCT FOR TEMPORALLY BASED DYNAMIC AUDIO SHIFTING

EP3992582B1EP 3992582 B1EP3992582 B1EP 3992582B1EP-3992582-B1

Inventors

  • GRANI, FRANCESCO
  • LOPEZ BATRES, MARIO

Dates

Publication Date
20260506
Application Date
20211025

Claims (15)

  1. An apparatus (20) comprising at least one processor (24) and at least one non-transitory memory (26) including computer program code instructions, the computer program code instructions configured to, when executed, cause the apparatus to at least: receive an indication of location based information for a user (600); provide for generation of an auditory cue (300) associated with the location based information; determine a duration of the auditory cue; and provide for generation of a transition portion (306) of the auditory cue having a dynamic virtual source location (204, 316) moving from a first virtual source location (202, 314) along a trajectory and ending at a second virtual source location (206, 318), wherein the transition portion of the auditory cue has a transition portion duration determined by the duration of the auditory cue, characterized in that the transition portion duration of the auditory cue does not exceed a predefined transition portion duration threshold.
  2. The apparatus of claim 1, wherein the apparatus is further caused to: generate a first portion (304) of the auditory cue in response to the duration of the auditory cue exceeding the predefined transition portion duration threshold, wherein the first portion of the auditory cue has a stationary virtual source location at the first virtual source location, wherein the first portion of the auditory cue does not exceed a predefined first portion duration threshold.
  3. The apparatus of claim 2, wherein the apparatus is further caused to: generate a third portion (308) of the auditory cue in response to the duration of the auditory cue exceeding a total of the predefined transition portion duration threshold and the predefined first portion duration threshold, wherein the third portion of the auditory cue has a stationary virtual source location at the second virtual source location.
  4. The apparatus of claim 3, wherein the third portion of the auditory cue has a duration that is equal to the duration of the auditory cue less the predefined first portion duration threshold and the predefined transition portion duration threshold.
  5. The apparatus of claim 1, wherein causing the apparatus to provide for generation of the transition portion of the auditory cue having a dynamic virtual source location moving from the first virtual source location along the trajectory and ending at the second virtual source location comprises causing the apparatus to provide for generation of the transition portion of the auditory cue having the dynamic virtual source location moving from the first virtual source location along the trajectory and ending at the second virtual source location using three-dimensional spatial audio effects.
  6. The apparatus of claim 1, wherein the second virtual source location is a location positioned between the user and a location identified in the location based information.
  7. The apparatus of claim 1, wherein the first virtual source location is proximate a head (208) of the user and wherein the trajectory is a curved trajectory from the first virtual source location to the second virtual source location.
  8. The apparatus of claim 1, wherein the transition portion of the auditory cue comprises a natural language instruction indicating an action to be taken.
  9. A method comprising: providing for generation of an auditory cue (300); determining a duration of the auditory cue; and providing for generation of a transition portion (306) of the auditory cue having a dynamic virtual source location (204, 316) moving, relative to a user (600), from a first virtual source location (202, 314) along a trajectory and ending at a second virtual source location (206, 318), wherein the transition portion of the auditory cue has a transition portion duration determined by the duration of the auditory cue characterized in that the transition portion duration of the auditory cue does not exceed a predefined transition portion duration threshold.
  10. The method of claim 9, further comprising: generating a first portion (304) of the auditory cue in response to the duration of the auditory cue exceeding the predefined transition portion duration threshold, wherein the first portion of the auditory cue has a stationary virtual source location at the first virtual source location, wherein the first portion of the auditory cue does not exceed a predefined first portion duration threshold or has a duration substantially equal to the auditory cue duration less the predefined transition portion duration threshold.
  11. The method of claim 9, further comprising: generating a third portion (308) of the auditory cue in response to the duration of the auditory cue exceeding a total of the predefined transition portion duration threshold and the predefined first portion duration threshold, wherein the third portion of the auditory cue has a stationary virtual source location at the second virtual source location.
  12. The method of claim 11, wherein the third portion of the auditory cue has a duration that is equal to the duration of the auditory cue less the predefined first portion duration threshold and the predefined transition portion duration threshold.
  13. The method of claim 9, wherein the auditory cue comprises location based information for the user, wherein the first virtual source location is proximate a head (208) of the user, wherein the trajectory is a curved trajectory from the first virtual source location to the second virtual source location, and wherein the second virtual source location is a location positioned between the user and a location identified in the location based information.
  14. A computer program configured to perform the method of any preceding method claim.
  15. A computer program product comprising at least one non-transitory computer-readable storage medium having computer-executable program code instructions stored therein, the computer-executable program code instructions comprising program code instructions to: receive an indication of location based information for a user (600); provide for generation of an auditory cue (300) associated with the location based information; determine a duration of the auditory cue; in response to the duration of the auditory cue being less than a predefined transition portion duration threshold: generate a transition portion (306) comprising the auditory cue having a dynamic virtual source location (204, 316) moving from a first virtual source location (202, 314) along a trajectory and ending at a second virtual source location (206, 318); in response to the duration of the auditory cue being greater than a predefined transition portion duration threshold: generate a first portion (304) of the auditory cue having a stationary virtual source location at the first virtual source location; and generate a transition portion (306) of the auditory cue having a duration equal to the predefined transition portion duration and having a dynamic virtual source location (204, 316) moving from a first virtual source location (202, 314) along a trajectory and ending at a second virtual source location (206, 318).

Description

TECHNOLOGICAL FIELD An example embodiment of the present invention relates generally to navigation assistance and user interface techniques, and more particularly, to a method, apparatus and computer program product for providing temporally-based spatial auditory cues to facilitate user interaction with navigational assistance or at least semi-autonomous vehicle control. BACKGROUND Maps have been used for centuries for providing route geometry and geographical information, while routes have conventionally been planned by hand along paths defined by the maps. Conventional paper maps including static images of roadways and geographic features from a snapshot in history have given way to digital maps presented on computers and mobile devices, and navigation has been enhanced through the use of graphical user interfaces. Digital maps and navigation can provide dynamic route guidance to users as they travel along a route. Further, dynamic map attributes such as route traffic, route conditions, and other dynamic map-related information may be provided to enhance the digital maps and facilitate navigation. Different map service providers along with different user interfaces (e.g., different mobile devices or different vehicle navigation systems) may result in nonuniform map and route guidance interfaces, which may not be intuitive or easily understood by a user, particularly one that is accustomed to a different type of map and navigation interface. Further, visual displays of route guidance instructions may not always be convenient or safe for a user to reference. As such, route guidance is often coupled with audible commands regarding maneuvers such as turns. However, these audible commands may be confusing or difficult to understand, for example, when provided in a complex intersection or when faced with multiple similar maneuver options. US patent No. 10,477,338 discloses an apparatus caused to: receive an indication of location based information for a user; provide for generation of a first auditory cue, where generation of the first auditory cue may include generating an audio cue having a virtual source location using three-dimensional audio effects. Providing for generation of the first auditory cue may include causing the apparatus to: provide for generation of a beginning of the first auditory cue at a first virtual source location; and provide for generation of a transition phase of the first auditory cue moving the virtual source location from the first virtual source location along a trajectory and ending at a second virtual source location, where the second virtual source location is a location positioned between the user and a location identified in the location based information. BRIEF SUMMARY A method, apparatus, and computer program product are therefore provided for providing a user interface for navigation. Embodiments may provide an apparatus according to appended claim 1, a method according to appended claim 9, a computer program according to appended claim 14 and a computer program product according to appended claim 15. BRIEF DESCRIPTION OF THE DRAWINGS Having thus described certain example embodiments of the present invention in general terms, reference will hereinafter be made to the accompanying drawings which are not necessarily drawn to scale, and wherein: Figure 1 is a block diagram of an apparatus according to an example embodiment of the present disclosure;Figure 2 is a block diagram of a system of implementing route guidance on a navigation system according to an example embodiment of the present disclosure;Figure 3 depicts an example environment and graphical representation of auditory cues according to an example embodiment of the present disclosure;Figure 4 illustrates timeline and spatial representation of an auditory cue provided to a user according to an example embodiment of the present disclosure;Figure 5 illustrates another timeline and spatial representation of an auditory cue provided to a user according to an example embodiment of the present disclosure;Figure 6 illustrates still another timeline and spatial representation of an auditory cue provided to a user according to an example embodiment of the present disclosure;Figure 7 illustrates an auditory cue having a virtual source position proximate a head of a user according to an example embodiment of the present disclosure;Figure 8 illustrates a transition portion of an auditory cue having a dynamic virtual source position that moves along a trajectory relative to a user according to an example embodiment of the present disclosure;Figure 9 illustrates an auditory cue having a virtual source position in a direction of a point of interest or location according to an example embodiment of the present disclosure;Figure 10 is a flowchart of a method for providing temporally-based spatial auditory cues to facilitate user interaction with navigational assistance or at least semi-autonomous vehicle control according to an exa