Search

EP-4735977-A1 - REAL-WORLD SENSORY STIMULI IN VIRTUAL WORLDS

EP4735977A1EP 4735977 A1EP4735977 A1EP 4735977A1EP-4735977-A1

Abstract

There is provided techniques for controlling how activation of a real-world sensory stimuli affects rendering of virtual worlds. A controller obtains an indication that a real-world sensory stimuli is scheduled to be activated at a real-world location by a sensory stimuli releasing device. The real-world location has a correspondence to a first virtual location in a first virtual world rendered at a first XR device and to a second virtual location in a second virtual world rendered at a second XR device. According to the indication, that the real-world sensory stimuli is scheduled to be activated is caused by a trigger. The controller, in response to having obtained the indication, performs an action that affects at least one of: rendering of the first virtual world at the first XR device, rendering of the second virtual world at the second XR device.

Inventors

  • WIDMARK, Tobias
  • KRISTENSSON, ANDREAS

Assignees

  • Telefonaktiebolaget LM Ericsson (publ)

Dates

Publication Date
20260506
Application Date
20230627

Claims (18)

  1. 1. A controller (200) for controlling how activation of a real-world sensory stimuli (150a, 150b) affects rendering of virtual worlds (120a, 120b), wherein the controller (200) is configured to communicate with at least a first XR device (110a), a second XR device (110b), and sensory stimuli releasing devices (140a, 140b), wherein the controller (200) comprises processing circuitry (210), and wherein the processing circuitry (210) is configured to cause the controller (200) to: obtain an indication that a real-world sensory stimuli (150a, 150b) is scheduled to be activated at a real-world location by one of the sensory stimuli releasing devices (140a, 140b), wherein the real-world location has a correspondence to a first virtual location in a first virtual world (120a) rendered at the first XR device (110a) and to a second virtual location in a second virtual world (120b) rendered at the second XR device (110b), and wherein, according to the indication, that the real-world sensory stimuli (150a, 150b) is scheduled to be activated is caused by a trigger; and in response thereto: perform an action that affects at least one of: rendering of the first virtual world (120a) at the first XR device (110a), rendering of the second virtual world (120b) at the second XR device (nob).
  2. 2. The controller (200) according to claim 1, wherein the trigger is any of: an action taken in the first virtual world (120a) by a user (160a) of the first XR device (110a), a timer expiring.
  3. 3. The controller (200) according to claim 2, wherein the action is based on a location of the second XR device (nob) relative said real-world location.
  4. 4. The controller (200) according to any preceding claim, wherein the controller (200) is configured to facilitate an operative connection between the first XR device (110a) and the second XR device (nob). P107946W001 24
  5. 5. The controller (200) according to any preceding claim, wherein the indication that the real-world sensory stimuli (150a, 150b) is scheduled to be activated is obtained as a request from the first XR device (110a), and wherein the processing circuitry (210) is configured to cause the controller (200) to: notify the second XR device (nob) that the real-world sensory stimuli (150a, 150b) is scheduled to be activated.
  6. 6. The controller (200) according to claim 5, wherein the processing circuitry (210) is configured to cause the controller (200) to: request said one of the sensory stimuli releasing devices (140a, 140b) to activate the real-world sensory stimuli (150a, 150b).
  7. 7. The controller (200) according to claim 2 and claim 5, wherein the processing circuitry (210) is configured to cause the controller (200) to: obtain a response from the second XR device (nob) requesting blocking activation of the real-world sensory stimuli (150a, 150b), wherein the action is based on the response.
  8. 8. The controller (200) according to claim 7, wherein the processing circuitry (210) is configured to cause the controller (200) to: notify the first XR device (110a) that the activation of the real-world sensory stimuli (150a, 150b) has been blocked.
  9. 9. The controller (200) according to claim 2 and claim 8, wherein the action involves changing virtual content of the first virtual world (120a).
  10. 10. The controller (200) according to claims 6 and 7, wherein the processing circuitry (210) is configured to cause the controller (200) to: notify the second XR device (nob) that the real-world sensory stimuli (150a, 150b) has been activated.
  11. 11. The controller (200) according to claims 2, 6 and 7, or claims 2 and 10, wherein the action involves changing virtual content of the second virtual world (120b). P107946W001 25
  12. 12. The controller (200) according to claims 2, 6 and 7, or claims 2 and 10, wherein the virtual content is a map of the second virtual world (120b), and wherein the action involves changing the map to either guide a user of the second XR device (nob) towards said real-world location or away from said real-world location.
  13. 13. The controller (200) according to claims 2, 6 and 7, or claims 2 and 10, wherein the virtual content is a storyline of the second virtual world (120b), and wherein the action involves changing the storyline to either include or exclude the real-world sensory stimuli (150a, 150b).
  14. 14. The controller (200) according to any preceding claim, wherein at which real- world location the real-world sensory stimuli (150a, 150b) is scheduled to be activated and by which of the sensory stimuli releasing devices (140a, 140b) the real- world sensory stimuli (150a, 150b) is scheduled to be activated is based on real-world location information of the first XR device (110a) and real-world location information of the sensory stimuli releasing devices (140a, 140b).
  15. 15. The controller (200) according to any preceding claim, wherein activation of the real-world sensory stimuli (150a, 150b) involves any, or any combination of: release a scent, play out an audio clip, play out a video clip, affect temperature of a radiator, start a fan, provide a haptic signal.
  16. 16. The controller (200) according to any preceding claim, wherein the controller (200) is part of, integrated with, or collocated with, either the first XR device (110a) or the second XR device (110b), or is part of, integrated with, or collocated with, a server device.
  17. 17. A computer program (920) for controlling how activation of a real-world sensory stimuli (150a, 150b) affects rendering of virtual worlds (120a, 120b), the computer program comprising computer code which, when run on processing circuitry (210) of a controller (200), wherein the controller (200) is configured to communicate with at least a first XR device (110a), a second XR device (110b), and sensory stimuli releasing devices (140a, 140b), causes the controller (200) to: P107946W001 26 obtain an indication that a real-world sensory stimuli (150a, 150b) is scheduled to be activated at a real-world location by one of the sensory stimuli releasing devices (140a, 140b), wherein the real-world location has a correspondence to a first virtual location in a first virtual world (120a) rendered at the first XR device (110a) and to a second virtual location in a second virtual world (120b) rendered at the second XR device (110b), and wherein, according to the indication, that the real -world sensory stimuli (150a, 150b) is scheduled to be activated is caused by a trigger; and in response thereto: perform an action that affects at least one of: rendering of the first virtual world (120a) at the first XR device (110a), rendering of the second virtual world (120b) at the second XR device (nob).
  18. 18. A computer program product (910) comprising a computer program (920) according to claim 17, and a computer readable storage medium (930) on which the computer program is stored.

Description

REAL-WORLD SENSORY STIMULI IN VIRTUAL WORLDS TECHNICAL FIELD Embodiments presented herein relate to a method, a controller, a computer program, and a computer program product for controlling how activation of a real-world sensory stimuli affects rendering of virtual worlds. BACKGROUND In general terms, virtual reality (VR) is a simulated experience that can be similar to or completely different from the real world. Non-limiting examples of VR technologies can be found in entertainment applications (such as video games, and movies), in education applications (such as medical or military training, robot navigation, construction modelling, and airplane simulation) and business applications (such as virtual meetings or computer conferencing). VR can be combined with augmented reality technologies and mixed reality technologies, sometimes referred to as extended reality technologies. VR systems are examples of more general extended reality (XR) systems, that in addition to VR systems also include augmented reality (AR) systems, and mixed reality (MR) systems. XR systems commonly use either headsets or multi-projected environments to generate realistic images, sounds and other sensations that simulate a user's physical presence in a virtual environment defining the simulated experience. A user using a piece of XR equipment is able to look around in the simulated experience, move around in it, and interact with virtual features or items. The effect is commonly created by headsets comprising a head-mounted display with a small screen to be placed in front of the eyes of the user, but can also be created through specially designed rooms with multiple large screens. VR systems typically incorporates auditory and visual feedback, but may also allow other types of sensory and force feedback through haptic technologies. While visual and auditory stimuli have been at the forefront when it comes to XR development, further immersion of the user in the virtual space might be achieved by providing stimuli that affect also other senses. As an illustrative example, attempts have been made to use olfactory stimuli (i.e., scents, smells, fragrances, etc.) to enhance audio-visual experiences, although often with limited success. Still, users P107946W001 2 who engage with XR systems that have an olfactory component congruent to the virtual environment report a higher level of immersion and satisfaction, even when the fragrances are light enough not to be consciously noticed. In this respect, one issue concerns how to provide olfactory stimuli, or other types of real-world sensory stimuli such as audio and/or visual stimuli, etc. to users without overloading or overwriting other senses. Taking olfactory stimuli as an example, the human olfactory system’s sensitivity to smells makes human beings susceptible to virtual smells lingering when they should be gone. Likewise, if a scent is perceived as incongruent with other types of real-world sensory stimuli immersion might even be lost. Technology is available for fitting conventional XR headsets with olfactory devices. For example, an olfactory device can be attached as a peripheral device to some XR headsets. In one non-limiting example, such a peripheral device can be based on discs containing scent stored using micro-encapsulation techniques, possibly having scent guide pipes directing the scents to the user’s nose. Some technologies are based on using a liquid perfume-based cartridge system, where a block with a limited number of scents is inserted into a device mounted on an XR headset. Scents can then be released as the user engages with the virtual environment. Due to the cartridges requiring perfume vials for each scent, the number of scents is limited to only a few that can be utilized at the same time. Since the devices need to fit the cartridges along with the necessary mechanics, they tend to be significantly bulky as well, often being half an ordinary XR headset in size. This limits not only the different scents that can be released but also how many times any given scent can be released. A further consequence of the above is that each user is limited to the scents that can be released by the olfactory device attached to its own XR headset. While these issues apply to XR devices with respect to handling of scents in particular, the same issues apply also to XR devices with respect to handling of other types of real-world sensory stimuli, such as auditory feedback, visual feedback, sensory and force feedback, etc. SUMMARY An object of embodiments herein is to address the above issues. P107946W001 3 A particular object is to take advantage of real-world sensory stimuli that is available for multiple users. A particular object is to enable communication between XR devices in the context of activation of real-world sensory stimuli. A particular object is to enable coordinated activation of real-world sensory stimuli for multiple XR devices. According to a first aspect the