Search

US-12626453-B2 - Method and arrangements for graphically visualizing data transfer in a 3D virtual environment

US12626453B2US 12626453 B2US12626453 B2US 12626453B2US-12626453-B2

Abstract

Method and arrangements ( 14; 15; 21; 500 ) for graphically visualizing data transfer ( 12; 13 ) on a first device ( 21 ) for a first user ( 31 ) that by means of the first device ( 21 ) is visually experiencing a 3D virtual environment ( 301 ) through a first person perspective, 1PP, field of view ( 302 ) from a first virtual location ( 371 ) in the 3D virtual environment ( 301 ). The first device ( 21 ) being configured to provide tracking between a real-world field of view orientation of the first user ( 31 ) and said 1PP field of view ( 302 ) in the 3D virtual environment ( 301 ). A data transfer ( 12; 13 ) is identified between a second device ( 21; 22 ) and a third device ( 23 ). It is graphically visualize ( 407 ), on the first user device ( 21 ), the identified data transfer in the 3D virtual environment ( 301 ) as a graphical flow ( 385; 385; 389 ) between a second virtual location ( 372; 371 ), associated with the second device ( 22 ), in the 3D virtual environment ( 301 ) and a third virtual location ( 373 ), associated with the third device ( 23 ). The first user ( 31 ) can thereby through said 1PP field of view visually experience the data transfer in the 3D virtual environment ( 301 ).

Inventors

  • Peter Ökvist
  • Tommy Arngren

Assignees

  • TELEFONAKTIEBOLAGET LM ERICSSON (PUBL)

Dates

Publication Date
20260512
Application Date
20210507

Claims (20)

  1. 1 . A method for graphically visualizing data transfer on a first device for a first user that by means of the first device is visually experiencing a 3D virtual environment through a first person perspective (1PP) field of view from a first virtual location in the 3D virtual environment, wherein said first device is configured to provide tracking between a real world field of view orientation of the first user and said 1PP field of view in the 3D virtual environment, and wherein the method comprises: identifying a data transfer between a second device and a third device, and graphically visualizing, on the first user device, the identified data transfer in the 3D virtual environment as a graphical flow between a second virtual location, associated with the second device, in the 3D virtual environment and a third virtual location, associated with the third device, in the 3D virtual environment, wherein the first user through said 1PP field of view can visually experience the data transfer in the 3D virtual environment.
  2. 2 . The method of claim 1 , wherein he method further comprises: obtaining information relating to said data transfer, which information is identifying one or more of following: a data rate of the data transfer, a type of data being transferred, a duration of the data transfer, bit rate of the data transfer, a latency of the data transfer, data transfer capabilities of the second and/or third device relevant for said data transfer, a protection level associated with said data transfer, and size of data segments used in the data transfer, and wherein said graphical visualization is further based on the obtained information.
  3. 3 . The method of claim 1 , wherein said first device is located at a first physical location in the real world, said second device is located at a second physical location in the real world and said third device is located at a third physical location in the real world, and wherein the method further comprises: assigning said third virtual location to the third device based on at least said third physical location.
  4. 4 . The method of claim 3 , wherein the method further comprises: computing, based on said first physical location and said third physical location, a real world direction that is a direction from the first physical location towards the third physical location where the third device is located, and wherein said third virtual location is assigned to the third device in the 3D virtual environment based on the computed real world direction.
  5. 5 . The method of claim 4 , wherein said third virtual location is located in a virtual cardinal direction corresponding to said computed real world direction.
  6. 6 . The method of claim 5 , wherein said virtual cardinal direction corresponds to said computed real world direction by being determined in relation to one or more visual cardinal directions visually indicated in and associated with the 3D virtual environment so that said virtual cardinal direction in relation to said one or more visual cardinal directions is the same cardinal direction as a real world cardinal direction corresponding to the computed real world direction.
  7. 7 . The method of claim 5 , wherein said virtual cardinal direction corresponds to said computed real world direction by being the same as a real-world cardinal direction corresponding to the computed real-world direction.
  8. 8 . The method of claim 3 , wherein the method further comprises: computing, based on said first physical location and said third physical location, a real world distance that is a distance between the first physical location and the third physical location, and wherein said third virtual location is assigned to the third device in the 3D virtual environment based on the computed real world distance.
  9. 9 . The method of claim 1 , wherein the second device is the first device.
  10. 10 . The method of claim 1 , wherein the second device and/or a second user thereof is graphically represented in the 3D virtual environment at said second virtual location that is different from said first virtual location.
  11. 11 . A non-transitory computer readable storage medium storing a program comprising instructions that when executed by one or more processors causes a device to perform the method of claim 1 .
  12. 12 . An apparatus for graphically visualizing data transfer on a first device for a first user that by means of the first device is visually experiencing a 3D virtual environment through a first person perspective (1PP) field of view from a first virtual location in the 3D virtual environment, wherein said first device is configured to provide tracking between a real world field of view orientation of the first user and said 1PP field of view in the 3D virtual environment, and wherein said apparatus is configured to: identify a data transfer between a second device and a third device, and graphically visualize, on the first user device, the identified data transfer in the 3D virtual environment as a graphical flow between a second virtual location, associated with the second device, in the 3D virtual environment and a third virtual location, associated with the third device, in the 3D virtual environment, wherein the first user through said 1PP field of view can visually experience the data transfer in the 3D virtual environment.
  13. 13 . The apparatus of claim 12 , further configured to: obtain information relating to said data transfer, which information is identifying one or more of following: a data rate of the data transfer, a type of data being transferred, a duration of the data transfer, bit rate of the data transfer, a latency of the data transfer, data transfer capabilities of the second and/or third device relevant for said data transfer, a protection level associated with said data transfer, and size of data segments used in the data transfer, and wherein said graphical visualization is further based on the obtained information.
  14. 14 . The apparatus of claim 12 , wherein said first device is located at a first physical location in the real world, said second device is located at a second physical location in the real world and said third device is located at a third physical location in the real world, and wherein the apparatus is further configured to: assign said third virtual location to the third device based on at least said third physical location.
  15. 15 . The apparatus of claim 14 , wherein the apparatus is further configured to: compute, based on said first physical location and said third physical location, a real world direction that is a direction from the first physical location towards the third physical location where the third device is located, and wherein said third virtual location is assigned to the third device in the 3D virtual environment based on the computed real world direction.
  16. 16 . The apparatus of claim 15 , wherein said third virtual location is located in a virtual cardinal direction corresponding to said computed real world direction, and said virtual cardinal direction corresponds to said computed real world direction by being determined in relation to one or more visual cardinal directions visually indicated in and associated with the 3D virtual environment so that said virtual cardinal direction in relation to said one or more visual cardinal directions is the same cardinal direction as a real world cardinal direction corresponding to the computed real world direction.
  17. 17 . The apparatus of claim 15 , wherein said third virtual location is located in a virtual cardinal direction corresponding to said computed real world direction, and said virtual cardinal direction corresponds to said computed real world direction by being the same as a real world cardinal direction corresponding to the computed real world direction.
  18. 18 . The apparatus of claim 14 , wherein the apparatus is further configured to: compute, based on said first physical location and said third physical location, a real world distance that is a distance between the first physical location and the third physical location, and wherein said third virtual location is assigned to the third device in the 3D virtual environment based on the computed real world distance.
  19. 19 . The apparatus of claim 12 , wherein the second device is the first device.
  20. 20 . The apparatus of claim 12 , wherein the second device and/or a second user thereof is graphically represented in the 3D virtual environment at said second virtual location that is different from said first virtual location.

Description

CROSS REFERENCE TO RELATED APPLICATIONS This application is a 35 U.S.C. § 371 National Stage of International Patent Application No. PCT/EP2021/062182, filed 2021 May 7. TECHNICAL FIELD Embodiments herein concern graphical visualization on a device for a user that by means of the device is visually experiencing a three dimensional (3D) virtual environment, such as a 3D extended reality (XR) environment, through a first person perspective, “1PP”, field of view from a virtual location in the 3D virtual environment and where the device is configured to provide tracking between a real world field of view orientation of the user and said 1PP field of view in the 3D virtual environment. BACKGROUND 3D extended reality (XR) environments, such as 3D virtual reality (VR), or 3D augmented reality (AR) environments, are today well known and have been used in various applications on both mobile and/or stationary devices, e.g. computers or game consoles with connected VR headsets, smart phones with or without VR headset integration, etc. The applications have for example been for informative, educational and/or entertainment purposes. A 3D XR environment may in general be considered a mixed reality environment that is a mix of the real world and a virtual world that a user is experiencing with some connection therebetween. It may also be described as an environment where digital, or virtual, objects have been brought into the physical world or that physical world objects have been brought into the virtual, or digital, world. A device configured to provide experience of such 3D virtual environment provides graphical visualization on the device for a user that is using the device and by means of the device is visually experiencing the 3D virtual environment through a first person perspective (1PP) field of view from a virtual location in the 3D virtual environment. The means of the device providing the visual experience to the user may e.g. be or include a VR headset, XR or AR glasses or even contact lens(es), a smartphone, just to give some examples. An important subgroup of such applications and devices are those where it is utilized that the application and device provide tracking between a real-world field of view orientation of the user and said 1PP field of view in the 3D virtual environment. That is, where the user by physically moving the head, device and/or eyes to change the field of view in the real world, there is tracking of this and thereby a corresponding change of the 1PP field of view on the virtual environment. This is a typical way of how there is provided a connection between the real world and the virtual world, enhancing the experience of mix of the real and virtual worlds. Virtual conference arenas or “rooms”, where users connect for communication by means of text and/or voice and/or video with each other, have been around for even longer times than VR environments, although the number of users and the time spent in videoconference meetings have dramatically increased lately. Many meetings are held remotely via teleconference, where teleconference may be considered to include also video conference. There are several different video conferencing solutions existing today, such as Microsoft® Teams™, Skype®, Zoom® etc. They all offer ways to have virtual meetings that are fully distributed or connecting groups of participants in conference rooms equipped with cameras, screens and microphones, or combinations thereof. There are also solutions, including e.g. some multiplayer computer games, where users meet and communicate in a 3D virtual environment, such as by means of VR headsets. Each user therein is typically represented by an avatar, e.g. a virtual person or game character, corresponding to a player in the game. It has also been presented solutions where videoconferencing as above has been extended with 3D virtual meeting rooms or environments, in which the users are virtually represented and in which communication between users can take place, current examples are Spatial, AltspaceVR, MeetVR, RecRoom, This kind of solutions are expected to be increasingly common in the future with more accessible, improved and more cost efficient devices and means, e.g. VR headsets, XR contact lenses, and similar, that enable users to participate in such meetings. This expected development will also lead to more and more situations and various applications where users are experiencing 3D virtual environments, such as 3D XR environments. People that have experienced 3D virtual environments, e.g. through VR headset, realize that there are both pros and cons with the experience of the mixed real and virtual worlds. It can be confusing, unpleasant or disturbing to the user in some aspects and situations that can occur, while at the same time this offers possibilities to inform users in many new and/or improved ways that are not possible in e.g. conventional 2D video conferencing and similar situations. It is desirable