EP-4735982-A1 - USAGE OF A MOBILE DEVICE OR INTEGRATED DEVICE SUPPORTING AND/OR ENABLING MOTION SYNCHRONIZED EXPERIENCES ON A MOVING PLATFORM
Abstract
The disclosure relates to a method (100) for operating at least one experience device (4) in a movable and/or moving platform (1) using at least one portable device (2) and/or at least one integrated device (3), wherein a movement and/or pose of the portable device (2) and/or integrated device (3) relative to the platform (1) is limited or fixed, the method comprises the steps of: - Performing at least one measurement (110) to determine a pose and/or motion and/or acceleration of the platform (1); - Leveraging the measurement (120) to enable a motion-synchronized experience on the at least one experience device (4); and - operating (130) the at least one experience device (4) based on the measurement.
Inventors
- Lochmann, Gerrit
- SCHWEISSHELM, Jakob Julian
- NOBILI, Simona
Assignees
- Holoride Technoligies Group Pte. Ltd.
Dates
- Publication Date
- 20260506
- Application Date
- 20240627
Claims (19)
- 1 . A method (100) for operating at least one experience device (4) in a movable and/or moving platform (1 ) using at least one portable device (2) and/or at least one integrated device (3), wherein a movement and/or pose of the portable device (2) and/or integrated device (3) relative to the platform (1 ) is limited or fixed, the method comprises the steps of: - Performing at least one measurement (110) to determine a pose and/or motion and/or acceleration of the platform (1 ); - Leveraging the measurement (120) to enable a motion-synchronized experience on the at least one experience device (4); and - operating (130) the at least one experience device (4) based on the measurement.
- 2. A method (200) for operating a portable device (2) and/or an integrated device (3) in a movable and/or moving platform (1), wherein a movement and/or pose of the portable device (2) and/or integrated device (3) relative to the platform (1 ) is motion-limited or fixed, and wherein the portable device (2) and/or integrated device (3) supports at least one experience device (4) being located in the platform (1 ) to determine a pose and/or a movement and/or an acceleration of the respective experience device (4) relative to the platform (1 ).
- 3. The method (100, 200) according to any of the preceding claims, wherein the movable and/or moving platform (1) comprises at least a bike, rollercoaster, industrial vehicle, car, bus, train, truck, plane, helicopter, and/or ship, and/or the like moving platform.
- 4. The method (100, 200) according to any of the preceding claims, wherein the portable device (2) comprises: - a smartphone, - a tablet PC, - a smart watch, and/or - a multi-functional device.
- 5. The method (100, 200) according to any of the preceding claims, wherein the integrated device (3) comprises at least one integrated sensor within or attached to the platform (1 ), selected from the group consisting of an inertial measurement unit (IMU), global navigation satellite system (GNSS) receiver, camera, optical sensor, magnetometer, and accelerometer, wheel sensor, steering sensor, and wherein the integrated device (3) is configured to gather data from these sensors to determine the pose and/or movement and/or acceleration of the platform (1 ).
- 6. The method (100, 200) according to any of the preceding claims, wherein the experience device (4) comprises a unit for presenting audio and/or visual content, including spatial and/or VR content and/or AR content and/or MR content, or a display unit for 2D content, or a vehicle infotainment system capable of rendering content based on data received from the portable device (2) and/or integrated device (3), in particular in a motion-synchronized manner.
- 7. The method (100, 200) according to any of the preceding claims, wherein the portable device (2) and/or the integrated device (3) performs loading and pre-processing of map data, generating a virtual scene and/or experience and/or audio and/or rendered images and/or virtual elements from the map data, and streaming the pre-processed content to the at least one experience device (4).
- 8. The method (100, 200) according to any of the preceding claims, wherein the portable device (2) and/or the integrated device (3) enable multiple experience devices (4-x) within the platform (1 ) synchronized spatial audio and/or location and/or motion aware audio and/or visual experiences based on the relative poses and motion of the experience devices (4-x).
- 9. The method (100, 200) according to any of the preceding claims, wherein the portable device (2) and/or the integrated device (3) integrates data from platform sensors (5) to enhance the accuracy and/or redundancy and/or robustness of the user experience when using the at least one experience device (4).
- 10. The method (100, 200) according to any of the preceding claims 6 to 10, wherein computational tasks for audio and/or visual content are dynamically distributed between the portable device (2) and/or an integrated device (3) and the at least one experience device (4) based on computational load and task complexity.
- 11 . The method (100, 200) according to any of the preceding claims, wherein the portable device (2) and/or the integrated device (3) automatically activates and calibrates the experience device (4) upon detecting its fixed position within the platform (1 ) using sensor heuristics and/or NFC elements.
- 12. The method (100, 200) according to any of the preceding claims, wherein the portable device (2) and/or the integrated device (3) act as a hub, gathering data from multiple sources and distributing it to the at least one experience device (4).
- 13. The method (100, 200) according to any of the preceding claims, wherein the portable device (2) and/or the integrated device (3) incorporates weather and/or environmental data from internet services and/or from internal and/or peripheral sensors, and/or objects detected by those sensors into the at least one experience device (4), providing a realistic representation and/or artistic re-interpretation of the current conditions; and wherein the portable device (2) and/or an integrated device (3) integrates data from external data sources, such as internet services or the like, to enhance the experience.
- 14. The method (100, 200) according to any of the preceding claims, wherein pose data describing the pose of the portable device (2) and/or the integrated device (3) or the pose of the portable device (2) and/or the integrated device (3) can be determined and/or adjusted based on user input on the device's display, and/or platform display, and/or the experience device display.
- 15. The method (100, 200) according to any of the preceding claims, wherein the portable device (2) and/or the integrated device (3) ensures correct positioning of virtual avatars in a multi-user experience based on the relative pose and/or motion of the at least one experience device (4) and the body poses tracked by those devices.
- 16. The method (100, 200) according to any of the preceding claims, wherein portable device (2) and/or the integrated device (3) is mounted and/or placed and/or hold in various poses within the platform (1 ), including fixed mounts, handheld positions, and/or semi-rigid attachments and/or placements.
- 17. The method (100, 200) according to any of the preceding claims, wherein portable device (2) and/or the integrated device (3) automatically detects its mount pose within the platform (1 ) and adjusts the calibration and data processing accordingly.
- 18. A portable device (2) that is configured to perform a method according to any of the preceding method claims.
- 19. A computer program having program code or program means, wherein, if the computer program is executed on a computer or a computer- based processing unit, the computer program is stored on a computer readable medium, wherein the program code or the program means causes the computer or the computer-based processing unit to execute a method according to any of the preceding method claims.
Description
Usage of a mobile device or integrated device supporting and/or enabling motion synchronized experiences on a moving platform Description An XR headset as an example of an experience device can be designed as a head mounted device that can display or present a VR content (VR - virtual reality) and/or AR content (AR - augmented reality) and/or MR content (MR - mixed reality) to the user wearing the XR headset, in particular by providing a pose-aware and motion-synchronized output to the eyes of the user. As such, XR is a subset of all spatial media output devices, in the following referred to as “experience devices”, that replace or augment real-world signals with pose-aware and motion-synchronized virtual signals, in the following referred to as “spatial content”. Spatial audio headphones or a 2D screen that functions as a portable camera or window into a virtual 3D space are other examples for experience devices and are equally taken into account. For synchronizing movements of the user’s head with the shown spatial content, the experience device can comprise at least one sensor that provides a sensor signal that is correlated with a pose (position and/or spatial orientation/rotation) and/or motion and/or acceleration of the experience device in space I the surroundings. The at least one sensor can for example comprise at least one camera and/or an IMU (inertial measurement unit). Operating such a spatial media output device in a moving platform, in particular in a vehicle (like a passenger vehicle or a truck or a passenger bus or a motorbike) or a plane or a ship/boat, comes with the technical problem that the at least one sensor of the experience device will sense both the movements of the user inside the platform as well as the movements of the platform in the environment in the same way. These two types of position/movement need to be distinguished when presenting spatial content to the user via a pose- and motion-tracked experience device. Otherwise, when the platform moves in a curve with the user sitting still inside the platform, the spatial content will be changed in the same way as when the user turns the device inside the platform with world reference tracking. In the case of vehicle reference tracking, the motion relative to the world is unknown. The two types are preferably separated as an information on the pose of the vehicle in relation to its environment I surroundings and an information on the pose of the experience device in relation to the platform. Thus, there is a need in the state of the art to provide a solution for a method for operating at least one experience device in a movable and/or moving platform using at least one portable device and/or integrated device that can overcome, at least partially, the disadvantages of the above-described approaches. There, it is an objective of the disclosure to provide the method for operating at least one experience device in a movable and/or moving platform using at least one portable device and/or integrated device and a portable device, each of which is suitable for enriching the known state of the art. According to a first aspect, the disclosure relates to a method for operating at least one experience device in a movable and/or moving platform using at least one portable device and/or at least one integrated device. A movement and/or pose of the portable device and/or integrated device relative to the platform is limited or fixed. This feature can enhance the accuracy, reliability, and overall performance of the system in which the portable and/or integrated device operates and/or support the usage of further devices. By limiting or fixing the movement and/or pose of the portable device and/or integrated device relative to the platform, the portable device and/or integrated device can achieve higher positional accuracy. This stabilization can reduce errors caused by unintended movements or vibrations, ensuring that the devices maintain a consistent and precise location within the platform. The stabilization of the devices can lead to improved quality of sensor data. Sensors such as accelerometers, gyroscopes, and cameras can operate more effectively when the device is held in a stable position, reducing noise and enhancing the fidelity of measurements and observations. Fixing or limiting the movement of the device can facilitate more consistent calibration. A stable device position can simplify the calibration process, ensuring that once the device is calibrated relative to the platform, it remains accurate over time, thus reducing the need for frequent recalibrations. Further, when the movement and/or pose of the device is controlled, the integration of data from multiple sources (e.g., GNSS, IMlls, cameras, lasers) can be more reliable. The consistency in device positioning can enable more accurate data fusion, leading to better overall system performance. In the meaning of the disclosure, the experience device refers to a digital o