Search

US-12628115-B2 - Tracking system and tracking method

US12628115B2US 12628115 B2US12628115 B2US 12628115B2US-12628115-B2

Abstract

A tracking system is provided. The tracking system includes a first tracking device, a second tracking device, and a wearable tracking device. The first tracking device is disposed on a vehicle and is configured to obtain map information and first measurement information. The second tracking device is disposed on the vehicle and is configured to obtain second measurement information. The wearable tracking device is disposed on a user in the vehicle and is configured to obtain third measurement information. Further, the wearable tracking device is configured to obtain location position information of the user based on the map information, the first measurement information, the second measurement information, and the third measurement information. Furthermore, the local position information indicates a user position of the user within the vehicle.

Inventors

  • Kuan-Hsun Yu
  • Wen-Shan Yang

Assignees

  • HTC CORPORATION

Dates

Publication Date
20260512
Application Date
20230112

Claims (18)

  1. 1 . A tracking system, comprising: a first tracking device, disposed on a vehicle and configured to obtain map information and first measurement information; a second tracking device, disposed on the vehicle and configured to obtain second measurement information; and a wearable tracking device, disposed on a user in the vehicle and configured to obtain third measurement information, wherein the wearable tracking device is configured to obtain global position information of the vehicle based on the map information, obtain a tracker-vehicle pose relationship based on the first measurement information and the second measurement information, and obtain a user-tracker pose relationship based on the second measurement information and the third measurement information, wherein the wearable tracking device is further configured to obtain local position information of the user based on the global position information, the tracker-vehicle pose relationship, and the user-tracker pose relationship, wherein the local position information indicates a user position of the user within the vehicle.
  2. 2 . The tracking system according to claim 1 , wherein the map information comprises a simultaneous localization and mapping (SLAM) map, and the wearable tracking device is configured to: obtain global position information of the vehicle based on the SLAM map.
  3. 3 . The tracking system according to claim 2 , wherein the SLAM map is obtained according to a plurality of exterior images outside the vehicle through an exterior camera of the first tracking device.
  4. 4 . The tracking system according to claim 2 , wherein the SLAM map is obtained through a light detection and ranging (LiDAR) device or a global positioning system (GPS) device.
  5. 5 . The tracking system according to claim 1 , wherein the first tracking device comprises a first inertial measurement unit (IMU) sensor and the first measurement information comprises a first inertial measurement value, the second tracking device comprises a second IMU sensor and the second measurement information comprises a second inertial measurement value, and the wearable tracking device comprises a third IMU sensor and the third measurement information comprises a third inertial measurement value.
  6. 6 . The tracking system according to claim 5 , wherein each of the first inertial measurement value, the second inertial measurement value, and the third inertial measurement value comprises changes in six degrees of freedom and the six degrees of freedom comprises three translation values corresponding to three perpendicular axes and three rotation values corresponding to the three perpendicular axes.
  7. 7 . The tracking system according to claim 6 , wherein the wearable tracking device is configured to: obtain the tracker-vehicle pose relationship based on a tracker-vehicle difference between the three rotation values of the first inertial measurement value and the three rotation values of the second inertial measurement value; and obtain the user-tracker pose relationship based on a user-tracker difference between the three rotation values of the second inertial measurement value and the three rotation values of the third inertial measurement value.
  8. 8 . The tracking system according to claim 1 , wherein the second tracking device is configured to provide a tracker pattern, the wearable tracking device comprises an interior camera, the wearable tracking is configured to: obtain, from the interior camera, a plurality of interior images, wherein at least one of the plurality of interior images comprises an image of the tracker pattern; and obtain the user-tracker pose relationship based on the tracker pattern, the second measurement information and the third measurement information.
  9. 9 . The tracking system according to claim 8 , wherein the tracker pattern comprises at least one of a predetermined pattern, an ava group of the university of Cordoba (ArUco) marker, and a light emitting device.
  10. 10 . A tracking method, comprising: obtaining, through a first tracking device disposed on a vehicle, map information and first measurement information; obtaining, through a second tracking device disposed on the vehicle, second measurement information; obtaining, through a wearable tracking device disposed on a user in the vehicle, third measurement information; obtaining global position information of the vehicle based on the map information; obtaining a tracker-vehicle pose relationship based on the first measurement information and the second measurement information; obtaining a user-tracker pose relationship based on the second measurement information and the third measurement information; and obtaining local position information of the user based on the global position information, the tracker-vehicle pose relationship, and the user-tracker pose relationship, wherein the local position information indicates a user position of the user within the vehicle.
  11. 11 . The tracking method according to claim 10 , wherein the map information comprises a simultaneous localization and mapping (SLAM) map, and the tracking method further comprises: obtain global position information of the vehicle based on the SLAM map.
  12. 12 . The tracking method according to claim 2 , further comprising: obtaining the SLAM map according to a plurality of exterior images outside the vehicle through an exterior camera of the first tracking device.
  13. 13 . The tracking method according to claim 11 , further comprising: Obtaining the SLAM map through a light detection and ranging (LiDAR) device or a global positioning system (GPS) device.
  14. 14 . The tracking method according to claim 10 , wherein the first tracking device comprises a first inertial measurement unit (IMU) sensor and the first measurement information comprises a first inertial measurement value, the second tracking device comprises a second IMU sensor and the second measurement information comprises a second inertial measurement value, and the wearable tracking device comprises a third IMU sensor and the third measurement information comprises a third inertial measurement value.
  15. 15 . The tracking method according to claim 14 , wherein each of the first inertial measurement value, the second inertial measurement value, and the third inertial measurement value comprises changes in six degrees of freedom and the six degrees of freedom comprises three translation values corresponding to three perpendicular axes and three rotation values corresponding to the three perpendicular axes.
  16. 16 . The tracking method according to claim 15 , further comprising: obtaining the tracker-vehicle pose relationship based on a tracker-vehicle difference between the three rotation values of the first inertial measurement value and the three rotation values of the second inertial measurement value; obtaining the user-tracker pose relationship based on a user-tracker difference between the three rotation values of the second inertial measurement value and the three rotation values of the third inertial measurement value.
  17. 17 . The tracking method according to claim 10 , further comprising: providing, through the second tracking device, a tracker pattern; obtaining, from an interior camera of the wearable tracking device, a plurality of interior images, wherein at least one of the plurality of interior images comprises an image of the tracker pattern; and obtaining the user-tracker pose relationship based on the tracker pattern, the second measurement information and the third measurement information.
  18. 18 . The tracking method according to claim 17 , wherein the tracker pattern comprises at least one of a predetermined pattern, an ava group of the university of Cordoba (ArUco) marker, and a light emitting device.

Description

BACKGROUND Technical Field The disclosure relates to a tracking system; particularly, the disclosure relates to a tracking system and a tracking method. Description of Related Art In order to bring an immersive experience to users, various technologies, such as augmented reality (AR) and virtual reality (VR), are constantly being developed. AR technology allows users to bring virtual elements to the real-world. VR technology allows users to enter a whole new virtual world to experience a different life. Wearable devices are often used to provide this kind of immersive experience. In addition, to provide an immersive in-vehicle user experience, a wearable device is further integrated with a vehicle. For the purpose of the integration, the wearable device may obtain information from the vehicle by communicating with the advanced driver assistance system (ADAS). That is, the wearable device may obtain either engine status from the engine of the vehicle or may obtain tracking information from the tracking system of the vehicle. However, the setup of the integration is usually complicated and not user-friendly. Further, the tracking system of the vehicle might be expensive. SUMMARY The disclosure is direct to a tracking system and a tracking method, so as to track a position of a wearable device inside a vehicle. In this disclosure, a tracking system is provided. The tracking system includes a first tracking device, a second tracking device, and a wearable tracking device. The first tracking device is disposed on a vehicle and is configured to obtain map information and first measurement information. The second tracking device is disposed on the vehicle and is configured to obtain second measurement information. The wearable tracking device is disposed on a user in the vehicle and is configured to obtain third measurement information. Further, the wearable tracking device is configured to obtain location position information of the user based on the map information, the first measurement information, the second measurement information, and the third measurement information. Furthermore, the local position information indicates a user position of the user within the vehicle. In this disclosure, a tracking method is provided. The tracking method includes: obtaining, through a first tracking device disposed on a vehicle, map information and first measurement information; obtaining, through a second tracking device disposed on the vehicle, second measurement information; obtaining, through a wearable tracking device disposed on a user in the vehicle, third measurement information; and obtaining local position information of the user based on the map information, the first measurement information, the second measurement information, and the third measurement information, wherein the local position information indicates a user position of the user within the vehicle. Based on the above, according to the tracking system and the tracking method, a user friendly and low cost integration system of the wearable device in the vehicle is achieved. To make the aforementioned more comprehensible, several embodiments accompanied with drawings are described in detail as follows. BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the disclosure and, together with the description, serve to explain the principles of the disclosure. FIG. 1 is a schematic diagram of a tracking system according to an embodiment of the disclosure. FIG. 2 is a schematic diagram of a vehicle positioning scenario according to an embodiment of the disclosure. FIG. 3A is a schematic diagram of a calibration scenario according to an embodiment of the disclosure. FIG. 3B is a schematic diagram of a tracking scenario according to an embodiment of the disclosure. FIG. 4 is a schematic diagram of a conversion scenario according to an embodiment of the disclosure. FIG. 5 is a schematic flowchart of a tracking method according to an embodiment of the disclosure. FIG. 6 is a schematic flowchart of a tracking method according to an embodiment of the disclosure. DESCRIPTION OF THE EMBODIMENTS Reference will now be made in detail to the exemplary embodiments of the disclosure, examples of which are illustrated in the accompanying drawings. Whenever possible, the same reference numbers are used in the drawings and the description to refer to the same or like components. Certain terms are used throughout the specification and appended claims of the disclosure to refer to specific components. Those skilled in the art should understand that electronic device manufacturers may refer to the same components by different names. This article does not intend to distinguish those components with the same function but different names. In the following description and rights request, the wo