US-12626392-B2 - Pose calculating apparatus and method
Abstract
A pose calculating apparatus and method are provided. The pose calculating apparatus receives a plurality of real-time images and a plurality of inertial measurement parameters corresponding to at least one inertial sensor worn by a user. The pose calculating apparatus determines a pose calculating mode corresponding to each of a plurality of body regions of the user based on the real-time images and the inertial measurement parameters, wherein the pose calculating mode corresponds to a static mode or a motion mode. The pose calculating apparatus calculates a pose corresponding to each of the body regions based on the pose calculating mode corresponding to each of the body regions.
Inventors
- Yen-Ting Liu
- Yu-Heng Hong
- Jia-Yau Shiau
Assignees
- HTC CORPORATION
Dates
- Publication Date
- 20260512
- Application Date
- 20230710
Claims (18)
- 1 . A pose calculating apparatus, comprising: a transceiver interface, being communicatively connected to an image capturing device and at least one inertial sensor; and a processor, being electrically connected to the transceiver interface, and being configured to perform the following operations: receiving a plurality of real-time images and a plurality of inertial measurement parameters corresponding to the at least one inertial sensor worn by a user; determining a pose calculating mode corresponding to each of a plurality of body regions of the user based on the real-time images and the inertial measurement parameters, wherein the pose calculating mode corresponds to a static mode or a motion mode; and calculating a pose corresponding to each of the body regions based on the pose calculating mode corresponding to each of the body regions; wherein each of the at least one inertial sensor corresponds to one of the body regions, and the processor is further configured to perform the following operations: periodically performing the following operations on one of the body regions corresponding to the motion mode based on a preset time interval: calculating a third drift calibration value corresponding to each of the body regions corresponding to the motion mode based on the real-time images, wherein each of the body regions corresponding to the motion mode comprises one of the at least one inertial sensor; and calibrating the inertial measurement parameters generated by each of the at least one inertial sensor based on the third drift calibration value.
- 2 . The pose calculating apparatus of claim 1 , wherein the processor is further configured to perform the following operations: in response to determining that a first body region among the body regions corresponds to the static mode, calculating the pose corresponding to the first body region based on the real-time images.
- 3 . The pose calculating apparatus of claim 2 , wherein the processor is further configured to perform the following operations: calculating at least one candidate pose corresponding to the first body region based on the real-time images; and in response to determining that a candidate pose amount of the at least one candidate pose corresponding to the first body region is greater than one, determining the pose corresponding to the first body region based on the inertial measurement parameters.
- 4 . The pose calculating apparatus of claim 3 , wherein the processor is further configured to perform the following operations: calculating a confidence value for the pose corresponding to the first body region; and in response to determining that the confidence value corresponding to the pose corresponding to the first body region is lower than a standard value, switching the pose calculating mode of the first body region to the motion mode.
- 5 . The pose calculating apparatus of claim 2 , wherein the processor is further configured to perform the following operations: calculating an exercise intensity value for the first body region based on the inertial measurement parameters; and in response to the exercise intensity value exceeding a threshold value corresponding to the first body region, switching the pose calculating mode of the first body region to the motion mode.
- 6 . The pose calculating apparatus of claim 2 , wherein the processor is further configured to perform the following operations: determining whether an occlusion state occurs in the first body region based on the real-time images; and in response to the occlusion state occurring in the first body region, switching the pose calculating mode of the first body region to the motion mode.
- 7 . The pose calculating apparatus of claim 1 , wherein the processor is further configured to perform the following operations: in response to determining that a second body region among the body regions corresponds to the motion mode, calculating the pose corresponding to the second body region based on the inertial measurement parameters.
- 8 . The pose calculating apparatus of claim 7 , wherein each of the at least one inertial sensor corresponds to one of the body regions, and the processor is further configured to perform the following operations: determining from the body regions whether a body part corresponding to the second body region is a visible state based on the real-time images; in response to determining that the body part corresponding to the second body region is in the visible state, calculating a first drift calibration value corresponding to the body part based on the real-time images corresponding to the body part; and calibrating the inertial measurement parameters generated by a first inertial sensor corresponding to the second body region based on the first drift calibration value.
- 9 . The pose calculating apparatus of claim 7 , wherein each of the at least one inertial sensor corresponds to one of the body regions, and the processor is further configured to perform the following operations: determining from the body regions whether a first inertial sensor corresponding to the second body region is in a visible state based on the real-time images; in response to determining that the first inertial sensor corresponding to the second body region is in the visible state, calculating a second drift calibration value corresponding to the first inertial sensor based on the real-time images corresponding to the second body region; and calibrating the inertial measurement parameters generated by the first inertial sensor based on the second drift calibration value.
- 10 . A pose calculating method, being adapted for use in an electronic apparatus, wherein the pose calculating method comprises: receiving a plurality of real-time images and a plurality of inertial measurement parameters corresponding to at least one inertial sensor worn by a user; determining a pose calculating mode corresponding to each of a plurality of body regions of the user based on the real-time images and the inertial measurement parameters, wherein the pose calculating mode corresponds to a static mode or a motion mode; and calculating a pose corresponding to each of the body regions based on the pose calculating mode corresponding to each of the body regions; wherein each of the at least one inertial sensor corresponds to one of the body regions, and the pose calculating method further comprises the following steps: periodically performing the following operations on one of the body regions corresponding to the motion mode based on a preset time interval: calculating a third drift calibration value corresponding to each of the body regions corresponding to the motion mode based on the real-time images, wherein each of the body regions corresponding to the motion mode comprises one of the at least one inertial sensor; and calibrating the inertial measurement parameters generated by each of the at least one inertial sensor based on the third drift calibration value.
- 11 . The pose calculating method of claim 10 , wherein the pose calculating method further comprises the following steps: in response to determining that a first body region among the body regions corresponds to the static mode, calculating the pose corresponding to the first body region based on the real-time images.
- 12 . The pose calculating method of claim 11 , wherein the pose calculating method further comprises the following steps: calculating at least one candidate pose corresponding to the first body region based on the real-time images; and in response to determining that a candidate pose amount of the at least one candidate pose corresponding to the first body region is greater than one, determining the pose corresponding to the first body region based on the inertial measurement parameters.
- 13 . The pose calculating method of claim 12 , wherein the pose calculating method further comprises the following steps: calculating a confidence value for the pose corresponding to the first body region; and in response to determining that the confidence value corresponding to the pose corresponding to the first body region is lower than a standard value, switching the pose calculating mode of the first body region to the motion mode.
- 14 . The pose calculating method of claim 11 , wherein the pose calculating method further comprises the following steps: calculating an exercise intensity value for the first body region based on the inertial measurement parameters; and in response to the exercise intensity value exceeding a threshold value corresponding to the first body region, switching the pose calculating mode of the first body region to the motion mode.
- 15 . The pose calculating method of claim 11 , wherein the pose calculating method further comprises the following steps: determining whether an occlusion state occurs in the first body region based on the real-time images; and in response to the occlusion state occurring in the first body region, switching the pose calculating mode of the first body region to the motion mode.
- 16 . The pose calculating method of claim 10 , wherein the pose calculating method further comprises the following steps: in response to determining that a second body region among the body regions corresponds to the motion mode, calculating the pose corresponding to the second body region based on the inertial measurement parameters.
- 17 . The pose calculating method of claim 16 , wherein each of the at least one inertial sensor corresponds to one of the body regions, and the pose calculating method further comprises the following steps: determining from the body regions whether a body part corresponding to the second body region is a visible state based on the real-time images; in response to determining that the body part corresponding to the second body region is in the visible state, calculating a first drift calibration value corresponding to the body part based on the real-time images corresponding to the body part; and calibrating the inertial measurement parameters generated by a first inertial sensor corresponding to the second body region based on the first drift calibration value.
- 18 . The pose calculating method of claim 16 , wherein each of the at least one inertial sensor corresponds to one of the body regions, and the pose calculating method further comprises the following steps: determining from the body regions whether a first inertial sensor corresponding to the second body region is in a visible state based on the real-time images; in response to determining that the first inertial sensor corresponding to the second body region is in the visible state, calculating a second drift calibration value corresponding to the first inertial sensor based on the real-time images corresponding to the second body region; and calibrating the inertial measurement parameters generated by the first inertial sensor based on the second drift calibration value.
Description
BACKGROUND Field of Invention The present invention relates to a pose calculating apparatus and method. More particularly, the present invention relates to a pose calculating apparatus and method capable of correspondingly adjusting the pose calculating mode of each of a plurality of body regions. Description of Related Art In recent years, various technologies related to interaction have developed rapidly, and various applications related to interaction have been proposed one after another. In the prior, the pose of the user can be calculated by analyzing the real-time images (e.g., determining the pose by the computer vision) or by the inertial sensor worn (e.g., determining the pose by the inertial measurement parameters) by the user. However, since the detection frequency of the real-time image is relatively low (e.g., 30 frames per second), it is not suitable for use when the user is exercising. In addition, since computer vision is calculated based on the content of the real-time images, it is easy to cause pose calculation errors due to environmental problems such as the user's body parts being covered, insufficient ambient light, and insufficient contrast. In addition, although the detection frequency of the inertial sensor is relatively high (e.g., 60 frames per second), the accuracy of pose calculation will decrease with time because the inertial sensor will generate drift values to cause errors, and the drift values will accumulate over time. Furthermore, the process of resetting the inertial sensor is cumbersome, and the user's movement process may cause the position of the inertial sensor to move, thus reducing the accuracy of the inertial sensor. Accordingly, there is an urgent need for a pose calculating technology that can correspondingly adjust the pose calculating mode of each of a plurality of body regions. SUMMARY An objective of the present disclosure is to provide a pose calculating apparatus. The pose calculating apparatus comprises a transceiver interface and a processor. The transceiver interface is communicatively connected to an image capturing device and at least one inertial sensor, and the processor is electrically connected to the transceiver interface. The processor receives a plurality of real-time images and a plurality of inertial measurement parameters corresponding to the at least one inertial sensor worn by a user. The processor determines a pose calculating mode corresponding to each of a plurality of body regions of the user based on the real-time images and the inertial measurement parameters, wherein the pose calculating mode corresponds to a static mode or a motion mode. The processor calculates a pose corresponding to each of the body regions based on the pose calculating mode corresponding to each of the body regions. Another objective of the present disclosure is to provide a pose calculating method, which is adapted for use in an electronic apparatus. The pose calculating method comprises the following steps: receiving a plurality of real-time images and a plurality of inertial measurement parameters corresponding to at least one inertial sensor worn by a user; determining a pose calculating mode corresponding to each of a plurality of body regions of the user based on the real-time images and the inertial measurement parameters, wherein the pose calculating mode corresponds to a static mode or a motion mode; and calculating a pose corresponding to each of the body regions based on the pose calculating mode corresponding to each of the body regions. According to the above descriptions, the pose calculating technology (at least including the apparatus and the method) provided by the present disclosure determines the pose calculating mode corresponding to each of the plurality of body regions of the user through the real-time images and the inertial measurement parameters. In addition, the pose calculating technology provided by the present disclosure can select a suitable pose calculating mode to calculate the pose corresponding to each of the body regions through the switching mechanism of the static mode and the motion mode. The pose calculating technology provided by the present disclosure can adjust the pose calculating mode of each of the plurality of body regions correspondingly, thus solving the disadvantages of the conventional technology that may face a decrease in accuracy when determining the pose based on the real-time images and the inertial sensors. The detailed technology and preferred embodiments implemented for the subject disclosure are described in the following paragraphs accompanying the appended drawings for people skilled in this field to well appreciate the features of the claimed invention. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a schematic diagram depicting the structure of the pose calculating apparatus of the first embodiment; FIG. 2 is a schematic diagram depicting the operating environment of some embodiments; FIG. 3 is a schem