Search

KR-20260065144-A - APPARATUS AND METHOD FOR MEASURING EXERCISE MOVEMENT

KR20260065144AKR 20260065144 AKR20260065144 AKR 20260065144AKR-20260065144-A

Abstract

An exercise motion measuring device according to one embodiment disclosed in this document comprises: an RGBD camera capable of capturing three-dimensional images; a plurality of IMU sensors each attached to a plurality of body parts related to the user’s exercise motion and capable of measuring the rotation angles of the body parts; and a processor functionally connected to the RGBD camera and the plurality of IMU sensors. The processor calculates reference angles of the plurality of IMU sensors corresponding to the user’s reference posture using the plurality of IMU sensors and the RGBD camera while the user assumes a static posture in a calibration mode of the plurality of IMU sensors, and in a user’s exercise motion measuring mode, measures the user’s exercise motion by comparing the rotation angles of the body parts calculated using the plurality of IMU sensors with the reference angles.

Inventors

  • 이동춘

Assignees

  • 한국전자통신연구원

Dates

Publication Date
20260508
Application Date
20241101

Claims (14)

  1. In a motion measurement device, RGBD camera capable of capturing 3D images including depth image data; A plurality of IMU sensors each attached to a plurality of body parts related to the user's movement and capable of measuring the rotation angles of said body parts; and It includes a processor functionally connected to the above RGBD camera and the above plurality of IMU sensors, and the processor, In the calibration mode of the above plurality of IMU sensors, While the user assumes a static posture, reference angles of the plurality of IMU sensors corresponding to the user's reference posture are calculated using the plurality of IMU sensors and the RGBD camera, and In the user's exercise motion measurement mode mentioned above, An exercise motion measuring device that measures the user's exercise motion by comparing the rotation angles of the body parts calculated using the plurality of IMU sensors with the reference angles.
  2. In claim 1, the processor, While calculating first rotation angles of each body part using the plurality of IMU sensors, generating three-dimensional image data including the plurality of body parts using the RGBD camera, calculating second rotation angles of each body part using the three-dimensional image data, and calculating a second rotation angle converted into an IMU coordinate system by multiplying the calculated second rotation angle by the rotation angle of the RGBD camera. A motion measuring device that calculates reference angles of the body parts corresponding to the reference posture by comparing the first rotation angles and the converted second rotation angles.
  3. In claim 1, the processor, A motion measuring device that calculates three-dimensional position values of the user's joint points from the image data based on an artificial intelligence model, and calculates second rotation angles of the body parts corresponding to the degree to which each body part has rotated from the reference posture from the three-dimensional rotation values of the joint points.
  4. In claim 3, the processor, A motion measuring device that calculates second rotation angles of body parts by multiplying the rotation values of the RGBD camera according to the reference coordinate system by the calculated rotation values for each body part.
  5. In claim 2, the processor, A motion measuring device that calculates the reference angles by multiplying the inverse matrix of the second rotation angles by the first rotation angles.
  6. In claim 1, The processor further includes a display that outputs exercise movements, and the processor, An exercise motion measuring device that, while an exercise motion is performed by the user, calculates current rotation angles of the body parts using the sensors, calculates relative rotation angles of the body parts by comparing the current rotation angles with the reference angles, and measures the exercise motion using the relative rotation angles.
  7. In claim 6, the processor, A motion measuring device that calculates a relative movement angle from the relative rotation angle according to the relative rotation angles and joint connection structure, compares the relative movement angle with the target movement angles related to the motion, and outputs a guidance screen regarding the accuracy of the motion through the display based on the comparison result.
  8. In a method for measuring motion using at least one processor, A step of acquiring sensing values related to the operation of multiple body parts using multiple IMU sensors each mounted on multiple body parts of a user, and acquiring three-dimensional image data including said body parts using an RGBD camera; A step of calculating reference angles of each body part according to the coordinate system of the plurality of IMU sensors corresponding to the reference posture of the user using the above 3D image data and the above sensing value; and A method for measuring movement motion including a motion of measuring the user's movement motion by comparing the rotation angles of the body parts calculated using the plurality of IMU sensors with the reference angles.
  9. In claim 8, the calculating step is, A step of calculating first rotation angles of each body part using the plurality of IMU sensors, generating three-dimensional image data including the plurality of body parts using the RGBD camera, and calculating second rotation angles of each body part using the image data; and A method for measuring movement motion, comprising the operation of calculating reference angles of the body parts corresponding to the reference posture by comparing the first rotation angles and the second rotation angles.
  10. In claim 8, the step of calculating the second rotation angle is, A step of calculating three-dimensional rotation values of the user's joint points from the above image data; and A method for measuring movement motion, comprising the operation of calculating second rotation angles of the body parts corresponding to the degree to which each body part has rotated from the reference posture based on the three-dimensional rotation values calculated above.
  11. In claim 10, the step of calculating the three-dimensional rotation values is, It includes an operation of converting each of the calculated 3D rotation values into an IMU coordinate system using the rotation value of the RGBD camera according to the coordinate system, A method for measuring movement motion, wherein the step of calculating second rotation angles corresponding to the degree of rotation is to calculate the second rotation angles of the body parts using the converted three-dimensional rotation values.
  12. In claim 8, the step of calculating the reference angle is, A motion measurement method comprising the operation of calculating the reference angles by multiplying the inverse matrix of the second rotation angles by the first rotation angles.
  13. In claim 8, A step of calculating current rotation angles of the body parts using the sensors while an exercise movement is performed by the user, and calculating relative rotation angles of the body parts by comparing the current rotation angles with the reference angles; A step of calculating a relative joint angle using the relative rotation angles based on a joint dependent structure; A method for measuring a movement, further comprising the operation of generating and outputting a guidance screen regarding the accuracy of the movement by comparing the target joint angles related to the movement stored above with the relative joint angles.
  14. In a method for measuring motion using at least one processor, A step of acquiring three-dimensional image data including multiple body parts related to a user's exercise movements using an RGBD camera; A step of calculating 3D position values of multiple target points for each body part using the above 3D image data; A step of calculating the rotation angle for each body part based on the reference posture of the user using the three-dimensional position values of the above target points; and The method includes an operation of calculating reference angles for each body part corresponding to the reference posture according to the coordinate system of the IMU sensor from the rotation angles for each body part using rotation angles calculated through IMU sensors mounted on each of the plurality of body parts, and A method for measuring movement, wherein the rotation angle calculated through the above IMU sensors is time-synchronized with the above 3D image data, and the above reference angles for each body part are used to measure the movement of the user.

Description

Apparatus and Method for Measuring Exercise Movement The various embodiments disclosed in this document relate to motion monitoring technology. Generally, stroke patients experience various motor impairments and undergo rehabilitation exercise therapy to overcome them. However, conventional rehabilitation exercise therapy involves patients simply and repeatedly following the therapist's movements, making it a tedious and difficult time for the patients. Recently, there has been a lot of research being conducted on incorporating game content into rehabilitation exercise therapy to eliminate boredom and make rehabilitation exercises fun. In particular, there is a lot of research being done on VR/AR rehabilitation content that involves playing game content while wearing HMD equipment (VR/AR devices). VR/AR rehabilitation content can track the user's location, gaze direction, and the tracker held in the user's hand. VR/AR rehabilitation content can assist the user's rehabilitation exercise therapy by changing the content according to the location and gaze direction of the user wearing the HMD device. FIG. 1 shows an example of an implementation of a motion measurement device according to one embodiment. FIG. 2 shows a configuration diagram of a motion measurement device according to one embodiment. FIG. 3 shows an example of a body part on which IMU sensors are mounted according to one embodiment. FIG. 4 shows a schematic flowchart of a method for measuring movement according to one embodiment. FIG. 5 shows a flowchart of IMU sensor calibration according to one embodiment. FIG. 6 shows a flowchart of a method for measuring exercise motion in an exercise measurement mode according to one embodiment. FIG. 7 is an example of a reference posture according to one embodiment. In relation to the description of the drawings, the same or similar reference numerals may be used for identical or similar components. Generally, user behavior can be captured in three main ways. First, high-precision motion capture technologies such as OptiTrack can capture user movements with high precision but require a large space. Next, there is a technology using a camera, such as Kinect, which has lower precision and a short movement recognition distance but is easy to use. Finally, there is an IMU sensor-based motion capture technology that has medium precision and captures user movements based on data received from IMU sensors attached to the user's body joints. The aforementioned two motion capture technologies utilize cameras, so motion capture cannot be performed if a joint is obscured from the camera's field of view. In contrast, capture using IMU sensors does not use a camera, allowing for smooth motion capture even if the joint is blocked. However, the user must first calibrate the IMU sensors while they are attached to key joint areas. IMU sensor calibration may be a step of collecting measurement data from IMU sensors attached to each joint of the user when the user assumes a reference posture (e.g., N-pose or T-pose). Subsequently, during the motion capture operation, the rotation values of each joint point can be calculated through a calculation that compares the IMU sensor data with the reference posture data collected during the calibration operation. However, for users who have difficulty assuming a standard posture, such as stroke patients with hemiplegia, rehabilitation content training techniques using IMU sensors may be difficult to apply. Furthermore, the attachment position or direction of the IMU sensor on the user's body may vary depending on the subject being measured or the person responsible for attaching the sensor. Therefore, when capturing user motion using an IMU sensor, it is always necessary to calibrate the sensor by having the user assume an N-pose or a T-pose. FIG. 1 shows an example of implementation of a motion measurement device according to one embodiment, FIG. 2 shows a configuration diagram of a motion measurement device according to one embodiment, and FIG. 3 shows an example diagram of a body part on which IMU sensors are mounted according to one embodiment. Referring to FIGS. 1 and 2, a motion measuring device (100) according to one embodiment may include an RGBD camera (110), a plurality of IMU sensors (120), an input device (130), an output device (140), a memory (150), and a processor (160). In one embodiment, the motion measuring device (100) may omit some components or include additional components. Additionally, some of the components of the motion measuring device (100) may be combined to form a single entity, while performing the same functions as the corresponding components prior to combination. The RGBD camera (110) may be a camera including an infrared camera and an RGB camera. Alternatively, the RGBD camera (110) may include a stereo camera and an infrared camera. The RGBD camera (110) can capture and generate three-dimensional images related to the user's actions under the control