KR-20260064775-A - Exercise analysis method and system using unmanned aerial vehicles and wearable devices
Abstract
A motion analysis method according to one embodiment of the present disclosure is a motion analysis method performed on a service server linked with an unmanned aerial vehicle and a wearable device, comprising the steps of receiving sensing data for a target object from a wearable device, receiving flight collection data including a captured image of a target object from an unmanned aerial vehicle, analyzing the degree of motion for the target object based on the sensing data and the image data, and generating information on the degree of motion for the target object and information on subsequent actions thereon.
Inventors
- 최재호
Assignees
- 최재호
Dates
- Publication Date
- 20260508
- Application Date
- 20241029
Claims (10)
- A motion analysis method performed on a service server linked with unmanned aerial vehicles and wearable devices, A step of receiving sensing data about a target object from a wearable device; A step of receiving flight collection data including a captured image of a target object captured from an unmanned aerial vehicle; A step of analyzing the degree of motion of a target object based on the above sensing data and the above image data; and A step comprising generating motion degree information for a target object and follow-up action information thereon; Exercise analysis method using unmanned aerial vehicles and wearable devices.
- In claim 1, the step of receiving the flight collection data is, A step of setting one of the above target objects as a target object and requesting tracking shooting of the set target object; and The step of receiving target image data for a target object generated by the unmanned aerial vehicle while tracking the target object; comprising Exercise analysis method using unmanned aerial vehicles and wearable devices.
- In paragraph 2, the exercise analysis method utilizing the above-mentioned unmanned aerial vehicle and wearable device is, The method further comprises the step of generating a three-dimensional skeleton of the target object from the target image data and performing body shape and posture analysis of the target object based on the three-dimensional skeleton. Exercise analysis method using unmanned aerial vehicles and wearable devices.
- In claim 1, the step of performing body shape and posture analysis of the target object based on the three-dimensional skeleton is: A step of analyzing the body shape of the above-mentioned subject object; A step of analyzing the posture of the subject object based on the movement of the above three-dimensional skeleton; and A step of performing a comprehensive growth prediction for the subject object by combining the body shape and posture; comprising Exercise analysis method using unmanned aerial vehicles and wearable devices.
- In claim 1, the step of analyzing the degree of motion for the target object is, A step of setting the above sensing data as a first element and the above flight collection data as a second element; A step of performing temporal synchronization processing between the first element and the second element; A step of performing basic preprocessing on data collected from the first element and the second element, respectively; and A step of calculating the degree of exercise by assigning individual weights to each of the data of the first element preprocessed and the data of the second element preprocessed, and then reflecting them; comprising Exercise analysis method using unmanned aerial vehicles and wearable devices.
- In paragraph 5, the above individual weights are, Dynamically variable settings based on the movement of the target object, Exercise analysis method using unmanned aerial vehicles and wearable devices.
- In paragraph 5, the step of generating motion degree information for the above-mentioned target object and subsequent action information thereon is A step of storing a plurality of follow-up action output models with different labels set according to the degree of exercise; and The step of determining one follow-up action calculation model corresponding to the calculated degree of motion, and setting a follow-up action using the determined one follow-up action calculation model; comprising Exercise analysis method using unmanned aerial vehicles and wearable devices.
- In paragraph 2, the step of requesting tracking shots of the target object is, A step of identifying the target object in the above-mentioned captured image; A step of identifying at least one target object among the above target objects; and The method includes the step of generating visual feature information for at least one target object and providing it to the unmanned aerial vehicle. The above-mentioned unmanned aerial vehicle is, Flying to identify and track the target object based on the above visual feature information, Exercise analysis method using unmanned aerial vehicles and wearable devices.
- Combined with hardware, a computer program for performing the motion analysis method of any one of claims 1 to 8 is stored therein Computer-readable recording medium.
- A wearable device attached to a target object and generating sensing data for said target object; An unmanned aerial vehicle that generates flight collection data including a captured video of the above-mentioned target object; and A service server that receives the sensing data and the image data, analyzes the degree of motion of a target object based on the sensing data and the image data, and generates information on the degree of motion of the target object and information on follow-up actions thereon; comprising Exercise analysis system utilizing unmanned aerial vehicles and wearable devices.
Description
Exercise analysis method and system using unmanned aerial vehicles and wearable devices The present invention relates to motion analysis technology, and according to various embodiments of the present disclosure, a method and system for motion analysis utilizing unmanned aerial vehicles and wearable devices are disclosed. It has been proven that exercise and various physical activities have various effects on the body, and accordingly, diverse exercise and physical activities are being carried out from childhood. However, there are limitations, such as the need for very expensive equipment to conduct a precise analysis of the effects of such exercise. Accordingly, there is a diverse need to develop video-based exercise analysis. For individuals with immature bodies, such as children and adolescents, there is a diverse need for predicting and analyzing growth levels based on exercise analysis; in particular, there is a growing demand for simple and convenient analysis without the need for expensive equipment. The information described above may be provided as related art for the purpose of aiding understanding of the present disclosure. No claim or determination is made as to whether any of the foregoing may be applied as prior art related to the present disclosure. FIG. 1 is a conceptual diagram illustrating a motion analysis system utilizing an unmanned aerial vehicle and a wearable device according to one embodiment of the present disclosure. FIG. 2 is a diagram illustrating an example of a motion analysis method using unmanned aerial vehicles and wearable devices, performed in the motion analysis system illustrated in FIG. 1. FIG. 3 is a flowchart illustrating an example of a motion degree analysis method performed in the motion analysis system illustrated in FIG. 1. FIG. 4 is a flowchart illustrating an example of a follow-up action setting method performed in the motion analysis system illustrated in FIG. 1. Figure 5 is a flowchart illustrating an example of a posture analysis and growth prediction method performed in the motion analysis system illustrated in Figure 1. FIG. 6 is a flowchart illustrating an example of a three-dimensional skeleton generation method performed in the motion analysis system illustrated in FIG. 1. FIG. 7 is a flowchart illustrating an example of a method for specifically capturing a target object, performed in the motion analysis system illustrated in FIG. 1. Figures 8 and 9 are illustrated in Figure 8, which is a diagram illustrating an example of an unmanned aerial vehicle tracking and photographing a target object. FIG. 10 is a diagram illustrating an exemplary computing operating environment of a wearable device according to one embodiment of the present invention. FIG. 11 is a diagram illustrating an exemplary computing operating environment of an unmanned aerial vehicle according to one embodiment of the present invention. FIG. 12 is a diagram illustrating an exemplary computing operating environment of a service server according to one embodiment of the present invention. Hereinafter, embodiments of the present disclosure are described in detail with reference to the drawings so that those skilled in the art can easily practice them. However, the present disclosure may be embodied in various different forms and is not limited to the embodiments described herein. In relation to the description of the drawings, the same or similar reference numerals may be used for identical or similar components. Furthermore, in the drawings and related descriptions, descriptions of well-known functions and configurations may be omitted for clarity and brevity. The various embodiments of this document and the terms used therein are not intended to limit the technical features described in this document to specific embodiments, and should be understood to include various modifications, equivalents, or substitutions of said embodiments. In connection with the description of the drawings, similar reference numerals may be used for similar or related components. The singular form of a noun corresponding to an item may include one or more of said items unless the relevant context clearly indicates otherwise. In this document, phrases such as "A or B," "at least one of A and B," "at least one of A or B," "A, B or C," "at least one of A, B and C," and "at least one of A, B, or C" may each include any one of the items listed together in the corresponding phrase, or all possible combinations thereof. Terms such as "first," "second," or "first" or "second" may be used simply to distinguish said components from other said components and do not limit said components in any other aspect (e.g., importance or order). Where any (e.g., 1st) component is referred to as "coupled" or "connected" to another (e.g., 2nd) component, with or without the terms "functionally" or "communicationly," it means that said any component may be connected to said other component directly (e.g., via a wire), wirelessly, o