CN-122023842-A - Intelligent clothing try-on matching method and system based on human body gesture recognition
Abstract
The invention is suitable for the technical field of intelligent clothing try-on, and provides an intelligent clothing try-on matching method and system based on human body gesture recognition, wherein the method comprises the steps of obtaining three-dimensional size data of waist and abdomen areas of a target try-on wearer in a standard standing gesture and elasticity parameters of trousers materials; obtaining a sample set, wherein the sample set comprises test-wearing data of a plurality of test-wearers with known body fat rate, the test-wearing data comprises three-dimensional size data of waist and abdomen areas of each test-wearers in a standard standing posture and a preset target posture set, and a posture-size change mapping model is established based on the waist and abdomen area size change values of the test-wearers in the sample set under different preset target postures. Under the condition that only the three-dimensional size data of the waist and abdomen area of the standard standing posture of the target test wearer is obtained, the waist and abdomen area size change under the preset target postures such as sitting posture, bending or squatting posture can be accurately predicted, and the problem that the test matching is inaccurate due to the fact that the preset target posture size data cannot be directly obtained in the prior art is solved.
Inventors
- YANG ZIHE
Assignees
- 新疆理工职业大学
Dates
- Publication Date
- 20260512
- Application Date
- 20260114
Claims (10)
- 1. An intelligent clothing try-on matching method based on human body gesture recognition, which is characterized by comprising the following steps: acquiring three-dimensional size data of a waist and abdomen area of a target wearer in a standard standing posture and elasticity parameters of trousers materials; Acquiring a sample set, wherein the sample set comprises test-wearing data of a plurality of test wearers with known body fat rate, the test-wearing data comprises three-dimensional size data of waist and abdomen areas of each test wearer in a standard standing posture and a preset target posture set, and a posture-size change mapping model is established based on the waist and abdomen area size change values of the test wearers in the sample set under different preset target postures; acquiring a predicted size change value of a waist and abdomen area of a target try-on wearer under a preset target posture based on a posture-size change mapping model according to posture parameters corresponding to the preset target posture; determining a soft tissue deformation correction coefficient according to the difference of waist and abdomen area size change values of the test wearers with different body fat rates in the sample set under the same preset target gesture, and correcting the predicted size change value; Based on a preset fabric stress-strain relation and a ring pressure model, determining the cloth deformation of a waist and abdomen area of the trousers under different preset target postures of a target try-on wearer according to the three-dimensional size data, the corrected predicted size change value and the trousers material elasticity parameter, and calculating a contact pressure value of the corresponding waist and abdomen area; And outputting the trousers comfort degree evaluation and the whole fitting advice of the target fitting person under different preset target postures based on the contact pressure value.
- 2. The intelligent clothing try-on matching method based on human body posture recognition according to claim 1, wherein a sample set is obtained, the sample set contains try-on data of a plurality of try-on persons with known body fat rates, the try-on data comprises three-dimensional size data of waist and abdomen areas of each try-on person in a standard standing posture and a preset target posture set, and the step of establishing a posture-size change mapping model based on the waist and abdomen area size change values of the try-on persons in the sample set under different preset target postures comprises the following steps: Acquiring a sample set, wherein the sample set comprises test-on data of a plurality of test-on wearers with known body fat rates, and the test-on data comprises three-dimensional size data of waist and abdomen areas of each test-on wearer in a standard standing posture and preset target posture set; Comparing the standard standing posture waist and abdomen area three-dimensional size data of the test wearer in the sample set with waist and abdomen area three-dimensional size data corresponding to the preset target postures, and calculating waist and abdomen area size change values of each test wearer under different preset target postures; Identifying gesture parameters corresponding to the gesture of the preset target, and classifying the size change values according to the gesture parameters; and establishing a mapping relation model between the posture parameters corresponding to the preset target posture and the waist and abdomen area size change value based on the classified size change value.
- 3. The intelligent clothing try-on matching method based on human body posture recognition according to claim 1, wherein the step of determining soft tissue deformation correction coefficients according to differences of waist and abdomen area size change values of the test wearers with different body fat rates in the sample set under the same preset target posture and correcting the predicted size change values comprises the following steps: comparing the waist and abdomen area size change values of the test wearers with different body fat rates in the sample set under the same preset target posture to obtain a comparison result; judging whether the corresponding size change values of different body fat rates under the same preset target gesture are different according to the comparison result; When judging that the difference exists, drawing a change curve between the body fat rate and the waist and abdomen area size change value; obtaining a target body fat rate of a target test wearer, and taking a slope corresponding to the change curve at the target body fat rate as a soft tissue deformation correction coefficient corresponding to the target body fat rate; And correcting the predicted size change value of the waist and abdomen area of the target test wearer under different preset target postures by using the soft tissue deformation correction coefficient.
- 4. The intelligent clothing try-on matching method based on human body posture recognition according to claim 1, wherein the step of determining the fabric deformation of the waist and abdomen regions of the trousers of the target wearer in different preset target postures according to the three-dimensional size data, the corrected predicted size change value and the elasticity parameter of the trousers material based on the preset fabric stress-strain relation and the ring pressure model, and calculating the contact pressure value of the corresponding waist and abdomen regions comprises the steps of: according to the three-dimensional size data of the waist and abdomen area of a target wearer in a standard standing posture, acquiring the initial size of cloth corresponding to the waist and abdomen area of the trousers in the standard standing posture; determining predicted three-dimensional size data corresponding to the waist and abdomen area of a target try-on wearer under a preset target posture according to the corrected predicted size change value, and determining corresponding cloth deformation by combining with the elasticity parameters of the trousers materials; Determining cloth tension corresponding to the cloth deformation based on the elasticity parameters of the trousers materials and a preset fabric stress-strain relation; And calculating the contact pressure value of the waist and abdomen area of the target wearer under different preset target postures based on the cloth tension and the preset annular compression model.
- 5. The intelligent clothing try-on matching method based on human body gesture recognition according to claim 1, wherein the preset target gesture set comprises a set of a plurality of preset target gestures enabling a three-dimensional size change value of a waist and abdomen area to exceed a preset deformation threshold, and the preset target gesture comprises at least one of sitting gesture, bending gesture or squatting gesture.
- 6. An intelligent clothing try-on matching system based on human body gesture recognition, characterized in that the system comprises: the data acquisition module is used for acquiring three-dimensional size data of the waist and abdomen area of the target wearer in the standard standing posture and elasticity parameters of the trousers material; The system comprises a mapping model establishing module, a model processing module and a model processing module, wherein the mapping model establishing module is used for acquiring a sample set, wherein the sample set comprises test data of a plurality of test wearers with known body fat rates, the test data comprises three-dimensional size data of waist and abdomen areas of each test wearer in a standard standing posture and a preset target posture set, and a posture-size change mapping model is established based on the waist and abdomen area size change values of the test wearers in the sample set under different preset target postures; The size change prediction module is used for acquiring a predicted size change value of a waist and abdomen area of a target test wearer under a preset target gesture based on a gesture-size change mapping model according to gesture parameters corresponding to the preset target gesture; The soft tissue deformation correction module is used for determining a soft tissue deformation correction coefficient according to the difference of the waist and abdomen area size change values of the test wearers with different body fat rates in the sample set under the same preset target gesture, and correcting the predicted size change value; the pressure calculation module is used for determining the cloth deformation of the waist and abdomen area of the trousers under different preset target postures of the target try-on wearer according to the three-dimensional size data, the corrected predicted size change value and the elasticity parameter of the trousers material based on the preset fabric stress-strain relation and the ring pressure model, and calculating the contact pressure value of the corresponding waist and abdomen area; The comfort level output module is used for outputting the trousers comfort level evaluation and the whole try-on suggestion of the target try-on person under different preset target postures based on the contact pressure value.
- 7. The intelligent clothing try-on matching system based on human body gesture recognition according to claim 6, wherein the mapping model building module specifically comprises: The sample acquisition unit is used for acquiring a sample set, wherein the sample set comprises test-on data of a plurality of test wearers with known body fat rates, and the test-on data comprises three-dimensional size data of waist and abdomen areas of each test wearer in a standard standing posture and a preset target posture set; The dimension change calculation unit is used for comparing the standard standing posture waist and abdomen region three-dimensional dimension data of the sample set test wearer with the waist and abdomen region three-dimensional dimension data corresponding to the preset target postures, and calculating waist and abdomen region dimension change values of each test wearer under different preset target postures; The gesture parameter classification unit is used for identifying gesture parameters corresponding to the preset target gesture and classifying the size change value according to the gesture parameters; and the mapping relation establishing unit is used for establishing a mapping relation model between the posture parameters corresponding to the preset target posture and the waist and abdomen area size change value based on the classified size change value.
- 8. The intelligent clothing try-on matching system based on human body gesture recognition according to claim 7, wherein the soft tissue deformation correction module specifically comprises: The size change comparison unit is used for comparing the size change values of the waist and abdomen areas of the test wearers with different body fat rates in the sample set under the same preset target postures to obtain a comparison result; the difference judging unit is used for judging whether the corresponding size change values of different body fat rates under the same preset target gesture have differences or not according to the comparison result; The curve fitting unit is used for drawing a change curve between the body fat rate and the waist and abdomen area size change value when judging that the difference exists; The correction coefficient determining unit is used for obtaining the target body fat rate of the target test wearer, and taking the slope corresponding to the change curve at the target body fat rate as the soft tissue deformation correction coefficient corresponding to the target body fat rate; and the size correction unit is used for correcting the predicted size change value of the waist and abdomen area of the target test wearer under different preset target postures by utilizing the soft tissue deformation correction coefficient.
- 9. The intelligent clothing try-on matching system based on human body gesture recognition according to claim 8, wherein the pressure calculation module specifically comprises: The initial size acquisition unit is used for acquiring the initial size of the cloth corresponding to the waist and abdomen area of the trousers in the standard standing posture according to the three-dimensional size data of the waist and abdomen area of the target wearer in the standard standing posture; the deformation determining unit is used for determining predicted three-dimensional size data corresponding to the waist and abdomen area of the target try-on wearer under the preset target posture according to the corrected predicted size change value, and determining corresponding cloth deformation by combining with the elasticity parameters of the trousers material; the tension determining unit is used for determining the cloth tension corresponding to the cloth deformation based on the elasticity parameters of the trousers materials and a preset fabric stress-strain relation; And the pressure calculation unit is used for calculating the contact pressure value of the waist and abdomen area of the target test wearer under different preset target postures based on the cloth tension and the preset ring compression model.
- 10. The intelligent clothing try-on matching system based on human body posture recognition of claim 9, wherein the set of preset target postures comprises a set of a plurality of preset target postures enabling a three-dimensional size change value of a waist and abdomen area to exceed a preset deformation threshold, and the preset target postures comprise at least one of sitting postures, bending postures or squatting postures.
Description
Intelligent clothing try-on matching method and system based on human body gesture recognition Technical Field The invention belongs to the technical field of intelligent clothing try-on, and particularly relates to an intelligent clothing try-on matching method and system based on human body gesture recognition. Background With the development of three-dimensional human body scanning technology, virtual fitting technology and human body gesture recognition technology, intelligent clothing fitting systems based on human body three-dimensional models are gradually applied to the field of online clothing sales and clothing design. The existing intelligent try-on system generally realizes trousers appearance display or basic size recommendation by acquiring three-dimensional size data of a try-on person under a standard standing posture and combining clothing style information and a cloth model. However, the technology generally only depends on standard standing posture size data for matching, and does not consider the size change caused by the deformation of human body soft tissues in preset target postures such as sitting posture, bending or squatting posture of a wearer in the actual wearing process, so that the wearing result is obviously different from the actual wearing experience, and the fitting property and the comfort degree of the trousers in different postures are difficult to truly reflect. In the prior art, acquisition of human body size change under a preset target posture mainly depends on direct three-dimensional scanning or estimation based on a general deformation model. However, the waist and abdomen area under the preset target gesture is influenced by factors such as skeletal motion, muscle contraction, fat tissue extrusion and the like, complicated soft tissue deformation can occur, the three-dimensional scanning process is easily influenced by instability of gesture shielding, cloth shielding and dynamic deformation, and corresponding three-dimensional size data are difficult to accurately acquire. Meanwhile, the existing estimation mode based on the unified deformation model usually ignores the influence of the body fat rate difference on the soft tissue deformation amplitude, so that the deviation of the waist and abdomen area size prediction results of different body type test wearers in the preset target gesture is larger, and the individual test-wearing matching requirement cannot be met. In addition, the existing intelligent fitting technology is generally focused on appearance display or size recommendation, lacks the capability of quantitatively evaluating the wearing comfort degree based on the elasticity characteristics of trousers materials and a human body mechanical model, and cannot output fitting advice conforming to actual wearing experience according to the waist-abdomen area contact pressure value. Therefore, the prior art is difficult to solve the technical problems that the size change of the preset target gesture cannot be accurately predicted only based on the standard standing gesture data, the body fat rate influence is ignored, and the comfort level of the pants can not be quantified, so that the reliability of the fitting matching result is insufficient. Disclosure of Invention The invention aims to provide an intelligent clothing try-on matching method and system based on human body gesture recognition, and aims to solve the problems in the background technology. The invention is realized in such a way that an intelligent clothing try-on matching method based on human body gesture recognition comprises the following steps: acquiring three-dimensional size data of a waist and abdomen area of a target wearer in a standard standing posture and elasticity parameters of trousers materials; Acquiring a sample set, wherein the sample set comprises test-wearing data of a plurality of test wearers with known body fat rate, the test-wearing data comprises three-dimensional size data of waist and abdomen areas of each test wearer in a standard standing posture and a preset target posture set, and a posture-size change mapping model is established based on the waist and abdomen area size change values of the test wearers in the sample set under different preset target postures; acquiring a predicted size change value of a waist and abdomen area of a target try-on wearer under a preset target posture based on a posture-size change mapping model according to posture parameters corresponding to the preset target posture; determining a soft tissue deformation correction coefficient according to the difference of waist and abdomen area size change values of the test wearers with different body fat rates in the sample set under the same preset target gesture, and correcting the predicted size change value; Based on a preset fabric stress-strain relation and a ring pressure model, determining the cloth deformation of a waist and abdomen area of the trousers under different prese