EP-4736126-A1 - METHOD AND APPARATUS OF PATTERNS FOR CLOTHING SIMULATION USING NEURAL NETWORK MODEL
Abstract
A clothing simulation method and apparatus are provided. The clothing simulation method includes obtaining pattern information for each of patterns of a garment, the pattern information including information about sample points extracted from each of the patterns, based on an embedding vector for each of the patterns obtained by applying the pattern information to a pattern embedding model trained to estimate a correlation between input pattern information, predicting sewing information about the patterns, and based on the sewing information, generating a simulation result of the garment.
Inventors
- KANG, DONG SIG
- JU, EUN JUNG
Assignees
- CLO Virtual Fashion Inc.
Dates
- Publication Date
- 20260506
- Application Date
- 20240820
Claims (15)
- A clothing simulation method, comprising: obtaining pattern information for each of patterns of a garment, the pattern information including information about sample points extracted from each of the patterns; predicting sewing information about the patterns based on an embedding vector for each of the patterns obtained by applying the pattern information to a pattern embedding model trained to estimate a correlation between input pattern information; and based on the sewing information, generating a simulation result of the garment.
- The clothing simulation method of claim 1, further comprising: extracting a predetermined number of sample points from each of the patterns of the garment; and based on a position of a predetermined type of a point included in each of the patterns, adjusting positions of the sample points.
- The clothing simulation method of claim 1, wherein the obtaining of the pattern information for each of the patterns comprises obtaining information about sample points of a target pattern, based on a length of an outline from a reference point of the target pattern among the patterns of the garment to each sample point extracted from the target pattern.
- The clothing simulation method of claim 1, wherein: the pattern embedding model comprises a transformer encoder, and the predicting of the sewing information comprises obtaining information indicating a sewing line of a pattern pair extracted from the patterns by applying, to a transformer decoder, an embedding vector pair corresponding to the pattern pair.
- The clothing simulation method of claim 1, wherein the pattern embedding model comprises: a first transformer encoder trained to estimate a correlation between sample points extracted from a same pattern from the input pattern information; and a second transformer encoder trained to estimate a correlation between patterns from encoding data for each of the patterns obtained from embedding vectors of sample points output from the first transformer encoder.
- The clothing simulation method of claim 5, wherein the predicting of the sewing information comprises: obtaining encoding data of a first pattern including an embedding vector for each sample point corresponding to the first pattern by applying pattern information of the first pattern to the first transformer encoder; obtaining encoding data of a second pattern including an embedding vector for each sample point corresponding to the second pattern by applying pattern information of the second pattern to the first transformer encoder; obtaining an embedding vector of the first pattern and an embedding vector of the second pattern by applying the encoding data of the first pattern and the encoding data of the second pattern to the second transformer encoder; and predicting sewing information about a pattern pair of the first pattern and the second pattern by applying the embedding vector of the first pattern and the embedding vector of the second pattern to a transformer decoder.
- The clothing simulation method of claim 1, wherein the sewing information comprises a pair of lines sewn together within the patterns.
- The clothing simulation method of claim 7, wherein the sewing information further includes information indicating a sewing direction of the lines.
- The clothing simulation method of claim 4, wherein the information indicating the sewing line of the pattern pair comprises: information about sample points that are extracted from a first pattern of the pattern pair and correspond to a starting point and an end point of the sewing line included in the first pattern; and information about sample points that are extracted from a second pattern of the pattern pair and correspond to a starting point and an end point of the sewing line included in the second pattern.
- The clothing simulation method of claim 1, wherein the pattern embedding model is trained based on a loss about a difference between the predicted sewing information and ground truth.
- The clothing simulation method of claim 1, wherein: the predicting of the sewing information about the patterns comprises predicting arrangement information about the patterns and the sewing information, based on an embedding vector for each of the patterns, and the generating of the simulation result of the garment comprises generating the simulation result of the garment, based on the sewing information and the arrangement information.
- The clothing simulation method of claim 11, wherein the pattern embedding model is trained based on a loss about a difference between the predicted sewing information and ground truth, and a loss about a difference between the predicted arrangement information and ground truth.
- The clothing simulation method of claim 1, wherein the information about the sample points comprises at least one of: information indicating positions of the sample points within a pattern of the garment; information indicating positions of the sample points within the garment; information indicating a positional relationship between adjacent ones of the sample points; and information indicating a type of the sample points.
- A non-transitory computer-readable storage medium storing instructions that, when executed by a processor, cause the processor to perform the method of claim 1.
- A clothing simulation apparatus comprising: one or more processors; and memory storing instructions thereon, the instructions when executed by the one or more processors cause the one or more processors to: obtain pattern information for each of patterns of a garment, the pattern information including information about sample points extracted from each of the patterns; predict sewing information about the patterns based on an embedding vector for each of the patterns obtained by applying the pattern information to a pattern embedding model trained to estimate a correlation between input pattern information; and based on the sewing information, generate a simulation result of the garment.
Description
METHOD AND APPARATUS OF PATTERNS FOR CLOTHING SIMULATION USING NEURAL NETWORK MODEL Embodiments described herein relate to automatically placing patterns of clothing on a three-dimensional (3D) avatar for simulation. A garment appears three-dimensional (3D) when worn by a person, but the garment is more of a two-dimensional (2D) object because it is a combination of pieces of fabric that are cut according to a 2D pattern. Because fabric, that is, a material of a garment, is flexible, the shape of the fabric may vary depending on the body shape or movement of a person wearing the garment. In addition, different fabrics may have different physical properties (e.g., strength, elasticity, and shrinkage). Because of such differences, garments made of the same shaped patterns express different behavior, look and feel when donned on a 3D avatar. In the garment industry, computer-based clothing simulation technology is widely used to develop actual clothing designs. During clothing simulation, a user typically manually arranges clothing patterns of a garment at suitable positions on a three-dimensional (3D) avatar. Such arrangement of patterns may involve a great amount of time and impose difficulty on a user with a lack of expertise in clothing designing or simulation. These and/or other aspects, features, and advantages of the invention will become apparent and more readily appreciated from the following description of embodiments, taken in conjunction with the accompanying drawings of which: FIG. 1 is a flowchart of automatically arranging patterns of a garment on a 3D avatar, according to an embodiment. FIG. 2 is a diagram illustrating sample points of a pattern and information indicating positions of the sample points within the pattern, according to an embodiment. FIG. 3 is a diagram illustrating sample point information, according to an embodiment. FIGS. 4A and 4B are diagrams illustrating sewing information and automatically matched sewing parts, according to an embodiment. FIG. 5 is a block diagram illustrating a structure of a n neural network model for a clothing simulation of predicting sewing information, according to an embodiment. FIG. 6 is a block diagram illustrating a structure of a neural network model for predicting sewing information and arrangement information, according to an embodiment. FIG. 7 is a diagram illustrating arrangement points and associated arrangement plates, according to an embodiment. FIG. 8 is a block diagram illustrating an automatic arrangement device, according to an embodiment. The following detailed structural or functional description is provided as an example only and various alterations and modifications may be made to the embodiments. Accordingly, the embodiments are not construed as limited to the disclosure and should be understood to include all changes, equivalents, and replacements within the idea and the technical scope of the disclosure. With regard to the description of the drawings, similar reference numerals may be used to refer to similar or related components. It is to be understood that a singular form of a noun corresponding to an item may include one or more of the things, unless the relevant context clearly indicates otherwise. As used herein, "A or B", "at least one of A and B", "at least one of A or B", "A, B or C", "at least one of A, B and C", and "at least one of A, B, or C," each of which may include any one of the items listed together in the corresponding one of the phrases, or all possible combinations thereof. Terms such as "1st" and "2nd," or "first" and "second" may be used to simply distinguish a corresponding component from other components, and do not limit the components in other aspects (e.g., importance or order). For example, a first component may be referred to as a second component, and similarly the second component may also be referred to as the first component. It is to be understood that if a component (e.g., a first component) is referred to, with or without the term "operatively" or "communicatively," as "coupled with," "coupled to," "connected with," or "connected to" another component (e.g., a second component), the component may be coupled with the other component directly (e.g., wiredly), wirelessly, or via a third component. The singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises/comprising" and/or "includes/including" when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof. Unless otherwise defined, all terms, including technical and scientific terms, used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the present disclosu