Search

CN-116075859-B - Generating a target texture from a plurality of source textures

CN116075859BCN 116075859 BCN116075859 BCN 116075859BCN-116075859-B

Abstract

A computer-implemented method for generating a target texture (201) from at least two source textures (211, 221, 231) includes performing a statistical texture synthesis operation. Each of the source textures is assigned an interpolation weight, and the target texture is synthesized using the source textures and the assigned interpolation weights. Synthesizing the target texture includes randomly selecting one of the source textures, randomly extracting a texture tile (212) from the selected source texture (211), modifying the extracted texture tile in a manner such that at least one statistical property (215) of the modified texture tile (214) approximates the corresponding average statistical property (202), and inserting the modified texture tile into the target texture such that the modified texture tile seamlessly fits the existing texture content (203) in the target texture.

Inventors

  • FRANCIS LAMY

Assignees

  • 爱色丽欧洲有限公司

Dates

Publication Date
20260505
Application Date
20210707
Priority Date
20200707

Claims (19)

  1. 1. A computer-implemented method for generating a target texture (201) from at least two source textures (211, 221, 231), the method comprising performing a statistical texture synthesis operation, comprising: (i) Assigning interpolation weights to each of the source textures (211, 221, 231), and (Ii) Synthesizing a target texture (201) using the source textures (211, 221, 231) and the assigned interpolation weights, Wherein the synthetic target texture (201) comprises: (a) Randomly selecting one of the source textures (211, 221, 231) with a probability proportional to the interpolation weight of the source textures (211, 221, 231); (b) Randomly extracting texture tiles (212) from the selected source texture (211); (c) Modifying the extracted texture tile (212) by modifying pixel values in the extracted texture tile to obtain a modified texture tile (214), the modification of the pixel values being performed in such a way that at least one statistical property (215) of the modified texture tile (214) approximates the corresponding average statistical property (202), the average statistical property (202) being determined by performing a weighted average on the source textures (211, 221, 231), the source textures (211, 221, 231) being weighted by interpolation weights; (d) Inserting the modified texture tile (214) into the target texture such that the modified texture tile (214) seamlessly fits existing texture content (203) in the target texture, and (E) Repeating steps (a) - (d) until the target texture is completely filled.
  2. 2. The computer-implemented method of claim 1, Wherein the target texture (201) is associated with a target set of coordinates (a ', b') indicating a specific combination of illumination and viewing directions for which the target texture (201) indicates a spatial variation of appearance, and Wherein each source texture (211, 221, 231) is associated with a different set of source coordinates (a '-F'), each set of source coordinates (a '-F') being indicative of a combination of illumination and viewing directions for which the respective source texture (211, 221, 231) is indicative of a spatial variation of appearance.
  3. 3. The computer-implemented method of claim 2, wherein assigning interpolation weights to each of the source textures (211, 221, 231) comprises: creating delaunay triangulation of the source coordinate set (a '-F'); finding a simplex in a Delaue triangulation comprising the target coordinate set (a ', b'), the found simplex having a plurality of angles, and The coordinates of the center of gravity of the target coordinates (a ', b') relative to the found simplex are used as interpolation weights for the source texture at the corners of the found simplex.
  4. 4. The computer-implemented method of claim 1, Wherein the target texture (201) is indicative of a spatial variation in appearance of a composite comprising at least two components according to a recipe, the recipe defining a concentration of each component in the composite, Wherein each source texture (211, 221, 231) is associated with one of the ingredients, and Wherein the source textures (211, 221, 231) are assigned interpolation weights according to the concentration of the ingredients as defined by the recipe.
  5. 5. The computer-implemented method of claim 4, further comprising: the adjusted target texture is generated by additionally adjusting pixel values in the target texture to correct for effects of absorption and scattering in the composite material.
  6. 6. The computer-implemented method of claim 5, Wherein the pixel values are adjusted in such a way that at least one statistical property of the adjusted target texture matches a baseline property of the composite material.
  7. 7. The computer-implemented method of claim 6, Wherein the pixel values are adjusted in such a way that the average color space value of the adjusted target texture matches a reference color space value of the composite material, said color space value and said reference color space value preferably being expressed in the perceived color space.
  8. 8. The computer-implemented method of claim 7, Wherein the target texture (201) is indicative of a spatial variation of the appearance of the composite material for a specific combination of illumination and viewing directions, and Wherein the method comprises fitting parameters of the BRDF model to reference colors at a plurality of other combinations of illumination and viewing directions, and evaluating the BRDF model at the particular combination to obtain reference color space values for the particular combination.
  9. 9. The computer-implemented method of claim 5, comprising: Obtaining individual optical parameters that describe, at least approximately, the scattering and absorption behavior of each component in the composite; Determining a combined optical parameter describing the scattering and absorption behavior of the composite based on the concentrations of the components and their individual optical parameters; Performing an optical simulation of the luminous flux within the composite for at least one layer below the surface of the composite to determine the attenuation factors of the incident and reflected light of the effect pigment in the layer, and The pixel value of the target texture is adjusted based on the decay factor.
  10. 10. The computer-implemented method of any of the preceding claims, wherein the at least one statistical property (215) of the modified texture tile (214) is a histogram of pixel values.
  11. 11. The computer-implemented method of any of claims 1-9, wherein modifying the extracted texture tile (212) includes applying a monotonically non-decreasing point-to-point transform to pixel values.
  12. 12. The computer-implemented method of any of claims 1-9, wherein inserting the modified texture tile (214) into the target texture comprises: Calculating a seam (205) enhancing visual smoothness between the existing texture content (203) and the modified texture sheet (214), and -Stitching the existing texture content (203) with the modified texture sheet (214) along the seam (205).
  13. 13. The computer-implemented method of any of claims 1-9, comprising visualizing the target texture (201) using a display device (70).
  14. 14. The computer-implemented method of any of claims 1-9, comprising generating an instance (56; 66) of an appearance model, the appearance model comprising a discrete texture surface comprising a plurality of target textures (201), each target texture (201) being associated with a different set of target coordinates (a ', b'), wherein at least one of the target textures (201) in the discrete texture surface is generated by performing a statistical texture synthesis operation.
  15. 15. The computer-implemented method of claim 14, further comprising: A virtual object (72) having a continuous three-dimensional surface geometry is visualized using a display device (70), using instances of the appearance model (56; 66).
  16. 16. The computer-implemented method of claim 14, the method comprising: determining a set of measured appearance properties (54) of the first material by performing measurements on a target object (50) comprising the first material using an appearance capture device (52); generating a first instance (56) of the appearance model based on the set of measured appearance properties (54) of the first material; determining a recipe (60) of a second material based on the measured appearance properties (54) of the first material and based on predetermined appearance properties associated with a plurality of reference materials; Generating a second instance (66) of the appearance model based on the recipe and predetermined appearance attributes associated with the reference material, and A scene comprising at least one virtual object (72) is visualized using a display device (70), a first portion of the scene is visualized using a first instance (56) of the appearance model, and a second portion of the scene is visualized using a second instance (66) of the appearance model.
  17. 17. An apparatus for generating a target texture (201) associated with a target set of coordinates (a ', b'), the apparatus comprising a processor (310) and a memory (320), the memory (320) comprising program instructions (102, 104, 108) configured to cause the processor (310) to perform the method of any of the preceding claims.
  18. 18. The device of claim 17, further comprising a display device (70).
  19. 19. A computer program product comprising program instructions (102, 104, 108), which program instructions (102, 104, 108), when executed by a processor, cause the processor to perform the method according to any one of claims 1 to 16.

Description

Generating a target texture from a plurality of source textures Technical Field The invention relates to improving the rendering of virtual objects based on measurements from limited appearances of the actual objects. In particular, the present invention relates to a method of generating a target texture from a plurality of source textures. The invention also relates to an apparatus for performing such a method and a corresponding computer program. Background It is not just the shape and inherent color of the object that needs to be known to photorealistically render a virtual object using a computer. The rendering operation should ideally reproduce the visual impression of the actual material contained in the object in the environment under defined illumination and viewing conditions. This is known to those skilled in the art as "appearance". The appearance may include color and texture. Depending on the illumination conditions and viewing conditions, both color and texture may appear different. The appearance model can be used to describe the appearance of a material or object under any illumination and viewing conditions, that is, the model can be used to describe a wide range of illumination types (light, diffusion), viewing angles relative to illumination angles, and object directions. An "appearance model" is a formal construction that describes an appearance in mathematical terms using a number of material-related parameters called "appearance attributes". The appearance model provides a mathematical description of the appearance in such a form and at such a level of integrity that it is possible to visualize (i.e., render and display) objects having arbitrary three-dimensional shapes under arbitrary illumination and viewing conditions. In a simple embodiment, the appearance model describes only the correlation of color with illumination and viewing conditions, without taking into account spatial variations in appearance across the surface of the object. For example, in a simple embodiment, the appearance model may be a BRDF model that describes spectral reflectance as a function of illumination and viewing direction only, regardless of spatial variation. In more complex embodiments, the appearance model may be a model comprising a "texture", i.e. there is a spatial variation of the appearance across the surface of the object in addition to the angular dependence of reflection and/or transmission. In particular, the appearance model may be a spatially varying bi-directional reflectance distribution function ("SVBRDF") model, a bi-directional texture function ("BTF") model, a bi-directional surface scattering distribution function ("BSSRDF model), a dedicated model of automotive paint, or the like. Many such models are known in the art. Thus, appearance attributes used in the appearance model may include both color attributes and texture attributes. The texture may depend largely on the illumination and viewing directions relative to the object, as understood in the broad sense of describing spatial variations in appearance across the surface of the object. For example, in metallic effect paints, which typically comprise highly reflective flakes, when the direction of illumination and/or viewing is continuously varied, the position of the surface where strong reflection is observed may not be continuously varied, as flakes across the surface or at different positions within the paint will reflect differently and in different proportions under different combinations of the direction of illumination and viewing. As described above, the appearance model may include a description of the texture with associated texture attributes. The texture properties may comprise image data indicating the texture in a specific surface area with arbitrary parameterization, or data that may have been derived from the image data, such as a height or normal map, roughness values, flash distribution, etc. In the case of an appearance model suitable for describing the appearance of metal and pearlescent effect paints, the texture information may, for example, take the form of one or more images of representative surface portions of the paint coating, the texture attribute being a pixel value in the image. It may be desirable to predict the appearance of an object for which only some appearance attributes are known. In particular, color and appearance measurement devices typically have a limited set of measurement geometries, particularly a limited set of combinations of illumination and viewing angles. For example, a handheld imaging spectrophotometer may have three to eleven illumination angles, two spectrophotometer viewing angles, and one texture camera viewing angle. In such examples, only texture attributes for a very limited set of "sparse" illumination and viewing directions may be known, while reliable predictions of appearance may require predicting textures for a "dense" range of illumination and viewing directions.