Search

CN-121999064-A - Textile color fastness prediction method and system

CN121999064ACN 121999064 ACN121999064 ACN 121999064ACN-121999064-A

Abstract

The application relates to the technical field of image processing, in particular to a method and a system for predicting textile color fastness, wherein the method comprises the steps of acquiring a textile image after friction test, converting the textile image into a CIELAB color space, calculating the self-adaptive compactness of each pixel point, and dividing the textile image into a texture area and a chromaticity area; the method comprises the steps of improving clustering distance based on self-adaptive compactness, carrying out super-pixel segmentation on an image, determining signal-to-noise ratio weights of pixel points according to brightness texture fluctuation indexes, calculating representative colors of super-pixel areas by using the signal-to-noise ratio weights, calculating original color differences of the super-pixel areas in a chromaticity area, and mapping the maximum value of the original color differences into color fastness grades. According to the technical scheme, the interference of the texture on the surface of the fabric can be overcome, and the prediction accuracy of the color fastness of the textile is improved.

Inventors

  • LIU JIAWEI
  • ZHANG PENGXIANG
  • Ye Yongsen

Assignees

  • 广东毅彤新材科技有限公司

Dates

Publication Date
20260508
Application Date
20260127

Claims (10)

  1. 1. A method for predicting color fastness of a textile, the method comprising: After friction test, obtaining a textile image and converting the textile image into a CIELAB color space, and calculating brightness texture fluctuation indexes and chromaticity gradient amplitude values of all pixel points, wherein the brightness texture fluctuation indexes represent noise intensity caused by textile textures, and the chromaticity gradient amplitude values represent signal intensity caused by color change; Calculating the self-adaptive compactness of each pixel point based on the brightness texture fluctuation index and the chromaticity gradient amplitude, and dividing the textile image into a texture area and a chromaticity area according to the self-adaptive compactness, wherein the texture area is an area with dominant brightness texture fluctuation index, and the chromaticity area is an area with dominant chromaticity gradient amplitude; Performing super-pixel segmentation on the image based on the adaptive compactness improvement clustering distance to generate a plurality of super-pixel areas; Determining the signal-to-noise ratio weight of each pixel point according to the brightness texture fluctuation index, and calculating the weighted average color of each super pixel area by using the signal-to-noise ratio weight to serve as a representative color; and calculating the original color difference of the representative color of each super pixel area in the chromaticity area relative to the standard color, and mapping the maximum value of the original color difference into the color fastness grade.
  2. 2. The method of claim 1, wherein calculating the brightness texture fluctuation index and the chromaticity gradient magnitude of each pixel comprises: Calculating standard deviation of pixel values of an L channel in a preset neighborhood by taking any pixel point as a center to obtain the brightness texture fluctuation index; And calculating the gradient modular length of the pixel point on the a-b chromaticity plane to obtain the chromaticity gradient amplitude.
  3. 3. The method of claim 1, wherein calculating the adaptive compactness of each pixel based on the luminance texture fluctuation index and the chromaticity gradient magnitude comprises: Calculating texture-chromaticity dominant factors of all pixel points, wherein the texture-chromaticity dominant factors are differences of brightness texture fluctuation indexes and chromaticity gradient amplitudes, and dividing the differences by sums of the brightness texture fluctuation indexes and the chromaticity gradient amplitudes; The adaptive compactness is equal to a reference compactness constant multiplied by a correction factor, which is 1 plus the product of a preset sensitivity and the texture-chroma dominant factor.
  4. 4. The method according to claim 1, wherein the pixel belongs to a texture region in response to a texture-chromaticity dominant factor of any pixel being greater than a segmentation threshold, and wherein the pixel belongs to a chromaticity region in response to the texture-chromaticity dominant factor of any pixel being not greater than the segmentation threshold, the segmentation threshold being 0 or determined using a maximum inter-class variance method.
  5. 5. A method of predicting textile color fastness according to claim 1, wherein the adaptively compactedness-based improved clustering distance comprises: Calculating the color distance and the space distance between any pixel point and the clustering center; And taking the square of the ratio of the self-adaptive compactness of the pixel points to the preset step length as a fusion weight of the space distance, and carrying out weighted fusion on the space distance and the color distance to obtain the clustering distance for super-pixel segmentation.
  6. 6. A method of predicting textile color fastness according to claim 1, wherein the super-pixel segmentation of the image to generate a plurality of super-pixel regions comprises: Uniformly distributing a plurality of initial clustering centers on the textile image; Calculating the clustering distance between each pixel point and the clustering center in a local range taking the clustering center as the center, and distributing the pixel points to the clustering center with the smallest clustering distance; and repeating the steps until the moving distance of each clustering center is smaller than a preset threshold value, thereby obtaining a plurality of super-pixel areas.
  7. 7. The method of claim 1, wherein the signal-to-noise ratio weight of any pixel is the inverse of the sum of the brightness texture fluctuation index and a predetermined constant.
  8. 8. A method of predicting color fastness of textiles according to claim 1, wherein calculating a weighted average color of each super pixel area using signal to noise ratio weights comprises: acquiring original color values of all pixel points in a super pixel area; And normalizing the signal-to-noise ratio weight of each pixel point in the super pixel region, and then carrying out weighted summation on the original color values to obtain the representative color.
  9. 9. A method of predicting textile color fastness as claimed in claim 1, wherein the standard color is an average of representative colors of each super pixel block in the texture area.
  10. 10. A textile color fastness prediction system, characterized by comprising a processor and a memory, the memory storing computer program instructions which, when executed by the processor, implement a textile color fastness prediction method according to any one of claims 1 to 9.

Description

Textile color fastness prediction method and system Technical Field The application relates to the technical field of image processing, in particular to a method and a system for predicting textile color fastness. Background Textile color fastness, especially rubbing color fastness, is one of the important indexes for evaluating the quality of textiles. In the textile industry, it is often necessary to use standard rubbing cloths to dry rub or wet rub the dyed textiles, and then rate the degree of staining on the rubbing cloths, according to relevant national standards. With the development of machine vision technology, an automated rating method based on image processing is increasingly applied. Existing automated methods typically utilize an industrial camera to capture a rubbing cloth image, convert it to a specific color space such as RGB or CIELAB, and determine the color fastness level by calculating the color difference of the rubbed and unrubbed areas. However, in practical application, the textile is not a smooth plane, but a complex structure with surface texture is formed by interweaving or knitting yarns, and in the imaging process, a large number of high light spots are formed by the fluctuation of the yarns, and the existing color extraction algorithm usually adopts simple area averaging or Gaussian blur processing, so that the fluctuation of light and shadow caused by physical texture and the real color change of the textile caused by dye fading cannot be effectively distinguished, errors occur in calculating color difference, and inaccurate color fastness prediction results of the textile are caused. Therefore, how to extract the real color change of the textile and realize high-precision automatic color fastness prediction is a technical problem to be solved urgently. Disclosure of Invention In order to solve the technical problem of inaccurate textile color fastness prediction results, the application provides a textile color fastness prediction method and a textile color fastness prediction system, which can overcome the texture interference of the fabric surface and improve the textile color fastness prediction precision. According to the first aspect of the application, a textile color fastness prediction method is provided, and the prediction method comprises the steps of obtaining a textile image after friction test and converting the textile image into a CIELAB color space, calculating brightness texture fluctuation indexes and chromaticity gradient amplitudes of all pixel points, wherein the brightness texture fluctuation indexes represent noise intensities caused by textile textures, the chromaticity gradient amplitudes represent signal intensities caused by color changes, calculating adaptive compactness of all pixel points based on the brightness texture fluctuation indexes and the chromaticity gradient amplitudes, dividing the textile image into texture areas and chromaticity areas according to the adaptive compactness, wherein the texture areas are areas with dominant brightness texture fluctuation indexes, the chromaticity areas are areas with dominant chromaticity gradient amplitudes, performing super-pixel segmentation on the image based on the adaptive compactness improvement clustering distance to generate a plurality of super-pixel areas, determining signal-to-noise ratio weights of all pixel points according to the brightness texture fluctuation indexes, calculating weighted average colors of all super-pixel areas by using the signal-to-noise ratio weights to serve as representative colors, calculating color differences of representative colors of all super-pixel areas relative to original colors in the chromaticity areas, and mapping the maximum values of the representative colors of all super-pixel areas to original colors into color fastness levels. The method comprises the steps of calculating brightness texture fluctuation index and chromaticity gradient amplitude, dividing texture areas and chromaticity areas based on the brightness texture fluctuation index and chromaticity gradient amplitude, further carrying out superpixel segmentation and signal-to-noise ratio weighted calculation on representative colors by utilizing self-adaptive compactness, eliminating false-and-true of effective color information in a textile image, eliminating interference of a textile texture structure on color extraction, and ensuring that original color difference calculated based on the representative colors can truly reflect the fading degree of textiles. Preferably, calculating the brightness texture fluctuation index and the chroma gradient amplitude of each pixel point comprises calculating the standard deviation of the L-channel pixel value in a preset neighborhood by taking any pixel point as a center to obtain the brightness texture fluctuation index, and calculating the gradient modular length of the pixel point on an a-b chroma plane to obtain the chroma gradient amplitude. T