US-12620134-B2 - Generating color correction model including color space conversion, through an interpolation algorithm according to color data of reference sampling points
Abstract
A method of generating a color correction model, including: acquiring first color coordinates of a sample pixel in a first color space; converting the first color coordinates of the sample pixel to second color coordinates of the sample pixel in a second color space; inputting the second color coordinates of the sample pixel into an initial color correction model to generate sample output data; and training the initial color correction model according to the sample output data and theoretical output data corresponding to the first color coordinates, to obtain a trained color correction model, where the trained color correction model is configured to perform color correction on a target pixel in each of a plurality of video standards.
Inventors
- Yanhong Wu
- Xian Wang
- Dan Zhu
Assignees
- BOE TECHNOLOGY GROUP CO., LTD.
Dates
- Publication Date
- 20260505
- Application Date
- 20220630
Claims (17)
- 1 . A method of generating a color correction model, comprising: acquiring first color coordinates of a sample pixel in a first color space; converting the first color coordinates of the sample pixel to second color coordinates of the sample pixel in a second color space; inputting the second color coordinates of the sample pixel into an initial color correction model to generate sample output data; and training the initial color correction model according to the sample output data and theoretical output data corresponding to the first color coordinates, to obtain a trained color correction model, wherein the trained color correction model is configured to perform color correction on a target pixel in each of a plurality of video standards, wherein the initial color correction model comprises: a first color mapping table configured to record sampling coordinates and color data of each of a plurality of first sampling points in a first sampling point space; a reference point determination module configured to determine sampling coordinates of a plurality of first reference sampling points corresponding to the sample pixel in the first sampling point space, according to the second color coordinates of the sample pixel; a query module configured to determine color data of each of the plurality of first reference sampling points, according to the sampling coordinates of the first reference sampling point and the first color mapping table; and an interpolation module configured to acquire the sample output data through an interpolation algorithm, according to the color data of the plurality of first reference sampling points, wherein the training the initial color correction model comprises: updating the color data in the initial color mapping table.
- 2 . The method according to claim 1 , wherein the first sampling point space comprises a plurality of first sampling cubes, each of the plurality of first sampling cubes is defined by eight first sampling points as vertices of the first sampling cube, and the first sampling cube is divided into a plurality of tetrahedrons; and the reference point determination module is configured to map the second color coordinates of the sample pixel into the first sampling point space, to obtain a first mapping point of the second color coordinates of the sample pixel in the first sampling point space; and taking each vertex of the tetrahedron where the first mapping point is located as the first reference sampling point, and determining sampling coordinates of the first reference sampling point.
- 3 . The method according to claim 2 , wherein the interpolation module is configured to acquire the sample output data according to the following formula: f ^ ( p ) = [ f ( p 0 ) f ( p 1 ) f ( p 2 ) f ( p 3 ) ] [ x 0 x 1 x 2 x 3 y 0 y 1 y 2 y 3 z 0 z 1 z 2 z 3 1 1 1 1 ] - 1 [ x y z 1 ] , wherein p represents the first mapping point, and p 0 , p 1 , p 2 and p 3 represent the vertices of the tetrahedron where the first mapping point is located; {circumflex over (f)}(p) is the sample output data; f(p 0 ), f(p 1 ), f(p 2 ), and f(p 3 ) are the color data of the four vertices of the tetrahedron, respectively; (x 0 y 0 , z 0 ) are coordinates of the vertex p 0 with respect to a first datum point; (x 1 , y 1 , z 1 ) are coordinates of the vertex p 1 with respect to the first datum point, (x 2 , y 2 , z 2 ) are coordinates of the vertex p 2 with respect to the first datum point, (x 3 , y 3 , z 3 ) are coordinates of the vertex p 3 with respect to the first datum point; and (x, y, z) are coordinates of the the first mapping point with respect to the first datum point, wherein the first datum point is one of the vertices of the first sampling cube where the first mapping point is located.
- 4 . The method according to claim 1 , wherein the first color space is a RGB space, and the second color space is an XYZ space.
- 5 . The method according to claim 1 , wherein the sample pixel is a pixel in a sample image conforming to a preset video standard; and the converting the first color coordinates of the sample pixel to second color coordinates of the sample pixel in a second color space, comprises: converting the first color coordinates of the sample pixel to linear coordinates, according to a rule of non-linear conversion corresponding to the preset video standard; and converting the linear coordinates of the sample pixel to corresponding second color coordinates, according to a rule of conversion between the linear coordinates and the second color coordinates.
- 6 . The method according to claim 1 , wherein an image comprising the sample pixel conforms to a preset video standard; and prior to the training the initial color correction model according to the sample output data and theoretical output data corresponding to the first color coordinates, the method further comprises: acquiring a second color mapping table corresponding to the preset video standard, wherein the second color mapping table records sampling coordinates and corresponding color data of each of a plurality of second sampling points in a second sampling point space; determining sampling coordinates of a plurality of second reference sampling points corresponding to the second color coordinates of the sample pixel in the second sampling point space, according to the first color coordinates of the sample pixel; determining color data of each of the plurality of second reference sampling points, according to the sampling coordinates of the second reference sampling point and the second color mapping table; and acquiring theoretical output data of the sample pixel through an interpolation algorithm, according to the color data of the plurality of second reference sampling points.
- 7 . The method according to claim 6 , wherein the second sampling point space comprises a plurality of second sampling cubes, each of the plurality of second sampling cubes is defined by eight second sampling points as vertices of the second sampling cube, and the second sampling cube is divided into a plurality of tetrahedrons; and the determining sampling coordinates of a plurality of second reference sampling points corresponding to the second color coordinates of the sample pixel in the second sampling point space, according to the first color coordinates of the sample pixel, comprises: mapping the first color coordinates of the sample pixel into the second sampling point space, to obtain a second mapping point of the first color coordinates of the sample pixel in the second sampling point space; and taking each vertex of the tetrahedron where the second mapping point is located as the second reference sampling point, and determining sampling coordinates of the second reference sampling point.
- 8 . The method according to claim 7 , wherein the acquiring theoretical output data of the sample pixel through an interpolation algorithm, according to the color data of the plurality of second reference sampling points, comprises: acquiring theoretical output data of the sample pixel according to the following formula: f ^ ( p ′ ) = [ f ( p 0 ′ ) f ( p 1 ′ ) f ( p 2 ′ ) f ( p 3 ′ ) ] [ x 0 ′ x 1 ′ x 2 ′ x 3 ′ y 0 ′ y 1 ′ y 2 ′ y 3 ′ z 0 ′ z 1 ′ z 2 ′ z 3 ′ 1 1 1 1 ] - 1 [ x ′ y ′ z ′ 1 ] , wherein p′ represents the second mapping point, and p′ 0 , p′ 1 , p′ 2 and p′ 3 represent the vertices of the tetrahedron where the second mapping point is located; f(p′) is the theoretical output data; f(p′ 0 ), f(p′ 1 ), f(p′ 2 ), and f(p′ 3 ) are the color data of the four vertices of the tetrahedron; (x′ 0 , y′ 0 , z′ 0 ) are coordinates of the vertex p′ 0 with respect to a second datum point; (x′ 1 , y′ 1 , z′ 1 ) are coordinates of the vertex p′ 1 with respect to the second datum point, (x′ 2 , y′ 2 , z′ 2 ) are coordinates of the vertex p′ 2 with respect to the second datum point, (x′ 3 , y′ 3 , z′ 3 ) are coordinates of the vertex p′ 3 with respect to the second datum point; and (x′, y′, z′) are coordinates of the the second mapping point with respect to the second datum point, wherein the second datum point is one of the vertices of the second sampling cube where the second mapping point is located.
- 9 . A method of correcting color, comprising: acquiring first color coordinates of a target pixel in a first color space; converting the first color coordinates of the target pixel to second color coordinates of the target pixel in a second color space; and inputting the second color coordinates of the target pixel into a trained color correction model, to obtain target output data, wherein the trained color correction model is obtained according to the method according to claim 1 , wherein the trained color correction model comprises: a first color mapping table configured to record sampling coordinates and color coordinates of each of a plurality of first sampling points in a first sampling point space: a reference point determination module configured to determine sampling coordinates of a plurality of first reference sampling points corresponding to the target pixel in a first sampling point space, according to the second color coordinates of the target pixel: a query module configured to determine color data of each of the plurality of first reference sampling points, according to the sampling coordinates of the first reference sampling point and the first color mapping table; and an interpolation module configured to acquire the target output data through an interpolation algorithm, according to color data of the plurality of first reference sampling points.
- 10 . The method of correcting color according to claim 9 , wherein the first sampling point space comprises a plurality of first sample cubes, each of the plurality of first sampling cubes is defined by eight first sampling points as vertices of the first sampling cube, and the first sample cube is divided into a plurality of tetrahedrons; and the reference point determination module is configured to map the second color coordinates of the target pixel into the first sampling point space, to obtain a first mapping point of the second color coordinates of the target pixel in the first sampling point space; and taking each vertex of the tetrahedron where the first mapping point is located as the first reference sampling point, and determining sampling coordinates of the first reference sampling point.
- 11 . The method of correcting color according to claim 9 , wherein the converting the first color coordinates of the target pixel to second color coordinates of the target pixel in a second color space, comprises: converting the first color coordinates of the target pixel to linear coordinates according to a rule of non-linear conversion corresponding to a target video standard, wherein the target video standard is a video standard which is met by a target image comprising the target pixel; and converting the linear coordinates of the target pixel to corresponding second color coordinates, according to a rule of conversion between the linear coordinates and the second color coordinates.
- 12 . An apparatus for generating a color correction model, comprising: a first acquisition module configured to acquire first color coordinates of a sample pixel in a first color space; a first conversion module configured to convert the first color coordinates of the sample pixel to second color coordinates of the sample pixel in a second color space; and a training module configured to input the second color coordinates of the sample pixel into an initial color correction model, to generate sample output data, and train the initial color correction model according to the sample output data and theoretical output data corresponding to the first color coordinates, to obtain a trained color correction model, wherein the trained color correction model is configured to perform color correction on the target pixel in each of a plurality of video standards, wherein the initial color correction model comprises: a first color mapping table configured to record sampling coordinates and color data of each of a plurality of first sampling points in a first sampling point space; a reference point determination module configured to determine sampling coordinates of a plurality of first reference sampling points corresponding to the sample pixel in the first sampling point space, according to the second color coordinates of the sample pixel; a query module configured to determine color data of each of the plurality of first reference sampling points, according to the sampling coordinates of the first reference sampling point and the first color mapping table; and an interpolation module configured to acquire the sample output data through an interpolation algorithm, according to the color data of the plurality of first reference sampling points, wherein the training module is configured to update the color data in the initial color mapping table.
- 13 . An apparatus for correcting color, comprising: a second acquisition module configured to obtain first color coordinates of a target pixel in a first color space; a second conversion module configured to convert the first color coordinates of the target pixel to second color coordinates of the target pixel in a second color space; and a correction module configured to input the second color coordinates of the target pixel into a trained color correction model, to obtain target output data, wherein the trained color correction model is generated by the apparatus according to claim 12 , wherein the trained color correction model comprises: a first color mapping table configured to record sampling coordinates and color coordinates of each of a plurality of first sampling points in a first sampling point space; a reference point determination module configured to determine sampling coordinates of a plurality of first reference sampling points corresponding to the target pixel in a first sampling point space, according to the second color coordinates of the target pixel; a query module configured to determine color data of each of the plurality of first reference sampling points, according to the sampling coordinates of the first reference sampling point and the first color mapping table; and an interpolation module configured to acquire the target output data through an interpolation algorithm, according to color data of the plurality of first reference sampling points.
- 14 . A non-transitory computer readable medium storing a computer program which, when being executed by a processor, cause the processor to perform the method according to claim 1 .
- 15 . A display device, comprising: one or more processors; and a memory for storing one or more programs which, when being executed by the one or more processors, cause the one or more processors to perform the method according to claim 1 .
- 16 . A non-transitory computer readable medium storing a computer program which, when being executed by a processor, cause the processor to perform the method according to claim 9 .
- 17 . A display device, comprising: one or more processors; and a memory for storing one or more programs which, when being executed by the one or more processors, cause the one or more processors to perform the method according to claim 9 .
Description
TECHNICAL FIELD The present disclosure relates to the field of display technology, and in particular to a method of generating a color correction model, a method of correcting color, an apparatus for generating a color correction model, an apparatus for correcting color, a computer readable medium, and a display device. BACKGROUND With the development of image and video technologies, people are increasingly pursuing more perfect display effect of video images. Among many image characteristics, color is one of the most direct perceptual attributes of the human visual system, and has been deeply and widely studied. Color management is a necessary function of a high-end display. Screens with differences in display characteristics can achieve the same color expression as a standard image, after colors of the screens are corrected. SUMMARY The present disclosure provides a method of generating a color correction model, a method of correcting color, an apparatus for generating a color correction model, an apparatus for correcting color, a computer readable medium, and a display device. In a first aspect, an embodiment of the present disclosure provides a method of generating a color correction model, including: acquiring first color coordinates of a sample pixel in a first color space;converting the first color coordinates of the sample pixel to second color coordinates of the sample pixel in a second color space;inputting the second color coordinates of the sample pixel into an initial color correction model to generate sample output data; andtraining the initial color correction model according to the sample output data and theoretical output data corresponding to the first color coordinates, to obtain a trained color correction model, wherein the trained color correction model is configured to perform color correction on a target pixel in each of a plurality of video standards. In some embodiments, the initial color correction model includes: a first color mapping table configured to record sampling coordinates and color data of each of a plurality of first sampling points in a first sampling point space;a reference point determination module configured to determine sampling coordinates of a plurality of first reference sampling points corresponding to the sample pixel in the first sampling point space, according to the second color coordinates of the sample pixel;a query module configured to determine color data of each of the plurality of first reference sampling points, according to the sampling coordinates of the first reference sampling point and the first color mapping table; andan interpolation module configured to acquire the sample output data through an interpolation algorithm, according to the color data of the plurality of first reference sampling points,wherein the training the initial color correction model includes: updating the color data in the initial color mapping table. In some embodiments, the first sampling point space includes a plurality of first sampling cubes, each of the plurality of first sampling cubes is defined by eight first sampling points as vertices of the first sampling cube, and the first sampling cube is divided into a plurality of tetrahedrons; and the reference point determination module is configured to map the second color coordinates of the sample pixel into the first sampling point space, to obtain a first mapping point of the second color coordinates of the sample pixel in the first sampling point space; and taking each vertex of the tetrahedron where the first mapping point is located as the first reference sampling point, and determining sampling coordinates of the first reference sampling point. In some embodiments, the interpolation module is configured to acquire the sample output data according to the following formula: fˆ(p)=[f(p0)f(p1)f(p2)f(p3)][x0x1x2x3y0y1y2y3z0z1z2z31111]-1[xyz1],wherein p represents the first mapping point, and p0, p1, p2 and p3 represent the vertices of the tetrahedron where the first mapping point is located; {circumflex over (f)}(p) is the sample output data; f(p0), f(p1), f(p2), and f(p3) are the color data of the four vertices of the tetrahedron, respectively; (x0, y0, z0) are coordinates of the vertex p0 with respect to a first datum point; (x1, y1, z1) are coordinates of the vertex p1 with respect to the first datum point, (x2, y2, z2) are coordinates of the vertex p2 with respect to the first datum point, (x3, y3, z3) are coordinates of the vertex p3 with respect to the first datum point; and (x, y, z) are coordinates of the the first mapping point with respect to the first datum point, wherein the first datum point is one of the vertices of the first sampling cube where the first mapping point is located. In some embodiments, the first color space is a RGB space, and the second color space is an XYZ space. In some embodiments, the sample pixel is a pixel in a sample image conforming to a preset video standard; and the co