Search

CN-122023600-A - Animation color correction optimization system and method based on artificial intelligence

CN122023600ACN 122023600 ACN122023600 ACN 122023600ACN-122023600-A

Abstract

The invention discloses an animation color correction optimization system and method based on artificial intelligence, and relates to the technical field of color correction optimization. The system comprises a color data acquisition module, an AI analysis model training module, a dynamic correction module, a style matching module and an effect verification module, wherein the method is used for training a color analysis model based on deep learning by acquiring color parameters and target style data of an animation frame, carrying out real-time color correction on light change and element motion characteristics in a dynamic scene, adjusting correction parameters in combination with the overall style of the animation, and ensuring consistency of color effects through comparison and verification. The invention realizes the automation and accurate correction of the animation colors, eliminates subjective errors of manual operation, improves the color adaptation effect under dynamic scenes, ensures the uniformity of the animation color styles, and is suitable for color optimization flows of various animation production.

Inventors

  • YANG HAOGANG

Assignees

  • 环球墨非(北京)科技有限公司

Dates

Publication Date
20260512
Application Date
20251202

Claims (10)

  1. 1. The animation color correction optimization method based on artificial intelligence is characterized by comprising the following steps of: s1, collecting frame color parameters and target style data of an animation to be corrected, wherein the frame color parameters comprise color gamut, color gradation, brightness, color distribution characteristics and time sequence fluctuation characteristics, and the target style data comprise reference style images, style color matrixes and style characteristic vectors; S2, training the training data set, inputting the training data set according to a frame time sequence, and learning the time sequence change rule of frame color parameters and the adaptation relation between the dynamic scene characteristics and the target style characteristics in a linkage way; extracting the migratable characteristics in the target style data, establishing a style adaptation rule, optimizing model parameters through a multidimensional loss function, constructing an adaptation rule base in which a dynamic correction rule and a style matching rule are mutually associated, enabling the model to form a corresponding mapping relation of frame color parameters, dynamic scene characteristics and target style characteristics, and completing color analysis model training; s3, extracting and analyzing element motion characteristics in the dynamic scene of the animation based on the trained color analysis model, and generating real-time correction parameters by combining real-time light change detection data; S4, comparing and checking the correction result with target style data and preset color consistency standards by adopting double indexes of color difference and style similarity, and generating a feedback signal if the preset requirement is not met, and driving a model to iteratively optimize correction parameters in an adaptation rule base until the color consistency and style consistency requirements are met.
  2. 2. The method for optimizing color correction of an artificial intelligence based animation according to claim 1, wherein S1 comprises the following steps: Analyzing the animation to be corrected frame by frame, extracting color gamut, color gradation, brightness and color distribution characteristics of each frame, calculating time sequence fluctuation characteristics based on color parameter sequences of continuous N frames, wherein the time sequence fluctuation characteristics comprise inter-frame color parameter variation amplitude and variation frequency, the variation amplitude is calculated by a formula delta P= |P_i-P_ (i-1) |, P_i is a target color parameter value of the ith frame, P_ (i-1) is a corresponding color parameter value of the ith-1 frame, the variation frequency is calculated by a formula F=M/T, M is the number of frames in unit time, the variation amplitude delta P exceeds a preset fluctuation threshold value, and T is the unit time length; The method comprises the steps of collecting a reference style image, extracting a style color matrix through color space conversion, generating a style feature vector based on a color distribution statistical method, wherein the components of the style feature vector are the mean value and the variance of each color channel in the reference style image; Carrying out standardized pretreatment on the obtained time sequence frame color parameters and target style data, and normalizing the frame color parameters by adopting a min-max normalization formula, wherein the min-max normalization formula is that P_norm= (P-P_min)/(P_max-P_min), P is an original parameter value, P_min is a minimum value of corresponding parameters, and P_max is a maximum value of corresponding parameters; Dividing the color parameters of the preprocessed time sequence frame and the target style data into a training data set and a checking data set according to a preset proportion, and keeping the time sequence association of the frame in the training data set and the checking data set.
  3. 3. The method for optimizing color correction of an artificial intelligence based animation according to claim 2, wherein S2 comprises the following steps: Inputting a training data set into an artificial intelligent model according to a frame time sequence order, constructing triple feature association dimensions to realize linkage learning, wherein the specific implementation process of each dimension is as follows: Establishing time relevance between a time sequence change rule of frame color parameters and dynamic scene characteristics, using continuous M frames as a time window, mapping time sequence fluctuation characteristics of the frame color parameters in each window with dynamic scene characteristics corresponding to the window one by one, forming a characteristic vector of combining color time sequence and scene dynamic by characteristic splicing, and relating color fluctuation and scene dynamic change under the same time dimension; establishing the adaptation priority association of the dynamic scene feature and the target style feature, and calculating a dynamic scene feature intensity value Sd, wherein the specific formula is Sd=w1×vm+w2×DeltaL, wherein Vm is the element moving average speed, deltaL is the light intensity change rate, w1 and w2 are preset weight coefficients, and w1+w2=1; establishing the mapping weight association of the frame color parameter and the target style characteristic, calculating the cosine similarity Sc of the frame color parameter vector and the target style characteristic vector, and sc= (x.y)/(|x|×|y|), where X is the frame color parameter vector, Y is a target style feature vector, and style adaptation weights corresponding to various color parameters are obtained through Softmax function normalization processing based on cosine similarity Sc, and the sum of the weights is 1, so that the adaptation duty ratio is distributed according to the feature similarity; Extracting the migratable features in the target style data by adopting a contrast learning anchor point matching method, taking the target style feature vector as an anchor point, calculating cosine similarity S of each frame color feature and the anchor point in the training data set, and establishing a style adaptation rule based on the migratable features, wherein S= (A.B)/(|A|×|B|), A is the frame color feature vector, B is the style anchor point feature vector; Defining a multi-dimensional loss function optimization model parameter, wherein the multi-dimensional loss function expression is L_total=alpha×L_color+beta×L_style+gamma×L_temporal, wherein L_color is a color difference loss index, L_color= (1/D) sigma|C_i-C_target|, D is a color parameter dimension number, C_i is an i-th frame color parameter outputted by the model, C_target is a target style color parameter, L_style is a style similarity loss index, L_style=1-S, L_temporal is a time sequence consistency loss index, L_temporal= (1/(D× (N-1))) sigma|C_i-C_ (i-1) |, N is a continuous frame number, C_i is an i-th frame output color parameter, C_ (i-1) is an i-th frame output color parameter, alpha, beta and gamma are preset weight coefficients, and alpha+gamma=1+gamma+gamma; The method comprises the steps of establishing an adaptive rule base with interrelated dynamic correction rules and style matching rules based on mapping relations learned by a model, associating corresponding dynamic correction parameter intervals and style adjustment coefficients by taking style feature vectors as index keys to form mapping tables of style features, dynamic correction rules and style matching rules, and continuing iterative optimization of model parameters until a multidimensional loss function L_total converges to a preset threshold L0 to complete color analysis model training.
  4. 4. The method for optimizing color correction of an artificial intelligence based animation according to claim 3, wherein S3 comprises the following steps: Extracting features of real-time frames of an animation dynamic scene, and calculating element motion features by an inter-frame difference method, wherein the element motion features comprise element motion average speed vm_real and motion track complexity Cm, wherein vm_real= sigma (Di/deltat)/M, di is pixel displacement of unit elements in two adjacent frames, deltat is frame interval time, M is the number of motion elements in the scene, cm= sigma (Si/S_total) x Ei, si is a curvature integral value of a unit element motion track, S_total is curvature integral sum of all element tracks, and Ei is visual weight of the elements in the scene; Collecting real-time light change detection data through a color sensor, calculating light intensity L_real and light change gradient G_L, wherein G_L= |L_real-L_prev|/deltat, L_prev is the light intensity of the previous frame, adopting min-max normalization processing to vm_real, cm and G_L to convert the vm_real, cm and G_L into dimensionless characteristic values of [0,1] intervals, dynamically distributing real-time weight coefficients w1_real, w2_real and w3_real based on the contribution ratio of motion, light and track complexity of model learning in S2 to color distortion, and satisfying w1_real+w2_real+w3_real=1, and updating dynamic scene characteristic intensity Sd_real, wherein the calculation formula is Sd_real=w1_real×vmreal+w2_real×g3+real×Cm; Extracting color feature vectors of a real-time frame, calculating cosine similarity Sc_real between the color feature vectors and the overall style feature vectors of the animation, determining a dynamic correction parameter interval according to sd_real based on a style adjustment coefficient K_style corresponding to the Sc_real matching adaptation rule base, and calling a basic correction parameter P_base from the interval; The dynamic color mapping matrix M_color is constructed, wherein the formula is M_color=K_style×M_style_ref+Sd_real×M_dynamic, wherein M_style_ref is a target style color matrix, and M_dynamic is a dynamic correction matrix generated based on P_base; performing matrix operation by combining the frame color parameters and M_color to obtain a preliminary correction color parameter C_temp; Introducing time sequence consistency constraint, calculating a smoothing coefficient lambda of correction parameters of a current frame and a previous frame, wherein lambda=1/(1+e (-|C_temp-C_prev_corr|) ), C_prev_corr is a final correction parameter of the previous frame, optimizing the final correction parameter by a formula C_final=lambda×C_temp+ (1-lambda) x C_prev_corr to realize dynamic scene color adaptation correction, outputting the final correction parameter of the current frame, and synchronously feeding back real-time characteristic data and correction parameters to an adaptation rule base.
  5. 5. The method for optimizing color correction of an artificial intelligence based animation according to claim 4, wherein S4 comprises the following steps: extracting a final color parameter C_final and a corresponding style feature vector Y_corr of the corrected animation frame, synchronously calling target style data, a style feature vector Y_target and a preset color consistency standard in S1, quantifying the difference between the C_final and the target style color parameter C_target by adopting Euclidean distance to obtain a color difference index D_color; The method comprises the steps of presetting a color difference threshold value D0_color and a style similarity threshold value S0_style, judging that a correction result meets the standard if D_color is less than or equal to D0_color and S_style is more than or equal to S0_style, judging that the correction result does not meet the preset requirement if only a single index meets the standard or neither of the two indexes meets the standard, generating color parameter adjustment feedback, correcting a corresponding dynamic correction parameter interval in an adaptation rule base based on delta C=k1× (C_target-C_final), wherein K1 is a color correction coefficient, generating style weight adjustment feedback if S_style is less than or equal to D0_style, optimizing a style adjustment coefficient K_style by a calculation formula delta K=k2× (1-S_style), and K2 is a style adaptation coefficient, synchronously executing color parameter adjustment and wind weight adjustment if both the D_color parameter adjustment and the style are not met, and repeatedly carrying out iterative correction according to the preset parameter adjustment rule until the S_style weight is greater than or equal to the preset parameter, and repeatedly updating the priority value D_4 when the S_style weight adjustment rule is greater than or equal to the preset parameter is greater than or equal to the priority, and the S_weight adjustment rule is greater than or equal to the S_0.
  6. 6. An artificial intelligence based animation color correction optimization system applied to the artificial intelligence based animation color correction optimization method of any one of claims 1-5 is characterized in that the system comprises a color data acquisition module, an AI analysis model training module, a dynamic correction module, a style matching module and an effect verification module; the color data acquisition module is used for acquiring frame color parameters and target style data of the animation to be corrected, carrying out standardized preprocessing on the acquired data, and constructing a training data set and a verification data set which comprise time sequence frame sequences; the AI analysis model training module is used for receiving the preprocessed training data set, establishing an adaptation relation among multiple features through linkage learning, extracting style movable features, establishing an adaptation rule, and constructing an adaptation rule base after multi-dimensional loss function optimization to complete color analysis model training; The dynamic correction module is used for extracting the characteristics of the dynamic scene of the animation and real-time light change data, generating real-time correction parameters based on the color analysis model, and realizing color adaptation correction of the dynamic scene by combining time sequence consistency constraint; the style matching module is used for calculating the matching degree of the real-time frame color characteristics and the target style characteristics, calling a style matching rule in the adaptation rule base, and adjusting the color mapping matrix to ensure style uniformity; The effect verification module is used for verifying the correction result by adopting double indexes of color difference and style similarity, generating a feedback signal aiming at the condition that the color difference and style similarity are not up to standard, optimizing the parameters of the adaptive rule base, and driving iterative correction until the preset requirement is met.
  7. 7. The artificial intelligence based animated color correction optimization system of claim 6, wherein said color data acquisition module comprises a data acquisition unit and a data preprocessing unit; the data acquisition unit is used for analyzing the animation to be corrected frame by frame to extract frame color parameters, acquiring a reference style image, extracting a style color matrix and a style feature vector, and integrating to form target style data; the data preprocessing unit is used for carrying out normalization, outlier rejection and time sequence alignment processing on frame color parameters and target style data, dividing a training data set and a checking data set according to a preset proportion, and reserving time sequence association of frames.
  8. 8. The artificial intelligence-based animated color correction optimization system of claim 6, wherein the AI analysis model training module comprises a feature correlation learning unit and a model optimization and rule base construction unit; The feature association learning unit is used for inputting a training data set according to the sequence of the frame time sequence, constructing triple association dimensionalities of the frame color parameter time sequence change and the dynamic scene feature, the dynamic scene feature and the target style feature and the frame color parameter and the target style feature, and realizing linkage learning; the model optimization and rule base construction unit is used for extracting the style transferable characteristics by adopting contrast learning and establishing a style adaptation rule, optimizing model parameters through a multidimensional loss function and constructing an adaptation rule base with the interrelation of the dynamic correction rule and the style matching rule until the model converges.
  9. 9. The artificial intelligence based animated color correction optimization system of claim 6, wherein said dynamic correction module comprises a real-time feature extraction unit and a color correction execution unit; the real-time feature extraction unit is used for extracting the motion features of elements in the dynamic scene of the animation, collecting real-time ray change data and calculating related feature indexes, dynamically distributing weights after normalization processing and updating the feature intensity of the dynamic scene; the color correction execution unit is used for calling basic correction parameters based on the dynamic scene characteristic intensity, generating final correction parameters by combining time sequence smoothness constraint, and synchronously feeding back real-time characteristic data and the correction parameters to the adaptation rule base.
  10. 10. The artificial intelligence based animated color correction optimizing system of claim 6, wherein the effect verification module comprises a double index verification unit and a parameter optimization feedback unit; The dual index verification unit is used for extracting corrected final color parameters and style characteristic vectors, calculating color difference indexes and style similarity indexes, and comparing preset thresholds to judge whether the corrected results reach standards; the parameter optimization feedback unit is used for adjusting dynamic correction parameters or style adjustment coefficients according to the substandard situation, distributing optimization priorities according to the characteristic intensity of the dynamic scene, updating the adaptation rule base and driving the dynamic correction module to return for iteration until the continuous preset frame number reaches the standard.

Description

Animation color correction optimization system and method based on artificial intelligence Technical Field The invention relates to the technical field of color correction optimization, in particular to an animation color correction optimization system and method based on artificial intelligence. Background Animation color is a key technical element in animation production, the visual uniformity and parameter coordination of the animation color directly affect the final presentation quality, scene rendering effect and audience visual experience of the work, and the animation color is a core link for guaranteeing the uniform realization of animation content technology and artistic expression. The traditional animation color correction is mainly realized by manually adjusting parameters such as color gradation, color gamut and the like according to subjective experience of professional technicians, has low efficiency, lacks uniform correction standards, is easy to cause color deviation of different frames and different manufacturing links due to individual aesthetic differences, greatly prolongs the manufacturing period and increases the reworking cost. In the prior art, although some digital correction tools attempt to simplify operations, they are designed for static images, and it is difficult to adapt to the complex requirements of dynamic scenes of animation. For example, the problems of light intensity fluctuation, color distortion and transition faults are easy to cause by rapid movement of characters and scene elements in a dynamic scene, the existing tools lack real-time response and dynamic compensation capability, meanwhile, the existing schemes do not establish a global style matching mechanism, single-frame correction results possibly crack with the whole art style of the animation, an accurate effect verification system is lacking, color consistency is difficult to quantify, and the color consistency of the works is insufficient. Therefore, there is a need for an animation color processing technology that can break through the manual dependency, realize dynamic scene adaptation, unified style and automatic accurate correction, and solve the bottleneck in the prior art. Disclosure of Invention The invention aims to provide an animation color correction optimization system and method based on artificial intelligence so as to solve the problems in the background technology. In order to solve the technical problems, the invention provides the following technical scheme: The animation color correction optimizing method based on artificial intelligence comprises the following steps: s1, collecting frame color parameters and target style data of an animation to be corrected, wherein the frame color parameters comprise color gamut, color gradation, brightness, color distribution characteristics and time sequence fluctuation characteristics, and the target style data comprise reference style images, style color matrixes and style characteristic vectors; S2, training the training data set, inputting the training data set according to a frame time sequence, and learning the time sequence change rule of frame color parameters and the adaptation relation between the dynamic scene characteristics and the target style characteristics in a linkage way; extracting the migratable characteristics in the target style data, establishing a style adaptation rule, optimizing model parameters through a multidimensional loss function, constructing an adaptation rule base in which a dynamic correction rule and a style matching rule are mutually associated, enabling the model to form a corresponding mapping relation of frame color parameters, dynamic scene characteristics and target style characteristics, and completing color analysis model training; s3, extracting and analyzing element motion characteristics in the dynamic scene of the animation based on the trained color analysis model, and generating real-time correction parameters by combining real-time light change detection data; S4, comparing and checking the correction result with target style data and preset color consistency standards by adopting double indexes of color difference and style similarity, and generating a feedback signal if the preset requirement is not met, and driving a model to iteratively optimize correction parameters in an adaptation rule base until the color consistency and style consistency requirements are met. Further, S1 includes the following: Analyzing the animation to be corrected frame by frame, extracting color gamut, color gradation, brightness and color distribution characteristics of each frame, calculating time sequence fluctuation characteristics based on color parameter sequences of continuous N frames, wherein the time sequence fluctuation characteristics comprise inter-frame color parameter variation amplitude and variation frequency, the variation amplitude is calculated by a formula delta P= |P_i-P_ (i-1) |, P_i is a target