Search

CN-122020318-A - Eye pattern degradation compensation method, eye pattern degradation compensation device, storage medium, and electronic apparatus

CN122020318ACN 122020318 ACN122020318 ACN 122020318ACN-122020318-A

Abstract

The application relates to an eye diagram degradation compensation method, a compensation device, a storage medium and electronic equipment, wherein the method comprises the steps of collecting various parameter data and an eye diagram of a display device, wherein the various parameter data are related to the quality of the eye diagram, and extracting characteristics of the various parameter data and the eye diagram to obtain a fusion characteristic vector; the fusion feature vector is input into a degradation traceability model to obtain an eye degradation cause and a compensation strategy, the compensation strategy is executed to reduce the eye degradation degree, and the degradation traceability model is constructed based on the GBDT model, so that the method solves the problems that in the prior art, whether an eye is degraded or not can only be judged, but the eye degradation cause cannot be traced and the compensation strategy cannot be determined.

Inventors

  • YANG WENWU
  • XU PEI

Assignees

  • 惠科股份有限公司

Dates

Publication Date
20260512
Application Date
20260331

Claims (10)

  1. 1. An eye degradation compensation method, the method comprising: collecting various parameter data of a display device and an eye diagram, wherein the various parameter data are related to the quality of the eye diagram: Extracting features of the multiple parameter data and the eye pattern to obtain a fusion feature vector; And inputting the fusion feature vector into a degradation traceability model to obtain an eye degradation cause and a compensation strategy, and executing the compensation strategy to reduce the eye degradation degree, wherein the degradation traceability model is constructed based on a GBDT model.
  2. 2. The method of claim 1, wherein the plurality of parameter data at least includes a model number of a display panel, a specification of a cable, an error rate of a transmission link, a power supply voltage and a temperature of a chip on a driving board, an ambient temperature, and an ambient humidity, and performing feature extraction on the plurality of parameter data and the eye pattern to obtain a fusion feature vector, including: Coding the non-numerical parameter data to obtain a static characteristic vector; Carrying out standardization processing on the numerical value type parameter data to obtain standard parameter data, and carrying out standardization processing on the eye pattern to obtain a standard eye pattern; Carrying out statistical processing on the standard parameter data of the preset duration to obtain a dynamic environment feature vector, wherein the statistical processing at least comprises average value taking, square error taking and extremum taking; Calculating the product of each element in the static feature vector and the corresponding weight coefficient to obtain a weighted static feature vector, calculating the product of each element in the dynamic environment feature vector and the corresponding weight coefficient to obtain a weighted dynamic environment feature vector, and calculating the product of each element in the standard eye diagram and the corresponding weight coefficient to obtain a weight eye diagram, wherein the weight coefficient is determined based on the quality correlation degree of the eye diagram; Inputting the weighted static feature vector and the weighted dynamic environment feature vector into a trained first transducer encoder to obtain a weighted time sequence feature vector; Inputting the weighted eye pattern into a trained first CNN network to obtain a weighted eye pattern feature vector; and inputting the weighted time sequence feature vector and the weighted eye pattern feature vector into a trained full connection layer to obtain the fusion feature vector.
  3. 3. The method of claim 2, wherein inputting the fused feature vector into a degradation traceability model results in an eye degradation cause and compensation strategy, and wherein executing the compensation strategy comprises: inputting the static feature vector and the dynamic environment feature vector into a trained second transducer encoder to obtain a time sequence feature vector; Inputting the standard eye pattern into a trained second CNN network to obtain an eye pattern feature vector; And inputting the time sequence feature vector, the eye pattern feature vector and the fusion feature vector into the degradation traceability model to obtain the eye pattern degradation cause and the compensation strategy.
  4. 4. The method of claim 1, wherein after collecting the plurality of parameter data and the eye diagram of the display device, before performing feature extraction on the plurality of parameter data and the eye diagram to obtain the fused feature vector, the method comprises: for each type of parameter data, under the condition that the sampling frequency of the parameter data is smaller than a preset frequency, processing the parameter data by adopting a linear interpolation method to enable the storage frequency of the parameter data to be equal to the preset frequency; and processing the eye diagram by adopting the linear interpolation method to ensure that the storage frequency of the eye diagram is equal to the preset frequency.
  5. 5. The method of claim 1, wherein after feature extraction of the plurality of parameter data and the eye pattern to obtain a fused feature vector, the method further comprises: inputting the fusion feature vector of the historical period into a trained quality prediction model to obtain an eye pattern score of a future period, wherein the quality prediction model consists of a multi-layer Bi-LSTM structure; and under the condition that the eye marks at a plurality of continuous moments in the future period are smaller than a preset eye mark score, inputting the fusion feature vector into the degradation traceability model to obtain the eye mark degradation cause and the compensation strategy.
  6. 6. The method of claim 5, wherein after executing the compensation strategy, the method further comprises: the method comprises the steps of obtaining a plurality of eye quality representation parameters, wherein the eye quality representation parameters at least comprise eye scores and error rates of transmission links; an adjustment step of adjusting adjustment values of compensation parameters in the compensation strategy based on a preset percentage under the condition that the eye pattern quality characterization parameters are all located in the corresponding threshold ranges; And repeating the steps, and sequentially executing the acquisition step and the adjustment step for M times or stopping until the eye pattern quality characterization parameters are all positioned in the corresponding threshold range, wherein M is a positive integer.
  7. 7. The method of claim 1, wherein executing the compensation strategy comprises: Acquiring a level table of a current scene, wherein the level table comprises levels of priority of eye degradation causes; and executing a compensation strategy corresponding to each eye pattern degradation cause according to the order of the priority of the eye pattern degradation cause in the level table of the current scene from high to low.
  8. 8. An eye degradation compensation device, the device comprising: the system comprises an acquisition unit, a display unit and a display unit, wherein the acquisition unit is used for acquiring various parameter data and an eye diagram of the display unit, and the various parameter data are related to the quality of the eye diagram: The feature extraction unit is used for carrying out feature extraction on the various parameter data and the eye pattern to obtain a fusion feature vector; And the tracing unit is used for inputting the fusion feature vector into a degradation tracing model to obtain an eye degradation cause and a compensation strategy, and executing the compensation strategy to reduce the eye degradation degree, wherein the degradation tracing model is constructed based on the GBDT model.
  9. 9. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the eye degradation compensation method according to any one of claims 1 to 7.
  10. 10. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, which when executed by the processor implements the steps of the eye degradation compensation method of any one of claims 1 to 7.

Description

Eye pattern degradation compensation method, eye pattern degradation compensation device, storage medium, and electronic apparatus Technical Field The present application relates to the field of display technologies, and in particular, to an eye degradation compensation method, an eye degradation compensation device, a storage medium, and an electronic apparatus. Background The eye diagram can reflect the display quality of the display device, the eye diagram is a graph displayed by accumulating a series of digital signals transmitted to the source driver by the driving board on the oscilloscope, the eye diagram can reflect the signal integrity transmitted to the source driver by the driving board, the signal integrity transmitted to the source driver by the driving board directly determines whether the source driver can provide accurate driving voltage for the pixel array or not, and further determines the quality degree of the display quality of the display device, as the refresh rate of the display screen is increased to 500Hz+, the transmission rate breaks through 48Gbps, and the problem of eye diagram degradation (the signal integrity transmitted to the source driver by the driving board is reduced) becomes a core bottleneck for restricting the display quality of the display device. However, in the prior art, only whether the eye pattern is degraded or not can be judged, but the source eye pattern degradation cause cannot be traced, and the compensation strategy cannot be determined. Disclosure of Invention The application provides an eye diagram degradation compensation method, an eye diagram degradation compensation device, a computer readable storage medium and electronic equipment, which are used for solving the problems that whether an eye diagram is degraded or not can only be judged, but the source eye diagram degradation cause can not be traced and a compensation strategy can not be determined in the prior art. The application provides an eye diagram degradation compensation method, which comprises the steps of collecting various parameter data of a display device and an eye diagram, wherein the various parameter data are related to the quality of the eye diagram, extracting features of the various parameter data and the eye diagram to obtain a fusion feature vector, inputting the fusion feature vector into a degradation tracing model to obtain an eye diagram degradation cause and a compensation strategy, executing the compensation strategy to reduce the eye diagram degradation degree, and constructing the degradation tracing model based on GBDT models. The method comprises the steps of selecting a plurality of parameter data, wherein the plurality of parameter data at least comprises a model of a display panel, a specification of a cable, an error rate of a transmission link, a power supply voltage and temperature of a chip on a driving board, an ambient temperature and an ambient humidity, conducting feature extraction on the plurality of parameter data and an eye diagram to obtain a fusion feature vector, the method comprises the steps of conducting coding processing on the parameter data of a non-numerical type to obtain a static feature vector, conducting standardization processing on the parameter data of a numerical type to obtain standard parameter data, conducting standardization processing on the eye diagram to obtain standard eye diagram, conducting statistics processing on the standard parameter data of a preset duration to obtain a dynamic eye diagram environment feature vector, the statistics processing at least comprises taking an average value, taking a square error, taking an extreme value, calculating the product of each element in the static feature vector and a corresponding weight coefficient of the eye diagram to obtain a weighted static feature vector, calculating the product of each element in the dynamic environment feature vector and the eye diagram to obtain a weighted feature vector, conducting standardization processing on the parameter data of the numerical type to obtain a standard parameter data, conducting standardization processing on the standard parameter data to obtain a standard eye diagram, conducting statistics processing on the standard parameter data, conducting statistics processing on the standard time-sequence feature vector, obtaining a training layer, and inputting the weighted training feature vector to the training layer, and the training layer is obtained by weighting the training feature vector. Optionally, the fusion feature vector is input into a degradation traceability model to obtain an eye degradation cause and a compensation strategy, and the compensation strategy is executed, wherein the method comprises the steps of inputting the static feature vector and the dynamic environment feature vector into a trained second transducer encoder to obtain a time sequence feature vector, inputting the standard eye pattern into a trained second CNN