CN-121986485-A - Encoding/decoding method, code stream, encoder, decoder, and storage medium
Abstract
The application discloses a coding and decoding method, a code stream, an encoder, a decoder and a storage medium, wherein the method comprises the steps of determining texture characteristic indexes of a current block; the method comprises the steps of determining a transformation core group of a current block according to texture feature indexes, determining a transformation core of the current block according to the transformation core group, determining transformation coefficients of the current block, performing inverse transformation on the transformation coefficients of the current block according to the transformation core, and determining a residual block of the current block. In this way, the compression efficiency of inter prediction can be improved.
Inventors
- WANG FAN
Assignees
- OPPO广东移动通信有限公司
Dates
- Publication Date
- 20260505
- Application Date
- 20230928
Claims (20)
- A decoding method applied to a decoder, the method comprising: determining a texture feature index of the current block; determining a transformation core group of the current block according to the texture feature index; Determining a transformation core of the current block according to the transformation core group; And determining the transformation coefficient of the current block, and performing inverse transformation on the transformation coefficient of the current block according to the transformation check to determine a residual block of the current block.
- The method of claim 1, wherein the determining transform coefficients of the current block comprises: decoding a code stream, and determining a quantization coefficient of the current block; and dequantizing the quantized coefficients of the current block to determine the transformation coefficients of the current block.
- The method of claim 1, wherein the method further comprises: inter-predicting the current block to determine a predicted block of the current block; and determining a reconstruction block of the current block according to the prediction block of the current block and the residual block of the current block.
- The method of claim 1, wherein the determining the transform core of the current block from the set of transform cores comprises: Decoding a code stream, and determining the value of first grammar identification information of the current block; And when the first grammar identification information indicates that the current block uses a first transformation mode, determining a transformation core index of the current block, and determining a transformation core of the current block according to the transformation core group and the transformation core index.
- The method of claim 4, wherein the determining the transform core index of the current block when the first syntax identification information indicates that the current block uses a first transform mode comprises: If the current block uses an intra-frame prediction mode, decoding a code stream, and determining a transformation core index of the current block; and if the current block uses the inter prediction mode, determining a transformation core index of the current block according to the value of the first grammar identification information.
- The method of claim 1, wherein the determining the texture feature index of the current block comprises: determining candidate pixels for deriving the texture feature index; and determining the texture feature index of the current block according to the candidate pixels.
- The method of claim 6, wherein the determining the texture feature index of the current block from the candidate pixels comprises: Determining a horizontal gradient value and a vertical gradient value of the candidate pixel; Determining texture feature indexes and gradient strength values corresponding to the candidate pixels according to the horizontal gradient values and the vertical gradient values of the candidate pixels; Constructing a texture feature statistical table according to the texture feature index and the gradient strength value corresponding to the candidate pixel; and determining the texture feature index of the current block according to the texture feature statistical table.
- The method of claim 6, wherein the method further comprises: And determining the number of the candidate pixels according to the size parameter of the current block.
- The method of claim 6, wherein the method further comprises: determining a prediction block of the current block; at least part of pixels in the prediction block are taken as the candidate pixels.
- The method of claim 9, wherein the method further comprises: Determining neighboring pixels of a reconstructed region of the current block; And taking the adjacent pixels of the reconstructed region as the candidate pixels.
- The method of claim 10, wherein the method further comprises: and taking adjacent pixels of the reconstructed region and at least part of pixels in the prediction block as the candidate pixels.
- The method of claim 6, wherein the method further comprises: Determining a reference block of the current block; At least part of pixels in the reference block are taken as the candidate pixels.
- The method of claim 12, wherein the determining the reference block for the current block comprises: determining the reference block as a reference block of an integer pixel, or And determining the reference block as a reference block of sub-pixels.
- The method of claim 13, wherein when the reference block is a sub-pixel reference block, the determining the reference block of the current block comprises: determining two reference image blocks when the current block carries out bidirectional prediction; Performing pixel interpolation filtering on the two reference image blocks to determine the two pixel reference image blocks; And carrying out weighted combination on the two sub-pixel reference image blocks to determine the reference block of the current block.
- The method of claim 7, wherein the determining the texture feature index and the gradient strength value corresponding to the candidate pixel according to the horizontal gradient value and the vertical gradient value of the candidate pixel comprises: Performing angle mapping according to the horizontal gradient value and the vertical gradient value of the candidate pixel, and determining a texture feature index corresponding to the candidate pixel; And carrying out gradient intensity calculation according to the horizontal gradient value and the vertical gradient value of the candidate pixel, and determining the gradient intensity value corresponding to the candidate pixel.
- The method of claim 15, wherein the determining the texture feature index corresponding to the candidate pixel according to the angle mapping of the horizontal gradient value and the vertical gradient value of the candidate pixel comprises: And determining texture feature indexes corresponding to the candidate pixels by using a preset lookup table according to the horizontal gradient values and the vertical gradient values of the candidate pixels.
- The method of claim 15, wherein the calculating the gradient strength according to the horizontal gradient value and the vertical gradient value of the candidate pixel, and determining the gradient strength value corresponding to the candidate pixel includes: And carrying out addition operation on the absolute value of the horizontal gradient value and the absolute value of the vertical gradient value, and determining the gradient intensity value corresponding to the candidate pixel.
- The method of claim 7, wherein the constructing a texture feature statistics table from the texture feature index and the gradient strength value corresponding to the candidate pixel comprises: Determining at least one texture feature index and corresponding at least one gradient strength value when the number of candidate pixels is at least one; Determining at least one reference texture feature index with different characteristics according to the at least one texture feature index, and performing accumulation calculation on gradient strength values belonging to the same reference texture feature index according to the at least one gradient strength value to determine a gradient strength accumulated value corresponding to the at least one reference texture feature index; And constructing the texture feature statistical table according to the at least one reference texture feature index and the gradient strength accumulated value corresponding to the at least one reference texture feature index.
- The method of claim 18, wherein the determining the texture feature index of the current block from the texture feature statistics table comprises: Determining a maximum gradient strength accumulated value in the texture feature statistical table; And determining a reference texture feature index corresponding to the maximum gradient strength accumulated value as the texture feature index of the current block.
- The method of claim 18, wherein the determining the texture feature index of the current block from the texture feature statistics table comprises: Determining a maximum gradient strength accumulated value in the texture feature statistical table; And setting the texture feature index of the current block to be in a DC mode or PLANANR modes when the maximum gradient strength accumulated value is smaller than a first threshold.
Description
Encoding/decoding method, code stream, encoder, decoder, and storage medium Technical Field The embodiment of the application relates to the technical field of video encoding and decoding, in particular to an encoding and decoding method, a code stream, an encoder, a decoder and a storage medium. Background With the improvement of the requirements of people on the video display quality, high-definition, ultra-high-definition and other high-resolution videos are generated. However, high resolution video typically has more information and therefore requires more bandwidth. To reduce bandwidth requirements, video coding standards involving video compression have been introduced. Intra-frame prediction and inter-frame prediction are included in video coding standards. In the inter-frame prediction, the inter-frame coded block can usually find a better predicted block from a reference image, so that the residual error is less, but the transformation process of the inter-frame prediction is considered incompletely, which is not beneficial to processing texture features of various angles, so that the compression efficiency is low. Disclosure of Invention The embodiment of the application provides a coding and decoding method, a code stream, an encoder, a decoder and a storage medium, which can improve compression efficiency. The technical scheme of the embodiment of the application can be realized as follows: in a first aspect, an embodiment of the present application provides a decoding method, applied to a decoder, including: determining a texture feature index of the current block; Determining a transformation core group of the current block according to the texture feature index; determining a transformation core of the current block according to the transformation core group; And determining the transformation coefficient of the current block, and performing inverse transformation according to the transformation coefficient of the current block to determine the residual block of the current block. In a second aspect, an embodiment of the present application provides an encoding method, applied to an encoder, including: determining a texture feature index of the current block; Determining a transformation core group of the current block according to the texture feature index; determining a transformation core of the current block according to the transformation core group; determining a residual block of the current block, and transforming the residual block of the current block according to a transformation check to determine transformation coefficients of the current block; And coding the transformation coefficient of the current block, and writing the obtained coding bit into the code stream. In a third aspect, an embodiment of the present application provides a code stream, where the code stream is generated by bit encoding according to information to be encoded, where the information to be encoded includes at least one of a quantization coefficient of a current block, a feature index number of the current block, a value of first syntax identification information, a value of second syntax identification information, a value of third syntax identification information, and a value of fourth syntax identification information; the first grammar identification information is used for indicating whether the current block uses a first transformation mode and a transformation core index which is correspondingly used, the second grammar identification information is used for indicating whether the current sequence allows the first transformation mode to be used, the third grammar identification information is used for indicating whether the current image allows the first transformation mode to be used, and the fourth grammar identification information is used for indicating whether the current slice allows the first transformation mode to be used. In a fourth aspect, an embodiment of the present application provides an encoder, including a first determining unit, a transforming unit, and an encoding unit, wherein: A first determining unit configured to determine a texture feature index of the current block, determine a transform kernel group of the current block based on the texture feature index, and determine a transform kernel of the current block based on the transform kernel group; A transformation unit configured to determine a residual block of the current block, transform the residual block of the current block according to a transform check, and determine a transform coefficient of the current block; And the coding unit is configured to code the transformation coefficient of the current block and write the obtained coding bit into the code stream. In a fifth aspect, an embodiment of the present application provides an encoder, including a first memory and a first processor, wherein: a first memory for storing a computer program capable of running on the first processor; A first processor for performing the