CN-122029819-A - Method, apparatus and medium for video processing
Abstract
Embodiments of the present disclosure provide a solution for video processing. A method for video processing is presented. The method includes applying a filtering method to prediction samples associated with a video unit for conversion between the video unit and a bitstream of the video, and performing conversion based on the filtered prediction samples.
Inventors
- YIN WENBIN
- ZHANG KAI
- WANG YANG
- ZHANG LI
Assignees
- 抖音视界有限公司
- 字节跳动有限公司
Dates
- Publication Date
- 20260512
- Application Date
- 20241010
- Priority Date
- 20231011
Claims (20)
- 1. A method for video processing, comprising: Applying a filtering method to prediction samples associated with a video unit of said video for conversion between said video unit and a bitstream of said video, and The conversion is performed based on the filtered prediction samples.
- 2. The method of claim 1, wherein the filtering method comprises at least one of a Bilateral Filtering (BF) method or a deblocking filtering method.
- 3. The method of claim 1, wherein the prediction samples are included in one of the video unit, a Coding Unit (CU), a Prediction Unit (PU), a Transform Unit (TU), a block, a region, a picture, a slice, or a slice.
- 4. The method according to claim 1 or 2, wherein at least one of the prediction samples or pixels is filtered or modified by the BF method with decoded information and/or with statistical information.
- 5. The method of claim 4, wherein the filtered result of the BF method is determined by the following equation: Wherein the method comprises the steps of Represents one of an updated or modified luminance sample, an updated or modified luminance pixel, an updated or modified chrominance sample, or an updated or modified chrominance pixel; representing one of an unmodified luminance sample, an unmodified luminance pixel, an unmodified chrominance sample, or an unmodified chrominance pixel located at the center of the filtered shape; representing the difference between the corresponding reference sample point and the unmodified center sample point; Representing the sum of the vertical distance between the reference sample point and the center sample point and the horizontal distance between the reference sample point and the center sample point; representing a total number of samples within the filter shape; representing the intensity factor, and Representing a function of the filter weights used to determine each location in the filter shape.
- 6. The method of claim 5, wherein the function Is included in the look-up table.
- 7. The method of claim 5, wherein the function The following formula is expressed: Wherein the method comprises the steps of Representing parameters for filtering, and Another parameter for filtering is represented.
- 8. The method of claim 5, wherein one of the unified parameter or unified look-up table is used by different points or different locations inside the intra reference sample.
- 9. The method of claim 5, wherein one of the different parameters or look-up tables is used by a different point or different location inside the intra reference sample.
- 10. The method according to claim 5, wherein the parameter for filtering is a predetermined parameter, or Wherein the parameters for filtering are the parameters to be searched for, or Wherein the parameters for filtering are determined in real time, or Wherein the parameters for filtering are signaled in the bitstream.
- 11. The method of claim 7 or 10, wherein the parameters for filtering comprise And/or 。
- 12. The method of claim 11, wherein the parameter is determined based on at least one of a codec mode of the video unit, a size of the video unit, or other decoded information of the video unit.
- 13. The method of claim 11, wherein the parameter is signaled from an encoder to a decoder.
- 14. The method of claim 5, wherein the intensity factor is a predetermined parameter, or Wherein the intensity factor is the parameter being searched for, or Wherein the intensity factor is determined in real time, or Wherein the strength factor is signaled in the bitstream.
- 15. The method of claim 14, wherein the strength factor is determined based on at least one of a codec mode of the video unit, a size of the video unit, or other decoded information of the video unit.
- 16. The method of claim 15, wherein the intensity factor is determined based on a brightness size of the video unit.
- 17. The method of claim 16, wherein the intensity factor is based on Is determined in which Representing the brightness width of the video unit, an Representing the brightness level of the video unit.
- 18. The method of claim 16, wherein the intensity factor is based on Multiplied by Is determined in which Representing the brightness width of the video unit, an Representing the brightness level of the video unit.
- 19. The method of claim 15, wherein the intensity factor is determined based on a chroma size of the video unit.
- 20. The method of claim 19, wherein the intensity factor is based on Is determined in which Representing the chrominance width of the video unit, an Representing the chrominance height of the video unit.
Description
Method, apparatus and medium for video processing Technical Field Embodiments of the present disclosure relate generally to video processing technology and, more particularly, to a bilateral filter of prediction samples in video codec. Background Today, digital video capabilities are being applied to various aspects of a person's life. Various types of video compression techniques have been proposed for video encoding/decoding, such as the MPEG-2, MPEG-4, ITU-T H.263, ITU-T H.264/MPEG-4 part 10 Advanced Video Codec (AVC), ITU-T H.265 High Efficiency Video Codec (HEVC) standard, the multifunctional video codec (VVC) standard. However, the codec efficiency of video codec technology is generally expected to be further improved. Disclosure of Invention Embodiments of the present disclosure provide a solution for video processing. In a first aspect, a method for video processing is presented. The method includes applying a filtering method to prediction samples associated with a video unit for conversion between the video unit and a bitstream of the video, and performing conversion based on the filtered prediction samples. The method according to the first aspect of the present disclosure advantageously improves codec efficiency and performance by applying a filtering method compared to conventional solutions. In a second aspect, an apparatus for video processing is presented. The apparatus includes a processor and a non-transitory memory having instructions thereon. The instructions, when executed by a processor, cause the processor to perform a method according to the first aspect of the present disclosure. In a third aspect, a non-transitory computer readable storage medium is presented. The non-transitory computer readable storage medium stores instructions that cause a processor to perform a method according to the first aspect of the present disclosure. In a fourth aspect, another non-transitory computer readable recording medium is presented. The non-transitory computer readable recording medium stores a bitstream of video generated by a method performed by a video processing apparatus. The method includes applying a filtering method to prediction samples associated with a video unit of video and generating a bitstream based on the filtered prediction samples. In a fifth aspect, a method for storing a bitstream of video is presented. The method includes applying a filtering method to prediction samples associated with a video unit of a video, generating a bitstream based on the filtered prediction samples, and storing the bitstream in a non-transitory computer readable recording medium. This summary is intended to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Drawings The above and other objects, features and advantages of the exemplary embodiments of the present disclosure will become more apparent by the following detailed description with reference to the accompanying drawings. In example embodiments of the present disclosure, like reference numerals generally refer to like elements. FIG. 1 illustrates a block diagram of an example video codec system according to some embodiments of the present disclosure; fig. 2 illustrates a block diagram of a first example video encoder, according to some embodiments of the present disclosure; fig. 3 illustrates a block diagram of an example video decoder, according to some embodiments of the present disclosure; FIG. 4 illustrates nominal vertical and horizontal positions of 4:2:2 luminance and chrominance samples in a picture; FIG. 5 shows an example of a block diagram of an encoder; Fig. 6 shows a picture with 18 x 12 luma Codec Tree Units (CTUs) partitioned into 12 slices and 3 raster scan stripes; FIG. 7 shows a picture with 18×12 luma CTUs partitioned into 24 tiles and 9 rectangular stripes; fig. 8 shows a picture divided into 4 pieces, 11 bricks and 4 rectangular strips; fig. 9A shows CTS across the bottom boundary of a picture; Fig. 9B shows CTS across the right boundary of the picture; Fig. 9C shows CTBs across the lower right boundary of a picture; Fig. 10 shows 67 intra prediction modes; FIG. 11 shows a schematic of picture samples and horizontal and vertical block boundaries on an 8X 8 grid, and non-overlapping blocks of 8X 8 samples; fig. 12 shows pixels related to filter on/off decision and strong/weak filter switching; Fig. 13A to 13C show filter shapes of an Adaptive Loop Filter (ALF), respectively; FIGS. 14A-14C show the relative coordinates supported by a 5X 5 diamond filter, respectively; FIG. 15 shows an example of relative coordinates supported by a 5X 5 diamond filter; FIG. 16 illustrates a flow chart of a method for video processing according to an embodiment of the present disclosure; FIG.