Search

US-12627806-B2 - Adaptive chroma intra mode coding in video compression

US12627806B2US 12627806 B2US12627806 B2US 12627806B2US-12627806-B2

Abstract

An example method of video decoding includes receiving a video bitstream that comprises a chroma component and a luma component. The method further includes identifying a context based on a nominal angle corresponding to the chroma component, and entropy-decoding an indicator from the video bitstream according to the context, the indicator corresponding to a delta angle for the chroma component. The method also includes decoding video data from the video bitstream using the delta angle.

Inventors

  • Liang Zhao
  • Xin Zhao
  • Shan Liu

Assignees

  • Tencent America LLC

Dates

Publication Date
20260512
Application Date
20240621

Claims (20)

  1. 1 . A method of video decoding, comprising: receiving a video bitstream that comprises a chroma component and a luma component; identifying a context based on a nominal angle corresponding to the luma component; entropy-decoding an indicator from the video bitstream according to the context based on the nominal angle corresponding to the luma component, the indicator indicating a delta angle for the chroma component; and decoding video data from the video bitstream by using the delta angle to decode the chroma component.
  2. 2 . The method of claim 1 , wherein the context is further based on a second nominal angle for the chroma component.
  3. 3 . The method of claim 2 , wherein: when the nominal angle for the chroma component is within a threshold distance of the second nominal angle for the luma component, a first context is used as the context; and when the nominal angle for the chroma component is not within the threshold distance of the second nominal angle for the luma component, a second context is used as the context.
  4. 4 . The method of claim 1 , wherein the context is further based on a second delta angle for the luma component.
  5. 5 . The method of claim 1 , wherein identifying the context comprises identifying a cumulative density function.
  6. 6 . The method of claim 1 , wherein the chroma component and the luma component are co-located.
  7. 7 . The method of claim 1 , wherein the indicator comprises an entropy encoding of the delta angle for the chroma component.
  8. 8 . The method of claim 1 , wherein the context is based on a second delta angle corresponding to the luma component and whether the nominal angle for the luma component is the same as a nominal angle for the chroma component.
  9. 9 . A computing system, comprising: control circuitry; memory; and one or more sets of instructions stored in the memory and configured for execution by the control circuitry, the one or more sets of instructions comprising instructions for: receiving video data that comprises a chroma component and a luma component; identifying a context based on a nominal angle corresponding to the luma component; entropy-encoding an indicator according to the context based on the nominal angle corresponding to the luma component, the indicator indicating a delta angle for the chroma component; signaling the entropy-encoded indicator and the chroma component encoded using the delta angle.
  10. 10 . The computing system of claim 9 , wherein the context is further based on a second nominal angle for the chroma component.
  11. 11 . The computing system of claim 10 , wherein: when the nominal angle for the chroma component is within a threshold distance of the second nominal angle for the luma component, a first context is used as the context; and when the nominal angle for the chroma component is not within the threshold distance of the second nominal angle for the luma component, a second context is used as the context.
  12. 12 . The computing system of claim 9 , wherein the context is further based on a second delta angle for the luma component.
  13. 13 . The computing system of claim 9 , wherein identifying the context comprises identifying a cumulative density function.
  14. 14 . The computing system of claim 9 , wherein the chroma component and the luma component are co-located.
  15. 15 . A method of generating a video bitstream, the video bitstream comprising: a plurality of encoded pictures including a current picture comprising a chroma component and a luma component; and an entropy-encoded indicator indicating a delta angle for the chroma component; and the method comprising: entropy encoding the entropy-encoded indicator according to a context based on a nominal angle corresponding to the luma component; and transmitting the video bitstream, including the plurality of encoded pictures and the entropy-encoded indicator.
  16. 16 . The method of claim 15 , wherein the context is further based on a second nominal angle for the chroma component.
  17. 17 . The method of claim 16 , wherein: when the nominal angle for the chroma component is within a threshold distance of the second nominal angle for the luma component, a first context is used as the context; and when the nominal angle for the chroma component is not within the threshold distance of the second nominal angle for the luma component, a second context is used as the context.
  18. 18 . The method of claim 15 , wherein the video encoding method further comprises identifying the context using a cumulative density function.
  19. 19 . The method of claim 15 , wherein the context is further based on a second delta angle for the luma component.
  20. 20 . The method of claim 15 , wherein the chroma component and the luma component are co-located.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS This application is a continuation of U.S. patent application Ser. No. 18/119,470, filed Mar. 9, 2023, which is a continuation of U.S. patent application Ser. No. 17/459,790, filed Aug. 27, 2021, now issued as U.S. Pat. No. 11,627,322, which is a continuation of U.S. patent application Ser. No. 17/004,176, filed Aug. 27, 2020, now issued as U.S. Pat. No. 11,140,394, each of which is hereby incorporated by reference in its entirety. FIELD This disclosure relates generally to field of data processing, and more particularly to video encoding and decoding. BACKGROUND AOMedia Video 1 (AV1) is an open video coding format designed for video transmissions over the Internet. It was developed as a successor to VP9 by the Alliance for Open Media (AOMedia), a consortium founded in 2015 that includes semiconductor firms, video on demand providers, video content producers, software development companies and web browser vendors. In AV1, there is a total 56 directional angles, of which 8 are nominal angles and the remainder are specified as a delta from the nominal angles. SUMMARY Embodiments relate to a method, system, and computer readable medium for encoding and/or decoding video data. According to one aspect, a method for encoding and/or decoding video data is provided. The method may include receiving video data including (1) a chroma component having a first nominal angle and a first delta angle and (2) a luma component having a second nominal angle and a second delta angle is received. An index associated with the first delta angle may be parsed. The video data may be encoded and/or decoded using intra prediction based on the parsed index. According to another aspect, a computer system for encoding and/or decoding video data is provided. The computer system may include one or more processors, one or more computer-readable memories, one or more computer-readable tangible storage devices, and program instructions stored on at least one of the one or more storage devices for execution by at least one of the one or more processors via at least one of the one or more memories, whereby the computer system is capable of performing a method. The method may include receiving video data including (1) a chroma component having a first nominal angle and a first delta angle and (2) a luma component having a second nominal angle and a second delta angle is received. An index associated with the first delta angle may be parsed. The video data may be encoded and/or decoded using intra prediction based on the parsed index. According to yet another aspect, a computer readable medium for encoding and/or decoding video data is provided. The computer readable medium may include one or more computer-readable storage devices and program instructions stored on at least one of the one or more tangible storage devices, the program instructions executable by a processor. The program instructions are executable by a processor for performing a method that may accordingly include receiving video data including (1) a chroma component having a first nominal angle and a first delta angle and (2) a luma component having a second nominal angle and a second delta angle is received. An index associated with the first delta angle may be parsed. The video data may be encoded and/or decoded using intra prediction based on the parsed index. BRIEF DESCRIPTION OF THE DRAWINGS These and other objects, features and advantages will become apparent from the following detailed description of illustrative embodiments, which is to be read in connection with the accompanying drawings. The various features of the drawings are not to scale as the illustrations are for clarity in facilitating the understanding of one skilled in the art in conjunction with the detailed description. In the drawings: FIG. 1 illustrates a networked computer environment according to at least one embodiment; FIG. 2 is a diagram of the nominal angles of AV1, according to at least one embodiment; FIGS. 3A and 3B are exemplary predefined mapping tables, according to at least one embodiment; FIG. 4 is an operational flowchart illustrating the steps carried out by a program that codes video data, according to at least one embodiment; FIG. 5 is a block diagram of internal and external components of computers and servers depicted in FIG. 1 according to at least one embodiment; FIG. 6 is a block diagram of an illustrative cloud computing environment including the computer system depicted in FIG. 1, according to at least one embodiment; and FIG. 7 is a block diagram of functional layers of the illustrative cloud computing environment of FIG. 6, according to at least one embodiment. DETAILED DESCRIPTION Detailed embodiments of the claimed structures and methods are disclosed herein; however, it can be understood that the disclosed embodiments are merely illustrative of the claimed structures and methods that may be embodied in various forms. Those structures and