Search

JP-7855119-B2 - Decoder, Encoder, and Bitstream Generator

JP7855119B2JP 7855119 B2JP7855119 B2JP 7855119B2JP-7855119-B2

Inventors

  • 安倍 清史
  • 西 孝啓
  • 遠間 正真
  • 加納 龍一
  • 橋本 隆

Assignees

  • パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ

Dates

Publication Date
20260507
Application Date
20250512
Priority Date
20170427

Claims (3)

  1. A decryption device that decrypts a block to be decrypted contained in a picture to be decrypted, Processor and Equipped with memory, The processor uses the memory to: For bidirectional prediction, two predicted images are obtained by interpolating to fractional pixel precision using two reference pictures associated with the decoded block. Using the multiple pixel values of multiple first pixels contained in the two predicted images, the decoding target block is divided to obtain multiple horizontal gradient values corresponding to each of the multiple second pixels contained in the subblock. Based on the aforementioned multiple horizontal gradient values, the motion correction value of the subblock is derived. At the end of the interpretation using the multiple horizontal gradient values, the output prediction image corresponding to the subblock is generated using the motion correction value of the subblock. The two predicted images are identified using two motion vectors. The aforementioned reference range for interpolation is included in the usual reference range referenced to obtain a fractional pixel-precision prediction image corresponding to the decoding target block in the usual interpretation that does not use the plurality of horizontal gradient values. In the interpolation process to fractional pixel precision, an 8-tap filter is used. Decoding device.
  2. An encoding device for encoding a block to be encoded contained in a picture to be encoded, Processor and Equipped with memory, The processor uses the memory to: For bidirectional prediction, two predicted images are obtained by interpolating to fractional pixel precision using two reference pictures associated with the aforementioned encoding target block. Using the multiple pixel values of multiple first pixels contained in the two predicted images, the encoding target block is divided to obtain multiple horizontal gradient values corresponding to each of the multiple second pixels contained in the subblock. Based on the aforementioned multiple horizontal gradient values, the motion correction value of the subblock is derived. At the end of the interpretation using the multiple horizontal gradient values, the output prediction image corresponding to the subblock is generated using the motion correction value of the subblock. The two predicted images are identified using two motion vectors. The aforementioned reference range for interpolation is included in the usual reference range referenced to obtain a fractional pixel-precision prediction image corresponding to the encoding target block in the usual interpretation that does not use the plurality of horizontal gradient values. In the interpolation process to fractional pixel precision, an 8-tap filter is used. Encoding device.
  3. A bitstream generator that generates a bitstream, Processor and Equipped with memory, The processor uses the memory to generate a bitstream containing motion information indicating a reference picture used in interpretation, The aforementioned inter prediction is For bidirectional prediction, two predicted images are obtained by interpolating to fractional pixel precision using two reference pictures associated with the encoded blocks contained in the encoded picture. Using the multiple pixel values of multiple first pixels contained in the two predicted images, the encoding target block is divided to obtain multiple horizontal gradient values corresponding to each of the multiple second pixels contained in the subblock. Based on the aforementioned multiple horizontal gradient values, the motion correction value of the subblock is derived. At the end of the interpretation using the plurality of horizontal gradient values, the process includes generating an output prediction image corresponding to the subblock using the motion correction value of the subblock, The two predicted images are identified using two motion vectors. The aforementioned reference range for interpolation is included in the usual reference range referenced to obtain a fractional pixel-precision prediction image corresponding to the target block for encoding in a normal interpolation prediction that does not use the plurality of horizontal gradient values. In the interpolation process to fractional pixel precision, an 8-tap filter is used. Bitstream generator.

Description

This disclosure relates to image encoding and decoding using interpretation. The video encoding standard known as HEVC (High-Efficiency Video Coding) has been standardized by JCT-VC (Joint Collaborative Team on Video Coding). H. 265 (ISO/IEC 23008-2 HEVC (High Efficiency Video Coding)) Figure 1 is a block diagram showing the functional configuration of the encoding device according to Embodiment 1.Figure 2 shows an example of block division in Embodiment 1.Figure 3 is a table showing the transformation basis functions corresponding to each transformation type.Figure 4A shows an example of the shape of a filter used in an ALF (Advanced Filter).Figure 4B shows another example of the filter shape used in ALF.Figure 4C shows another example of the filter shape used in ALF.Figure 5A shows the 67 intra-prediction modes in intra-prediction.Figure 5B is a flowchart illustrating the overview of the predictive image correction process using OBMC processing.Figure 5C is a conceptual diagram illustrating the overview of the predictive image correction process using OBMC processing.Figure 5D shows an example of FRUC.Figure 6 is a diagram illustrating pattern matching (bilateral matching) between two blocks along a motion trajectory.Figure 7 illustrates pattern matching (template matching) between a template in the current picture and a block in the referenced picture.Figure 8 is a diagram illustrating a model that assumes uniform linear motion.Figure 9A is a diagram illustrating the derivation of subblock-level motion vectors based on the motion vectors of multiple adjacent blocks.Figure 9B is a diagram illustrating the overview of the motion vector derivation process using merge mode.Figure 9C is a conceptual diagram illustrating the overview of DMVR processing.Figure 9D is a diagram illustrating the outline of a predictive image generation method using brightness correction processing by LIC processing.Figure 10 is a block diagram showing the functional configuration of the decoding device according to Embodiment 1.Figure 11 is a flowchart showing the interpretation prediction in Embodiment 2.Figure 12 is a conceptual diagram illustrating the interpretation prediction in Embodiment 2.Figure 13 is a conceptual diagram illustrating an example of the reference range of the motion compensation filter and gradient filter in Embodiment 2.Figure 14 is a conceptual diagram illustrating an example of the reference range of the motion compensation filter in Modification 1 of Embodiment 2.Figure 15 is a conceptual diagram illustrating an example of the reference range of the gradient filter in Modification 1 of Embodiment 2.Figure 16 shows an example of a pixel pattern referenced in the derivation of local motion estimates in a modified example 2 of Embodiment 2.Figure 17 is an overall diagram of the content supply system that realizes the content distribution service.Figure 18 shows an example of an encoding structure during scalable encoding.Figure 19 shows an example of an encoding structure during scalable encoding.Figure 20 shows an example of a web page display screen.Figure 21 shows an example of a web page display screen.Figure 22 shows an example of a smartphone.Figure 23 is a block diagram showing an example of a smartphone configuration. The embodiments will be described in detail below with reference to the drawings. The embodiments described below are all comprehensive or specific examples. The numerical values, shapes, materials, components, arrangement and connection configurations of components, steps, and step order shown in the following embodiments are examples only and are not intended to limit the scope of the claims. Furthermore, components in the following embodiments that are not described in the independent claim representing the highest-level concept are described as optional components. (Embodiment 1) First, an overview of Embodiment 1 will be given as an example of an encoding and decoding device to which the processes and/or configurations described in each aspect of this disclosure, described later, can be applied. However, Embodiment 1 is merely an example of an encoding and decoding device to which the processes and/or configurations described in each aspect of this disclosure can be applied, and the processes and/or configurations described in each aspect of this disclosure can also be implemented in encoding and decoding devices different from Embodiment 1. When applying the processes and/or configurations described in each aspect of this disclosure to Embodiment 1, for example, one of the following may be performed: (1) Replacing the encoding or decoding device of Embodiment 1 with the components corresponding to the components described in each aspect of this disclosure from among the multiple components constituting the encoding or decoding device, with the components described in each aspect of this disclosure. (2) Replacing the encoding or decoding device of Embodiment 1 with any modificati