KR-20260066091-A - Mesh data transmission device, mesh data transmission method, mesh data reception device and mesh data reception method
Abstract
A mesh data decoding method according to embodiments may include the steps of receiving a base mesh bitstream, a displacement vector bitstream, a texture map bitstream, and signaling information; a base mesh processing step for restoring a base mesh from the base mesh bitstream; a displacement information processing step for restoring displacement information from the displacement vector bitstream; a restoration step for restoring a mesh based on the base mesh and the displacement information; and a texture map processing step for restoring texture maps from the texture map bitstream.
Inventors
- 김대현
- 변주형
- 권나성
- 심동규
Assignees
- 엘지전자 주식회사
Dates
- Publication Date
- 20260512
- Application Date
- 20240830
- Priority Date
- 20230830
Claims (15)
- A step of receiving a base mesh bitstream, a displacement vector bitstream, a texture map bitstream, and signaling information; A base mesh processing step for restoring a base mesh from the above base mesh bitstream; A displacement information processing step for restoring displacement information from the above displacement vector bitstream; A restoration step for restoring a mesh based on the base mesh and the displacement information; and A mesh data decoding method comprising a texture map processing step for restoring texture maps from the above texture map bitstream.
- In claim 1, the texture map processing step A step of restoring texture maps by decoding the texture map bitstream based on the above signaling information; A step of determining whether at least one texture map is omitted from the texture map bitstream based on the signaling information above; and A mesh data decoding method comprising the step of generating the at least one omitted texture map based on the signaling information and at least one reference frame when it is confirmed that there is at least one omitted texture map.
- In Article 2, A mesh data decoding method in which the omission unit of at least one texture map is at least one of a frame unit, a patch unit, a sub-mesh unit, a tile unit, and a slice unit.
- In claim 2, the signaling information A mesh data decoding method comprising information for identifying whether to omit at least one texture map and reference texture map information associated with at least one reference frame.
- A receiver that receives a base mesh bitstream, a displacement vector bitstream, a texture map bitstream, and signaling information; A base mesh processing unit that restores a base mesh from the above base mesh bitstream; A displacement information processing unit that restores displacement information from the above displacement vector bitstream; A restoration unit that restores a mesh based on the above base mesh and the above displacement information; and A mesh data decoding device comprising a texture map processing unit that restores a texture map from the above texture map bitstream.
- In claim 5, the texture map processing unit A mesh data decoding device that decodes the texture map bitstream based on the signaling information to restore texture maps, checks whether at least one texture map is omitted in the texture map bitstream based on the signaling information, and if it is confirmed that there is at least one omitted texture map, generates the at least one omitted texture map based on the signaling information and at least one reference frame.
- In Article 6, The omission unit of the above-mentioned at least one texture map is a mesh data decoding device that is at least one of a frame unit, a patch unit, a sub-mesh unit, a tile unit, and a slice unit.
- In claim 6, the signaling information A mesh data decoding device comprising information for identifying whether to omit at least one texture map and reference texture map information associated with at least one reference frame.
- Step of encoding the original mesh; and A mesh data encoding method comprising the step of transmitting a bitstream containing the above-mentioned encoded mesh and signaling information.
- In claim 9, the encoding step A base mesh processing step that generates a base mesh bitstream by encoding a base mesh generated by simplifying the original mesh; A displacement information processing step that generates a displacement vector bitstream by encoding displacement information generated based on the above base mesh; A mesh restoration step for restoring a mesh based on the above-mentioned encoded base mesh and the above-mentioned encoded displacement information; and A mesh data encoding method comprising a texture map processing step of determining whether to omit at least one texture map among texture maps generated based on the original mesh and the restored mesh, and encoding the texture maps that are not omitted to generate a texture map bitstream.
- In claim 10, the texture map processing step A mesh data encoding method that determines whether to omit the current texture map by comparing the similarity between the current texture map and the reference texture map before encoding the generated texture maps.
- In claim 11, the texture map processing step A mesh data encoding method that omits the coding and transmission of the current texture map if the difference between the signal-to-noise ratio (PSNR) value calculated through the current texture map and the PSNR value calculated through the reference texture map is smaller than a preset threshold value.
- In Article 10, A mesh data encoding method in which the omission unit of at least one texture map is at least one of a frame unit, a patch unit, a sub-mesh unit, a tile unit, and a slice unit.
- In claim 13, the signaling information A mesh data encoding method comprising information for identifying whether to omit at least one texture map and reference texture map information associated with at least one reference frame for generating the omitted texture map.
- In Article 14, A mesh data encoding method comprising the above reference texture map information for identifying the direction of the at least one reference frame.
Description
Mesh data transmission device, mesh data transmission method, mesh data reception device and mesh data reception method Mesh data transmission device, mesh data transmission method, mesh data reception device and mesh data reception method The embodiments provide a method for providing 3D content to provide various services to users, such as VR (Virtual Reality), AR (Augmented Reality), MR (Mixed Reality), and autonomous driving services. Among 3D content, point cloud data or mesh data is a set of points in 3D space. However, there is a problem in that it is difficult to generate point cloud data or mesh data because there is a large amount of points in 3D space. In other words, there is a problem in that a large amount of throughput is required to transmit and receive 3D data with a large amount of points, such as point cloud data or mesh data. Drawings are included to further understand the embodiments, and the drawings illustrate the embodiments along with descriptions related to the embodiments. For a better understanding of the various embodiments described below, one must refer to the description of the embodiments below in relation to the following drawings, which include parts corresponding to similar reference numerals throughout the drawings. FIG. 1 shows a system for providing dynamic mesh content according to embodiments. FIG. 2 shows a V-MESH compression method according to embodiments. FIG. 3 shows the pre-processing of V-MESH compression according to the embodiments. FIG. 4 illustrates a mid-edge subdivision method according to embodiments. FIG. 5 illustrates a displacement generation process according to embodiments. FIG. 6 illustrates the intra-frame encoding process of V-MESH data according to embodiments. FIG. 7 illustrates the inter-frame encoding process of V-MESH data according to embodiments. FIG. 8 illustrates a lifting conversion process for displacement according to embodiments. FIG. 9 illustrates the process of packing conversion coefficients according to embodiments into a 2D image. FIG. 10 illustrates the attribute transfer process of the V-MESH compression method according to the embodiments. FIG. 11 illustrates the intra-frame decoding process of V-MESH data according to embodiments. Figure 12 shows an inter-frame decoding processor for V-MESH data. FIG. 13 is a drawing showing an example of a transmitting device according to embodiments. FIG. 14 is a drawing showing an example of a receiving device according to embodiments. FIG. 15 is a drawing showing another example of a transmitting device according to embodiments. FIG. 16 is a flowchart showing an example of a method for determining whether to omit texture map coding according to embodiments. FIG. 17 is a diagram showing an example of a method for determining whether to omit a texture map using a texture map maximum distance parameter according to embodiments. FIG. 18 is a diagram showing an example of whether a texture map is omitted and a reference frame index is signaled at the sub-mesh unit according to embodiments. FIG. 19 is a diagram showing an example of the frame relationship of a referenced texture map when texture map coding is omitted in sub-mesh units according to embodiments. FIG. 20 is a drawing showing another example of a receiving device according to embodiments. FIG. 21 is a drawing showing an example of a detailed block diagram of a texture map decoder according to embodiments. FIG. 22 is a drawing showing an example of inducing a texture map omission flag to 0 according to embodiments. FIG. 23 is a flowchart showing an example of deriving a texture map omission flag in a texture map omission flag determination unit according to embodiments. FIG. 24 is a diagram showing an example of whether a texture map is omitted and a reference frame index is signaled at the sub-mesh unit according to embodiments. FIG. 25 is a diagram showing an example of the frame relationship of a referenced texture map when texture map coding is omitted in sub-mesh units according to embodiments. FIG. 26 is a diagram showing an example of the syntax structure of an atlas frame parameter set (AFPS) according to embodiments. FIG. 27 is a table showing examples of whether the texture map of afps_texture_skip_refDirection_idx is omitted and the reference direction according to embodiments. FIG. 28 is a drawing showing an example of the syntax structure of an atlas tile header according to embodiments. FIG. 29 is a drawing showing an example of the syntax structure of an atlas tile data unit according to embodiments. FIG. 30 is a table showing examples of determining patch modes according to a separator when the coding type of the current atlas tile according to embodiments is I_TILE. FIG. 31 is a table showing examples of determining patch modes according to a separator when the coding type of the current atlas tile according to embodiments is P_TILE. FIG. 32 is a drawing showing an example of the syntax structure of a submesh hea