Search

CN-122027753-A - Shot object edge tracing method and device, electronic equipment and medium

CN122027753ACN 122027753 ACN122027753 ACN 122027753ACN-122027753-A

Abstract

The application discloses a method, a device, electronic equipment and a medium for tracing an object to be shot, and relates to the technical field of image processing, wherein the method comprises the steps of obtaining a first image shot by a camera and space position and direction information of the camera during shooting, wherein the first image comprises the object to be shot and a virtual background; based on the space position and direction information, capturing a second image of the same position and direction in the 3D space, wherein the second image further comprises a virtual background, comparing the first image with the second image, determining pixels with differences exceeding a preset range, obtaining a tracing image based on the determined pixels, merging the tracing image with the first image, and storing the tracing image. By using the image obtained by the spatial position and direction information which are the same as that of the camera in the 3D space during shooting, the image is compared with the image shot by the camera, the shooting object is accurately and effectively identified, and then the edge tracing is accurately carried out. And meanwhile, the image after the edge is traced is marked, so that the subsequent splitting treatment is convenient.

Inventors

  • XIA SIYU
  • LU QI
  • WANG XIN
  • Lan Bipu

Assignees

  • 重庆达瓦合志影像科技有限公司

Dates

Publication Date
20260512
Application Date
20231130

Claims (10)

  1. 1. A method of object description for use in a virtual camera system, the method comprising: Acquiring a first image shot by a camera and space position and direction information of the camera during shooting, wherein the first image comprises a shooting object and a virtual background; Capturing a second image of the same position and direction in a 3D space based on the spatial position and direction information, wherein the second image further comprises a virtual background; Comparing the first image with the second image, and determining pixels with differences exceeding a preset range; obtaining a stroked image based on the determined pixels; and merging the edge tracing image with the first image and storing the merged image.
  2. 2. The method for tracing an object as defined in claim 1, wherein the step of comparing the first image with the second image to determine pixels having differences exceeding a preset range comprises: Converting RGB color values of each pixel of the first image and the second image into HSL hue saturation values respectively; Comparing the hue saturation value obtained by converting the first image with the hue saturation value obtained by converting the second image for each pixel in turn; For each pixel, judging whether the difference value of the hue values is larger than a preset hue value threshold value or not, judging whether the difference value of the saturation values is larger than a preset saturation threshold value or not, and judging whether the difference value of the brightness values is larger than a preset brightness threshold value or not; If both are less than the preset threshold, the transparency of the pixel is marked as transparent, and if more than one hue saturation value is greater than the preset threshold, the transparency of the pixel is marked as opaque.
  3. 3. The method for tracing a subject according to claim 2, wherein said step of obtaining a tracing image based on said determined pixels comprises: determining a color value c of a boundary pixel for edge tracing and a pixel thickness d of a line for edge tracing; Judging the boundary pixel by pixel according to a boundary judging method to obtain a tracing image; the boundary judging method comprises the following steps: For each pixel to be judged, acquiring the transparency of 8 surrounding pixels, which are 8 directions away from the pixel d, around the pixel; calculating the sum of the transparency of 8 surrounding pixels, and calculating with transparency marked as transparent value 0 and transparency marked as opaque value 1; If the calculated result is inconsistent with the transparency of the currently judged pixel, judging the currently judged pixel as a boundary; Setting a color value of a pixel position determined as a boundary as c in a new drawing board; Setting the pixel position of the non-boundary as the current color; the transparency of the pixels determined as the boundary is marked as opaque.
  4. 4. A subject edging method as claimed in claim 3, wherein the step of combining and storing the edging image with the first image comprises: mixing the first image with a transparency marker; Placing the mixed image in a drawing board for generating a tracing image; Storing the image in the drawing board.
  5. 5. A subject edging device for use in a virtual photographing system, the device comprising: the first acquisition unit is used for acquiring a first image shot by the camera and space position and direction information of the camera during shooting, wherein the first image comprises a shooting object and a virtual background; a second acquisition unit, configured to capture a second image of the same position and direction in a 3D space based on the spatial position and direction information, where the second image further includes a virtual background; the image comparison unit is used for comparing the first image with the second image and determining pixels with differences exceeding a preset range; An image stroking unit for obtaining a stroked image based on the determined pixels; and the image storage unit is used for combining the tracing image with the first image and storing the tracing image.
  6. 6. The object-of-shot edging device according to claim 5, characterized in that the image comparison unit is specifically configured to: Converting RGB color values of each pixel of the first image and the second image into HSL hue saturation values respectively; Comparing the hue saturation value obtained by converting the first image with the hue saturation value obtained by converting the second image for each pixel in turn; For each pixel, judging whether the difference value of the hue values is larger than a preset hue value threshold value or not, judging whether the difference value of the saturation values is larger than a preset saturation threshold value or not, and judging whether the difference value of the brightness values is larger than a preset brightness threshold value or not; If both are less than the preset threshold, the transparency of the pixel is marked as transparent, and if more than one hue saturation value is greater than the preset threshold, the transparency of the pixel is marked as opaque.
  7. 7. The photographic subject edging device of claim 6, wherein the image edging unit is specifically configured to: determining a color value c of a boundary pixel for edge tracing and a pixel thickness d of a line for edge tracing; Judging the boundary pixel by pixel according to a boundary judging method to obtain a tracing image; the boundary judging method comprises the following steps: For each pixel to be judged, acquiring the transparency of 8 surrounding pixels, which are 8 directions away from the pixel d, around the pixel; calculating the sum of the transparency of 8 surrounding pixels, and calculating with transparency marked as transparent value 0 and transparency marked as opaque value 1; If the calculated result is inconsistent with the transparency of the currently judged pixel, judging the currently judged pixel as a boundary; Setting a color value of a pixel position determined as a boundary as c in a new drawing board; Setting the pixel position of the non-boundary as the current color; the transparency of the pixels determined as the boundary is marked as opaque.
  8. 8. The object-of-shot edging device according to claim 7, characterized in that the image storage unit is specifically configured to: mixing the first image with a transparency marker; Placing the mixed image in a drawing board for generating a tracing image; Storing the image in the drawing board.
  9. 9. An electronic device, comprising: one or more processors; a memory; one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more applications configured to perform the method of any of claims 1-4.
  10. 10. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a program code, which is callable by a processor for executing the method according to any one of claims 1-4.

Description

Shot object edge tracing method and device, electronic equipment and medium Technical Field The present application relates to the field of image processing technologies, and in particular, to a method, an apparatus, an electronic device, and a medium for tracing an object. Background The virtual shooting technology is a technology based on computer graphics, and can shoot and make contents such as films, games and the like in a virtual scene. The virtual shooting technology can help industries such as movies, games and the like to improve the manufacturing efficiency and reduce the cost, and can also create more vivid and rich visual effects. With the development of virtual shooting technology, under the increasingly complex visual requirements and uncertainties of shooting sites, a single shooting mode cannot meet the requirements of post-processing. The foreground (usually, a person) and the background can be fused well based on the traditional LED screen virtual shooting, but the foreground and the background cannot be split for processing respectively in the later process. Therefore, how to provide a tracing method for distinguishing the shooting object from the LED screen in the virtual shooting is a problem to be solved. Disclosure of Invention In order to solve the problems, the application provides a method, a device, electronic equipment and a medium for tracing an object. In a first aspect, the present application provides a method for tracing a shot object, which is applied to a virtual shooting system, and the method includes: Acquiring a first image shot by a camera and space position and direction information of the camera during shooting, wherein the first image comprises a shooting object and a virtual background; Capturing a second image of the same position and direction in a 3D space based on the spatial position and direction information, wherein the second image further comprises a virtual background; Comparing the first image with the second image, and determining pixels with differences exceeding a preset range; obtaining a stroked image based on the determined pixels; and merging the edge tracing image with the first image and storing the merged image. Optionally, the step of comparing the first image with the second image to determine pixels with differences exceeding a preset range specifically includes: Converting RGB color values of each pixel of the first image and the second image into HSL hue saturation values respectively; Comparing the hue saturation value obtained by converting the first image with the hue saturation value obtained by converting the second image for each pixel in turn; For each pixel, judging whether the difference value of the hue values is larger than a preset hue value threshold value or not, judging whether the difference value of the saturation values is larger than a preset saturation threshold value or not, and judging whether the difference value of the brightness values is larger than a preset brightness threshold value or not; If both are less than the preset threshold, the transparency of the pixel is marked as transparent, and if more than one hue saturation value is greater than the preset threshold, the transparency of the pixel is marked as opaque. Optionally, the step of obtaining the stroked image based on the determined pixels specifically includes: determining a color value c of a boundary pixel for edge tracing and a pixel thickness d of a line for edge tracing; Judging the boundary pixel by pixel according to a boundary judging method to obtain a tracing image; the boundary judging method comprises the following steps: For each pixel to be judged, acquiring the transparency of 8 surrounding pixels, which are 8 directions away from the pixel d, around the pixel; calculating the sum of the transparency of 8 surrounding pixels, and calculating with transparency marked as transparent value 0 and transparency marked as opaque value 1; If the calculated result is inconsistent with the transparency of the currently judged pixel, judging the currently judged pixel as a boundary; Setting a color value of a pixel position determined as a boundary as c in a new drawing board; Setting the pixel position of the non-boundary as the current color; the transparency of the pixels determined as the boundary is marked as opaque. Optionally, the step of merging the stroked image with the first image and storing the merged image includes: mixing the first image with a transparency marker; Placing the mixed image in a drawing board for generating a tracing image; Storing the image in the drawing board. In a second aspect, the present application provides a subject edging device, the device comprising: the first acquisition unit is used for acquiring a first image shot by the camera and space position and direction information of the camera during shooting, wherein the first image comprises a shooting object and a virtual background; a second acquisition uni