Search

US-12620054-B2 - Method and apparatus for image processing

US12620054B2US 12620054 B2US12620054 B2US 12620054B2US-12620054-B2

Abstract

A method and apparatus for image processing. The method includes: in response to a first trigger operation from a user, obtaining screen coordinates of the first trigger operation; obtaining, for each first image obtained after the first trigger operation, coordinates of a trigger pixel on the first image based on the screen coordinates of the first trigger operation; and performing, based on the coordinates of the trigger pixel, effect processing on the first image to obtain an image with a magnifying glass effect.

Inventors

  • Jinyuan Wu
  • Yuanhuang Zhang
  • Yi Guo

Assignees

  • BEIJING ZITIAO NETWORK TECHNOLOGY CO., LTD.

Dates

Publication Date
20260505
Application Date
20220516
Priority Date
20210730

Claims (18)

  1. 1 . A computer-implemented method for image processing comprising: in response to a first trigger operation from a user, obtaining screen coordinates of the first trigger operation; obtaining, for each first image obtained after the first trigger operation, coordinates of a trigger pixel on the first image based on the screen coordinates of the first trigger operation; and performing, based on the coordinates of the trigger pixel, effect processing on the first image to obtain an image with a magnifying glass effect, comprising: obtaining a processing parameter corresponding to the first image based on a frame number of the first image and first mapping, the processing parameter comprising at least one of a chromatic aberration intensity coefficient, a distortion coefficient, a scaling coefficient, and a blur coefficient, the first mapping is used for indicating a correspondence between the frame number and the processing parameter; performing, based on the coordinates of the trigger pixel and the processing parameter corresponding to the first image, the effect processing on the first image to obtain the image with the magnifying glass effect, wherein the effect processing corresponds to the processing parameter and comprises at least one of radial chromatic aberration processing, distortion processing, scaling processing, or radial blur processing.
  2. 2 . The method of claim 1 , wherein the processing parameter comprises the chromatic aberration intensity coefficient, and the effect processing comprises the radial chromatic aberration processing, and wherein performing the effect processing on the first image based on the coordinates of the trigger pixel and the processing parameter corresponding to the first image comprises: obtaining, for each pixel on the first image, a sum of color values of a plurality of sampling points corresponding to the pixel in each color channel based on the coordinates of the trigger pixel, coordinates of the pixel, a number of the sampling points, a step coefficient, an intensity coefficient corresponding to the color channel, a weight coefficient, a texture of the first image, and the chromatic aberration intensity coefficient; and determining a color value of the pixel in each color channel based on the sum of the color values of the plurality of sampling points corresponding to the pixel in the color channel.
  3. 3 . The method of claim 2 , wherein obtaining the sum of color values of the plurality of sampling points corresponding to the pixel in each color channel based on the coordinates of the trigger pixel, the coordinates of the pixel, the number of the sampling points, the step coefficient, the intensity coefficient corresponding to the color channel, the weight coefficient, the texture of the first image, and the chromatic aberration intensity coefficient comprises: determining a direction from the trigger pixel to the pixel based on the coordinates of the trigger pixel and the coordinates of the pixel; determining a sampling step based on the direction from the trigger pixel to the pixel, the step coefficient, and the number of the sampling points; determining, for each color channel of RGB channels, an offset corresponding to the color channel based on the direction from the trigger pixel to the pixel, the step coefficient, and the number of the sampling points, the chromatic aberration intensity coefficient and the intensity coefficient corresponding to the color channel; and determining, for each color channel of the RGB channels, the sum of the color values of the plurality of sampling points corresponding to the pixel in the color channel based on the texture of the first image, the coordinates of the pixel, the offset corresponding to the color channel, the sampling step, the number of the sampling points and the weight coefficient.
  4. 4 . The method of claim 2 , wherein determining the color value of the pixel in each color channel based on the sum of the color values of the plurality of sampling points corresponding to the pixel in the color channel comprises: for each color channel of RGB channels, dividing the sum of the color values of the plurality of sampling points corresponding to the pixel in the color channel by the number of the sampling points to obtain the color value of the pixel in the color channel.
  5. 5 . The method of claim 1 , wherein the processing parameter comprises the distortion coefficient, and the effect processing comprises the distortion processing, and wherein performing the effect processing on the first image based on the coordinates of the trigger pixel and the processing parameter corresponding to the first image comprises: obtaining a distortion function based on the distortion coefficient; determining, for each pixel on the first image, a pixel before distortion on the first image based on the coordinates of the trigger pixel, the coordinates of the pixel, a distance from the trigger pixel to the pixel and the distortion function, the pixel before distortion corresponding to the pixel; and determining a color value of the pixel before distortion as the color value of the pixel.
  6. 6 . The method of claim 1 , wherein the processing parameter comprises the scaling coefficient, and the effect processing comprises the scaling processing, and wherein performing the effect processing on the first image based on the coordinates of the trigger pixel and the processing parameter corresponding to the first image comprises: determining scaled vertex coordinates based on the coordinates of the trigger pixel, current vertex coordinates of a quadrilateral model and the scaling coefficient, the quadrilateral model being used for changing a display size of the image; updating the vertex coordinates of the quadrilateral model to the scaled vertex coordinates; and mapping the first image to the quadrilateral model to obtain the image with the magnifying glass effect.
  7. 7 . The method of claim 1 , wherein the processing parameter comprises the blur coefficient, and the effect processing comprises the radial blur processing, and wherein performing the effect processing on the first image based on the coordinates of the trigger pixel and the processing parameter corresponding to the first image comprises: obtaining, for each pixel on the first image, a sum of color values of a plurality of sampling points corresponding to the pixel based on the coordinates of the trigger pixel, coordinates of the pixel, a number of the sampling points, a texture of the first image, and the blur coefficient; and obtaining a color value of the pixel based on the sum of the color values of the plurality of sampling points corresponding to the pixel.
  8. 8 . The method of claim 7 , wherein obtaining the sum of color values of the plurality of sampling points corresponding to the pixel based on the coordinates of the trigger pixel, the coordinates of the pixel, the number of the sampling points, the texture of the first image, and the blur coefficient comprises: determining a direction from the trigger pixel to the pixel based on the coordinates of the trigger pixel and the coordinates of the pixel; and determining the sum of the color values of the plurality of sampling points corresponding to the pixel based on the coordinates of the pixel, the number of the sampling points, the blur coefficient, the texture of the first image and the direction from the trigger pixel to the pixel.
  9. 9 . The method of claim 7 , wherein obtaining the color value of the pixel based on the sum of the color values of the plurality of sampling points corresponding to the pixel comprises: dividing the sum of the color values of the plurality of sampling points corresponding to the pixel by the number of the sampling points to obtain the color value of the pixel.
  10. 10 . The method of claim 1 , wherein performing the effect processing on the first image based on the coordinates of the trigger pixel and the processing parameter corresponding to the first image comprises: performing, based on the coordinates of the trigger pixel and the processing parameter corresponding to the first image, the radial chromatic aberration processing, the distortion processing, the scaling processing, and the radial blur processing in sequence on the first image.
  11. 11 . The method of claim 1 , wherein in the first mapping, the frame number is positively correlated with the chromatic aberration intensity coefficient, the scaling coefficient and the blur coefficient respectively, and is negatively correlated with the distortion coefficient.
  12. 12 . The method of claim 11 , wherein the method further comprises: in response to a second trigger operation from the user, obtaining screen coordinates of the second trigger operation; obtaining, for each second image obtained after the second trigger operation, coordinates of a trigger pixel on the second image based on the screen coordinates of the second trigger operation; obtaining a processing parameter corresponding to the second image based on a frame number of the second image and second mapping; and performing, based on the coordinates of the trigger pixel on the second image and the processing parameter corresponding to the second image, effect processing on the second image, wherein in the second mapping, the frame number is negatively correlated with the chromatic aberration intensity coefficient, the scaling coefficient and the blur coefficient respectively, and is positively correlated with the distortion coefficient.
  13. 13 . A non-transitory computer readable storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, causes the processor to carry out a method comprising: in response to a first trigger operation from a user, obtaining screen coordinates of the first trigger operation; obtaining, for each first image obtained after the first trigger operation, coordinates of a trigger pixel on the first image based on the screen coordinates of the first trigger operation; and performing, based on the coordinates of the trigger pixel, effect processing on the first image to obtain an image with a magnifying glass effect, comprising: obtaining a processing parameter corresponding to the first image based on a frame number of the first image and first mapping, the processing parameter comprising at least one of a chromatic aberration intensity coefficient, a distortion coefficient, a scaling coefficient, and a blur coefficient, the first mapping is used for indicating a correspondence between the frame number and the processing parameter; performing, based on the coordinates of the trigger pixel and the processing parameter corresponding to the first image, the effect processing on the first image to obtain the image with the magnifying glass effect, wherein the effect processing corresponds to the processing parameter and comprises at least one of radial chromatic aberration processing, distortion processing, scaling processing, or radial blur processing.
  14. 14 . A terminal device comprising: a processor; and a non-transitory memory for storing executable instructions by the processor; wherein the processor is configured to carry out a method comprising: in response to a first trigger operation from a user, obtaining screen coordinates of the first trigger operation; obtaining, for each first image obtained after the first trigger operation, coordinates of a trigger pixel on the first image based on the screen coordinates of the first trigger operation; and performing, based on the coordinates of the trigger pixel, effect processing on the first image to obtain an image with a magnifying glass effect, comprising: obtaining a processing parameter corresponding to the first image based on a frame number of the first image and first mapping, the processing parameter comprising at least one of a chromatic aberration intensity coefficient, a distortion coefficient, a scaling coefficient, and a blur coefficient, the first mapping is used for indicating a correspondence between the frame number and the processing parameter; performing, based on the coordinates of the trigger pixel and the processing parameter corresponding to the first image, the effect processing on the first image to obtain the image with the magnifying glass effect, wherein the effect processing corresponds to the processing parameter and comprises at least one of radial chromatic aberration processing, distortion processing, scaling processing, or radial blur processing.
  15. 15 . The terminal device of claim 14 , wherein the processing parameter comprises the chromatic aberration intensity coefficient, and the effect processing comprises the radial chromatic aberration processing, and wherein the processor is configured to perform the effect processing on the first image based on the coordinates of the trigger pixel and the processing parameter corresponding to the first image by: obtaining, for each pixel on the first image, a sum of color values of a plurality of sampling points corresponding to the pixel in each color channel based on the coordinates of the trigger pixel, coordinates of the pixel, a number of the sampling points, a step coefficient, an intensity coefficient corresponding to the color channel, a weight coefficient, a texture of the first image, and the chromatic aberration intensity coefficient; and determining a color value of the pixel in each color channel based on the sum of the color values of the plurality of sampling points corresponding to the pixel in the color channel.
  16. 16 . The terminal device of claim 14 , wherein the processing parameter comprises the distortion coefficient, and the effect processing comprises the distortion processing, and wherein the processor is configured to perform the effect processing on the first image based on the coordinates of the trigger pixel and the processing parameter corresponding to the first image by: obtaining a distortion function based on the distortion coefficient; determining, for each pixel on the first image, a pixel before distortion on the first image based on the coordinates of the trigger pixel, the coordinates of the pixel, a distance from the trigger pixel to the pixel and the distortion function, the pixel before distortion corresponding to the pixel; and determining a color value of the pixel before distortion as the color value of the pixel.
  17. 17 . The terminal device of claim 14 , wherein the processing parameter comprises the scaling coefficient, and the effect processing comprises the scaling processing, and wherein the processor is configured to perform the effect processing on the first image based on the coordinates of the trigger pixel and the processing parameter corresponding to the first image by: determining scaled vertex coordinates based on the coordinates of the trigger pixel, current vertex coordinates of a quadrilateral model and the scaling coefficient, the quadrilateral model being used for changing a display size of the image; updating the vertex coordinates of the quadrilateral model to the scaled vertex coordinates; and mapping the first image to the quadrilateral model to obtain the image with the magnifying glass effect.
  18. 18 . The terminal device of claim 14 , wherein the processing parameter comprises the blur coefficient, and the effect processing comprises the radial blur processing, and wherein the processor is configured to perform the effect processing on the first image based on the coordinates of the trigger pixel and the processing parameter corresponding to the first image by: obtaining, for each pixel on the first image, a sum of color values of a plurality of sampling points corresponding to the pixel based on the coordinates of the trigger pixel, coordinates of the pixel, a number of the sampling points, a texture of the first image, and the blur coefficient; and obtaining a color value of the pixel based on the sum of the color values of the plurality of sampling points corresponding to the pixel.

Description

CROSS-REFERENCE TO RELATED APPLICATION(S) This application claims the benefit of Chinese Patent Application No. 202110875574.3 filed on Jul. 30, 2021, entitled “METHOD AND APPARATUS FOR IMAGE PROCESSING”, which is hereby incorporated by reference in its entirety. FIELD The present disclosure relates to the field of image processing and in particular, to a method and apparatus for image processing. BACKGROUND With the development of software development technology, there are more and more types of applications (Apps) on mobile terminals. Among them, video APPs are popular among the public. Users can not only browse videos through video apps, but also create and post videos by themselves. Users may add effects to videos, thereby enhancing user participation. However, the current diversification of effects is not enough, and user needs cannot be met. SUMMARY The present disclosure provides a method and apparatus for image processing. In a first aspect, the present disclosure provides a method for image processing comprising: in response to a first trigger operation from a user, obtaining screen coordinates of the first trigger operation; obtaining, for each first image obtained after the first trigger operation, coordinates of a trigger pixel on the first image based on the screen coordinates of the first trigger operation; and performing, based on the coordinates of the trigger pixel, an effect processing on the first image to obtain an image with a magnifying glass effect. Optionally, performing the effect processing on the first image based on the coordinates of the trigger pixel to obtain the image with the magnifying glass effect comprises: obtaining a processing parameter corresponding to the first image based on a frame number of the first image and first mapping, the processing parameter comprising at least one of a chromatic aberration intensity coefficient, a distortion coefficient, a scaling coefficient, and a blur coefficient, the first mapping is used for indicating a correspondence between the frame number and the processing parameter; and performing, based on the coordinates of the trigger pixel and the processing parameter corresponding to the first image, the effect processing on the first image to obtain the image with the magnifying glass effect, wherein the effect processing corresponds to the processing parameter and comprises at least one of radial chromatic aberration processing, distortion processing, scaling processing, or radial blur processing. Optionally, the processing parameter comprises the chromatic aberration intensity coefficient, and the effect processing comprises the radial chromatic aberration processing, and wherein performing the effect processing on the first image based on the coordinates of the trigger pixel and the processing parameter corresponding to the first image comprises: obtaining, for each pixel on the first image, a sum of color values of a plurality of sampling points corresponding to the pixel in each color channel based on the coordinates of the trigger pixel, coordinates of the pixel, a number of the sampling points, a step coefficient, an intensity coefficient corresponding to the color channel, a weight coefficient, a texture of the first image, and the chromatic aberration intensity coefficient; and determining a color value of the pixel in each color channel based on the sum of the color values of the plurality of sampling points corresponding to the pixel in the color channel. Optionally, obtaining the sum of color values of the plurality of sampling points corresponding to the pixel in each color channel based on the coordinates of the trigger pixel, the coordinates of the pixel, the number of the sampling points, the step coefficient, the intensity coefficient corresponding to the color channel, the weight coefficient, the texture of the first image, and the chromatic aberration intensity coefficient comprises: determining a direction from the trigger pixel to the pixel based on the coordinates of the trigger pixel and the coordinates of the pixel; determining a sampling step based on the direction from the trigger pixel to the pixel, the step coefficient, and the number of the sampling points; determining, for each color channel of RGB channels, an offset corresponding to the color channel based on the direction from the trigger pixel to the pixel, the step coefficient, and the number of the sampling points, the chromatic aberration intensity coefficient and the intensity coefficient corresponding to the color channel; and determining, for each color channel of the RGB channels, the sum of the color values of the plurality of sampling points corresponding to the pixel in the color channel based on the texture of the first image, the coordinates of the pixel, the offset corresponding to the color channel, the sampling step, the number of the sampling points and the weight coefficient. Optionally, determining the color value of the pixel in each c