Search

KR-20260062326-A - METHOD AND APPARATUS FOR PROCESSING IMAGES

KR20260062326AKR 20260062326 AKR20260062326 AKR 20260062326AKR-20260062326-A

Abstract

The present disclosure relates to a technology for processing images and provides an image processing device comprising an object identification unit that distinguishes and identifies a plurality of objects included in an image, and an object selection unit that filters a plurality of objects based on area information, border information, and curvature information for each of the plurality of objects to select misrecognition target objects corresponding to preset misrecognition criteria.

Inventors

  • 임희주
  • 이창환
  • 최호
  • 예인수

Assignees

  • 포스코홀딩스 주식회사

Dates

Publication Date
20260507
Application Date
20241029

Claims (20)

  1. An object identification unit that distinguishes and identifies multiple objects included in an image; and An image processing device comprising an object selection unit that filters the plurality of objects based on area information, border information, and curvature information for each of the plurality of objects and selects misrecognition target objects corresponding to preset misrecognition criteria.
  2. In Article 1, The above object selection unit is, An image processing device that selects a first candidate object based on the above area information, selects a second candidate object using the above boundary information for the first candidate object, and selects a misrecognition target object using the above curvature information for the second candidate object.
  3. In Article 2, The above object selection unit is, An image processing device characterized by calculating area information for each of the above objects through different algorithms set in advance, and selecting the first candidate object based on an area comparison result obtained by comparing the calculated area information.
  4. In Paragraph 3, The above-mentioned first candidate object is, An image processing device characterized by being an object among the plurality of objects above in which the area comparison result is calculated to be greater than or equal to a preset value.
  5. In Paragraph 3, The above-mentioned preset different algorithms are, An image processing device characterized by including a ConvexHull algorithm and a ConcaveHull algorithm.
  6. In Article 5, The above-mentioned first candidate object is, An image processing device characterized by an object in which a first area calculated according to the above ConvexHull algorithm is larger than a second area calculated according to the above ConcaveHull algorithm, and the ratio of the second area to the first area is smaller than a preset value.
  7. In Article 2, The above object selection unit is, An image processing device characterized by selecting the second candidate object based on the border information of each of the first candidate objects and pixel information specified based on the border information.
  8. In Article 7, The above border information is, An image processing device characterized by the area outline used to calculate the area information of the first candidate object and the outline of the area difference area selected according to a different algorithm being set as an overlapping part.
  9. In Article 7, The above second candidate object is, An image processing device characterized by being selected based on the average value of the pixel information specified by being spaced apart by a preset number of pixels in the normal direction of the above-mentioned border information.
  10. In Article 9, The above second candidate object is, An image processing device characterized by the fact that the average value of the pixel information above is an object that exceeds a set value.
  11. In Article 2, The above object selection unit is, An image processing device characterized by calculating a plurality of curvature information for a specific point among the boundaries of a second candidate object, calculating a ratio in which the curvature information is less than or equal to a preset value, and removing an object in which the ratio is greater than a preset reference ratio to select the object to be misrecognized.
  12. In Article 11, The above object selection unit is, An image processing device characterized by estimating the polygon with the highest similarity to the shape of the above border, and calculating the curvature information using the angle between adjacent vectors with the line connecting the vertices of the above polygon as a vector.
  13. In Article 1, The above misrecognized object is, An image processing device characterized by the fact that the above-mentioned plurality of objects are objects identified by overlapping.
  14. In an image processing method for object selection, A step of distinguishing and identifying multiple objects included in an image; and An image processing method comprising the step of filtering the plurality of objects based on area information, border information, and curvature information for each of the plurality of objects to select a misrecognition target object corresponding to a preset misrecognition criterion.
  15. In Article 14, The step of selecting the above-mentioned object to be misrecognized is: An image processing method that selects a first candidate object based on the above area information, selects a second candidate object using the above boundary information for the first candidate object, and selects a misrecognition target object using the above curvature information for the second candidate object.
  16. In Article 15, The step of selecting the above-mentioned object to be misrecognized is: An image processing method characterized by calculating area information for each of the above objects through different algorithms set in advance, and selecting the first candidate object based on an area comparison result obtained by comparing the calculated area information.
  17. In Article 15, The step of selecting the above-mentioned object to be misrecognized is: An image processing method characterized by selecting the second candidate object based on the border information of each of the first candidate objects and pixel information specified based on the border information.
  18. In Article 15, The step of selecting the above-mentioned object to be misrecognized is: An image processing method characterized by calculating a plurality of curvature information for a specific point among the boundaries of a second candidate object, calculating a ratio in which the curvature information is less than or equal to a preset value, and removing an object in which the ratio is greater than a preset reference ratio to select the object to be misrecognized.
  19. In image processing systems, An image processing device comprising an object selection unit that distinguishes and identifies a plurality of objects included in an image, and filters the plurality of objects based on area information, border information, and curvature information for each of the plurality of objects to select misrecognition target objects corresponding to preset misrecognition criteria; and An image processing system comprising an evaluation device that calculates at least one of a foreign matter content ratio and sphericity using the above-mentioned plurality of objects and the above-mentioned misrecognition target object information.
  20. In Article 19, The above evaluation device is, An image processing system characterized by excluding the object to be misrecognized from the plurality of objects, extracting the foreign object using a clustering unsupervised learning model, and calculating the foreign object content ratio by calculating the area of the foreign object relative to the area of the plurality of objects.

Description

Method and apparatus for processing images The present disclosure relates to a technique for selecting objects in an image. Image analysis refers to the process of extracting meaningful information from images. It primarily involves using digital image processing techniques to extract specific objects, characteristics, and situations from an image, and then analyzing the image based on this information. In particular, as the importance of image analysis increases in various industries, such as the advancement of image-based artificial intelligence and camera-based autonomous driving technology, various technologies for image analysis are being developed. Although object recognition and classification within images are automated through computer processing, there are difficulties in accurately recognizing objects due to various factors when multiple objects of similar form exist. In addition, due to the advancement of secondary battery technology, technology for studying the physical properties of materials (particles) by capturing their shapes in images using devices such as material microscopes is also developing. However, in the case of fine particles, when capturing them as two-dimensional images, it is difficult to clearly distinguish and recognize the particles due to overlapping, breakage, and foreign substances between particles. FIG. 1 is a drawing for explaining the configuration of an image processing device according to one embodiment. FIG. 2 is a diagram illustrating an operation for identifying a plurality of objects according to one embodiment. FIG. 3 is a diagram illustrating an object selection operation according to one embodiment. FIG. 4 is a diagram illustrating the operation of calculating area information according to different algorithms according to one embodiment. FIG. 5 is a diagram illustrating an example in which first candidate objects are selected according to one embodiment. FIG. 6 is a diagram illustrating the operation of selecting a second candidate object according to one embodiment. FIG. 7 is a drawing for explaining an example of a first candidate object selected to a second candidate object according to one embodiment. FIG. 8 is a diagram illustrating the operation of calculating curvature information according to one embodiment. FIG. 9 is a diagram illustrating an example in which a misrecognized target object is selected according to one embodiment. FIG. 10 is a flowchart illustrating an image processing method according to one embodiment. FIG. 11 is a configuration diagram for explaining an image processing system according to one embodiment. FIG. 12 is a diagram illustrating the operation of calculating the foreign matter content ratio according to one embodiment. FIG. 13 is a diagram illustrating a sphericity calculation operation according to one embodiment. Hereinafter, some embodiments of the present disclosure will be described in detail with reference to the exemplary drawings. In assigning reference numerals to the components of each drawing, the same components may have the same reference numeral as much as possible, even if they are shown in different drawings. Furthermore, in describing the embodiments, if it is determined that a detailed description of related known components or functions may obscure the essence of the technical concept, such detailed description may be omitted. Where terms such as "comprising," "having," or "consisting of" are used in this specification, other parts may be added unless "only" is used. Where a component is expressed in the singular, it may include a plural unless otherwise specified. Additionally, terms such as first, second, A, B, (a), (b), etc., may be used to describe the components of the present disclosure. These terms are used merely to distinguish the components from other components, and the nature, order, sequence, or number of the components are not limited by such terms. In describing the positional relationship of components, where it is stated that two or more components are "connected," "combined," or "joined," it should be understood that while the two or more components may be directly "connected," "combined," or "joined," they may also be "connected," "combined," or "joined" with other components "intervened." Here, the other components may be included in one or more of the two or more components that are "connected," "combined," or "joined" with one another. In describing the temporal flow relationship regarding components, methods of operation, or methods of production, for example, when the temporal or sequential relationship is described using "after," "following," "next," or "before," it may include cases where the relationship is not continuous unless "immediately" or "directly" is used. Meanwhile, where numerical values or corresponding information regarding a component (e.g., levels, etc.) are mentioned, even without separate explicit notation, the numerical values or corresponding information may