Search

KR-20260064089-A - METHOD AND APPARATUS FOR DETECTING A LESION IN A ULTRASONIC IMAGE

KR20260064089AKR 20260064089 AKR20260064089 AKR 20260064089AKR-20260064089-A

Abstract

A method and apparatus for determining the location of a lesion in an ultrasound image of a body tissue are provided, through the steps of generating a plurality of parameter images based on an ultrasound RF signal reflected from the body tissue, inputting the plurality of parameter images into an artificial neural network, and determining the location of the lesion based on the output of the artificial neural network.

Inventors

  • 서준영
  • 문영민
  • 권기훈

Assignees

  • 재단법인 포항산업과학연구원

Dates

Publication Date
20260507
Application Date
20241031

Claims (11)

  1. As a method for determining the location of a lesion in an ultrasound image of body tissue, A step of generating a plurality of parameter images based on an ultrasonic RF signal reflected from the above body tissue, The step of inputting the above plurality of parameter images into an artificial neural network, and A step of determining the location of the lesion based on the output of the artificial neural network. A method including
  2. In paragraph 1, A method in which the plurality of parameter images include a B (brightness) mode image, an entropy image, a phase image, and a decay image.
  3. In paragraph 2, The step of generating a plurality of parameter images based on the ultrasound RF signal reflected from the body tissue is: A step of performing local entropy calculation based on the envelope of the above ultrasonic RF signal, and A step of generating the entropy image based on the result of the above local entropy calculation A method including
  4. In paragraph 2, The step of generating a plurality of parameter images based on the ultrasound RF signal reflected from the body tissue is: A step of obtaining phase information of the ultrasonic RF signal by performing a Hilbert transform on the ultrasonic RF signal, and Step of generating the phase image based on the above phase information A method including
  5. In paragraph 2, The step of generating a plurality of parameter images based on the ultrasound RF signal reflected from the body tissue is: A step of performing a Fast Fourier Transform (FFT) on the above ultrasonic RF signal, and A step of generating an attenuation image based on the ultrasonic RF signal analyzed in the frequency domain by the above FFT A method including
  6. In paragraph 2, The step of inputting the above plurality of parameter images into an artificial neural network is, The step of inputting the plurality of parameter images into each of the plurality of backbone networks within the artificial neural network. A method including
  7. In paragraph 6, The step of determining the location of the lesion based on the output of the artificial neural network is: A step of determining the location of the lesion by combining the outputs of a plurality of linear layers, each connected to the plurality of backbone networks. A method including
  8. In paragraph 6, A method in which the plurality of backbone networks described above include a convolutional neural network (CNN) architecture.
  9. In paragraph 8, A method in which the above CNN architecture includes at least one artificial intelligence (AI) model among VGG-16, ResNet-50, DenseNet-201, and EfficientNetV2-L.
  10. In paragraph 2, Step of displaying the location of the above lesion on the B-mode image A method that includes more.
  11. As a device for determining the location of a lesion in an ultrasound image of body tissue, The processor and memory are included, the memory stores instructions that cause the processor to perform a plurality of steps, and the plurality of steps are A step of generating a plurality of parameter images based on an ultrasonic RF signal reflected from the above body tissue, The step of inputting the above plurality of parameter images into an artificial neural network, and A step of determining the location of the lesion based on the output of the artificial neural network. A device including

Description

Method and apparatus for detecting a lesion in an ultrasound image This description relates to a method and apparatus for detecting lesions in ultrasound images. This description is the result of research conducted with funding from the Ministry of Science and ICT and supported by the National Research Foundation of Korea in 2024 (No. 2022M3A9B6082794). Traditional Korean Medicine (TKM) has recently been actively integrating with modern medical technology, and the accuracy and efficiency of diagnosis and treatment are improving through this convergence. In particular, the introduction of ultrasound diagnostic devices is enhancing the precision of diagnosis in Korean medicine. Ultrasound diagnostic devices can generate tomographic images of body tissues and blood flow images by directing ultrasound signals from the human skin toward tissue areas within the body and amplifying/converting ultrasound RF (radio frequency) signals reflected from different body tissues with differences in acoustic impedance. Compared to other imaging diagnostic devices such as X-ray diagnostic devices, CT scanners, Magnetic Resonance Imaging (MRI) devices, and nuclear medicine diagnostic devices, ultrasound diagnostic devices have the advantages of being compact, inexpensive, and capable of generating and displaying images in real time. Furthermore, because ultrasound diagnostic devices allow for the safe and non-invasive observation of human tissues without radiation exposure, they are widely used for cardiac, abdominal, urological, and gynecological diagnoses. Ultrasound imaging is generally inexpensive and can be provided in real-time for diagnosis, but it has the disadvantage of low image quality due to significant noise. FIG. 1 is a diagram showing a lesion detection system according to one embodiment. FIG. 2 is a flowchart illustrating a lesion detection method according to one embodiment. FIG. 3 is a diagram illustrating a lesion detection process according to one embodiment. FIG. 4 is a diagram showing parameter images according to one embodiment. FIG. 5 is a block diagram showing a lesion detection system according to another embodiment. The embodiments of this description are described below with reference to the attached drawings so that those skilled in the art can easily implement them. However, this description may be implemented in various different forms and is not limited to the embodiments described herein. Furthermore, in order to clearly explain this description in the drawings, parts unrelated to the explanation have been omitted, and similar parts throughout the specification are denoted by similar reference numerals. In this description, each of the phrases such as “A or B”, “at least one of A and B”, “at least one of A or B”, “A, B or C”, “at least one of A, B and C”, and “at least one of A, B, or C” may include any one of the items listed together in the corresponding phrase, or all possible combinations thereof. In this description, when a part is described as "including" a certain component, it means that, unless specifically stated otherwise, it does not exclude other components but may include additional components. Expressions written in the singular in this description may be interpreted as singular or plural unless explicit expressions such as "one" or "singular" are used. In this description, "and/or" includes each of the mentioned components and all combinations of one or more. In this description, terms including ordinal numbers, such as first, second, etc., may be used to describe various components, but said components are not limited by said terms. Such terms are used solely for the purpose of distinguishing one component from another. For example, without departing from the scope of the present disclosure, the first component may be named the second component, and similarly, the second component may be named the first component. In the flowchart described herein with reference to the drawings, the order of operations may be changed, multiple operations may be merged or some operations may be divided, and certain operations may not be performed. The Artificial Intelligence model (AI model) of the present disclosure is a machine learning model that learns at least one task and may be implemented as a computer program executed by a processor. The task learned by the AI model may refer to a problem to be solved through machine learning or a task to be performed through machine learning. The AI model may be implemented as a computer program executed on a computing device, downloaded via a network, or sold in the form of a product. Alternatively, the AI model may be linked with various devices via a network. FIG. 1 is a diagram showing a lesion detection system according to one embodiment, and FIG. 2 is a flowchart showing a lesion detection method according to one embodiment. Referring to FIG. 1, a lesion detection system (100) according to one embodiment includes an image generating device (110)