Search

KR-20260067409-A - APPARATUS AND METHOD FOR DIAGNOSTIC ASSISTANCE BASED ON ARTIFICIAL INTELLIGENCE PROCESSING FOR RADIOLOGICAL IMAGES

KR20260067409AKR 20260067409 AKR20260067409 AKR 20260067409AKR-20260067409-A

Abstract

The present invention relates to a diagnostic assistance device and method based on artificial intelligence processing of radiation images. A disease diagnostic assistance device according to one embodiment of the present invention may include an input processing unit that receives a training radiation image, additional information regarding the training radiation image, and a radiation image to be diagnosed; an artificial intelligence processing unit that learns the training radiation image and the additional information through a first feature processing unit, a second feature processing unit, and a feature fusion processing unit to construct a diagnostic assistance model and determines a bone mineral density (BMD) value and disease classification information for the radiation image to be diagnosed based on the diagnostic assistance model; and an information providing unit that provides disease diagnostic assistance information including at least one of the bone mineral density value, the disease classification information, and diagnostic basis information.

Inventors

  • 이성민
  • 김성태

Assignees

  • 경희대학교 산학협력단

Dates

Publication Date
20260513
Application Date
20241104

Claims (12)

  1. An input processing unit that receives a learning radiation image, additional information regarding the learning radiation image, and a radiation image to be diagnosed; An artificial intelligence processing unit that constructs a diagnostic assistance model by learning the training radiographic image and the additional information through a first feature processing unit, a second feature processing unit, and a feature fusion processing unit, and determines bone mineral density (BMD) values and disease classification information for the diagnostic target radiographic image based on the diagnostic assistance model; and Characterized by including an information providing unit that provides disease diagnosis assistance information including at least one of the above bone density value, the above disease classification information, and the above diagnostic basis information. Disease diagnosis assistance device.
  2. In paragraph 1, The first feature processing unit above learns either of the learning radiation image and the diagnostic target radiation image to extract a first feature, and The second feature processing unit is characterized by extracting a second feature by learning patient information including at least one of bone density evaluation, gender, age, case history, and whether or not surgery was performed among the additional information. Disease diagnosis assistance device.
  3. In paragraph 2, The feature fusion processing unit is characterized by constructing a diagnostic assistance model that determines the bone density value and disease classification information determined based on the bone density value by fusion learning the first feature and the second feature in the case of the training radiographic image, and determining the bone density value and disease classification information for the diagnostic target radiographic image based on the diagnostic assistance model. Disease diagnosis assistance device.
  4. In paragraph 3, The feature fusion processing unit is characterized by fusing the first feature and the second feature using at least one layer among a concatenation layer, an average pooling layer, and a fully-connected (FC) layer with respect to the first feature and the second feature. Disease diagnosis assistance device.
  5. In paragraph 2, The above-mentioned first feature processing unit is characterized by extracting the first feature using a CNN (Convolutional Neural Network) and a vision transformer. Disease diagnosis assistance device.
  6. In paragraph 2, The above second feature processing unit is characterized by using a fully-connected (FC) layer corresponding to a number corresponding to the number of additional information. Disease diagnosis assistance device.
  7. In paragraph 1, The above input processing unit is characterized by adjusting the size of the training radiation image and the diagnostic target radiation image, and aligning them equally to either the left or the right side to perform normalization of the training radiation image and the diagnostic target radiation image. Disease diagnosis assistance device.
  8. In paragraph 1, The above diagnostic basis information is characterized by including a radiographic image that indicates the portion where the first feature is extracted in relation to the bone density value and the disease classification information in the above diagnostic target radiographic image. Disease diagnosis assistance device.
  9. In an input processing unit, a step of receiving a training radiation image, additional information regarding the training radiation image, and a radiation image to be diagnosed; A step of constructing a diagnostic assistance model by learning the training radiographic image and the additional information through a first feature processing unit, a second feature processing unit, and a feature fusion processing unit in an artificial intelligence processing unit, and determining bone mineral density (BMD) values and disease classification information for the diagnostic target radiographic image based on the diagnostic assistance model; and The information providing unit is characterized by including the step of providing disease diagnosis assistance information comprising at least one of the bone density value, the disease classification information, and the diagnosis basis information. Disease diagnosis assistance method.
  10. In Paragraph 9, The step of constructing a diagnostic assistance model by learning the training radiographic image and the additional information through the first feature processing unit, the second feature processing unit, and the feature fusion processing unit, and determining bone mineral density (BMD) values and disease classification information for the diagnostic target radiographic image based on the diagnostic assistance model, In the first feature processing unit above, a step of extracting a first feature by learning either the learning radiation image and the diagnosis target radiation image; In the second feature processing unit above, a step of extracting a second feature by learning patient information including at least one of bone density evaluation, gender, age, case history, and whether or not surgery has been performed among the additional information; and The method is characterized by including the step of constructing a diagnostic assistance model in the above feature fusion processing unit for the case of the training radiographic image by fusion learning the first feature and the second feature to determine the bone density value and disease classification information determined based on the bone density value, and determining the bone density value and disease classification information for the diagnostic target radiographic image based on the above diagnostic assistance model. Disease diagnosis assistance method.
  11. In Paragraph 9, The step of receiving the above-mentioned learning radiation image, additional information regarding the above-mentioned learning radiation image, and the radiation image to be diagnosed is, The method is characterized by including the step of adjusting the size of the training radiation image and the diagnostic target radiation image, and aligning them equally to either the left or the right side to perform normalization on the training radiation image and the diagnostic target radiation image. Disease diagnosis assistance method.
  12. In Paragraph 9, The above diagnostic basis information is characterized by including a radiographic image that indicates the portion where the first feature is extracted in relation to the bone density value and the disease classification information in the above diagnostic target radiographic image. Disease diagnosis assistance method.

Description

Apparatus and Method for Diagnostic Assistance Based on Artificial Intelligence Processing of Radiologic Images The present invention relates to a diagnostic assistance device and method based on artificial intelligence processing of radiographic images, and more specifically, to a technology that provides diagnostic assistance information to prevent osteoporotic fractures and complications after bone-related surgery by constructing a diagnostic assistance model that predicts bone density through machine learning of radiographic images. As people age, bone density gradually decreases, and as bone mineral density (BMD) declines in this way, they develop osteoporosis, in which holes form in the bones. Osteoporosis can lead to very serious consequences, such as bones breaking even from minor impacts and failing to heal easily; therefore, periodic checkups are required to prevent osteoporosis or the worsening of symptoms. Osteoporosis refers to a condition in which bone mass is excessively reduced compared to normal individuals, and it is a clinical condition accompanied by fractures and deformities of bone shape. In other words, osteoporosis is a pathological condition characterized by an abnormal decrease in bone mass, accompanied by fractures of the spine and femur, as well as bone deformities. Conventionally, bone density information can be obtained by a medical professional interpreting images of the patient, such as X-rays or ultrasounds. However, if the medical professional interpreting the patient's X-ray images has low proficiency, there is a risk of misdiagnosis. Conventional bone density measurement methods primarily target the lumbar spine and hip joints, making it difficult to accurately predict shoulder bone density. In addition, low shoulder bone density can lead to osteoporotic humeral fractures and postoperative complications, but there was a lack of technical means to predict this in advance. With the development of artificial intelligence, there is a need to consider technology that obtains bone density information by having AI interpret X-ray images of patients. FIG. 1 is a diagram illustrating a diagnostic assistance device based on artificial intelligence processing of radiation images according to an embodiment of the present invention. FIG. 2 is a diagram illustrating the image preprocessing configuration of a diagnostic assistance device based on artificial intelligence processing of radiation images according to an embodiment of the present invention. FIGS. 3a to 3c are drawings illustrating an artificial intelligence processing unit of a diagnostic assistance device based on artificial intelligence processing of radiation images according to an embodiment of the present invention. FIGS. 4 and 5 are drawings illustrating diagnostic basis information provided by an artificial intelligence processing-based diagnostic aid for radiation images according to an embodiment of the present invention. FIGS. 6 and 7 are drawings illustrating a diagnostic assistance method based on artificial intelligence processing of radiation images according to an embodiment of the present invention. Hereinafter, various embodiments of this document are described with reference to the attached drawings. The embodiments and the terms used therein are not intended to limit the technology described in this document to specific embodiments and should be understood to include various modifications, equivalents, and/or substitutions of said embodiments. In describing various embodiments below, if it is determined that a detailed description of related known functions or configurations could unnecessarily obscure the essence of the invention, such detailed description will be omitted. Furthermore, the terms described below are defined considering their functions in various embodiments, and these may vary depending on the intentions or practices of the user or operator. Therefore, their definitions should be based on the content throughout this specification. In relation to the description of the drawings, similar reference numerals may be used for similar components. A singular expression may include a plural expression unless the context clearly indicates otherwise. In this document, expressions such as "A or B" or "at least one of A and/or B" may include all possible combinations of the items listed together. Expressions such as "first," "second," "first," or "second" may modify the corresponding components regardless of order or importance, and are used merely to distinguish one component from another without limiting the components. Where it is stated that a certain (e.g., first) component is "(functionally or telecommunicationally) connected" or "connected" to another (e.g., second) component, said certain component may be directly connected to said other component or connected through another component (e.g., third component). In this specification, "configured to" may be used interchangeably with, depending on