Search

US-12620135-B2 - Endoscope device, endoscopic medical assistance system and endoscopic image processing method

US12620135B2US 12620135 B2US12620135 B2US 12620135B2US-12620135-B2

Abstract

An endoscope device, an endoscopic medical assistance system and an endoscopic image processing method are provided. An endoscope in the endoscope device and the endoscopic medical assistance system senses light beams reflected by an object to generate initial images. After the initial images are aligned with each other to form a superimposed image, an image fusion device of the endoscopic medical assistance system analyzes features of the superimposed image to convert the superimposed image into a feature image. The image fusion device adjusts a color depth of each of a plurality of pixel points of the feature image according to a relationship between the color depth of each of the pixel points and a maximum color depth that are classified in the same color tone in each of pixel regions of the feature image. The image fusion device generates a fusion image according to the adjusted feature image.

Inventors

  • Yuan-Ting Fang
  • Ming-Huang Hsiao

Assignees

  • MORNING DEW CREATIVE TECHNICAL CO., LTD.

Dates

Publication Date
20260505
Application Date
20230528
Priority Date
20230331

Claims (8)

  1. 1 . An endoscopic medical assistance system, comprising: an endoscope, wherein, after a plurality of ambient lights are irradiated on an object and then are reflected by the object to form a plurality of test lights, the endoscope senses parts of a plurality of colored light beams of the plurality of test lights that fall within a same one of a plurality of imaging ranges to generate each of a plurality of initial images, so as to generate the plurality of initial images within the plurality of imaging ranges, respectively; and an image fusion device connected to the endoscope, wherein the image fusion device aligns the plurality of initial images with each other to form a superimposed image, the image fusion device analyzes a plurality of features of the superimposed image and accordingly converts the superimposed image into a feature image, the image fusion device reads and identifies colors of a plurality of pixel points in a plurality of pixel regions of the feature image, the image fusion device compares color depths of the plurality of pixel points in each of the plurality of pixel regions of the feature image with each other, the image fusion device determines a largest one of the color depths of the plurality of pixel points that are classified in a same one of a plurality of color tones in each of the plurality of pixel regions as a maximum color depth of the same one of the plurality of color tones in each of the plurality of pixel regions of the feature image, the image fusion device analyzes a relationship between the color depth of each of the plurality of pixel points and the maximum color depth that are classified in the same one of the plurality of color tones in each of the plurality of pixel regions of the feature image, the image fusion device adjusts the color depth of each of the plurality of pixel points of the feature image according to the relationship between the color depth of each of the plurality of pixel points and the maximum color depth that are classified in the same one of the plurality of color tones in each of the plurality of pixel regions of the feature image, and the image fusion device generates a fusion image according to the feature image on which the color depths of all of the plurality of pixel regions are adjusted.
  2. 2 . The endoscopic medical assistance system according to claim 1 , wherein the endoscope includes: a beam splitter configured to split each of the plurality of test lights into a number of the plurality of colored light beams, and to reflect the plurality of colored light beams of the plurality of test lights respectively along a plurality of light reflecting paths; and a plurality of image sensors disposed respectively within the plurality of imaging ranges, wherein each of the plurality of image sensors are disposed in a number of the plurality of light reflecting paths; wherein the plurality of image sensors sense the plurality of colored light beams in the plurality of light reflecting paths to generate the plurality of initial images within the plurality of imaging ranges, respectively.
  3. 3 . The endoscopic medical assistance system according to claim 2 , wherein the plurality of colored light beams sensed by one of the plurality of image sensors are a green light beam and a blue light beam, and the plurality of colored light beams sensed by another one of the plurality of image sensors are a red light beam, a green light beam and a blue light beam.
  4. 4 . The endoscopic medical assistance system according to claim 1 , wherein, when the relationship between the color depth of any one of the plurality of pixel points and the maximum color depth that are classified in the same one of the plurality of color tones in a same one of the plurality of pixel regions of the feature image does not meet a preset relationship, the image fusion device adjusts the color depth of the one of the plurality of pixel points according to the maximum color depth, and the image fusion device generates the fusion image according to the feature image on which the color depths of all of the plurality of pixel regions are adjusted according to the maximum color depths of the plurality of pixel regions.
  5. 5 . The endoscopic medical assistance system according to claim 1 , wherein, when a color difference between the color depth of any one of the plurality of pixel points and the maximum color depth that are classified in the same one of the plurality of color tones in a same one of the plurality of pixel regions of the feature image is larger than a color difference threshold, the image fusion device adjusts the color depth of the one of the plurality of pixel points according to the maximum color depth.
  6. 6 . The endoscopic medical assistance system according to claim 1 , wherein the image fusion device sets a plurality of weight values according to the relationships between the color depths of the plurality of pixel points and the maximum color depths that are classified in the plurality of color tones in each of the plurality of pixel regions of the feature image, and the image fusion device determines adjustment degrees of the color depths of the plurality of pixel points of the feature image according to the plurality of weight values of the plurality of pixel points of the feature image, respectively.
  7. 7 . The endoscopic medical assistance system according to claim 6 , wherein the image fusion device establishes a weight chart of the feature image according to the plurality of weight values of the plurality of pixel points of the feature image, and the image fusion device determines the adjustment degrees of the color depths of the plurality of pixel points of the feature image according to the plurality of weight values on the weight chart.
  8. 8 . An endoscopic image processing method, comprising the following steps: sensing a plurality of test lights after a plurality of ambient lights are irradiated on an object and then are reflected by the object to form the plurality of test lights; sensing parts of a plurality of colored light beams of the plurality of test lights that fall within a same one of a plurality of imaging ranges to generate each of a plurality of initial images, so as to generate the plurality of initial images respectively within the plurality of imaging ranges; aligning the plurality of initial images with each other to form a superimposed image; analyzing a plurality of features of the superimposed image to convert the superimposed image into a feature image; reading and identifying colors of a plurality of pixel points in a plurality of pixel regions of the feature image; comparing color depths of the plurality of pixel points in each of the plurality of pixel regions of the feature image with each other; determining a largest one of the color depths of the plurality of pixel points that are classified in a same one of a plurality of color tones in each of the plurality of pixel regions, as a maximum color depth of the same one of the plurality of color tones in each of the plurality of pixel regions of the feature image; analyzing a relationship between the color depth of each of the plurality of pixel points and the maximum color depth that are classified in the same one of the plurality of color tones in each of the plurality of pixel regions of the feature image; adjusting the color depth of each of the plurality of pixel points according to the relationship between the color depth of each of the plurality of pixel points and the maximum color depth that are classified in the same one of the plurality of color tones in each of the plurality of pixel regions of the feature image; and generating a fusion image according to the feature image on which the color depths of all of the plurality of pixel regions are adjusted.

Description

CROSS-REFERENCE TO RELATED PATENT APPLICATION This application claims the benefit of priority to Taiwan Patent Application No. 112112441, filed on Mar. 31, 2023. The entire content of the above identified application is incorporated herein by reference. Some references, which may include patents, patent applications and various publications, may be cited and discussed in the description of this disclosure. The citation and/or discussion of such references is provided merely to clarify the description of the present disclosure and is not an admission that any such reference is “prior art” to the disclosure described herein. All references cited and discussed in this specification are incorporated herein by reference in their entireties and to the same extent as if each reference was individually incorporated by reference. FIELD OF THE DISCLOSURE The present disclosure relates to an endoscope device, and more particularly to an endoscope device, an endoscopic medical assistance system and an endoscopic image processing method. BACKGROUND OF THE DISCLOSURE In recent years, medical capsules, such as capsule endoscopes, have been widely used in the medical field. After the medical capsule is swallowed into a human body, the medical capsule sequentially flows through a plurality of body parts inside the human body. At the same time, the medical capsule captures a plurality of images inside the plurality of body parts (such as organs, tissues and so on) of the human body. However, the medical capsules such as the capsule endoscopes cannot capture the plurality of images of the same one of the body parts of the human body respectively under different colored lights inside the body parts of the human body. Furthermore, the images captured by the medical capsules such as the capsule endoscopes are blurred images. However, medical personnel must accurately diagnose health of the human body based on a clear image. Therefore, the plurality of images captured by the medical capsules must be further fused to form the clear image by a system for a long period of time. The medical personnel cannot instantly obtain the clear image for accurate diagnosis of the human health. SUMMARY OF THE DISCLOSURE In response to the above-referenced technical inadequacies, the present disclosure provides an endoscope device. After a plurality of ambient lights are irradiated on an object and then are reflected by the object to form a plurality of test lights, the endoscope senses parts of a plurality of colored light beams of the plurality of test lights that fall within a same one of a plurality of imaging ranges to generate each of a plurality of initial images, so as to generate the plurality of initial images respectively within the plurality of imaging ranges. In certain embodiments, the endoscope includes a beam splitter and a plurality of image sensors. The beam splitter is configured to split each of the plurality of test lights into some of the plurality of colored light beams. The beam splitter is configured to reflect the plurality of colored light beams of the plurality of test lights respectively along a plurality of light reflecting paths. The plurality of image sensors disposed respectively within the plurality of imaging ranges. Each of the plurality of image sensors is disposed in a number of the plurality of light reflecting paths. The plurality of image sensors sense the plurality of colored light beams in the plurality of light reflecting paths to respectively generate the plurality of initial images within the plurality of imaging ranges. In addition, the present disclosure provides an endoscopic medical assistance system. The endoscopic medical assistance system includes an endoscope and the image fusion device. After a plurality of ambient lights are irradiated on an object and then are reflected by the object to form a plurality of test lights, the endoscope senses parts of a plurality of colored light beams of the plurality of test lights that fall within a same one of a plurality of imaging ranges to generate each of a plurality of initial images, so as to generate the plurality of initial images respectively within the plurality of imaging ranges. The image fusion device is connected to the endoscope. The image fusion device aligns the plurality of initial images with each other to form a superimposed image. The image fusion device analyzes a plurality of features of the superimposed image and accordingly converts the superimposed image into a feature image. The image fusion device reads and identifies colors of a plurality of pixel points in a plurality of pixel regions of the feature image. The image fusion device compares color depths of the plurality of pixel points in each of the plurality of pixel regions of the feature image with each other. The image fusion device determines a largest one of the color depths of the plurality of pixel points that are classified in a same one of a plurality of color tones i