Search

US-20260123999-A1 - ELECTRONICALLY GUIDED PRECISION MEDICAL TARGETING USING NEAR INFRARED FLUORESCENCE IMAGING

US20260123999A1US 20260123999 A1US20260123999 A1US 20260123999A1US-20260123999-A1

Abstract

A medical targeting system facilitates guidance of a real-time imaging system and intervention tool to a calibrated position of a target identified in preprocedural images. Real-time images from an imaging system are registered with preprocedural images. The imaging device is guided to the vicinity of a virtual target identified from the registered preprocedural images until the physical target is identified in the real-time images. The virtual target position can then be calibrated to correspond to the physical target. Once calibrated, an intervention tool can be guided to the calibrated position to perform the procedure, even if the real time imaging no longer shows physical target. The imaging device may include a near infrared fluorescence (NIRF) fiberscope that enables imaging of targets outside of the anatomical channel or other pathway being traversed by the imaging device.

Inventors

  • Alexandru Paunescu
  • Matthew Krever
  • Michelle Mahoney
  • Chad E. Eckert
  • Christopher Sramek
  • Charles Scheib
  • Ravi PATEL

Assignees

  • JOHNSON & JOHNSON ENTERPRISE INNOVATION INC.

Dates

Publication Date
20260507
Application Date
20231110

Claims (20)

  1. 1 . A method for performing guidance to a physical target, the method comprising: obtaining preprocedural images representing an anatomy; obtaining an initial virtual target position of the physical target in the preprocedural images; obtaining, from a real-time imaging device, real-time intraprocedural images of the anatomy; registering the real-time intraprocedural images of the anatomy to the preprocedural images to generate a mapping between coordinates of the real-time intraprocedural images and coordinates of the preprocedural images; facilitating electronic guidance of the real-time imaging device to a vicinity of the initial virtual target position using the mapping; obtaining a position of the physical target; determining a calibrated virtual target position corresponding to the obtained position of the physical target; and facilitating guidance of a medical tool to the physical target based at least in part on the calibrated virtual target position.
  2. 2 . The method of claim 1 , wherein determining the calibrated virtual target position comprises: presenting, to a display device, a representation of a virtual target at the initial virtual target position overlaid on the real-time intraprocedural images; receiving at least one user input for repositioning the virtual target and updating the display device to indicate the repositioning; and determining the position of the physical target responsive to detecting alignment between a repositioned virtual target and the physical target.
  3. 3 . (canceled)
  4. 4 . The method of claim 1 , wherein determining the calibrated virtual target position comprises: generating a representation of a virtual target at the initial virtual target position overlaid on the real-time intraprocedural images; automatically generating adjustments to the initial virtual target position and applying an image analysis until the virtual target substantially overlaps the physical target; and determining the position of the physical target based at least in part on the adjustments.
  5. 5 . The method of claim 1 , wherein determining the calibrated virtual target position comprises: receiving user inputs to control position of the real-time imaging device to capture at least two images of the physical target from different viewpoints; and determining the position of the physical target based at least in part on the at least two images.
  6. 6 - 7 . (canceled)
  7. 8 . The method of claim 1 , wherein facilitating the guidance comprises: generating electronic navigation guidance for an electronically assisted intervention tool to the calibrated virtual target position.
  8. 9 . The method of claim 1 , wherein facilitating the guidance comprises: displaying, on a display screen, the preprocedural images and a representation of a virtual target at the calibrated virtual target position overlaid on the preprocedural images; and displaying, on the display screen, a position of an intervention tool overlaid on the preprocedural images to enable guidance of the intervention tool to the calibrated virtual target position.
  9. 10 . The method of claim 1 , wherein facilitating the guidance comprises: displaying, on a display screen, the real-time intraprocedural images and a representation of a virtual target at the calibrated virtual target position overlaid on the real-time intraprocedural images.
  10. 11 . The method of claim 1 , wherein facilitating the guidance comprises: determining a distance from an intervention tool to the physical target; and generating control signals to guide the intervention tool based at least in part on the distance.
  11. 12 . (canceled)
  12. 13 . The method of claim 1 , wherein the real-time imaging device comprises an endoscope for obtaining visible light images and wherein the real-time imaging device further includes a fiberscope for obtaining near-infrared fluorescent images, wherein registering the real-time intraprocedural images to the preprocedural images comprises: registering the visible light images from the endoscope to the preprocedural images; and registering the near-infrared fluorescent images from the fiberscope to the visible light images from the endoscope.
  13. 14 . (canceled)
  14. 15 . The method of claim 13 , wherein the fiberscope has a substantially fixed relative position to the endoscope, and wherein registering the near-infrared fluorescent images to the visible light images is based at least in part on the substantially fixed relative position.
  15. 16 . The method of claim 13 , wherein the fiberscope is positioned within a field of view of the endoscope, and wherein registering the near-infrared fluorescent images to the visible light images is based at least in part on detecting the fiberscope in the visible light images or the near-infrared fluorescent images.
  16. 17 . (canceled)
  17. 18 . The method of claim 13 , wherein the fiberscope is configured to pass through a working channel of the endoscope.
  18. 19 . The method of claim 1 , wherein obtaining the real-time intraprocedural images of the anatomy further comprises: obtaining a time-of-flight of near-infrared fluorescent light projected by a fiberscope to determine a depth component of near-infrared fluorescent images.
  19. 20 . The method of claim 1 , wherein determining the calibrated virtual target position corresponding to the detected position of the physical target comprises: emitting, by a fiberscope of the real-time imaging device, near-infrared light of one or more wavelengths; detecting, by the fiberscope, reflectance properties of tissue in a path of the near-infrared light; and discerning when the physical target is in the path of the near-infrared light based at least in part on the reflectance properties.
  20. 21 . The method of claim 20 , further comprising: determining the detected position of the physical target based at least in part on a time-of-flight of the near-infrared light between the fiberscope and a surface of the physical target.

Description

BACKGROUND Technical Field The described embodiments relate to a system and method for precision guidance of medical tools to an anatomical target. Description of Related Art Successful robotic-guided diagnosis and therapy depends on the ability to obtain precise anatomical positions of a tumor or other anatomical target. Typically, such systems rely on identification of the target position in preprocedural images (e.g., computerized tomography (CT) scan images) obtained from a preprocedural imaging system. The preprocedural images are then registered to the real-time anatomy of the patient during the procedure based at least in part on anatomical landmarks, electromagnetic sensors, or other techniques. However, this registration process may suffer from inaccuracies due to anatomical changes in the patient between the preprocedural image scan and the procedure, noise in the electromagnetic field affecting the electromagnetic sensors, or other factors. Such inaccuracies complicate the procedure and may negatively impact the likelihood of success. SUMMARY A medical targeting system performs calibrated guidance to a physical target. Preprocedural images representing an anatomy are obtained. An initial virtual target position of the physical target in the preprocedural images is determined. Real-time intraprocedural images of the anatomy are furthermore obtained from a real-time imaging device. The real-time intraprocedural images of the anatomy are registered to the preprocedural images to generate a mapping between coordinates of the real-time intraprocedural images and coordinates of the preprocedural images. The real-time imaging device is electronically guided to a vicinity of the initial virtual target position using the mapping. A position of the physical target is obtained and a calibrated virtual target position corresponding to the obtained position of the physical target is determined. Guidance of a medical tool to the physical target is facilitated based at least in part on the calibrated virtual target position. In an embodiment, determining the calibrated virtual target position comprises presenting, to a display device, a representation of a virtual target at the initial virtual target position overlaid on the real-time intraprocedural images, receiving at least one user input for repositioning the virtual target and updating the display device to indicate the repositioning, and determining the position of the physical target responsive to detecting alignment between a repositioned virtual target and the physical target. In an embodiment, the at least one user input comprises at least one of a first input comprising a first adjustment of the virtual target along a first axis, a second input comprising a second adjustment of the virtual target along a second axis, and a third input comprising a third adjustment of the virtual target along a third axis. In an embodiment, determining the calibrated virtual target position comprises generating a representation of a virtual target at the initial virtual target position overlaid on the real-time intraprocedural images, automatically generating adjustments to the initial virtual target position and applying an image analysis until the virtual target substantially overlaps the physical target, and determining the position of the physical target based at least in part on the adjustments. In an embodiment, determining the calibrated virtual target position comprises receiving user inputs to control position of the real-time imaging device to capture at least two images of the physical target from different viewpoints, and determining the position of the physical target based at least in part on the at least two images. In an embodiment, determining the position of the physical target comprises receiving user inputs marking the physical target in the at least two images. In an embodiment, determining the position of the physical target comprises automatically identifying the physical target in the at least two images using image analysis. In an embodiment, facilitating the guidance comprises generating electronic navigation guidance for an electronically assisted intervention tool to the calibrated virtual target position. In an embodiment, facilitating the guidance comprises displaying, on a display screen, the preprocedural images and a representation of a virtual target at the calibrated virtual target position overlaid on the preprocedural images and displaying, on the display screen, a position of an intervention tool overlaid on the preprocedural images to enable guidance of the intervention tool to the calibrated virtual target position. In an embodiment, facilitating the guidance comprises displaying, on a display screen, the real-time intraprocedural images and a representation of a virtual target at the calibrated virtual target position overlaid on the real-time intraprocedural images. In an embodiment, facilitating the guidance comprises determi