Search

EP-4110158-B1 - SYSTEMS AND METHODS FOR OBJECT MEASUREMENT IN MINIMALLY INVASIVE ROBOTIC SURGERY

EP4110158B1EP 4110158 B1EP4110158 B1EP 4110158B1EP-4110158-B1

Inventors

  • BONN, KENLYN S.
  • SNOW, Joshua R.
  • BAGROSKY, TYLER J.
  • MAZZARO, Luciano
  • MUCILLI, JASON M.
  • KINGSLEY, DYLAN R.
  • RAINIERI, CHRISTOPHER V.
  • LENNARTZ, AMANDA H.
  • LINHARES VIEIRA, Alexandre
  • HOWE, Kent L.
  • SLATER, Dustin O.
  • ROGERS, JUSTIN
  • FAGAN, JAMES R.
  • DHIMAN, Anjali
  • LINEBARGER, JOHN W.
  • ROBERTSON, Rayne
  • MALANG, KEITH W.
  • SAVARY, Matthew

Dates

Publication Date
20260506
Application Date
20210216

Claims (7)

  1. A system (10) for measuring an object in a surgical site, comprising: an imaging device (210) configured to capture an image of an object within a surgical operative site; an imaging device control unit (250) configured to control the imaging device, the imaging device control unit (250) including: a processor (252); and a memory (254) storing instructions which, when executed by the processor (252), cause the system to: capture (302) an image of an object within a surgical operative site via the imaging device; determine (306) a size of the object based on the captured image of the object; display (308) the captured image of the object; and display (310), on the displayed captured image of the object, a representation of the determined size of the object wherein the processor (252) is further configured to determine the size of the object based on: a geometry of a surgical instrument captured in the image; a depth of each of a plurality of pixels in the captured image; a first location of the surgical instrument in a first frame of the captured image; a second location of the surgical instrument in a second frame of the captured image; and a difference between the first location and the second location.
  2. The system (10) of any preceding claim, wherein the processor (252) is configured to provide at least one of a visual warning or an audio warning when the object is disposed outside of a field of view of the captured image.
  3. The system (10) of claim 2, wherein the object is the surgical instrument.
  4. The system (10) of any preceding claim, wherein the object is the surgical instrument and the processor (252) is configured to disable the surgical instrument when the surgical instrument is outside of a field of view of the captured image.
  5. The system (10) of claim 4, wherein the instructions, when executed, further cause the system to highlight the surgical instrument on the display when the surgical instrument is in the field of view of the captured image.
  6. A non-transitory storage medium that stores a program causing a computer to execute a computer-implemented method for measuring an object in a surgical site, comprising: capturing an image of an object within a surgical operative site via an imaging device; determining a size of the object based on the captured image of the object; displaying the captured image of the object; and displaying, on the displayed captured image of the object, a representation of the determined size of the object further comprising determining the size of the object based on: a geometry of a surgical instrument captured in the image; a depth of each of a plurality of pixels in the captured image; a first location of the surgical instrument in a first frame of the captured image; a second location of the surgical instrument in a second frame of the captured image; and a difference between the first location and the second location.
  7. The non-transitory storage medium of claim 6, wherein the computer-implemented method, further comprises providing at least one of a visual warning, an audio warning, or a tactile warning when the object is disposed outside of a field of view of the captured image.

Description

FIELD The disclosure relates to robotics, and more specifically to robotic surgical devices, assemblies, and/or systems for performing endoscopic surgical procedures. Provided herein is a system and a non-transitory storage medium storing steps of a computer-implemented method for measuring an object in a surgical site. BACKGROUND Endoscopic instruments have become widely used by surgeons in endoscopic surgical procedures because they enable surgery to be less invasive as compared to conventional open surgical procedures in which the surgeon is required to cut open large areas of body tissue. As a direct result thereof, endoscopic surgery minimizes trauma to the patient and reduces patient recovery time and hospital costs. Some endoscopic instruments incorporate rotation and/or articulation features, thus enabling rotation and/or articulation of an end effector assembly of the endoscopic surgical instrument, disposed within the surgical site, relative to a handle assembly of the endoscopic surgical instrument, which remains externally disposed, to better position the end effector assembly for performing a surgical task within the surgical site. An endoscopic camera communicating with an operating room display is also often utilized in endoscopic surgery to enable the surgeon to visualize the surgical site as the end effector assembly is maneuvered into position and operated to perform the desired surgical task. US2019/175009A1 describes an object sizing system for an object positioned within a patient. US2019/328349A1 describes stone identification methods and systems. US2019/192237A1 describes a medical system for use in a lithotripsy procedure. US2019/058836A1 describes an apparatus and method for composing objects using a depth map. WO2015/149041A1 describes quantitative three-dimensional visualization of instruments in a field of view SUMMARY OF THE INVENTION The invention is defined by the appended independent claims. Optional features are set out in the appended dependent claims. SUMMARY OF THE DISCLOSURE The disclosure relates to devices, systems, and methods for surgical instrument identification in images. In accordance with aspects of the disclosure, a system for measuring an object in a surgical site is presented. The system includes an imaging device and an imaging device control unit. The imaging device control unit includes a processor and a memory storing instructions. The instructions, when executed by the processor, cause the system to: capture an image of an object within a surgical operative site via the imaging device; display the captured image of the object; and display, on the displayed captured image of the object, a representation of the determined size of the object. In an aspect of the present disclosure, the processor may be configured to determine the size of the object based on a depth of each of a plurality of pixels in the captured image, a focal length of the imaging device, and a field of view of the imaging device. In another aspect of the present disclosure, the processor may be further configured to determine the size of the object based on: a geometry of a surgical instrument captured in the image; a depth of each of a plurality of pixels in the captured image; a first location of the surgical instrument in a first frame of the captured image; a second location of the surgical instrument in a second frame of the captured image; and a difference between the first location and the second location. In an aspect of the present disclosure, the processor may be configured to determine the size of the object by providing, as input to a trained neural network stored in the memory: a depth of each of a plurality of pixels in the captured image; a focal length of the imaging device; and a field of view of the imaging device. In yet another aspect of the present disclosure, the system may further include a light source configured to emit structured light within the surgical operative site. In a further aspect of the present disclosure, the instructions, when executed, may further cause the system to: capture an image of a surgical instrument within the surgical operative site by the imaging device; and determine a location of the surgical instrument within a field of view of the captured image of the surgical instrument based on the structured light. In yet a further aspect of the present disclosure, the instructions, when executed, may further cause the system to re-center the imaging device based on the determined location of the surgical instrument; generate a re-centered image based on the re-centered imaging device; and display the re-centered image on the display. In an aspect of the present disclosure, the processor may be further configured to provide a visual warning and/or audio warning when the object is disposed outside of a field of view of the captured image. In a further aspect of the present disclosure, the object may be a surgical instrument. In yet another aspect