Search

EP-4225194-B1 - DEVICE AND SYSTEM FOR IMAGE GENERATION BASED ON CALCULATED ROBOTIC ARM POSITIONS

EP4225194B1EP 4225194 B1EP4225194 B1EP 4225194B1EP-4225194-B1

Inventors

  • WEISS, Noam
  • SHMAYAHU, Yizhaq

Dates

Publication Date
20260506
Application Date
20210715

Claims (10)

  1. A device (102) for obtaining time of flight images based on a surgical plan comprising: at least one processor (104); and at least one memory (106) storing instructions for execution by the at least one processor (104) that, when executed, cause the at least one processor (104) to: receive a surgical plan; determine, based on the surgical plan, a first path for a first robotic arm and a second path for a second robotic arm; cause the first robotic arm to move on the first path, the first robotic arm holding a transducer; cause the second robotic arm to move on the second path, the second robotic arm holding a receiver; and receive at least one image from the receiver, the image depicting patient anatomy and generated using time-of-flight measurements.
  2. The device (102) of claim 1, wherein the memory (106) stores further data for processing by the processor (104) that, when processed, causes the processor (104) to calculate a required pressure amplitude setting for the receiver.
  3. The device of claim 1, wherein the first robotic arm moves on the first path synchronously to the second robotic arm moving on the second path.
  4. The device of claim 1, wherein the at least one image is a three-dimensional model.
  5. The device of claim 1, wherein the surgical plan includes information about a region of interest of the patient anatomy, and further wherein determining the first path and the second path is based on the information.
  6. The device of claim 1, wherein the transducer is a first transducer, and wherein the memory (106) stores further data for processing by the processor (104) that, when processed, causes the processor (104) to: determine a third path for a third robotic arm; and cause the third robotic arm to move on the third path, the third robotic arm holding a second transducer, wherein the first transducer has an image setting different from the second transducer.
  7. The device of claim 6, wherein the image setting is at least one of a frequency and/or an amplitude.
  8. The device of claim 6, wherein the image is an elastographic image.
  9. The device of claim 1, wherein each of the first path and the second path is at least one of or a combination of an intra-corporeal path or an extra-corporeal path.
  10. A system (100) for obtaining time of flight images based on a surgical plan comprising: an imaging device (112) comprising a transducer and a receiver; a plurality of robotic arms (132), a first robotic arm of the plurality of robotic arms configured to hold the transducer and a second robotic arm of the plurality of robotic arms configured to hold the receiver; and a device (102) according to claim 1.

Description

CROSS REFERENCE TO RELATED APPLICATIONS This application claims the benefit of U.S. Non-Provisional Application No. 17/375,834 filed on July 14, 2021, and entitled "Method, Device, and System for Image Generation Based on Calculated Robotic Arm Positions" and U.S. Provisional Application No. 63/088,372, filed on October 6, 2020, and entitled "Method, Device, and System for Image Generation Based on Calculated Robotic Arm Positions". FIELD The present technology is related generally to obtaining images and, more particularly, to obtaining images based on calculated or known robotic arm positions. BACKGROUND Images obtained prior to and/or during a surgical operation may be used to perform registration prior to the surgical operation, verify registration during the surgical operation, and/or to determine movement of an anatomical feature of the patient. The images may be obtained throughout the surgical operation and may require activation of an imaging device multiple times during the operation. US 2020/015923 A1 discloses a surgical visualization system. The surgical visualization system is configured to identify one or more structure(s) and/or determine one or more distances with respect to obscuring tissue and/or the identified structure(s). The surgical visualization system can facilitate avoidance of the identified structure(s) by a surgical device. The surgical visualization system can comprise a first emitter configured to emit a plurality of tissue-penetrating light waves and a second emitter configured to emit structured light onto the surface of tissue. The surgical visualization system can also include an image sensor configured to detect reflected visible light, tissue-penetrating light, and/or structured light. The surgical visualization system can convey information to one or more clinicians regarding the position of one or more hidden identified structures and/or provide one or more proximity indicators. According to US 2015/297177 A1, a robot-assisted ultrasound system includes a first ultrasound probe, a robot comprising a manipulator arm having a tool end, and a second ultrasound probe attached to the tool end of the manipulator arm. The robot-assisted ultrasound system further includes a robot control system configured to control at least one of a position or a pose of the second ultrasound probe based on a contemporaneous position and pose of the first ultrasound probe, and an ultrasound processing and display system configured to communicate with at least one of the first and second ultrasound probes to receive and display an ultrasound image based on the first and second ultrasound probes acting in conjunction with each other. SUMMARY The invention provides a device for obtaining time of flight images based on a surgical plan according to claim 1. Further embodiments are described in the dependent claims. Any methods disclosed herein do not form part of the claimed invention. Embodiments of the present disclosure advantageously provide for obtaining images from known positions and orientations, such that the images may be used to update an existing 3D model and/or verify the accuracy of a registration throughout a surgical operation, such that up-to-date information is provided to a surgeon and/or a surgical robot. Further, embodiments of the present disclosure beneficially prevent harmful exposure of the patient to multiple iterations of radiation. The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims. The phrases "at least one", "one or more", and "and/or" are open-ended expressions that are both conjunctive and disjunctive in operation. For example, each of the expressions "at least one of A, B and C", "at least one of A, B, or C", "one or more of A, B, and C", "one or more of A, B, or C" and "A, B, and/or C" means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together. When each one of A, B, and C in the above expressions refers to an element, such as X, Y, and Z, or class of elements, such as X1-Xn, Y1-Ym, and Z1-Zo, the phrase is intended to refer to a single element selected from X, Y, and Z, a combination of elements selected from the same class (e.g., X1 and X2) as well as a combination of elements selected from two or more classes (e.g., Y1 and Zo). The term "a" or "an" entity refers to one or more of that entity. As such, the terms "a" (or "an"), "one or more" and "at least one" can be used interchangeably herein. It is also to be noted that the terms "comprising", "including", and "having" can be used interchangeably. The preceding is a simplified summary of the disclosure to provide an understanding of some aspects of the disclosure. This summary is neither an extensive nor exhaustive overview of the d