EP-3648663-B1 - SURGICAL INSTRUMENT WITH REAL TIME NAVIGATION ASSISTANCE
Inventors
- MCGINLEY, JOSEPH C.
- SWARTZ, Michael Andrew
- STOUT, Thomas Edward
- ZELINA, Mike
- STONER, Collin T.
Dates
- Publication Date
- 20260513
- Application Date
- 20181002
Claims (14)
- A navigation assistance system (100) for use with a surgical instrument (136), comprising: a sensor configuration comprising a plurality of sensors, at least one sensor of the plurality of sensors being a three-dimensional imager and infrared time-of-flight sensor (120), the sensor configuration for locating on the surgical instrument (136) and operable to be disposed relative to a patient on which an operation is to be performed, wherein the sensor configuration is operative to continually detect at least one reference anatomical feature (142) of the patient adjacent to the surgical instrument (136) to generate positioning data regarding a location of the surgical instrument (136) relative to the at least one reference anatomical feature (142), the positioning data comprising image field depth and surgical site image data regarding the at least one reference anatomical feature (142) adjacent to the location of the surgical instrument (136), wherein the sensor configuration is configured to be disposed at the surgical instrument (136) to have a collective observable field (122) that extends entirely about the surgical instrument (136), wherein the sensor configuration is operable to determine a position of the instrument (136) relative to the patient; a data repository (150) comprising imaging data corresponding to an imaging study of the patient, wherein the imaging data includes at least one imaged anatomical feature (144) of the patient; and a navigation module (110) that is operative to continually correlate the at least one reference anatomical feature (142) of the patient from the positioning data with at least one corresponding reference anatomical feature appearing in the imaging data by aligning the at least one corresponding reference anatomical feature of the imaging data with the at least one reference anatomical feature (142) of the positioning data and continually provide navigation data corresponding to a relative position of the surgical instrument (136) with respect to the at least one imaged anatomical feature (144) of the patient for navigation assistance of the surgical instrument (136) in real-time during the operation.
- The navigation assistance system (100) according to claim 1, wherein the imaging data comprises three-dimensional data.
- The navigation assistance system (100) according to any one of claims 1-2, wherein the at least one reference anatomical feature (142) comprises a dimensionally stable structure.
- The navigation assistance system (100) according to any one of claims 1-3, wherein the at least one reference anatomical feature (142) comprises an internal anatomical feature.
- The navigation assistance system (100) according to claim 4, wherein the at least one reference anatomical feature (142) comprises a bone.
- The navigation assistance system (100) of any one of claims 1-5, wherein the at least one reference anatomical feature (142) comprises an external anatomical feature.
- The navigation assistance system (100) according to claim 6, wherein the at least one reference anatomical feature (142) comprises a contour of the skin of the patient.
- The navigation assistance system (100) according to any one of claims 1-7, wherein the at least one reference anatomical feature (142) comprises one or more of an arm, a leg, a hand, a foot, a finger, a toe, a head, a torso, a spine, a pelvis, or other dimensionally stable anatomical landmark detectable by the three-dimensional imager and infrared time-of-flight sensor (120).
- The navigation assistance system (100) according to any one of claims 1-8, wherein the at least one imaged anatomical feature (144) comprises at least one subcutaneous structure.
- The navigation assistance system (100) according to any one of claims 1-9, wherein the at least one imaged anatomical feature (144) comprises at least one of a bone, a blood vessel, or a nerve.
- The navigation assistance system (100) according to any one of claims 1-10, wherein the navigation data is operable to be displayed in relation to the imaging data.
- The navigation assistance system (100) according to claim 11, wherein the navigation data is further operable to be displayed in an augmented reality display (170) positioned relative to a user.
- The navigation assistance system (100) according to claim 12, wherein the navigation data is at least partially based on the position of the augmented reality display (170) relative to the patient.
- The navigation assistance system (100) according to any one of claims 1-13, wherein the navigation data comprises trajectory information regarding the surgical instrument (136) relative to the patient.
Description
FIELD The present disclosure relates to systems for use in surgical operations to monitor a surgical instrument prior to or during an operation. BACKGROUND The use of powered surgical instruments is common in many surgical procedures. Examples of such instruments may include drills, saws, grinders, or the like that may be electric, pneumatic, hydraulic, or otherwise powered. Often times, use of such powered surgical instruments may allow for more efficient surgical operations, thus resulting in reduced risk to the patient, improved efficiency for the surgeon, lower costs, and improved outcomes. However, while such powered surgical instruments may provide advantages over human powered instruments, there may also be risk for inadvertent damage to the anatomy of the patient when using powered instruments. Specifically, certain anatomical structures of a patient may be vulnerable to being inadvertently damaged by powered surgical instruments. In this regard, it is important for a surgeon to accurately determine the placement and location of the surgical instrument before and during an operation. In addition, medical imaging technology has improved the ability for a surgeon to accurately image the anatomy of the patient. However, despite the advances in imaging technology, it remains difficult for a surgeon to utilize such imaging during an operation. For instance, a pre-surgical imaging study may be conducted that may be reviewed by a surgeon prior to an operation. This imaging study may provide the surgeon valuable information regarding patient anatomy to assist the surgeon in planning the operation. However, beyond use in planning, the imaging study may provide little or no detail to the surgeon when actually performing a surgery using a powered instrument without subsequent imaging during the operation. Requiring further imaging during an operation to assist with instrument placement potentially exposes the patient to increased radiation, requires additional time, and may increase the overall risk associated with the operation. As such, there remains a need to improve the ability to provide guidance data to a surgeon using a surgical instrument in an operation. U.S. Patent Publication 2017/0231718 A1 to Wohrle et al. describes a medical system having an instrument configured to fit in a patient's anatomy, an optical sensor associated with the instrument, a processing unit receiving data from the optical sensor, and a display displaying position of the instrument in the patient's anatomy based on the data from the optical sensor. U.S. Patent Publication 2013/0237811 to Mihailescu et al. describes a system having an ultrasound transducer and machine-vision camera system for registering the transducer's x, y, z position with respect to an object, such as a patient's body. The position and orientation are correlated with transducer scan data, and scans of the same region of the object are compared in order to reduce ultrasound artifacts and speckles. The system can be extended to medical instruments. U.S. Patent Publication 2014/0200440 to lannotti et al. describes a method for aligning a manipulable sensor assembly includes determination of a replica surface, which represents a field of view of a manipulable sensor assembly associated with an object. Electromagnetic radiation and/or sound reflected from a surface of the region of interest is detected to provide a surface map of the region of interest, which is compared to the replica surface to determine a rotation and/or a translation for the manipulable sensor assembly to bring the surface map into alignment with the replica surface. SUMMARY In view of the foregoing, the invention is as defined in the appended claims. The present disclosure relates to use of sensors to detect the position of a surgical instrument relative to a patient for generation navigation data for use in operation of the surgical instrument. A plurality of sensors is provided relative to the surgical instrument that sense a position of the surgical instrument relative to the patient to provide positioning data. The position data provides information regarding the position and orientation of the surgical instrument relative to the patient. Specifically, the sensors are operative to sense one or more reference anatomical features of the patient that allow the relative position of the surgical instrument to the patient to be determined. For instance, the reference anatomical features of the patient may correspond to visible portions of the patient. Such visible portions may correspond to internal or external anatomy. Preferably, the reference anatomical feature is dimensionally stable such that the reference anatomical feature may be reliably and repeatably sensed by the sensor. Accordingly, imaging data corresponding to an imaging study of the patient may also be retrieved. The imaging data may be correlated with the positioning data. This allows a navigation module or the like to determin