Search

EP-3858281-B1 - POSE MEASUREMENT CHAINING FOR EXTENDED REALITY SURGICAL NAVIGATION IN VISIBLE AND NEAR INFRARED SPECTRUMS

EP3858281B1EP 3858281 B1EP3858281 B1EP 3858281B1EP-3858281-B1

Inventors

  • CALLOWAY, Thomas

Dates

Publication Date
20260506
Application Date
20210128

Claims (10)

  1. A surgical system comprising: - a first tracking camera (N3), - a second tracking camera (N2); - a camera tracking system configured to - receive (2100) tracking information related to tracked objects (E, R) from the first tracking camera (N3) and the second tracking camera (N2) during a surgical procedure, - determine (2102) a first pose transform ( T N 3 R ) between a first object (R) coordinate system and the first tracking camera coordinate system based on first object tracking information from the first tracking camera N3) which indicates pose of the first object (E), - determine(2104) a second pose transform ( T R N 2 ) between the first object (R) coordinate system and the second tracking camera coordinate system based on first object tracking information from the second tracking camera (N2) which indicates pose of the first object (R), - determine (2106) a third pose transform ( T N 2 E )between a second object (E) coordinate system and the second tracking camera coordinate system based on second object tracking information from the second tracking camera (N2) which indicates pose of the second object (E), and determine (2108) a fourth pose transform ( T N 3 E ) between the second object coordinate system and the first tracking camera coordinate system characterized in that the determination of the fourth pose transform is based on combining the first, second, and third pose transforms ( T N 3 R , T R N 2 , T N 2 E ) so as to indirectly determine (2108) the fourth pose transform ( T N 3 E ) in a case where the second object (E) is at least partially hidden to the first camera (N3).
  2. The surgical system of Claim 1, wherein the camera tracking system is further configured to determine (2110) the fourth pose transform ( T N 3 E ) between the second object coordinate system and the first tracking camera coordinate system without use of any tracking information from the first tracking camera (N3) indicating pose of the second object (E).
  3. The surgical system of Claim 1, wherein the camera tracking system is further configured to determine pose of the second object (E) in the first tracking camera coordinate system based on processing through the fourth pose transform ( T N 3 E ) the tracking information from the first tracking camera which indicates pose of the first object (E), based on processing through the fourth pose transform ( T N 3 E ) the tracking information from the second tracking camera (N2) which indicates pose of the first object (E), and based on processing through the fourth pose transform the tracking information from the second tracking camera which indicates pose of the second object (E).
  4. The surgical system of Claim 1, further comprising an extended reality (XR) headset (1200) including the first tracking camera and a see-through display screen, the XR headset (1200) being configured to be worn by a user during the surgical procedure, the camera tracking system is further configured to: display on the see-through display screen of the XR headset an XR image having a pose that is determined based on the fourth pose transform (N3).
  5. The surgical system of Claim 4, wherein the camera tracking system is further configured to: generate the XR image as a graphical representation of the second object that is posed on the see-through display screen based on processing through the fourth pose transform the first object tracking information from the first and second tracking cameras and the second object tracking information from the second tracking camera.
  6. The surgical system of Claim 4, wherein the camera tracking system comprises a navigation controller communicatively connected to the first and second tracking cameras to receive the tracking information and configured to perform the determination of the first, second, third, and fourth pose transforms ( T N 3 E ).
  7. The surgical system of Claim 1, wherein: - dynamic reference arrays (DRA) including spaced apart fiducials are attached to each of the first object (R), the second object (E), the first tracking camera (N3), and the second tracking camera (N2); and - the camera tracking system is further configured, while a first partially hidden fiducial (a, b, c, d) of the dynamic reference array attached to the second object (E) can be viewed by the second tracking camera (N2) but cannot be viewed by the first tracking camera N3), to determine pose of the first partially hidden fiducial (a, b, c, d) relative to the fiducials attached to the first object (R) based on a combination of a determined pose of the first partially hidden fiducial (a, b, c, d) relative to the fiducials attached to the second tracking camera (N2) and a determined pose of the fiducials attached to the second tracking camera (N2) relative to the fiducials attached to the first object (R).
  8. The surgical system of Claim 7, wherein: the camera tracking system is further configured, while a second partially hidden fiducial (a, b, c, d) of the dynamic reference array attached to the second object (E) can be viewed by the second tracking camera (N3) but cannot be viewed by the first tracking camera (N2), to determine pose of the second partially hidden fiducial (a, b, c, d) relative to the fiducials attached to the first object (E) based on a combination of a determined pose of the second partially hidden fiducial (a, b, c, d) relative to the fiducials attached to the second tracking camera (N2) and a determined pose of the fiducials attached to the second tracking camera (N2) relative to the fiducials (a, b, c, d) attached to the first object (E).
  9. The surgical system of Claim 7, wherein: the camera tracking system is further configured, while a third partially hidden fiducial of the dynamic reference array attached to the second object cannot be viewed by the second tracking camera (N2) but can be viewed by the first tracking camera (N3), to determine pose of the third partially hidden fiducial relative to the fiducials attached to the first object (E) based on a combination of a determined pose of the third partially hidden fiducial relative to the fiducials attached to the first tracking camera (N3) and a determined pose of the fiducials attached to the first tracking camera (N3) relative to the fiducials attached to the first object (E).
  10. The surgical system of Claim 9, wherein the camera tracking system is further configured to process the determined pose of the first partially hidden fiducial relative to the fiducials attached to the first object (E) and to process the determined pose of the third partially hidden fiducial relative to the fiducials attached to the first object (E) through a pose recovery operation to determine pose of the second object (R) relative to the first object (E), and wherein the camera tracking system is preferably further configured, while a fourth partially hidden fiducial of the dynamic reference array attached to the second object (R) cannot be viewed by the second tracking camera (N2) but can be viewed by the first tracking camera (N3), to determine pose of the fourth partially hidden fiducial relative to the fiducials attached to the first object (E) based on a combination of a determined pose of the fourth partially hidden fiducial relative to the fiducials attached to the first tracking camera (N3) and a determined pose of the fiducials attached to the first tracking camera (N3) relative to the fiducials attached to the first object (E).

Description

FIELD The present disclosure relates to medical devices and systems, and more particularly, computer assisted navigation in surgery. BACKGROUND Computer assisted navigation in surgery provides surgeons with enhanced visualization of surgical instruments with respect to radiographic images of the patient's anatomy. Navigated surgeries typically include components for tracking the position and orientation of surgical instruments via arrays of disks or spheres using a single near infrared (NIR) stereo camera setup. In this scenario, there are three parameters jointly competing for optimization: (1) accuracy, (2) robustness and (3) ergonomics. Navigated surgery procedures using existing navigation systems are prone to events triggering intermittent pauses while personnel and/or objects obstruct the ability of a tracking component to track poses of the patient, the robot, and surgical instruments. There is a need to improve the tracking performance of navigation systems. US6006126A relates to a system for computer graphic determination and display of a patient's anatomy, as from CT or MR scanning, and stored along with associated equipment in an object field including the patient's anatomy. A first digitizing camera structure produces a signal representative of its field-of-view which defines coordinates of index points in its field-of-view. A second digitizing camera structure produces similar output for an offset field-of-view. The two camera positions are defined with respect to the patient's anatomy so that the fields-of-view of the cameras include both the patient's anatomy and the equipment, but are taken from different directions. Index markers are for fixing points in the fields of view and accordingly locate equipment relative to said patient anatomy. The index markers are provided by variety of structures including, light sources in various forms as reflectors, diodes, and laser scanner structures to provide a visible grid, mesh or cloud of points. SUMMARY The present invention is solely defined in the appended claims. Various embodiments disclosed herein are directed to improvements in computer assisted navigation during surgery. One or more extended reality (XR) headsets can be equipped with tracking cameras that provide tracking information to a camera tracking system for combining with tracking information from other tracking cameras which may be part of another XR headset, an auxiliary tracking bar, or other equipment. Through various pose chaining operations disclosed herein the camera tracking system may be able to track tools and other objects with greater robustness and through a wide range of motion. In one embodiment, a surgical system includes a camera tracking system that is configured to receive tracking information related to tracked objects from a first tracking camera and a second tracking camera during a surgical procedure. The camera tracking system is configured to determine a first pose transform between a first object coordinate system and the first tracking camera coordinate system based on first object tracking information from the first tracking camera which indicates pose of the first object. The camera tracking system is configured to determine a second pose transform between the first object coordinate system and the second tracking camera coordinate system based on first object tracking information from the second tracking camera which indicates pose of the first object, and to determine a third pose transform between a second object coordinate system and the second tracking camera coordinate system based on second object tracking information from the second tracking camera which indicates pose of the second object. The camera tracking system is configured to determine a fourth pose transform between the second object coordinate system and the first tracking camera coordinate system based on combining the first, second, and third pose transform. Related methods by a camera tracking system and related computer program products are disclosed. Other surgical systems, method, and computer program products according to embodiments will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such surgical systems, method, and computer program products be included within this description, be within the scope of the present disclosure, and be protected by the accompanying claims. Moreover, it is intended that all embodiments disclosed herein can be implemented separately or combined in any way and/or combination. BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in a constitute a part of this application, illustrate certain non-limiting embodiments of inventive concepts. In the drawings: Figure 1 illustrates an embodiment of a surgical system according to some embodiments of the present disclosure;