EP-3474285-B1 - WORKFLOW ASSISTANT FOR IMAGE GUIDED PROCEDURES
Inventors
- KOTIAN, FRANCOIS
- MC CARTHY, THOMAS
- SUREDA, FRANCISCO
- DESNOUX, VALERIE
Dates
- Publication Date
- 20260506
- Application Date
- 20180921
Claims (12)
- A workflow assistance system (50) for image guided surgical procedures, comprising: an interventional imaging system (20) that operates to move relative to a patient and to acquire interventional images during a surgical procedure; a surveillance system arranged (70) about a surgical suite, the surveillance system producing surveillance data; and a workflow controller that processes the surveillance data to identify and locate surveillance subjects, the surveillance subjects comprising personnel, the patient, the interventional imaging system, and medical equipment, within the surgical suite based upon the surveillance data and provide operational commands to the interventional imaging system based upon analysis of the surveillance data, wherein the surveillance system comprises a plurality of cameras and at least one wearable device configured to be worn by a clinician and provide an additional contextual component to interpret clinician movements or actions, and wherein the surveillance data comprises image data from the plurality of cameras and data from the wearable device, and wherein the workflow controller further receives static configuration data regarding the surgical suite, medical equipment, interventional imaging system or personnel and processes the surveillance data and the static configuration data to identify interactions between two or more surveillance subjects.
- The workflow assistance system (50) of claim 1, wherein the operational commands to the interventional imaging system are movement commands to avoid collisions between the interventional imaging system and the patient, personnel, and medical equipment in the surgical suite.
- The workflow assistance system (50) of any preceding claim, wherein the plurality of cameras comprises at least one pan-tilt-zoom camera, a depth camera, or a thermal camera.
- The workflow assistance system (50) of any preceding claim, wherein the surveillance system further comprises a plurality of microphones and the surveillance data comprises audio data.
- The workflow assistance system (50) of any preceding claim, wherein the workflow controller identifies locations and actions of the interventional imaging system and the personnel from the image data.
- The workflow assistance system (50) of any preceding claim wherein the workflow controller comprises a learning engine that processes surveillance data from previous surgical procedures to interpret the surveillance data acquired during the surgical procedure.
- The workflow assistance system (50) of any preceding claim, wherein the workflow controller identifies locations and actions of the interventional imaging system and the clinicians from the surveillance data and identifies surgical procedure events from the identified locations and actions.
- The workflow assistance system (50) of claim 7, further comprising a link to a hospital information system and the workflow controller automatedly records a surgical event log of the identified surgical procedure events and identified actions.
- The workflow assistance system (50) of claim 8, wherein the hospital information system predicts a surgical procedure duration from the surgical event log.
- The workflow assistance system (50) of claim 9, wherein the hospital information system adjusts a schedule of use of the surgical suite based upon the predicted surgical procedure duration.
- The workflow assistance system (50) of any of claims 8-10, wherein the workflow controller obtains at least one procedural guideline from the hospital information system and compares the procedural guideline to the identified locations and actions of the clinicians and the identified surgical procedure events to evaluate the surgical procedure and create clinician guidance.
- The workflow assistance system (50) of any preceding claim, further comprising a wearable dose meter, wherein the workflow controller obtains radiation level data of the interventional imaging system, calculates a radiation dose received by the clinician, and produces guidance to the clinician regarding dose mitigation.
Description
The present disclosure relates generally to image guided procedures. More specifically, the present disclosure relates to automated workflow and procedure guidance in an image guided procedure. Surgical suites are often complex and crowded environments in which multiple caregivers and multiple pieces of complex equipment surround a sedated and immobilized patient to perform the surgical care. This is additionally the case during minimally invasive surgical procedures wherein sophisticated medical imaging devices and graphical displays are relied upon by the caregivers to monitor the process and performance of the procedure without the need for a large incision into the patient's body for direct visual inspection and guidance. Audio and video systems in a surgical suite have been known or suggested although these have been limited in function and use. Video cameras in surgical suites have been used for external communication, for example for educational, training, or demonstration purposes. Video cameras have also been used as a passive documentation of the procedure for reference in the event of a later claim of liability. Audio and video systems have further been used for inter-procedure video conferencing for consultation purposes. Audio based command systems have further been suggested as a means for promoting sterility in the surgical suite by limiting physical interaction between care providers and medical devices. Comparetti, M. et al, "Safe surgical robotic system and workflow design in the ACTIVE project for awake neurosurgery", 31 October 2012, describes that the Active Constraints Technologies for Ill-defined or Volatile Environments (ACTIVE) project is aimed at designing an integrated robotic system to support awake neurosurgery interventions. Three robots actively operate on the patient with a modular control providing three modes (autonomous, "hands-on" or tele-operated) depending on the particular surgical action and the surgeon preferences. Two robots are used as right and left surgeon hands extensions, while the third robot acts as an active damper for restraining the patient head. The robot movements are actively and dynamically constrained on pre-defined areas that are updated intra-operatively according to the current surgical situation and to environmental sensors that are used in order to avoid collisions between the user and the robot arms. Nicolai, P. et al, "A Novel 3D Camera Based Supervision System for Safe Human-Robot Interaction in the Operating Room", Journal of Automation and Control Engineering, vol. 3, 1 October 2015, pages 410-417, describes that in anticipation of upcoming technological advances in the operating room, it is necessary to already give thought to how humans and robots can safely interact and cooperate in the operating room of the future. Described is a supervision system, consisting of seven 3D cameras, and the according shape cropping algorithm, which allows verifying the correct setup of surgical robots, detecting potential collisions between robots and their surroundings as well as monitoring the correctness of the robots' motions. Mönnich, H. et al, "A Supervision System for Intuitive Usage of a Telemanipulated Surgical Robotic Setup", IEEE International Conference on Robotics and Biomimetics, 7 December 2011, describes a system that is able to track objects and humans. Described is a complete surgical robotic system that can be used for telemanipulation as well as for autonomous tasks, e.g. cutting or needle-insertion. Two KUKA lightweight robots that feature seven DOF and allow variable stiffness and damping due to an integrated impedance controller are used as actuators. The system includes two haptic input devices for providing haptic feedback in telemanipulation mode as well as including virtual fixtures to guide the surgeon even during telemanipulation mode. The supervision system consists of a marker-based optical tracking system, Photonic Mixer Device cameras (PMD) and rgb-d cameras (Microsoft Kinect). A simulation environment is constantly updated with the model of the environment, the model of the robots and tracked objects, the occupied space as well as tracked models of humans. However, improved surgery procedure performance, integration between medical devices and clinician users of those devices, as well as management of personnel and resources within the hospital of which the surgical suite is a part can be achieved but with greater leverage and integration of surgical suite audio and video resources. The invention to which this European patent relates is defined by the appended claims. In an exemplary embodiment of a workflow assistance system for image guided surgical procedures, an interventional imaging system operates to move relative to a patient. The interventional imaging system further operates to acquire interventional images of the patient during a surgical procedure. A surveillance system is arranged about a surgical suite. The s