Search

EP-3602567-B1 - VIRTUAL REALITY TRAINING, SIMULATION, AND COLLABORATION IN A ROBOTIC SURGICAL SYSTEM

EP3602567B1EP 3602567 B1EP3602567 B1EP 3602567B1EP-3602567-B1

Inventors

  • GARCIA KILROY, PABLO EDUARDO
  • JOHNSON, Eric Mark
  • SIU, BERNARD FAI KIN
  • YU, Haoran

Dates

Publication Date
20260506
Application Date
20180628

Claims (15)

  1. A virtual reality system (200) for visualizing a virtual robotic surgery, comprising: a processor (210) configured to generate a virtual operating room comprising one or more virtual robotic arms mounted on a virtual operating table, one or more virtual surgical instruments each coupled to a distal end of a virtual robotic arm, and a virtual patient on top of the virtual operating table; and a handheld device (230) communicatively coupled to the processor and configured to a) select a number and locations of ports for entry of the virtual surgical instruments, b) determine a number and positions and orientation of the virtual robotic arms for the virtual surgery, and c) manipulate the virtual robotic arms and the virtual surgical instruments to perform a virtual surgery to the virtual patient; wherein the processor is configured to render the virtual surgery to the virtual patient in the virtual operating room on a display (240).
  2. The system of claim 1, wherein generating the virtual operating room is based on predetermined models for the virtual operating room, the virtual robotic arms, the virtual operating table, the virtual surgical instruments, and the virtual patient.
  3. The system of claim 2, wherein each of the one or more virtual surgical instruments is passing through a virtual cannula and having a distal end positioned within the abdomen of the virtual patient.
  4. The system of claim 1, wherein, based on user input through the handheld device, a portal (910) is created at a location in the virtual operating room, the portal allowing a quick navigation to the location upon selection of the portal.
  5. The system of claim 4, wherein the portal is positioned inside or outside the virtual patient.
  6. The system of claim 1, wherein the virtual surgical instruments comprise a virtual endoscope having a virtual camera positioned within the abdomen of the virtual patient and providing a view of a surgical workspace within the abdomen of the virtual patient.
  7. The system of claim 6, wherein the processor is configured to render the view of the surgical workspace from the virtual endoscope on the display.
  8. The system of claim 6, wherein, based on user input though the handheld device, the virtual endoscope and other virtual surgical instruments are moved coordinately to another region of the abdomen of the virtual patient in a coordinated relocation mode.
  9. The system of claim 8, wherein in the coordinated relocation mode, the virtual camera zooms out along an axis of the virtual endoscope to include the other regions of the abdomen in the view of the surgical workspace.
  10. A computer-implemented method for visualizing a virtual robotic surgery, comprising: generating a virtual operating room comprising one or more virtual robotic arms mounted on a virtual operating table, one or more virtual surgical instruments each coupled to a distal end of a virtual robotic arm, and a virtual patient positioned on top of the virtual operating table; receiving user input from a handheld device configured to a) select a number and locations of ports for entry of the virtual surgical instruments, b) determine a number and positions and orientation of the virtual robotic arms for the virtual surgery, and c) manipulate the virtual robotic arms and the virtual surgical instruments for performing the virtual surgery to the virtual patient; and rendering the virtual surgery to the virtual patient in the virtual operating room on a display.
  11. The method of claim 10, wherein generating the virtual operating room is based on predetermined models for the virtual operating room, the virtual robotic arms, the virtual operating table, the virtual surgical instruments, the virtual cannula, and the virtual patient.
  12. The method of claim 11, wherein each of the one or more virtual surgical instruments is passing through a virtual cannula and having a distal end positioned within the abdomen of the virtual patient.
  13. The method of claim 10, further comprising creating a portal at a location inside or outside the virtual patient in the virtual operating room, the portal allowing a quick navigation to the location upon selection of the portal.
  14. The method of claim 10, further comprising move the one or more virtual surgical instruments including a virtual endoscope and other virtual instruments coordinately to another region of the abdomen of the virtual patient in a coordinated relocation mode.
  15. A computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method of any of claims 10 to 14.

Description

TECHNICAL FIELD This invention relates generally to the field of robotic surgery, and more specifically to new and useful systems and methods for providing virtual robotic surgical environments. BACKGROUND Minimally-invasive surgery (MIS), such as laparoscopic surgery, involves techniques intended to reduce tissue damage during a surgical procedure. For example, laparoscopic procedures typically involve creating a number of small incisions in the patient (e.g., in the abdomen), and introducing one or more surgical instruments (e.g., an end effector, at least one camera, etc.) through the incisions into the patient. The surgical procedures may then be performed using the introduced surgical instruments, with the visualization aid provided by the camera. Generally, MIS provides multiple benefits, such as reduced patient scarring, less patient pain, shorter patient recovery periods, and lower medical treatment costs associated with patient recovery. In some embodiments, MIS may be performed with robotic systems that include one or more robotic arms for manipulating surgical instruments based on commands from an operator. A robotic arm may, for example, support at its distal end various devices such as surgical end effectors, imaging devices, cannulae for providing access to the patient's body cavity and organs, etc. Robotic surgical systems are generally complex systems performing complex procedures. Accordingly, a user (e.g., surgeons) generally may require significant training and experience to successfully operate a robotic surgical system. Such training and experience is advantageous to effectively plan the specifics of MIS procedures (e.g., determine optimal number, location, and orientation of robotic arms, determine optical number and location of incisions, determine optimal types and sizes of surgical instruments, determine order of actions in a procedure, etc.). Additionally, the design process of robotic surgical systems may also be complicated. For example, improvements in hardware (e.g., robotic arms) are prototyped as physical embodiments and physically tested. Improvements in software (e.g., control algorithms for robotic arms) may also require physical embodiments. Such cyclical prototyping and testing is generally cumulatively expensive and time-consuming. US 2016/314717 A1 describes a telerobotic surgery system for remote surgeon training that includes a robotic surgery station at a first location in a first structure at a first geographic point. Harvested animated animal tissue is at the robotic surgery station and includes harvested animal tissue, and at least one animating device coupled thereto. A remote surgeon trainee station at a second location in a second structure at a second geographic point is remote from the first geographic point. A remote surgeon instructor station is also included. A communications network couples the stations. SUMMARY The underlying invention is defined by the appended claims. Generally, a virtual reality system for providing a virtual robotic surgical environment may include a virtual reality processor (e.g., a processor in a computer implementing instructions stored in memory) for generating a virtual robotic surgical environment, a head-mounted display wearable by a user, and one or more handheld controllers manipulable by the user for interacting with the virtual robotic surgical environment. The virtual reality processor may, in some variations, be configured to generate a virtual robotic surgical environment based on at least one predetermined configuration file describing a virtual component (e.g., virtual robotic component) in the virtual environment. The head-mounted display may include an immersive display for displaying the virtual robotic surgical environment to the user (e.g., with a first-person perspective view of the virtual environment). In some variations, the virtual reality system may additionally or alternatively include an external display for displaying the virtual robotic surgical environment. The immersive display and the external display, if both are present, may be synchronized to show the same or similar content. The virtual reality system may be configured to generate a virtual robotic surgical environment within which a user may navigate around a virtual operating room and interact with virtual objects via the head-mounted display and/or handheld controllers. The virtual reality system (and variations thereof, as further described herein) may serve as a useful tool with respect to robotic surgery, in applications including but not limited to training, simulation, and/or collaboration among multiple persons. In some variations, a virtual reality system may interface with a real or actual (non-virtual) operating room. The virtual reality system may enable visualization of a robotic surgical environment, and may include a virtual reality processor configured to generate a virtual robotic surgical environment comprising at least one v