Search

JP-7855578-B2 - Monitoring the user's visual gaze to control which display system shows primary information.

JP7855578B2JP 7855578 B2JP7855578 B2JP 7855578B2JP-7855578-B2

Inventors

  • シェルトン・ザ・フォース・フレデリック・イー
  • ハリス・ジェイソン・エル

Assignees

  • シラグ・ゲーエムベーハー・インターナショナル

Dates

Publication Date
20260508
Application Date
20210929
Priority Date
20201002

Claims (19)

  1. A surgical hub for displaying information on a display based on the user's visual focus, wherein the surgical hub is A processor is provided, and the processor is Determining the display within the user's visual focus, Determining the surgical procedure that will use medical instruments during medical treatment, Based on contextual data and the surgical procedure, determine the display data related to the user, The system is configured to send a message to the display instructing it to display the aforementioned display data, The display is a first display, the display data is first display data, the message is a first message, and the processor is Determining that the second display is within the user's visual focus, Based on the surgical procedure, a second display data is determined from the context data, wherein the second display data has a lower priority for the user than the first display data. The system is further configured to send a second message instructing the second display to display the second display data, Surgical hub.
  2. The surgical hub according to claim 1, further configured to determine the user's visual focus by using one or more of the following: wearable device data, sensor data associated with the user, images from a camera in the operating room, and video from the camera in the operating room.
  3. The aforementioned processor, Receiving images or videos from the camera, To generate a geometric three-dimensional dataset from the aforementioned image or video, Using the aforementioned geometric three-dimensional dataset, one or more of the user's head orientation and the user's line of sight are determined. The surgical hub according to claim 1 or 2, further configured to determine the user's visual focus by using one or more of the user's head orientation and the user's line of sight.
  4. The aforementioned processor, The determination that the display is showing the second display data, Based on the user's identification information, the surgical procedure, and the context data, it is determined that the first display data has a higher priority than the second display data. The surgical hub according to any one of claims 1 to 3, further configured to transmit a second message instructing the second display to display the second display data.
  5. The surgical hub according to any one of claims 1 to 3, wherein the processor is further configured to determine that the display is displaying the second display data associated with a second user, and the message instructing the display to display the display data further includes an instruction to display the first display data together with the second display data.
  6. The aforementioned processor, The ranked dataset is determined by ranking the context data based on the likelihood of it being requested by the user during the surgical procedure. Determining the amount of display space for the aforementioned display, A surgical hub according to any one of claims 1 to 4, configured to determine the display data relevant to the user based on the context data and the surgical operation, by assigning a subset of the ranked data as the display data based on the amount of display space for the display.
  7. The aforementioned processor, Determining that a second user is looking at the display, and determining the amount of available display space for the display, Based on the surgical procedure and the relationship between the user and the second user, the data priority of the context data is determined, A surgical hub according to any one of claims 1 to 6, configured to determine the display data related to the user based on the context data and the surgical operation, by assigning a subset of the context data as the display data based on the data priority of the context data.
  8. The surgical hub according to any one of claims 1 to 7 , wherein the display data includes one or more of the following: instrument data, device errors, device proximity that may cause impact, biometric data, images, videos, and camera displays.
  9. A surgical hub for displaying information on a display based on the user's visual focus, wherein the surgical hub is A processor is provided, and the processor is Determining that the display is within the visual focus of the first user and the visual focus of the second user, Determining display data for the display based on the first surgical operation for the first user and the second surgical operation for the second user, The system is configured to send a message to the display instructing it to display the aforementioned display data, The display is a first display, the display data is first display data, the message is a first message, and the processor is Determining that the second display is within the visual focus of the second user, Determining second display data based on the second surgical operation, wherein the second display data has a lower priority for the first user than the first display data. A surgical hub further configured to transmit a second message instructing the second display to display the second display data .
  10. The surgical hub according to claim 9, wherein the first surgical operation indicates that the first medical instrument is being used by the first user during a medical procedure, and the second surgical operation indicates that the second medical instrument is being used by the second user during the medical procedure.
  11. The aforementioned processor, To determine the priority between the first surgical procedure and the second surgical procedure, The surgical hub according to claim 9 or 10, configured to determine the display data for the display based on the first surgical task for the first user and the second surgical task for the second user, by determining the display data from context data using the priority, the first surgical task, and the second surgical task.
  12. The aforementioned processor, Determining the priority between the first user and the second user, A surgical hub according to any one of claims 9 to 11, configured to determine the display data for the display based on the first surgical task for the first user and the second surgical task for the second user, by determining the display data from context data using the priority, the first surgical task, and the second surgical task.
  13. The surgical hub according to any one of claims 9 to 12 , wherein the display data includes one or more of the following: instrument data, device errors, device proximity that may cause impact, biometric data, images, videos, and camera displays.
  14. A surgical hub for displaying information on a display based on the user's visual focus, wherein the surgical hub is A processor is provided, and the processor is Determining the first display and the second display that are within the first focus of the first user and the second focus of the second user, Determining that the first surgical operation associated with the first user has a higher priority than the second surgical operation associated with the second user, Based on the first surgical procedure, first context data is determined, and based on the second surgical procedure, second context data is determined. A surgical hub configured to transmit a first message instructing a first display to display the first context data, and a second message instructing a second display to display the second context data.
  15. The surgical hub according to claim 14, wherein the first surgical operation indicates that the first medical instrument is being used by the first user during a medical procedure, and the second surgical operation indicates that the second medical instrument is being used by the second user during the medical procedure.
  16. The surgical hub according to claim 14 or 15 , wherein the first message further instructs the first display to delete the display data associated with the second user.
  17. The aforementioned processor, Determining that the first surgical procedure indicates that the first medical instrument is being used on the patient, Determining that the second surgical procedure indicates that the second medical instrument has been cleaned, reloaded, or prepared, The surgical hub according to any one of claims 14 to 16, configured to determine that a first surgical operation associated with a first user has a higher priority than a second surgical operation, by assigning a priority to the first surgical operation such that the first surgical operation is given a higher priority than the second surgical operation associated with a second user.
  18. The aforementioned processor, To decide on medical treatment, Based on the aforementioned medical procedure, the first priority of the first surgical procedure is determined, Based on the aforementioned medical procedure, the second priority of the second surgical procedure is determined, The surgical hub according to any one of claims 14 to 17, wherein it is configured to determine that the first surgical operation associated with the first user has a higher priority than the second surgical operation associated with the second user, by determining that the first priority of the first surgical operation is higher than the second priority of the second surgical operation.
  19. The surgical hub according to any one of claims 14 to 18, wherein the processor is configured to determine that the first surgical operation associated with the first user has a higher priority than the second surgical operation associated with the second user, by determining that the first surgical operation is associated with a higher level of risk than the second surgical operation.

Description

(Cross-reference of related applications) This application relates to the following, the contents of which are incorporated herein by reference. - A U.S. patent application filed together with this specification, entitled "METHOD FOR OPERATING TIERED OPERATION MODES IN A SURGICAL SYSTEM," having agent reference number END9287USNP1. - A U.S. patent application filed together with this specification, entitled "Situational Award-Relationship of Instruments Location and Individualization of Users to Control Displays," having agent reference number END9288USNP1. - A U.S. patent application filed together with this specification, entitled "SHARED SITUATIONAL AWARENESS OF THE DEVICE ACTUATOR ACTIVITY TO PRIORITIZE CERTAIN ASPECTS OF DISPLAYED INFORMATION", having agent reference number END9288USNP2. - A U.S. patent application filed together with this specification, entitled "RECONFIGURATION OF DISPLAY SHARING," with agent reference number END9288USNP4, and - A U.S. patent application filed together with this specification, entitled "CONTROL A DISPLAY OUTSIDE THE STERILE FIELD FROM A DEVICE WITHIN THE STERILE FIELD," with agent reference number END9288USNP5. Surgical systems often incorporate imaging systems that allow clinicians to view the surgical site and/or one or more parts thereof on one or more displays, such as monitors. The displays may be localized in the surgical theater and/or remote. The imaging system may include scopes equipped with cameras that view the surgical site and transmit the view to displays visible to the clinician. Examples of scopes include, but are not limited to, arthroscopes, angioscopes, bronchoscopes, cholangioscopies, colonoscopes, cystoscopes, esophagogastroduodenoscopes, enteroscopes, esophagoduodenoscopes (gastroscopy), endoscopes, laryngoscopes, nasopharyngolaryngoscopes, sigmoidoscopy, thoracoscopy, ureteroscopes, and exoscopy. The imaging system may be limited by the information that can be recognized by and/or communicated to the clinician. For example, certain hidden structures, physical contours, and/or dimensions in three-dimensional space may not be recognizable during surgery by certain imaging systems. Additionally, certain imaging systems may be unable to communicate and/or transmit specific information to the clinician during surgery. This is a block diagram of a computer-implemented interactive surgical system according to at least one aspect of the present disclosure.A surgical system used to perform surgical procedures in an operating room, according to at least one aspect of the present disclosure.A visualization system, a robotic system, and a surgical hub paired with an intelligent instrument, according to at least one aspect of the present disclosure.The present disclosure describes a surgical data network comprising a modular communication hub configured to connect modular devices located in one or more operating rooms of a medical facility, or any room within a medical facility equipped with specialized equipment for surgical procedures, to the cloud, according to at least one aspect of the present disclosure.This disclosure shows a computer-implemented interactive surgical system according to at least one aspect of this disclosure.The present disclosure shows a surgical hub comprising a plurality of modules connected to a modular control tower, according to at least one aspect of this disclosure.A logic diagram of a control system for a surgical instrument or tool according to at least one aspect of this disclosure is shown.A surgical instrument or tool comprising a plurality of motors that can be activated to perform various functions is shown according to at least one aspect of this disclosure.This is a diagram of a situational awareness surgical system according to at least one aspect of the present disclosure.An exemplary surgical procedure and inference timeline that a surgical hub can create from data detected at each step of a surgical procedure is shown according to at least one aspect of this disclosure.This is a block diagram of a computer-implemented interactive surgical system according to at least one aspect of the present disclosure.A block diagram showing a functional architecture of a computer-implemented interactive surgical system according to at least one aspect of the present disclosure.A block diagram of a computer-implemented interactive surgical system configured to adaptively generate control program updates for modular devices, according to at least one aspect of the present disclosure, is shown.A surgical system comprising a handle having a controller and a motor, an adapter releasably coupled to the handle, and a loading unit releasably coupled to the adapter, according to at least one aspect of the present disclosure.This section describes an exemplary flow for determining the operating mode and for operating in the determined mode.This shows an exemplary flow for changing the operating mode.This shows the primary display of th