Search

JP-7855612-B2 - Augmented reality headsets and probes for medical imaging

JP7855612B2JP 7855612 B2JP7855612 B2JP 7855612B2JP-7855612-B2

Inventors

  • スパース、セドリック
  • アンナイアン、アルン
  • ペレス ー パチョン、ローラ
  • ルグロ、アルノー

Assignees

  • アルスペクトラ エスアーエールエル

Dates

Publication Date
20260508
Application Date
20220504
Priority Date
20210504

Claims (15)

  1. An augmented reality (AR) system for acquiring and displaying measurements from a patient during medical procedures, An AR headset equipped with a near-eye display for displaying the measured values superimposed on the field of view of a surgeon of the patient's tissue at the location relative to the patient from which the measured values were collected, At least one sensor for tracking the position of the patient, A memory for storing the aforementioned measurement values and the locations associated therewith, Equipped with a processor, During the aforementioned medical procedure, the following steps are taken: (a) A step of receiving measurements collected from the patient's tissue, wherein the measurements are collected by probes positioned adjacent to various regions of the patient's tissue, (b) Tracking the position of the probe while it is collecting measurements to determine the location where the measurements are collected, (c) Simultaneously with performing steps (a) and (b), the step of using at least one sensor to track the location of a biological landmark on the patient so as to determine the relationship between the location of the biological landmark and the location where the measurement is collected, (d) Display the measured value at a certain position on the near-eye display, To continuously track the location of the biological landmark throughout the entire medical procedure and obtain its current position, An augmented reality (AR) system configured to perform the steps of updating the position of the measurement on the display based on the current position of the biological landmark, so that throughout the medical procedure, the measurement is aligned in the surgeon's field of view of the patient's tissue at the position relative to the patient from which the measurement was collected.
  2. The AR system according to claim 1, further comprising the step of creating a virtual map of the measured values, wherein the step of displaying the measured values on the near-eye display includes the step of displaying the virtual map.
  3. The AR system according to claim 1 or 2, wherein the step of tracking the location of the biological landmark on the patient further includes the steps of illuminating the area of the patient using light of a specific wavelength, and detecting a fluorescent signal emitted by the biological landmark in response to the illumination using the sensor.
  4. The AR system according to claim 1 or 2, further comprising a probe.
  5. The AR system according to claim 4, wherein the probe is a Raman probe.
  6. The AR system according to claim 1 or 2 , wherein the biological landmark is one or more features of the biological landmark.
  7. To continuously track the position of the feature throughout the entire medical procedure and obtain its current position, The sensor captures an image of the patient, From the aforementioned image, one or more of the features of the biological landmark are extracted. Determine the correspondence between one or more features across consecutive frames of the aforementioned image. The AR system according to claim 6, further comprising estimating the motion of one or more features across consecutive frames of the image by using a transformation matrix.
  8. The AR system according to claim 7, further comprising the step of estimating the motion of one or more features across consecutive frames of an image by using a transformation matrix, the step of applying rigid or non-rigid registration.
  9. The AR system according to claim 6 , wherein one or more features of the biological landmark include an edge or intersection of the biological landmark.
  10. The AR system according to claim 1 or 2 , wherein the biological landmark includes at least one of the patient's blood vessels, the patient's lymph nodes, the patient's nervous system, and the surface of the patient's organs.
  11. The step of tracking the position of the probe is, The AR system according to claim 1 or 2, comprising the steps of: detecting a marker on the probe via the sensor and calculating the position of the tip of the probe representing the position where the measurement is collected; or detecting a change in the magnetic field caused by the movement of a magnetic marker on the probe, wherein the change in the magnetic field identifies the position of the probe.
  12. The AR system according to claim 1 or 2, wherein the relationship between the position of the biological landmark and the position where the measurement is collected is determined by transforming the position of the biological landmark and the position where the measurement is collected into a common coordinate system.
  13. The step of displaying the measurement at a certain location on the near-eye display so that throughout the medical procedure, the measurement is aligned within the surgeon's field of view of the patient's tissue at the location relative to the patient where the measurement was collected, The AR system according to claim 1 or 2 , further comprising the steps of tracking the position of the surgeon's eye and adjusting the position of the measurement on the near-eye display based on the current position of the surgeon's eye.
  14. A method for obtaining and displaying measurements from a patient to a surgeon during a medical procedure using an augmented reality (AR) system, (a) A step of receiving measurements collected from the patient's tissue, wherein the measurements are collected by probes positioned adjacent to various regions of the patient's tissue, (b) Tracking the position of the probe while it is collecting measurements to determine the location where the measurements are collected, (c) Simultaneously with performing steps (a) and (b), the step of using at least one sensor to track the location of a biological landmark on the patient so as to determine the relationship between the location of the biological landmark and the location where the measurement is collected, (d) Display the measured value at a certain location on the near-eye display of the AR headset worn by the surgeon, Throughout the entire medical procedure, the location of the biological landmark is continuously tracked to obtain its current position. A method comprising the steps of updating the position of the measurement on the near-eye display based on the current position of the biological landmark, so that throughout the medical procedure, the measurement is aligned within the surgeon's field of view of the patient's tissue at the position relative to the patient from which the measurement was collected.
  15. A computer program product that includes instructions, wherein when the program is executed by a computer, the instructions are sent to the computer. (a) A step of receiving measurements collected from the patient's tissue, wherein the measurements are collected by probes positioned adjacent to various regions of the patient's tissue, (b) Tracking the position of the probe while it is collecting measurements to determine the location where the measurements are collected, (c) Simultaneously with performing steps (a) and (b), the step of using at least one sensor to track the location of a biological landmark on the patient so as to determine the relationship between the location of the biological landmark and the location where the measurement is collected, (d) Display the measurement at a certain location on the near-eye display of an AR headset worn by the surgeon , so that the measurement is available throughout the medical procedure at the location relative to the patient from which the measurement was collected. To continuously track the location of the biological landmark throughout the entire medical procedure and obtain its current position, A computer program product that causes the computer program to perform the steps of updating the position of the measurement on the near-eye display based on the current position of the biological landmark , thereby aligning the patient's tissue within the surgeon's field of view.

Description

This invention relates to an augmented reality system for use in medical procedures. Cancer surgery consists of removing the tumor. To ensure complete removal, the surgeon removes a portion of the healthy tissue surrounding the tumor. Complete removal is crucial to prevent tumor recurrence, but it comes with increased surgical costs, as well as higher morbidity and mortality rates. To ensure complete removal, the tumor must be correctly identified and accurately mapped. Current techniques for tumor identification and mapping include the following: - Biopsy, i.e., obtaining a tissue sample using a needle for post-hoc pathological analysis in the laboratory. - Patient scans, i.e., evaluation of the tumor margins using the patient's medical imaging. - Detecting cancer using thermal imaging, specifically infrared thermal imaging. Raman spectroscopy, a technique that allows for the analysis of the chemical composition of biological tissues, has been widely used over the past decade for cancer screening, diagnosis, and intraoperative surgical guidance. Raman spectroscopy for tumor identification and mapping is a powerful technique because it allows surgeons to distinguish between healthy and cancerous tissue during surgery while being less invasive than biopsy. However, known Raman techniques have significant limitations because the area of tissue analyzed at one time is very small (approximately 1 mm), requiring surgeons to perform several measurements with a probe to obtain a complete map of the cancerous tissue. This means that surgeons must either recall multiple locations of cancerous tissue determined by probe measurements during surgery, or mark the detected areas of cancerous tissue using physical markers such as stickers. The first method leads to reduced accuracy, while the latter increases surgical time because surgeons must stop and position physical markers correctly. A further solution to this problem is to observe the location of cancerous tissue on an image of the patient's body displayed on a monitor in the operating room. However, the surgeon must first compare this information with the actual patient's body. This can lead to a decrease in surgical accuracy, as this comparison is prone to human error, depending on the surgeon's ability to match the location shown on the monitor to the patient. In addition, this can prolong surgical time, as the surgeon must constantly look away from the patient to look at the monitor. This is a schematic diagram of a conventional Raman system used to analyze the chemical composition of biological tissues.This is a schematic diagram of an exemplary augmented reality (AR) headset according to an embodiment of the present invention.This is a schematic diagram of an exemplary augmented reality (AR) system according to one embodiment of the present invention for acquiring and visualizing measured values.Figure 3 shows a flowchart illustrating the steps taken to acquire measurements from a patient and accurately display them in the user's field of view using an exemplary AR system.This is an example virtual map displayed on the AR headset shown in Figure 2, acquired using the AR system shown in Figure 3. Figure 1 shows a conventional Raman system 200 for analyzing the chemical composition of biological tissues, such as for detecting and mapping tumors. The system 200 comprises a diode laser 204 attached to a Raman probe 202 via an optical fiber cable 206. The Raman probe 202 is also attached to a spectrometer 216 via an optical fiber cable 212. The spectrometer 216 may have a volume phase technology (VPT) grating coupled to a CCD camera 220. The CCD camera 220 communicates with a computer processing unit 222. During the Raman spectroscopy procedure, the probe 202 is positioned adjacent to the tissue 210 of the patient 208 being analyzed, as shown in Figure 1, so that a low-power laser beam from the laser 204 is incident on the area of tissue 210 being analyzed. This light incident on the tissue causes it to emit light through Raman scattering, which is then detected by the probe 202 and sent to the spectrometer 216 (i.e., a device that detects and analyzes incident light according to its wavelength and records the resulting spectrum). The output data from the spectrometer 216 is processed by the CCD camera 220 and the computer processing unit 222. Because cancerous tissue emits light within a specific spectrum, values that match the cancerous tissue can be converted into a graphic representation (i.e., a virtual map). This virtual map can be aligned with an image of the patient's tissue 208, thereby highlighting the areas of cancerous tissue. The images are merged with the virtual map and then displayed on a monitor set up in the operating room. The surgeon uses this visualization as guidance to locate areas of cancerous tissue during surgery. However, displaying images on a monitor far from the patient in the operating room has drawbacks. The surgeon must compare thi