Search

EP-4256421-B1 - OPTICAL ASSEMBLY FOR USE IN AN OBJECT HANDLING STATION

EP4256421B1EP 4256421 B1EP4256421 B1EP 4256421B1EP-4256421-B1

Inventors

  • CAPRARA, ALESSANDRO

Dates

Publication Date
20260506
Application Date
20211206

Claims (13)

  1. Wearable augmented reality optical assembly (10) for use in a center for washing and sterilization of objects (200) for the recognition, by means of computer vision techniques, of said objects (200), comprising: electromagnetic acquisition means (11) configured to detect at least images obtained with thermal imaging cameras and data being identification codes (202) of the objects (200) obtained with RFID or NFC tags readers; a processor (12) configured to perform image and data processing algorithms and also to generate visual content (16a); and a viewing device (13) capable of reproducing said visual content (16a) to make them visually available to an operator (300), wherein said processor (12) comprises a program for managing and tracking objects (200) adapted to recognize said objects (200) and provide the operator (300) with indications for the recomposition of kits or specialized groups of objects.
  2. Optical assembly (10) as in claim 1, characterized in that said processor (12) is configured to perform computer vision algorithms based on Deep Learning techniques and/or neural networks.
  3. Optical assembly (10) as in claims 1 or 2, characterized in that said viewing device (13) comprises a virtual screen (16) and reproduction means (17) capable of reproducing said visual content (16a).
  4. Optical assembly (10) as in any one of the preceding claims, characterized in that it is provided with a communication module (14) configured to transmit/receive said data to an external control unit (101).
  5. Optical assembly (10) as in any one of the preceding claims, characterized in that said electromagnetic acquisition means (11) further comprise one or more devices, chosen from video cameras.
  6. Optical apparatus (100) for use in a center for treating objects (200), characterized in that it comprises an optical assembly (10) as in any one of claims 1 to 5 and a control unit (101) configured at least to cooperate with a processor (12) in the steps of storing and/or processing data relating to said objects (200) and/or to interface the optical assembly (10) with said equipment for treating (400) and storing (401) objects (200) in said center, wherein said control unit (101) comprises a program for managing and tracking objects (200) suitable for the recognition of the object (200) and possibly its conformity, to provide useful indications for the recomposition of surgical kits (201) and possibly their final verification.
  7. Program for managing and tracking objects (200) comprising instructions that determine the execution of a method for managing objects (200) by means of an optical assembly (10) as in any of claims 1 to 5, said program instructions being adapted, once executed, to allow recognition of the object (200) and to provide indications for the recomposition of surgical kits (201).
  8. Management and tracking program as in claim 7, characterized in that it comprises instructions for processing an image, acquired by electromagnetic acquisition means (11), of said optical assembly (10), to recognize the objects (200) and possibly their conformity by means of a comparison with stored images.
  9. Management and tracking program as in any one of claims 7 or 8, characterized in that it comprises instructions for processing an image, acquired by electromagnetic acquisition means (11), of said optical assembly (10) to perform a step of recognition of an identification code (202) uniquely associated with an object (200).
  10. Management and tracking program as in any one of claims 7 to 9, characterized in that it includes computer vision algorithms based on Deep Learning techniques and/or neural networks.
  11. Method for handling objects (200) in a center for treating objects (200) adapted to allow the processing of said objects (200) by means of an optical assembly (10) as in any one of claims 1 to 5, wherein the method provides for the steps of: - detecting at least images and/or identification codes (202) of objects (200); - processing images and data relating to said objects (200) and generating visual content (16a) relating to said objects (200) from said images and data; - reproducing said visual content (16a) to make it visually available to an operator (300) of the treatment center.
  12. Method for managing a treatment cycle of objects (200) to be sanitized, by means of which parameters of said treatment cycle are defined, characterized in that said method comprises the determination of parameters of the treatment cycle by means of information obtained through the method as in claim 11.
  13. Center for washing and sterilization of objects (200), comprising equipment for treating (400) and storing (401) objects (200) and an optical apparatus (100) as in claim 6.

Description

FIELD OF APPLICATION Embodiments disclosed herein relate to a wearable augmented reality optical assembly for use in a center for treating objects, in particular a sterilization unit. In particular, the optical assembly allows an operator to be supported in the management of surgical instruments and in the recomposition of kits formed by a plurality of surgical instruments suitable for use during a specific surgical operation. Embodiments disclosed herein further relate to an apparatus comprising the optical assembly according to the invention and a method for objects handling. Embodiments disclosed herein also relate to a method for managing an object treatment cycle, by means of which, in particular, the parameters of a treatment cycle, e.g., sterilization of surgical instruments, can be defined. PRIOR ART It is known that in facilities in which operating theatres suitable for surgery are provided, such as hospitals, clinics or outpatient clinics, several hundred thousand surgical instruments are usually assembled and sterilized each year, including scissors and pliers, drills, orthopedic cutters, scalpels, endoscopic instruments, trays, surgical instrument cases or the like. A sterilization cycle consists of 3 main phases: reception and washing, packaging, sterilization and final check, usually conducted in adjacent but separate areas, for reasons of better control over the washing and sterilization process. In the reception and washing phase, the instruments used come from the operating theatres, are positioned in bulk, and are subjected to different washing and disinfection phases. In the second phase, each individual instrument is identified and checked for integrity, wear and condition of cleanliness. When necessary, for example if the presence of organic residues is suspected, the level of cleaning is evaluated visually with the help of a magnifying glass. The instruments are then individually packaged or reassembled in kits. Even today, checking and assembly are mainly done manually; however, there are some technologies that assist the operator in recognizing the instrument through the reading of an optical or radio frequency identification code or by displaying on-screen images of the instruments that make up the kit. Checklists and graphical representations of the precise positions of the instruments inside the container can then be used in the recomposition of the kit. The packaged instruments or kits are then sterilized and, after a final check, reinserted into the user circuit. A drawback of prior art solutions for supporting the operator in the recomposition of the surgical kit is that they usually provide fixed workstations at which to operate. The need to separate the various instrument preparation sectors so as to maintain the conditions of cleanliness and sterility gradually obtained makes it impossible, therefore, to use the same workstations even in subsequent stages of kit checking, for example verification of the kit after the sterilization phase. If support for the operator is to be ensured at all stages of surgical instrument treatment, a multiplication of the number of fixed workstations is required. Conversely, portable instruments, such as optical or radio frequency identification code readers, require handling by operators. In addition to not solving the problems of verifying the conditions of cleanliness and sterility as these instruments are handled manually, their use complicates the work of the operator, who must handle the surgical instruments and/or kit containers as well as the portable instruments all at the same time. The use of sterile gloves also makes the various operations even more difficult. Known optical assemblies which, however, do not solve the above problems, in particular with regard to the traceability and recomposition of kits or specialized groups of objects, are described, for instance, in the following documents: BABICHENKO DMITRIY ET AL: "SterileAR: Exploration of Augmented Reality and Computer Vision Approaches far Real-Time Feedback in Sterile Compounding Training", 2020 6TH INTERNATIONAL CONFERENCE OF THE IMMERSIVE LEARNING RESEARCH NETWORK (ILRN), IMMERSIVE LEARNING RESEARCH NETWORK, 21 June 2020 (2020-06-21), pages 62-69, XP033807343; US-A-2019/328921; and VERONIKA KRAUĂ€ ET AL: "Smartglasses in the Sterile Supply Process", PROCEEDINGS OF THE 10TH AUGMENTED HUMAN INTERNATIONAL CONFERENCE 2019, ACM, 2 PENN PLAZA, SUITE 701 NEW YORKNY1 0121-0701 USA, 8 September 2019 (2019-09-08), pages 859-861, XP058440986. Hence, there is a need to perfect an optical assembly which can overcome at least one of the drawbacks of the prior art. In particular, one aim of the present invention is to have an optical assembly that supports the operator in the recomposition of specialized groups of objects, in particular surgical kits. In this way, it is possible to reduce the likelihood of errors during correct recomposition, which are time-consuming and a cause of econom