US-12623349-B1 - System and method for automated optical deviation detection and correction procedure
Abstract
A system and method for automated optical deviation detection and correction is disclosed. The system includes a scanner, wherein the scanner includes one or more sensors positioned on the scanner, wherein the one or more sensors includes at least an optical sensor and a force-torque sensor; and a stage configured to hold a slide; at least a processor; and a memory communicatively connected to the at least a processor, wherein the memory contains instructions configuring processor to: receive sensor data from the one or more sensors, wherein the sensor data includes image data received from the optical sensor and associated with at least a slide and a tactile datum; detect a positional deviation in the image data relative to an alignment baseline; determine whether the positional deviation is within a correction threshold; and initiate a correction procedure associated with the at least a slide.
Inventors
- Mohammad Abdul Sulaiman
- Omkar Appasaheb Kabadagi
- Vaishnavi K B
- Prasanth Perugupalli
Assignees
- PRAMANA, INC.
Dates
- Publication Date
- 20260512
- Application Date
- 20250619
Claims (20)
- 1 . A system for automated optical deviation detection and correction procedure, the system comprising: a scanner, wherein the scanner comprises: at least two sensors positioned on the scanner, wherein the at least two sensors comprise at least an optical sensor and a force-torque sensor; and a stage configured to hold a slide; at least a processor; and a non-transitory memory communicatively connected to the at least a processor, wherein the non-transitory memory contains instructions configuring the at least a processor to: receive sensor data from the one or more sensors, wherein the sensor data comprises: image data received from the optical sensor and associated with at least a slide; and a tactile datum received from the force-torque sensor; detect a positional deviation in the image data relative to an alignment baseline; determine whether the positional deviation is within a correction threshold; and initiate a correction procedure associated with the at least a slide, wherein initiating the correction procedure comprises: generating a correction datum if the positional deviation is within the correction threshold; and generating a deviation flag if the positional deviation exceeds the correction threshold.
- 2 . The system of claim 1 , wherein the positional deviation is associated with at least one of a scanner misalignment or a slide misalignment.
- 3 . The system of claim 1 , wherein detecting the positional deviation comprises: identifying at least one boundary of the slide in the image data using an edge detection algorithm; and determining the positional deviation as a function of a difference between the at least one boundary and a reference boundary of the alignment baseline.
- 4 . The system of claim 1 , wherein detecting the positional deviation comprises spatially aligning the tactile datum to the image data, wherein spatially aligning the tactile datum to the image data comprises: mapping the image data into a coordinate system; and transforming coordinates of the tactile datum into the coordinate system to spatially align the tactile datum to the image data.
- 5 . The system of claim 1 , wherein generating the correction datum comprises executing the correction datum by generating and transmitting, to a robotic arm, an actuator control signal to reposition the at least a slide to be in closer alignment with the alignment baseline.
- 6 . The system of claim 1 , wherein generating the deviation flag comprises disabling scanning operations upon repeated generation of the deviation flag exceeding a flag threshold count over a defined time interval.
- 7 . The system of claim 1 , wherein generating the correction datum comprises: identifying a type of the positional deviation as a function of deviation patterns of historical positional deviations; and generating the correction datum as a function of the type of the positional deviation.
- 8 . The system of claim 1 , wherein the non-transitory memory contains instructions further configuring the at least a processor to: generate a correction data structure summarizing the correction procedure; and generate a user interface comprising the correction data structure.
- 9 . The system of claim 1 , wherein the non-transitory memory contains instructions further configuring the at least a processor to: record deviation metrics over a plurality of scan operations; compare the recorded deviation metrics to a defined statistical operating range; and generate a maintenance alert based on a trend analysis of the recorded deviation metrics.
- 10 . The system of claim 9 , wherein determining the defined statistical operating range comprises: determining a standard deviation of prior system performance data; and determining the defined statistical operating range as a function of the standard deviation.
- 11 . A method for automated optical deviation detection and correction procedure, the method comprising: receiving, using at least a processor, sensor data from at least two sensors positioned on a scanner, wherein the sensor data comprises image data received from at least an optical sensor of the at least two sensors and associated with at least a slide placed on the scanner and a tactile datum received from a force-torque sensor of the at least two sensors, wherein the scanner includes a stage configured to hold the slide; detecting, using the at least a processor, a positional deviation in the image data relative to an alignment, wherein detecting the positional deviation comprises spatially aligning the tactile datum to the image data; determining, using the at least a processor, whether the positional deviation is within a correction threshold; and initiating, using the at least a processor, a correction procedure associated with the at least a slide, wherein initiating the correction procedure comprises: generating a correction datum if the positional deviation is within the correction threshold; and generating a deviation flag if the positional deviation exceeds the correction threshold.
- 12 . The method of claim 11 , wherein the positional deviation is associated with at least one of a scanner misalignment or a slide misalignment.
- 13 . The method of claim 11 , wherein detecting the positional deviation comprises: identifying at least one boundary of the slide in the image data using an edge detection algorithm; and determining the positional deviation as a function of a difference between the at least one boundary and a reference boundary of an alignment baseline.
- 14 . The method of claim 11 , wherein detecting the positional deviation comprises spatially aligning the tactile datum to the image data, wherein spatially aligning the tactile datum to the image data comprises: mapping the image data into a coordinate system; and transforming coordinates of the tactile datum into the coordinate system to spatially align the tactile datum to the image data.
- 15 . The method of claim 11 , wherein generating the correction datum comprises executing the correction datum by generating and transmitting, to a robotic arm, an actuator control signal to reposition the at least a slide to be in closer alignment with an alignment baseline.
- 16 . The method of claim 11 , wherein generating the deviation flag comprises disabling scanning operations upon repeated generation of the deviation flag exceeding a flag threshold count over a defined time interval.
- 17 . The method of claim 11 , wherein generating the correction datum comprises: identifying a type of the positional deviation as a function of deviation patterns of historical positional deviations; and generating the correction datum as a function of the type of the positional deviation.
- 18 . The method of claim 11 , further comprising: generating, using the at least a processor, a correction data structure summarizing the correction procedure; and generating, using the at least a processor, a user interface comprising the correction data structure.
- 19 . The method of claim 11 , further comprising: recording, using the at least a processor, deviation metrics over a plurality of scan operations; comparing, using the at least a processor, the recorded deviation metrics to a defined statistical operating range; and generating, using the at least a processor, a maintenance alert based on a trend analysis of the recorded deviation metrics.
- 20 . The method of claim 19 , wherein determining the defined statistical operating range comprises: determining a standard deviation of prior system performance data; and determining the defined statistical operating range as a function of the standard deviation.
Description
FIELD OF THE INVENTION The present invention generally relates to the field of image processing. In particular, the present invention is directed to a system and method for automated optical deviation detection and correction procedure. BACKGROUND In the field of optical scanning and imaging systems, various methods have been developed to address the issue of positional deviation of scanned images. Traditional approaches often involve manual calibration and alignment procedures, which require significant human intervention and are prone to errors. These methods typically rely on visual inspection and manual adjustments. While effective to some extent, these manual processes are time-consuming and can lead to inconsistencies in image quality and alignment. SUMMARY OF THE DISCLOSURE In some aspects, the techniques described herein relate to a system for automated optical deviation detection and correction procedure, the system including: a scanner, wherein the scanner includes: one or more sensors positioned on the scanner, wherein the one or more sensors includes at least an optical sensor and a force-torque sensor; and a stage configured to hold a slide; at least a processor; and a memory communicatively connected to the at least a processor, wherein the memory contains instructions configuring processor to: receive sensor data from the one or more sensors, wherein the sensor data includes: image data received from the optical sensor and associated with at least a slide; and a tactile datum received from the force-torque sensor; detect a positional deviation in the image data relative to an alignment baseline; determine whether the positional deviation is within a correction threshold; and initiate a correction procedure associated with the at least a slide, wherein initiating the correction procedure includes: generating a correction datum if the positional deviation is within the correction threshold; and generating a deviation flag if the positional deviation exceeds the correction threshold. In some aspects, the techniques described herein relate to a method for automated optical deviation detection and correction procedure, the method including: receiving, using at least a processor, sensor data from one or more sensors positioned on a scanner, wherein the sensor data includes image data received from at least an optical sensor of the one or more sensors and associated with at least a slide placed on the scanner and a tactile datum received from a force-torque sensor of the one or more sensors, wherein the scanner includes a stage configured to hold the slide; detecting, using the at least a processor, a positional deviation in the image data relative to an alignment, wherein detecting the positional deviation includes spatially aligning the tactile datum to the image data; determining, using the at least a processor, whether the positional deviation is within a correction threshold; and initiating, using the at least a processor, a correction procedure associated with the at least a slide, wherein initiating the correction procedure includes: generating a correction datum if the positional deviation is within the correction threshold; and generating a deviation flag if the positional deviation exceeds the correction threshold. These and other aspects and features of non-limiting embodiments of the present invention will become apparent to those skilled in the art upon review of the following description of specific non-limiting embodiments of the invention in conjunction with the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS For the purpose of illustrating the invention, the drawings show aspects of one or more embodiments of the invention. However, it should be understood that the present invention is not limited to the precise arrangements and instrumentalities shown in the drawings, wherein: FIG. 1 illustrates a block diagram of an exemplary system for automated optical deviation detection and correction procedure; FIG. 2 illustrates a configuration of an exemplary system including a cluster of scanners; FIG. 3 illustrates an exemplary user interface; FIGS. 4A-C illustrate exemplary image data containing exemplary positional deviations; FIG. 5 illustrates a block diagram of an exemplary machine-learning module; FIG. 6 illustrates a diagram of an exemplary neural network; FIG. 7 illustrates a block diagram of an exemplary node in a neural network; FIG. 8 illustrates a flow diagram of an exemplary method for automated optical deviation detection and correction procedure; and FIG. 9 illustrates a block diagram of a computing system that can be used to implement any one or more of the methodologies disclosed herein and any one or more portions thereof. The drawings are not necessarily to scale and may be illustrated by phantom lines, diagrammatic representations and fragmentary views. In certain instances, details that are not necessary for an understanding of the embodiments or that render