EP-4740188-A1 - SYSTEM AND METHOD FOR INDUSTRIAL RISK ASSESSMENT VIA COMPUTER VISION
Abstract
A device, system and method comprising computer vision techniques for fire prevention/detection and risk assessment, as well as for determining deviations from an ideal operational state. The present invention includes for example systems and methods which leverage data collected by camera systems composed of infrared and visible light sensors to detect and/or prevent a fire from starting, and additionally, use this data to determine a risk assessment for the building. The present invention also provides for example a system and method for monitoring and controlling safety risks in indoor industrial environments by determining deviations from an ideal operational state using computer vision techniques and game-theoretic competitive ranking frameworks.
Inventors
- HANOVER, Drew
- LAENGLE, THOMAS
- TRAUTWEILER, Florian
Assignees
- Innovire AG
Dates
- Publication Date
- 20260513
- Application Date
- 20250417
Claims (1)
- WHAT IS CLAIMED: 1. A system for monitoring an indoor industrial environment by determining deviations from an ideal operational state, the system comprising: a plurality of cameras physically positioned throughout the indoor industrial environment for capturing real-time image data; a processor; and a memory storing instructions that, when executed by the processor, cause the processor to process the image data from the plurality of cameras; characterized in that the memory storing instructions, when executed by the processor, cause the processor to: (a) establish a game-theoretic competitive ranking framework wherein each camera functions as a player and each captured image represents a move made by the respective camera, the framework configured to identify physical deviations from the ideal operational state; (b) perform head-to-head matches between: (i) images from different cameras for assessing relative physical deviations across different zones of the industrial environment, and (ii) current and past images from the same camera for tracking absolute physical deviations over time; (c) evaluate, using a trained artificial intelligence model capable of recognizing industrial environment features, which image in each match is closer to the ideal operational state, wherein the ideal operational state comprises one or more of: (i) physical arrangement showing a presence of only material, machines, and humans directly involved in operational processes of the industrial environment, (ii) spatial organization of materials and equipment according to predefined safety parameters, and/or (iii) an absence of foreign objects; the evaluation yielding a match result comprising one of: a win when one camera's image is judged closer to the ideal operational state, a loss when the other camera's image is judged farther from the ideal operational state, or a draw when no clear difference is detected; (d) update a rating for each camera based on the match results according to a competitive ranking algorithm that adjusts each camera's rating based on the expected and actual outcomes of matches, wherein the rating quantifies the degree of physical deviation from the ideal operational states in each camera's monitored area; (e) determine if the physical deviations from the ideal operational state exceed a predetermined threshold based on the updated ratings; and (f) automatically trigger a physical alarm device or activate a safety mitigation system, or both, if the physical deviations exceed the threshold, thereby implementing automated control of safety conditions in the industrial environment by identifying and responding to conditions that increase risk. 2. The system of claim 1, wherein the ideal operational state further comprises: (a) predefined safety parameters for proper spatial organization of materials and equipment, wherein the predefined safety parameters include: (i) maintaining designated clearances around machinery, equipment, and electrical installations according to manufacturer specifications and regulatory standards; (ii) unobstructed emergency egress paths; (iii) designated material storage zones with maximum height restrictions and minimum clearances of from sprinkler heads and heating elements; and (iv) demarcated traffic lanes for personnel and material handling equipment; and (b) wherein the trained artificial intelligence model is configured to identify deviations from these predefined safety parameters by detecting encroachment on clearance zones, obstructed pathways, or improperly positioned equipment. 3. The system of claims 1 or 2, wherein the absence of foreign objects comprises the absence of: (a) process-unrelated items; (b) displaced operational materials; (c) maintenance debris; (d) production waste; and (e) transient equipment; wherein the trained artificial intelligence model is configured to distinguish between items that belong in the industrial environment versus those that constitute deviations from the ideal operational state. 2. The system of 3, wherein the absence of foreign objects comprises the absence of: (a) process-unrelated items including personal belongings, packaging materials, and tools from unrelated work processes; (b) displaced operational materials comprising raw materials, components, or finished products that have fallen, spilled, or been improperly stored outside their designated containers, conveyors, or storage areas; (c) maintenance debris including used parts, packaging from replacement components, cleaning materials, or maintenance tools left behind after repair or service activities; (d) production waste comprising scrap materials, byproducts, shavings, dust, or other process residues that have accumulated beyond acceptable levels or outside designated collection areas; and (e) transient equipment including temporary machinery, dollies, carts, or auxiliary equipment left in walkways, emergency exit paths, or production areas after their immediate purpose has been fulfilled; wherein the trained artificial intelligence model is configured to distinguish between items that belong in the industrial environment versus those that constitute deviations from the ideal operational state. 3. The system of any of claims 1-4, wherein updating the rating for each camera comprises: calculating an expected outcome for each match based on current ratings; comparing the expected outcome to the actual match result; and adjusting the ratings based on the difference between expected and actual outcomes. 4. The system of any of the above claims, wherein the plurality of cameras comprises at least one of visible light cameras, infrared cameras, thermal cameras, or multispectral cameras, or a combination thereof. 5. The system of any of the above claims, wherein the instructions further cause the processor to: generate a cleanliness score for each camera based on its updated rating; and display the cleanliness scores on a user interface. 6. The system of claim 7, wherein the instructions further cause the processor to: track changes in the cleanliness scores over time; and generate alerts if the cleanliness scores fall below a predetermined threshold. 7. The system of any of the above claims, wherein the instructions further cause the processor to: identify specific objects or conditions contributing to deviations from the ideal operational state; and include information about the identified objects or conditions in the triggered alarm. 8. The system of any of the above claims, wherein the instructions further cause the processor to: adjust the threshold for triggering the alarm based on historical data and patterns of deviations. 9. The system of any of the above claims, wherein the instructions further cause the processor to: perform image segmentation on the received image data to isolate different elements within the industrial environment. 10. The system of claim 11, wherein the instructions further cause the processor to: classify the isolated elements as either contributing to or deviating from the ideal operational state. 11. The system of any of the above claims, wherein the instructions further cause the processor to: apply different weightings to different types of physical deviations from the ideal operational state when updating the ratings. 12. The system of any of the above claims, wherein the instructions further cause the processor to: generate heatmaps visualizing the spatial distribution of deviations from the ideal operational state across at least a portion of the industrial environment. 13. The system of claim 14, wherein said portion of the industrial environment is defined according to a visual field of at least one camera in the system, and wherein said ideal operational state is defined according to data learned by an Al model according to a plurality of images received by said at least one camera over a period of time. 14. The system of any of the above claims, wherein the instructions further cause the processor to: implement a temporal analysis to detect gradual changes in the industrial environment over time that contribute to deviations from the ideal operational state. 15. The system of any of the above claims, wherein the instructions further cause the processor to: generate recommended actions to address identified deviations from the ideal operational state; and display the recommended actions on a user interface. 16. The system of any of the above claims, wherein the instructions further cause the processor to: adjust the frequency of head-to-head matches based on the rate of change in deviations from the ideal operational state. 17. The system of any of the above claims, wherein the safety mitigation system comprises at least one of an automated ventilation system, an emergency shutdown sequence, a fire suppression system, or an access control system; and wherein activation of the safety mitigation system comprises automatically controlling the operation of the corresponding system based on the type and severity of the detected physical deviations. 18. The system of any of the above claims, wherein the artificial intelligence model comprises a convolutional neural network, a visual transformer model or a variant thereof trained on images of ideal and non-ideal industrial environments. 19. The system of any of the above claims, wherein the game-theoretic competitive ranking framework implements an Elo rating system, Glicko-2 rating system, TrueSkill™ system, or Bayesian rating system. 20. The system of any of the above claims, wherein the instructions further cause the system to: implement a federated learning approach, an agent based learning approach or a combined approach thereof, to improve the artificial intelligence model using data from multiple industrial environments, from one industrial environment over a plurality of time periods or a combination thereof. 21. The system of any of the above claims, wherein the instructions further cause the processor to: generate a risk assessment score based on the updated ratings of the cameras; and incorporate the risk assessment score into an overall risk determination for the industrial environment. 22. A method for monitoring an indoor industrial environment by determining deviations from an ideal operational state by operating the system according to any of the above claims, the method comprising: (a) capturing real-time image data using the plurality of cameras physically positioned throughout the indoor industrial environment; (b) processing the image data to establish a game-theoretic competitive ranking framework wherein each camera functions as a player and each captured image represents a move made by the respective camera; (c) performing head-to-head matches between: (i) images from different cameras for assessing relative physical deviations across different zones of the industrial environment, and (ii) current and past images from the same camera for tracking absolute physical deviations over time; (d) evaluating, using the trained artificial intelligence model, which image in each match is closer to the ideal operational state; (e) updating a rating for each camera based on the match results according to the competitive ranking algorithm; (f) determining if the physical deviations from the ideal operational state exceed the predetermined threshold based on the updated ratings; and (g) automatically triggering the physical alarm device or activating the safety mitigation system, or both, if the physical deviations exceed the threshold.. 25. A system for monitoring and controlling safety risks in an indoor industrial environment by determining deviations from an ideal operational state, the system comprising: a plurality of cameras physically positioned throughout the indoor industrial environment for capturing real-time image data; a processor; and a memory storing instructions that, when executed by the processor, cause the processor to process the image data from the plurality of cameras; characterized in that the memory storing instructions, when executed by the processor, cause the processor to: (a) establish a competitive technical ranking framework wherein each camera functions as a monitoring node and each captured image represents a data point for spatial-temporal evaluation, the framework configured to identify physical deviations from the ideal operational state; (b) perform comparative evaluations between: (i) images from different cameras for assessing relative physical deviations across different zones of the industrial environment, and (ii) current and past images from the same camera for tracking absolute physical deviations over time; (c) evaluate, using a trained convolutional neural network implemented on the processor and configured to process the image data for recognizing industrial environment features, which image in each comparative evaluation is closer to the ideal operational state, wherein the ideal operational state comprises one or more of: (i) physical arrangement showing a presence of only material, machines, and humans directly involved in operational processes of the industrial environment, (ii) spatial organization of materials and equipment according to predefined safety parameters, and/or (iii) an absence of foreign objects; the evaluation yielding a comparison result comprising one of: a positive status when one camera's image is judged closer to the ideal operational state, a negative status when the other camera's image is judged farther from the ideal operational state, or a neutral status when no clear difference is detected; (d) update a technical safety rating for each camera based on the comparison results according to a competitive ranking algorithm that adjusts each camera's rating based on the expected and actual outcomes of comparisons, wherein the rating quantifies the degree of physical deviation from the ideal operational states in each camera's monitored area; (e) determine if the physical deviations from the ideal operational state exceed a predetermined safety threshold based on the updated technical safety ratings; and (f) automatically trigger a physical alarm device or activate a safety mitigation system, or both, if the physical deviations exceed the threshold, thereby implementing automated control of safety conditions in the industrial environment by identifying and responding to conditions that increase risk. 26. The system of claim 25, wherein the ideal operational state further comprises: (a) predefined safety parameters for proper spatial organization of materials and equipment, wherein the predefined safety parameters include: (i) maintaining designated clearances around machinery, equipment, and electrical installations according to manufacturer specifications and regulatory standards, wherein the clearances range from 0.6 to 1.2 meters depending on equipment type; (ii) unobstructed emergency egress paths with minimum widths of 0.7 to 1.1 meters; (iii) designated material storage zones with maximum height restrictions and minimum clearances of 0.5 meters from sprinkler heads and 1 meter from heating elements; and (iv) demarcated traffic lanes for personnel and material handling equipment with minimum widths of 0.8 meters for pedestrian-only paths and 1.8 to 3.5 meters for vehicular routes; and (b) wherein the trained convolutional neural network is configured to identify deviations from these predefined safety parameters by detecting encroachment on clearance zones, obstructed pathways, or improperly positioned equipment. 27. The system of claim 25, wherein the absence of foreign objects comprises the absence of: (a) process-unrelated items including personal belongings, packaging materials, and tools from unrelated work processes; (b) displaced operational materials comprising raw materials, components, or finished products that have fallen, spilled, or been improperly stored outside their designated containers, conveyors, or storage areas; (c) maintenance debris including used parts, packaging from replacement components, cleaning materials, or maintenance tools left behind after repair or service activities; (d) production waste comprising scrap materials, byproducts, shavings, dust, or other process residues that have accumulated beyond acceptable levels or outside designated collection areas; and (e) transient equipment including temporary machinery, dollies, carts, or auxiliary equipment left in walkways, emergency exit paths, or production areas after their immediate purpose has been fulfilled; wherein the trained convolutional neural network is configured to distinguish between items that legitimately belong in the industrial environment versus those that constitute deviations from the ideal operational state. 28. The system of claim 25, wherein processing the visible light image data to detect smoke comprises: calculating image energy using Sobel filtering to identify regions in the visible light image data where texture and contrast have been reduced; comparing a current frame's energy map to a reference background frame to identify regions showing significant energy drops; generating bounding boxes around the identified regions; and classifying the identified regions using a trained neural network model and performing temporal consistency checks to reduce false positives. 29. The system of claim 25, wherein processing the visible light image data to detect fire comprises: performing motion detection via background subtraction on the visible light image data; applying color filtering to identify regions with fire-characteristic color patterns; extracting regions of interest from the filtered frames; and classifying the extracted regions using a neural network trained on fire and non-fire images. 30. The system of claim 25, wherein the instructions further configure the system to assess cleanliness levels within the industrial environment by: comparing current visible light images with historical images from the same locations; quantifying differences in visual characteristics between the current and historical images; and identifying areas requiring maintenance based on the quantified differences, wherein the identified areas are incorporated into the physical deviations for safety risk assessment. 31. The system of claim 25, wherein the instructions further configure the system to: detect unauthorized personnel in the industrial environment based on motion detection and human detection algorithms applied to the visible light image data; and include the detection of unauthorized personnel in the physical deviations for safety risk assessment. 32. The system of claim 25, wherein the instructions further configure the system to: store the thermal image data and the visible light image data in a database; analyze historical trends in the temperature metrics; and automatically adjust the predefined temperature thresholds based on the historical trends. 33. The system of claim 25, wherein the control signals include: alarm signals transmitted to response personnel when temperature anomalies exceed critical thresholds; automatic shutdown signals for equipment in areas where fire or smoke is detected; and notification signals indicating areas requiring maintenance based on the identified physical deviations. 34. A computer-implemented method for monitoring and controlling safety risks in an industrial environment comprising a plurality of cameras including at least one infrared thermal camera and at least one visible light camera, the method comprising: acquiring thermal image data from the at least one infrared thermal camera; calculating temperature metrics based on the thermal image data, wherein the temperature metrics comprise minimum temperature, maximum temperature, and average temperature values for defined regions within the industrial environment; detecting temperature anomalies by comparing the calculated temperature metrics with predefined temperature thresholds for the defined regions; acquiring visible light image data from the at least one visible light camera; processing the visible light image data using computer vision algorithms to detect at least one physical condition selected from: smoke, fire, occupancy, and presence of hazardous materials; combining the detected temperature anomalies and the detected physical conditions to identify physical deviations from predetermined operational safety parameters; and generating control signals based on the identified physical deviations to mitigate detected safety risks. 35. The method of claim 34, wherein processing the visible light image data to detect smoke comprises: calculating image energy using Sobel filtering to identify regions in the visible light image data where texture and contrast have been reduced; comparing a current frame's energy map to a reference background frame to identify regions showing significant energy drops; generating bounding boxes around the identified regions; and classifying the identified regions using a trained neural network model and performing temporal consistency checks to reduce false positives. 36. The method of claim 34, wherein processing the visible light image data to detect fire comprises: performing motion detection via background subtraction on the visible light image data; applying color filtering to identify regions with fire-characteristic color patterns; extracting regions of interest from the filtered frames; and classifying the extracted regions using a neural network trained on fire and non-fire images. 37. The method of claim 34, further comprising: storing the thermal image data and the visible light image data in a database; analyzing historical trends in the temperature metrics; and automatically adjusting the predefined temperature thresholds based on the historical trends. 38. The method of claim 34, further comprising: implementing a competitive ranking system to assess cleanliness levels within the industrial environment, wherein: a) current images of different facility locations are compared to historical images of the same locations; b) each location is assigned a cleanliness rating; and c) areas with declining cleanliness ratings are identified and incorporated into the physical deviations for safety risk assessment. 39. The method of claim 34, wherein generating control signals comprises at least one of: transmitting alarm signals to response personnel when temperature anomalies exceed critical thresholds; automatically shutting down equipment in areas where fire or smoke is detected; and generating notification signals indicating areas requiring maintenance based on the identified physical deviations. 40. A non-transitory computer-readable storage medium storing instructions that, when executed by a processor, cause the processor to perform the method of any one of claims 34 to 39. 41 . A method for monitoring and controlling safety risks in an indoor industrial environment by determining deviations from an ideal operational state, the method comprising: capturing real-time image data using a plurality of cameras physically positioned throughout the indoor industrial environment; processing the captured image data using a processor executing instructions stored in a memory; establishing a game-theoretic competitive ranking framework wherein each camera functions as a player and each captured image represents a move made by the respective camera, the framework configured to identify physical deviations from predetermined operational safety parameters; performing head-to-head matches between: (i) images from different cameras for assessing relative physical deviations across different zones of the industrial environment, and (ii) current and past images from the same camera for tracking absolute physical deviations over time; evaluating, using a trained artificial intelligence model specifically adapted to recognize industrial environment safety conditions, which image in each match is closer to the ideal operational state, wherein the ideal operational state comprises one or more of: (i) physical arrangement showing only material, machines, and humans directly involved in operational processes of the industrial environment, (ii) proper spatial organization of materials and equipment according to predefined safety parameters, and (iii) an environmental parameter comprising at least temperature within process-specific optimal ranges; generating a match result comprising one of: a win when one camera's image is judged closer to the ideal operational state, a loss when the other camera's image is judged farther from the ideal operational state, or a draw when no clear difference is detected; updating a rating for each camera based on the match results according to a competitive ranking algorithm that adjusts each camera's rating based on the expected and actual outcomes of matches, wherein the rating quantifies the degree of physical deviation from safety parameters in each camera's monitored area; determining if the physical deviations from the ideal operational state exceed a predetermined threshold based on the updated ratings; and automatically triggering a physical alarm device or activating a safety mitigation system, or both, if the physical deviations exceed the threshold, thereby implementing automated control of safety conditions in the industrial environment by identifying and responding to conditions that increase risk. 42. The method of claim 41, wherein the trained artificial intelligence model comprises a convolutional neural network, a visual transformer model, or a variant thereof trained on a dataset comprising labeled examples of ideal and non-ideal industrial environments. 43. The method of claim 41, wherein the game-theoretic competitive ranking framework implements one or more of: an Elo rating system, a Glicko-2 rating system, a TrueSkill™ system, or a Bayesian rating system. 44. The method of any of claims 41 to 43, wherein updating the rating for each camera comprises: calculating an expected outcome for each match based on current ratings; comparing the expected outcome to the actual match result; and adjusting the ratings based on the difference between expected and actual outcomes. 45. The method of any of the above claims, wherein the plurality of cameras comprises at least one of visible light cameras, infrared cameras, thermal cameras, or multispectral cameras, or a combination thereof. 46. The method of any of the above claims, further comprising: measuring one or more environmental parameters using at least one sensor; incorporating the measured environmental parameters into the evaluation of the ideal operational state; and adjusting the rating updates based on the measured environmental parameters. 47. The method of any of the above claims, further comprising: generating a cleanliness score for each camera based on its updated rating; and displaying the cleanliness scores on a user interface. 48. The method of claim 47, further comprising: tracking changes in the cleanliness scores over time; and generating alerts if the cleanliness scores fall below a predetermined threshold. 49. The method of any of the above claims, further comprising: identifying specific objects or conditions contributing to deviations from the ideal operational state; and including information about the identified objects or conditions in the triggered alarm. 50. The method of any of the above claims, further comprising: dynamically adjusting the threshold for triggering the alarm based on historical data and patterns of deviations. 51 . The method of any of the above claims, further comprising: performing image segmentation on the received image data to isolate different elements within the industrial environment. 52. The method of any of the above claims, further comprising: classifying the isolated elements as either contributing to or deviating from the ideal operational state. 53. The method of any of the above claims, further comprising: applying different weightings to different types of physical deviations from the ideal operational state when updating the ratings. 54. The method of any of the above claims, further comprising: generating heatmaps visualizing the spatial distribution of deviations from the ideal operational state across at least a portion of the industrial environment. 55. The method of claim 54, wherein said portion of the industrial environment is defined according to a visual field of at least one camera in the system, and wherein said ideal operational state is defined according to data learned by the artificial intelligence model from a plurality of images received by said at least one camera over a period of time. 56. The method of any of the above claims, further comprising: implementing a temporal analysis to detect gradual changes in the industrial environment over time that contribute to deviations from the ideal operational state. 57. The method of any of the above claims 1, further comprising: generating recommended actions to address identified deviations from the ideal operational state; and displaying the recommended actions on a user interface. 58. The method of any of the above claims, further comprising: adjusting the frequency of head-to-head matches based on the rate of change in deviations from the ideal operational state. 59. The method of any of the above claims, wherein the safety mitigation system comprises at least one of an automated ventilation system, an emergency shutdown sequence, a fire suppression system, or an access control system; and wherein activation of the safety mitigation system comprises automatically controlling the operation of the corresponding system based on the type and severity of the detected physical deviations. 60. The method of any of the above claims, further comprising: implementing a federated learning approach, an agent-based learning approach, or a combined approach thereof, to improve the artificial intelligence model using data from multiple industrial environments, from one industrial environment over a plurality of time periods, or a combination thereof. 61 . The method of any of the above claims, further comprising: generating a risk assessment score based on the updated ratings of the cameras; and incorporating the risk assessment score into an overall risk determination for the industrial environment. 62. The method of claim 61, further comprising: communicating the risk assessment score to an insurance underwriting system for dynamic adjustment of insurance premiums based on real-time safety conditions. 63. The method of any of the above claims 1, further comprising training the artificial intelligence model using a dataset comprising: images scraped from internet sources showing various industrial environments; synthetic images generated using generative models with prompting; and images with visual effects overlaid to simulate smoke, fire, or other hazardous conditions. 64. The method of claim 63, wherein training the artificial intelligence model further comprises applying data augmentation techniques including one or more of: artificial noise, geometric transformations, color adjustments, and exposure adjustments. 65. The method of any of the above claims, wherein evaluating which image is closer to the ideal operational state comprises: for fire detection, performing background subtraction to detect motion, applying color filtering for fire-typical hues, and extracting regions of interest for neural network analysis; and for smoke detection, calculating image energy using Sobel filtering, detecting regions with energy drops, and classifying these regions using a neural network. 66. The method of any of the above claims, wherein the memory further comprises instructions that, when executed by the at least one processor, cause the system to verify potential hazard detections using a foundation model that provides secondary analysis before triggering alerts. 67. The method of any of the above claims, further comprising storing confidence scores for detected hazards in a temporal buffer to track persistence of detections over time and reduce false positives. 68. The method of any of the above claims, wherein the competitive ranking algorithm incorporates volatility tracking to identify cameras showing increased variation in performance, which may indicate degrading conditions or sensor malfunction. 69. The method of any of the above claims, wherein performing head-to-head matches comprises conducting both cross-sensor matches between different cameras at the same moment in time and historical self-matches comparing a camera's current reading to its historical norms. 70. The method of any of the above claims, wherein the game-theoretic competitive ranking framework is configured to handle non-transitive outcomes where if Camera A outperforms Camera B, and Camera B outperforms Camera C, Camera A might not necessarily outperform Camera C. 71 . A computer-implemented system for fire risk assessment in an industrial environment, the system comprising: at least one processor; and a memory coupled to the at least one processor, the memory comprising instructions that, when executed by the at least one processor, cause the system to: receive image data from a plurality of cameras monitoring the industrial environment; implement a game-theoretic ranking framework wherein cameras function as players and images function as moves in a competitive assessment of operational safety; generate head-to-head match results by comparing pairs of images to determine which image is closer to an ideal operational state representing safe operational conditions; update competitive rankings for each camera based on the match results using a rating algorithm that quantifies deviation from safety parameters; calculate a fire risk score based on the updated competitive rankings; and initiate preventive actions when the fire risk score exceeds a predetermined threshold. 72. The system of claim 71, wherein the memory further comprises instructions that, when executed by the at least one processor, cause the system to display a dashboard visualization presenting the calculated fire risk score along with contributing components including temperature scoring, fire scoring, smoke scoring, and cleanliness scoring. 73. The system of claims 71 or 72, wherein the method is implemented on an edge computing system comprising an on-premise server equipped with specialized GPUs that processes video feeds from the cameras in real-time and periodically uploads snapshot images for longer-term analysis. 74. The system of any of claims 71 to 73, wherein the system is configured to integrate with existing IP camera networks in the industrial environment, providing a plug-in upgrade that adds risk assessment capabilities without requiring replacement of existing camera infrastructure. 75. The system of any of claims 71 to 74, wherein the rating algorithm implements a competitive ranking method that adjusts ratings according to a mathematical formula that accounts for relative performance in pairwise comparisons. 76. The system of any of claims 71 to 75, wherein the memory further comprises instructions that, when executed by the at least one processor, cause the system to detect fire or smoke by analyzing the image data using a multi-stage detection pipeline comprising motion detection, color filtering, and neural network classification. 77. The system of any of claims 71 to 76, wherein the memory further comprises instructions that, when executed by the at least one processor, cause the system to detect smoke by performing energy-based image analysis to identify regions where texture and contrast have been reduced, followed by verification using a neural network classifier. 78. The system of any of claims 71 to 77, wherein the memory further comprises instructions that, when executed by the at least one processor, cause the system to: store a historical record of competitive rankings; and analyze trends in the competitive rankings to identify areas with deteriorating safety conditions. 79. The system of any of claims 71 to 78, wherein the memory further comprises instructions that, when executed by the at least one processor, cause the system to calculate the fire risk score by weighting different components according to industryspecific risk factors determined based on historical loss data. 80. A non-transitory computer-readable storage medium storing instructions that, when executed by a computing system, cause the computing system to: implement a game- theoretic competitive ranking framework for evaluating safety conditions in an industrial environment using image data from multiple cameras; perform pairwise comparisons between images to determine relative deviations from an ideal operational state; update competitive rankings for monitored areas based on the pairwise comparison results; identify potential safety risks based on changes in the competitive rankings; and initiate automated risk mitigation measures when identified risks exceed predetermined thresholds. 81 . A system for automatic fire risk assessment, comprising: an infrared sensor configured to monitor high-risk areas within a facility and to provide temperature data for every pixel in a thermal image; a visible light sensor configured to oversee day-to- day operations within the facility and to visualize elements including at least one of dust, smoke, fire, occupancy, cleanliness, safety equipment, hazardous materials, or potential intruders; a computer processing system operatively coupled to the infrared sensor and the visible light sensor, configured to process information from the sensors and to relay this information to a data storage system; a data storage system configured to store long-term data of the facility, including temperature data from the infrared sensor and events detected by the visible light sensor; and a fire risk scoring mechanism configured to quantify the level of fire risk based on a long-term temperature trend analysis across a multitude of thermal cameras, events from standard cameras, and data from in-person risk assessments, wherein the fire risk scoring mechanism is further configured to dynamically adjust insurance premiums based on the quantified level of fire risk. 82. The system of claim 81, wherein the infrared sensor is further configured to calculate minimum, maximum, and average temperatures for various regions within an image and to trigger alarms in case of temperature threshold exceedance. 83. The system of claim 81, wherein the visible light sensor is further configured to process images using computer vision algorithms to determine the presence of a sufficient fire risk. 84. The system of claim 81, wherein the data storage system is further configured to store manufacturer-specific data affecting fire risk, including the location of the facility and ambient temperatures. 85. The system of claim 81, wherein the fire risk scoring mechanism is further configured to take into account events detected from images processed by the visible light sensor for calculating the fire risk score. 86. A method for automatic fire risk assessment, comprising the steps of: monitoring a facility using an infrared sensor and a visible light sensor; processing data from the sensors using a computer processing system; storing the processed data in a data storage system; and calculating a fire risk score using a fire risk scoring mechanism based on long-term temperature trend analysis, events detected from images, and data from in-person risk assessments, wherein the calculated fire risk score is used to dynamically adjust insurance premiums. 87. The method of claim 86, wherein the step of monitoring the facility includes capturing temperature data and images of the facility. 88. The method of claims 86 or 87, wherein the step of processing the data includes calculating metrics from the temperature data and analyzing the images to detect fire risks. 89. The method of any of the above claims, wherein the step of storing the processed data includes storing the data in a cloud or on-premise data storage system for later analysis. 90. The method of any of the above claims, wherein the step of calculating the fire risk score includes taking into account events detected from the images processed by the visible light sensor. 91 . The method of any of the above claims, further comprising the step of configuring the infrared sensor to calculate minimum, maximum, and average temperatures for various regions within an image and to compare these metrics against predefined or automatically learned thresholds to trigger alarms in case of temperature threshold exceedance. 92. The method of any of the above claims, further comprising the step of configuring the visible light sensor to utilize computer vision algorithms to process images for the detection of elements indicative of fire risk, including dust, smoke, fire, and hazardous materials. 93. The method of any of the above claims, further comprising the step of utilizing the computer processing system to compare calculated temperature metrics against historical temperature data stored in the data storage system to identify temperature anomalies indicative of fire risk. 94. The method of any of the above claims, further comprising the step of utilizing the data storage system to store long-term facility data, including temperature trends and events detected by the visible light sensor, for use in subsequent fire risk score calculations. 95. The method of any of the above claims, further comprising the step of dynamically adjusting insurance premiums based on the fire risk score calculated by the fire risk scoring mechanism, wherein the adjustment rewards facilities with lower risk scores and penalizes those with higher risk scores. 96. The method of any of the above claims, further comprising the step of utilizing the fire risk scoring mechanism to integrate data from in-person risk assessments with sensor data to enhance the accuracy of the fire risk score. 97. The method of any of the above claims, further comprising the step of configuring the fire risk scoring mechanism to adjust the fire risk score based on manufacturerspecific data, including the location of the facility and ambient temperatures, which may affect the fire risk. 98. The method of any of the above claims, further comprising the step of employing the computer processing system to relay processed sensor data to a higher-level data storage system, wherein the data is analyzed for long-term trend analysis contributing to the fire risk score. 99. The method of any of the above claims, further comprising the step of configuring the data storage system to store data from a multitude of thermal cameras and standard cameras, wherein the stored data is analyzed for detecting trends and events that contribute to the overall fire risk assessment. 100. The method of any of the above claims, further comprising the step of implementing the fire risk scoring mechanism to utilize a weighted formula that aggregates quantified measures of detected risk factors, including temperature anomalies and events from standard cameras, to produce a unified fire risk score. 101. The method of any of the above claims, wherein the facility monitored is within the wood products industry, and the monitoring includes capturing temperature data and images specifically for the detection of dust accumulation that may pose a fire risk. 102. The method of any of the above claims, further comprising the step of analyzing the images captured by the visible light sensor to detect the presence of fine-grained wood by-products, wherein the detection of an accumulation beyond a predetermined threshold triggers an alert. 103. The method of any of the above claims, wherein the facility monitored is within the food processing industry, and the monitoring includes the use of the infrared sensor to detect temperature anomalies associated with processing equipment that may result in dust ignition. 104. The method of any of the above claims, wherein the facility monitored includes grain storage facilities, and the monitoring includes the use of the visible light sensor to detect changes in dust levels that may indicate a risk of combustion. 105. The method of any of the above claims, wherein the facility monitored includes flour storage facilities, and the monitoring includes the use of the computer processing system to analyze temperature and visual data for the early detection of smoldering or hot spots indicative of fire risk. 106. The method of any of the above claims, wherein the facility monitored involves storage of hazardous materials, and the monitoring includes the use of the infrared sensor to detect abnormal temperature readings indicative of chemical reactions that may pose a fire risk. 107. The method of any of the above claims, wherein the facility monitored is susceptible to electrical faults, and the monitoring includes the use of the infrared sensor to detect overheating electrical components that may lead to fires. 108. The method of any of the above claims, wherein the facility monitored is susceptible to arson, and the monitoring includes the use of the visible light sensor to detect unauthorized presence or activity within the facility during non-operational hours. 109. The method of any of the above claims, wherein the facility monitored involves hot work operations, and the monitoring includes the use of the infrared sensor to detect and track hot spots created during welding or cutting processes that may pose a fire risk. 110. The method of any of the above claims, wherein the facility monitored includes areas where hazardous materials are handled or stored, and the monitoring includes the use of the visible light sensor to detect spills or leaks of flammable or combustible materials. 111. A system for monitoring an industrial environment by detecting deviations from an ideal operational state, the system comprising: a camera network comprising a plurality of cameras positioned throughout the industrial environment for capturing image data; a computer vision processor configured to process the image data from the camera network; and a memory storing instructions that, when executed by the computer vision processor, cause the computer vision processor to: (a) analyze the image data using a plurality of monitoring modules to generate individual risk scores; (b) combine the individual risk scores using a weighted formula to generate a computer vision score; (c) receive a classical risk assessment score from a risk assessment module; and (d) provide the computer vision score and the classical risk assessment score to a combined scoring module that generates a unified risk score. 112. The system of claim 111, wherein the plurality of monitoring modules comprises at least one of: a temperature monitoring module configured to process temperature-related data; a fire/smoke/sparks detection module configured to detect fire, smoke, or spark events; a cleanliness monitoring module configured to assess cleanliness levels; a hazardous elements monitoring module configured to detect hazardous materials or conditions; and an intruder detection module configured to detect unauthorized personnel. 113. The system of claim 111 or 112, wherein the plurality of cameras comprises at least one infrared light sensor for capturing thermal information and at least one visible light sensor for capturing visible spectrum data. 114. The system of any one of claims 111 to 113, wherein the computer vision processor is configured to implement algorithms comprising at least one of: convolutional neural networks, vision transformer networks, isolation forest, semantic segmentation, generative adversarial networks, reinforcement learning with human feedback, autoencoders, or K-Nearest Neighbors. 115. The system of any one of claims 111 to 114, wherein the weighted formula for generating the computer vision score comprises weights (alpha), factors (sigma), and a bias term (beta). 116. The system of any one of claims 111 to 115, wherein the system further comprises an underwriting module configured to receive the unified risk score and adjust insurance premium prices based on the unified risk score. 117. The system of any one of claims 111 to 116, wherein the risk assessment module comprises components for evaluating different aspects of risk including at least one of: a geographic location component; a past incidents component; a fire safety component; a cleanliness component; and a building code component. 118. The system of claim 112, wherein the temperature monitoring module is configured to: measure minimum, maximum, and average temperatures; analyze temperature distributions; conduct trend analysis; trigger alarms; and detect anomalies based on the infrared data received. 119. The system of claim 118, wherein the temperature monitoring module is further configured to determine temperature anomalies over time by comparing information from the same sensor or group of sensors, or over space by comparing information provided from different sensors or sensor groups. 120. The system of claim 112, wherein the fire/smoke/sparks detection module implements a multi-stage fire detection pipeline including motion detection, color filtering, and region proposal. 121. The system of claim 120, wherein: the motion detection is implemented via background subtraction to identify areas of activity; the color filtering is applied to isolate pixels matching fire or smoke characteristics; and the region proposal techniques generate candidate areas for further analysis. 122. The system of claim 112, wherein the cleanliness monitoring module is configured to analyze visual imagery to detect dust, debris, or other indicators of cleanliness status. 123. The system of claim 112, wherein the hazardous elements monitoring module is configured to use computer vision techniques to identify potentially dangerous objects or situations within monitored areas. 124. The system of claim 112, wherein the intruder detection module is configured to use motion detection and object classification to identify human presence in restricted areas. 125. The system of any one of claims 112 to 124, wherein the system is configured to generate individual risk scores for each monitoring module, and wherein each risk score comprises a numerical value on a scale from 0 to 100, with higher values indicating higher levels of risk. 126. The system of any one of claims 111 to 125, wherein the camera network is connected to a network switch that distributes data to a compute server and a video management system. 127. The system of claim 126, wherein the video management system is connected to recording storage for storing video data and to display devices for visualization of camera feeds and recorded content. 128. The system of any one of claims 111 to 127, wherein the system further comprises a microphone for audio capture capabilities. 129. The system of any one of claims 111 to 128, wherein the system further comprises an alarm for generating alerts or notifications based on detected conditions. 130. The system of claim 129, wherein the alarm is triggered by the computer vision processor based on analysis of sensor data or received instructions from a network. 131. A method for fire detection in an industrial environment using the system of any one of claims 111 to 130, the method comprising: receiving video images; performing motion detection on the video images via background subtraction; performing color filtering by applying HSV/RGB thresholds to identify fire-like colors; generating a region proposal by extracting connected components from combined masks; and applying a neural network to analyze the regions to detect fire. 132. A method for smoke detection in an industrial environment using the system of any one of claims 111 to 130, the method comprising: receiving an input image; calculating image energy using Sobel filtering; detecting energy drops by comparing the current frame's energy map to a reference background frame; generating bounding boxes around regions where significant energy drops have been detected; applying a neural network classifier to analyze the regions; and tracking classification results in a temporal buffer. 133. The method of claim 132, further comprising: determining if energy remains low; identifying a pre-alarm candidate if the energy remains low; and continuing monitoring if the energy does not remain low. 134. A method for analyzing temperature data in an industrial environment using the system of any one of claims 111 to 130, the method comprising: obtaining infrared data; converting the infrared data to temperature measurements; obtaining regions of interest and temperature limits; calculating metrics based on the temperature measurements and regions of interest; determining whether a temperature anomaly is detected; and sending an alarm if a temperature anomaly is detected. 135. The method of claim 134, further comprising storing the temperature data, calculated metrics, and any detected anomalies for future reference or analysis. 136. A method for analyzing long-term temperature data in an industrial environment using the system of any one of claims 111 to 130, the method comprising: obtaining long-term infrared data from a database; running statistical analysis on the long-term data; evaluating delta variance in temperature readings over time; running outlier and trend deviation detection; determining if anomalies are present; sending an alarm if anomalies are detected; updating temperature limits; and updating a temperature risk score. 137. A method for training a neural network for fire detection using the system of any one of claims 111 to 130, the method comprising: collecting image data from multiple sources including internet scraped images, synthetic images, and visual effects overlays; compiling the images into a unified training dataset; applying data augmentation techniques to the training dataset; tuning hyperparameters; setting training configuration; and training the neural network using the augmented dataset. 138. The method of claim 137, wherein the data augmentation techniques include at least one of: artificial noise, geometric transformations, color adjustments, compression artifact simulation, and Mixup/Cutmix augmentations. 139. The method of claim 137 or 138, wherein training the neural network comprises using an EfficientNet family model with an Adam optimizer and a cosine annealing learning rate scheduler. 140. A computer vision scoring system implemented on the system of any one of claims 111 to 130, the scoring system deriving a computer vision score from multiple individual scoring components comprising: a temperature score evaluating temperature-related data; a fire/smoke/spark score assessing detection of smoke, fire, and spark events; a cleanliness score quantifying cleanliness levels; a hazardous score evaluating hazardous conditions; and an intruders score tracking unauthorized access detection.
Description
Title: SYSTEM AND METHOD FOR INDUSTRIAL RISK ASSESSMENT VIA COMPUTER VISION Inventors: Drew Hanover, Thomas Laengle, Florian Trautweiler Field of the Invention The present invention relates to the field of automated risk assessment in industrial environments and also risk reduction, including computer vision techniques for fire prevent! on/detecti on and risk assessment. Background In industrial environments, maintaining optimal operating conditions is crucial for both safety and operational efficiency. Traditional risk assessment methods typically involve periodic inspections by qualified risk engineers who evaluate various factors including environmental state, cleanliness, occupancy, electrical installations, and other safety -related parameters. These assessments form the basis for insurance premium pricing and risk management protocols. However, conventional periodic assessments suffer from significant limitations. First, they provide only a snapshot of conditions at a specific moment in time, rather than a continuous representation of day-to-day operations. Second, facilities are often notified in advance of inspections, allowing them to temporarily optimize conditions that may not reflect normal operations. This advance notification may lead to inaccurate risk assessments and potentially dangerous operating conditions between inspections. Furthermore, risk assessments have a direct impact on underwriting, and therefore act as a pricing mechanism for insurances. If the facility is arbitrarily different (read cleaner) on the day of an inspection, it can positively impact the risk assessment report resulting in lower premiums for the insured, and non-representative risk conditions for the insurer. Traditional fire detection technologies also have limitations, for example in high- ceiling industrial buildings where ceiling-mounted smoke detectors may only detect hazards after substantial development. These limitations create a need for more comprehensive, continuous monitoring systems that can detect risks at their source and provide more accurate real-time assessment of safety conditions. As a specific example of a type of risk, fire risk assessment is a procedure mandated by many insurance providers to understand the risk of catastrophic loss due to fire damage. Typically, these assessments are conducted once or twice per year, with qualified risk engineers representing an underwriter. The risk engineer’s job is to determine the likelihood and potential scale of a fire based on many factors. These factors may include, but are not limited to, environmental state, cleanliness, occupancy, electrical installations, sprinklers, alarming systems, fire doors, geographical location and of course the intended purpose of the building, i.e. manufacturing, residential, occupational and so on. The risk engineer takes these factors into consideration and provides risk scores to the underwriter. The underwriter then uses these risk scores as a primary factor in insurance premium prices. One challenge with these risk assessments is their periodic nature. Insurance clients are often notified in advance when a risk assessment is to occur. The risk engineers may encourage clients to make the building as prepared as possible before, or on the day of inspection. This means that when the risk engineer arrives to conduct the assessment, the actual condition of the building does not accurately reflect the normal day-to-day operating characteristics. Taking a sawmill or pellet manufacturing plant for example, the owner of the mill may make significant efforts to clean the facility, removing highly flammable sawdust from dangerous areas which would otherwise exist during normal operations. They may also repair dangerous, yet still functional equipment to avoid any demerits to their risk score in order to ultimately receive the lowest possible insurance premiums. Furthermore, traditional smoke and fire detection technologies have significant limitations, particularly in industrial buildings with high ceilings. Conventional smoke detectors and sprinklers are typically ceiling-mounted, which can delay detection in large- volume spaces. In such environments, smoke or heat may take considerable time to reach the sensor level, and by the time an alarm is triggered, the fire may already be well developed. This delay can result in extensive damage and contributes to higher insurance premiums across the entire customer segment. Brief Summary of the Invention This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The present invention relates to a device, system and method comprising computer vision techniques for fire prevention/ detection and