Search

EP-4740193-A2 - FIRE AND/OR EVENT DETECTION SYSTEM AND METHOD OF USE THEREOF

EP4740193A2EP 4740193 A2EP4740193 A2EP 4740193A2EP-4740193-A2

Abstract

The present invention provides a fire and/ or event detection system. The system includes at least first camera means, configured to operate in a range between 700nm to 15,000nm, and at least second camera means, configured to operate in a range between 380nm to 750nm. The system includes computing means provided in communication with at least the second camera means, wherein the computing means includes a machine learning algorithm arranged to process and classify data obtained from at least the second camera means, and the computing means is arranged to receive and process said data and discern and detect the occurrence ooff aann event within the field of vision of at least the second camera means.

Inventors

  • BROWN, LINCOLN
  • Hulme, Robert
  • RICHARDSON, JAMES

Assignees

  • Fire Camera Limited

Dates

Publication Date
20260513
Application Date
20240729

Claims (20)

  1. 1. A fire and/or event detection system, said system including: at least first camera means, configured to operate in a range between 700nm to 15,000nm; at least second camera means, configured to operate in a range between 380nm to 750nm; characterized in that the system includes computing means provided in communication with at least the second camera means, said computing means including a machine learning algorithm arranged to process and classify data obtained from at least the second camera means, the computing means arranged to receive and process said data and discern and detect the occurrence of an event within the field of vision of at least the second camera means.
  2. 2. A system according to claim 1 , wherein said computing means is in communication with both the first and the second camera means and data from the first camera means is processed alongside data from the second camera means, to discern and detect the occurrence of an event within the field of vision of the first and second camera means.
  3. 3. A system according to claim 1 , wherein said machine learning algorithm is programmed to recognize a standard set of working conditions and/ or parameters, and thus detect events and/or occurrences falling outside of said conditions and/or parameters.
  4. 4. A system according to claim 1 , wherein said first camera means is configured to operate in a range between 700nm to l ,400nm and/or between 8,000nm to 15,000nm.
  5. 5. A system according to claim 1 , wherein the second camera means is provided as a lens-less camera means.
  6. 6. A system according to claim 1 , wherein said machine learning algorithm is programmed to divide the field of view of the first and/ or second camera means into a grid comprising two or more sections and subsequently analyse each section sequentially, in use.
  7. 7. A system according to claim 2, wherein where a thermal event is detected by the first camera means, the machine learning algorithm is programmed to clarify and/ or confirm said event using the second camera means, which will analyse/review the event at the specific location in its field of view, in use.
  8. 8. A system according to claim 1 , wherein the first camera means and the second camera means are provided to be located in a single camera head.
  9. 9. A system according to claim 8, wherein the system includes a plurality of camera heads located in a plurality of positions in the environment being monitored, each camera head comprising first camera means and second camera means therein.
  10. 10. A system according to claim 8, wherein the or each camera head may include further sensing means, said further sensing means including temperature sensing means and/or air flow sensing means.
  11. 11. A system according to claim 8, wherein the or each camera head includes air supply, air circulation and/or air blowing means provided therewith.
  12. 12. A system according to claim 11, wherein actuation means are provided which are arranged, in use, to activate the supply of air on to one or more lens of the first and/ or second camera means, and/or circulate air through a housing of the or each camera head, said actuation means provided in communication with temperature and/or air flow sensing means, such that when a predetermined threshold is met, the actuation means activate a supply of air, in use.
  13. 13. A system according to claim 1, wherein the system further includes communication means, arranged to enable communication between at least the first and second camera means, and the computing means.
  14. 14. A system according to claim 13, wherein said communication means are provided to communicate live visual and/ or thermal data acquired from the first and/ or second camera means to the computing means, which is arranged to analyse and detect events and/or occurrences falling outside of a standard set of working conditions and/ or parameters, in use.
  15. 15. A system according to claim 1, wherein the system further includes display means, provided to relay live visual data from the first and/or second camera means.
  16. 16. A system according to claim 15, wherein the display means are located on a camera head in which the first and second camera means are located.
  17. 17. A system according to claim 1, wherein the system further includes alert means, said alert means in communication with communication means of the system and arranged to initiate an alert, in use, when there has been detected an event and/ or occurrence falling outside of a standard set of working conditions and/ or parameters.
  18. 18. A system according to claim 1 , wherein said computing means are arranged to permit images/data from the first camera means to be overlayed and/ or paralleled with data from the second camera means.
  19. 19. A system according to claim 1 , wherein said computing means includes machine vision software incorporated therewith, said software includes an artificial neural network model and as data is captured and received by the second camera means, in use, said data is compared with the neural network model and any anomalies/discrepancies etc. identified.
  20. 20. A method of detecting a fire and/or event occurring using a fire and/or event detection system as defined above, said method including the steps of: providing at least first camera means, configured to operate in a range between 700nm to 15,000nm, and at least second camera means, configured to operate in a range between 380nm to 750nm, of the system to observe and monitor a location or environment; receiving and processing data from at least the second camera means and, utilising the machine learning algorithm incorporated into the computing means of the system, processing and classifying the data to discern and detect the occurrence of an event within the field of vision of the at least second camera means.

Description

Fire and/or event detection system and method of use thereof The invention to which this application relates is a fire and/ or event detection system and a method of using the same. In particular, the invention relates to a fire detection system for use in an industrial environment. Fire detection systems exist in a variety of forms and are applied in a variety of environments. One of the most common systems employs one or more infrared (IR) camera systems, which can detect differences in temperature across the environment. Such systems work best in cool or ambient environments where heat may be readily detected against a cooler backdrop in instances where it shouldn’t occur. In industrial environments, such as metalworks, glassworks, or other such “hot” environments, thermal camera systems are also commonly used and are successful in reliably detecting flames or other hot objects (such as a glass gob) . However, this can frequently be problematic given the already high ambient temperature in the environment: it can then become difficult to discern a region of a higher temperature which shouldn’t ordinarily be so high. One solution employed by users of such systems is to “blank out” one or more of the observable regions of the camera system, for example, those regions where the temperature is expected to be constantly high, such as furnaces, moulding machinery and the like. This is achieved by splitting the observable region into zones and setting a higher temperature threshold (for example for prompting an alert), in the region(s) of higher temperature. While the above method and system serves to prevent constant alerts arising in the regions of higher temperature and focussing on other areas in that environment, problems do therefore arise: for example, if an accident occurs within the blanked out region such as a stray flame emanating from a furnace, or a glass gob getting stuck or falling from one of the troughs they are designed to slide down after cutting and scooping, and subsequently catching fire, this would not necessarily be picked up by the system given that the temperature will most likely fall within the temperature threshold set in place for the blanked out region. This “blind spot” can therefore have serious consequences. Further, thermal camera systems which are conventionally provided in such environments are inadequate when it comes to detecting any other accidents or faults in the system, which may not necessarily give off an increased heat signature. For example, in glassworks environments it is possible that conveying/ delivery systems carrying newly formed glass gobs, which are still hot, may suffer from jamming, or one or more of the gobs may inadvertently fall from the conveyor /delivery system. While in some circumstances, a thermal camera may be able to see and detect the thermal indication of a stray gob having fallen from the conveyor/delivery system, or the flame/fire which may ignite as a result of it, it is possible that the gob could simply fall out of the line of sight of the camera and thus the heat signature may not be detected until the flames have grown substantially, creating more danger. Visual cameras may be used to focus on particularly hot areas/regions, but this requires constant monitoring from personnel, which can be costly, and may also still be missed or not spotted in a timely manner if the person monitoring is not fully paying attention or is not viewing the camera feed at certain points in time. Similarly, the visual cameras may be used to monitor the progress of, for example, glass gobs on a conveyor/delivery system. However, if monitoring personnel simply miss, or do not see the gob fall from the trough or other section, or they miss a jam occurring in these or other such automated systems (due to poor resolution of the video feed or not observing the exact location at the exact moment of the event), the situation may worsen quickly before it can draw the attention of monitoring personnel. It is therefore an aim of the present invention to provide an improved fire detection system which overcomes the aforementioned problems associated with the prior art. It is a further aim of the present invention to provide a method of using an improved fire detection system which overcomes the aforementioned problems associated with the prior art. According to a first aspect of the invention there is provided a fire and/ or event detection system, said system including: at least first camera means, configured to operate in a range between 700nm to 15,000nm; at least second camera means, configured to operate in a range between 380nm to 750nm; characterized in that the system includes computing means provided in communication with at least the second camera means, said computing means including a machine learning algorithm arranged to process and classify data obtained from at least the second camera means, the computing means arranged to receive and process said da