Search

CN-121982794-A - Unmanned remote water plant inspection method and system based on multi-source data fusion and intelligent analysis

CN121982794ACN 121982794 ACN121982794 ACN 121982794ACN-121982794-A

Abstract

The application relates to the technical field of intelligent patrol, in particular to a water plant unmanned remote patrol method and system based on multi-source data fusion and intelligent analysis, wherein the method comprises the steps of collecting conventional monitoring data through a fixed video and a sensor network deployed in a key area of the water plant; when the conventional data meet the preset abnormal triggering condition, automatically scheduling the mobile robot carrying the multi-mode load to go to a target position for performing multi-angle fine review collection such as visible light, infrared and audio, then inputting the conventional data and the review data into an equipment relation model constructed based on a water plant process together, positioning abnormal root cause equipment based on physical, electrical and spatial association relations among equipment by fusing multi-source heterogeneous data and graph neural network analysis, automatically generating a structured expert inspection report containing abnormal details, multi-dimensional evidence chains and maintenance suggestions according to analysis results, and realizing unmanned and precise remote diagnosis and decision support.

Inventors

  • Dou Cunkai
  • WANG FULEI
  • ZHOU YUJIE
  • Tian Jinzan
  • ZHANG QIDONG

Assignees

  • 浪潮通用软件有限公司

Dates

Publication Date
20260505
Application Date
20251222

Claims (10)

  1. 1. The unmanned remote inspection method for the water plant based on multi-source data fusion and intelligent analysis is characterized by comprising the following steps of: s1, acquiring conventional monitoring data of a plurality of preset positions through a video monitoring subsystem and an Internet of things sensor subsystem which are deployed in a water plant; s2, comparing the conventional monitoring data with a preset abnormal triggering condition, determining an abnormal triggering target monitoring point when the comparison result meets the abnormal triggering condition, automatically scheduling the autonomous mobile robot to go to the position of the area where the target monitoring point is located, and acquiring multi-angle refined data of the target monitoring point by utilizing the carried multi-mode sensing load to obtain at least one review data including a visible light image, an infrared thermal image and equipment sound; S3, inputting the conventional monitoring data and the rechecking data into an equipment relation model constructed based on the water plant process flow, carrying out fusion analysis on multi-source heterogeneous data, and positioning abnormal equipment based on the association relation among the equipment; S4, based on analysis and positioning results, automatically generating an expert patrol report containing abnormal positions, types, multidimensional evidence chains and maintenance suggestions.
  2. 2. The multi-source data fusion and intelligent analysis-based water plant unmanned remote inspection method according to claim 1, wherein in S2, the abnormal triggering condition comprises at least one of the following: matching the image collected by the video monitoring subsystem with a normal state image template based on a similarity threshold condition of image matching, and judging whether the matching similarity is lower than a preset first threshold value; and comparing the data acquired by the sensor subsystem of the Internet of things with a preset safe operation threshold or a historical normal fluctuation range based on the threshold/range condition of the sensor data, and judging whether the data exceeds the threshold or the range.
  3. 3. The multi-source data fusion and intelligent analysis-based water plant unmanned remote inspection method according to claim 1, wherein in S2, when the comparison result meets the abnormal trigger condition and is associated to a plurality of target monitoring points, the method further comprises: Generating a patrol queue for the plurality of target monitoring points according to a preset scheduling strategy, wherein an optimization target of the scheduling strategy comprises at least one of abnormal risk level, equipment process criticality, shortest geographic path or task urgency; And controlling the autonomous mobile robot to sequentially go to the region positions of all target monitoring points according to the inspection queue, and acquiring multi-angle refined data.
  4. 4. The unmanned remote inspection method of water works based on multi-source data fusion and intelligent analysis according to claim 1, wherein in S3, the equipment relationship model constructed based on the water works process flow is a model which is analyzed and calculated by a graph neural network with directed heterogeneous graph representation topology, and the steps of realizing fusion analysis and positioning of multi-source heterogeneous data comprise: s31, extracting feature vectors related to the equipment state from the conventional monitoring data and the rechecking data respectively to form a feature vector set; s32, based on a water plant process flow chart, key equipment or monitoring points are used as nodes, wherein the target monitoring points triggering the abnormality are defined as suspected abnormal nodes to be analyzed, and a directed heterogeneous graph with different types of edges is established according to physical connection, electrical control and spatial proximity relations; S33, using a graph annotation meaning network for configuring independent weight parameters for different types of edges to carry out information propagation and aggregation on the directed heterogeneous graph, updating node representation and realizing fusion of multi-source heterogeneous data; S34, aiming at the suspected abnormal node, calculating the difference between the updated node representation and the historical normal state node representation, and positioning root cause equipment causing the abnormality by analyzing the influence propagation among the nodes in the graph.
  5. 5. The unmanned remote inspection method for water plants based on multi-source data fusion and intelligent analysis according to claim 4, wherein S31 comprises: extracting at least one of the following feature vectors from the regular monitoring data: Extracting visual features reflecting the appearance, state or meter reading of the equipment from visible light or infrared video streams collected by a video monitoring subsystem based on the features of the video images; extracting a numerical feature vector reflecting pressure, flow, liquid level, vibration, gas concentration or water quality parameters from data acquired from an internet-based sensor subsystem based on the characteristics of the sensor readings; extracting at least one of the following feature vectors from the rechecking data: Extracting the detail texture, the structural integrity or the visual characteristics of leakage signs aiming at the target monitoring points from a high-definition zoom image acquired by an autonomous mobile robot based on the characteristics of the visible light image; based on the characteristics of infrared thermal images, semantic segmentation is carried out on the equipment area of the target monitoring point from the infrared thermal images acquired by the autonomous mobile robot, and the highest temperature of the area is calculated Standard deviation of temperature And hot spot area ratio To form infrared characteristic vector ; Extracting acoustic spectrum characteristics reflecting mechanical states, bearing abrasion or abnormal noise of equipment from equipment operation sounds acquired by the autonomous mobile robot; And extracting geometric shape features reflecting the deformation, displacement or corrosion degree of the equipment structure from laser contour scanning data acquired by the autonomous mobile robot based on the contour scanning features.
  6. 6. The unmanned remote inspection method for water plants based on multi-source data fusion and intelligent analysis according to claim 5, wherein the step of S32 comprises: S321, analyzing a process flow chart of the water plant, and defining each key physical device or independent monitoring point instance in the flow chart as a unique chart node And the target monitoring point triggering the abnormality is connected with the corresponding graph node The node is marked as a suspected abnormal node of the round of analysis; S322, based on the process flow chart and the actual layout information of the equipment, automatically establishing directed edges or undirected edges of at least two semantic types among the graph nodes: The physical connection edge is that a directional edge is established between equipment nodes with direct physical connection relation according to the material flow direction shown in the process flow chart, and the direction of the edge is consistent with the main flow direction of the material; An electric control edge, which is to establish a directional edge between a controller node and a controlled device node according to a control logic diagram or a signal connection relation, wherein the direction of the edge points to the controlled device from the controller; The space adjacent edges are used for calculating Euclidean distances among all equipment nodes according to the actual coordinate positions of the equipment in the factory, and establishing undirected edges between the equipment node pairs with the mutual distances smaller than a preset threshold value to represent the space adjacent relationship; S323, for each graph node Searching for a feature vector matched with the actual equipment or monitoring point represented by the node from the feature vector set, and taking the feature vector as the graph node Is (are) initial feature vectors For the equipment nodes which are not covered by the target monitoring points, initializing the initial feature vectors of the equipment nodes by using the feature vectors corresponding to the conventional monitoring data.
  7. 7. The unmanned remote inspection method for water plants based on multi-source data fusion and intelligent analysis according to claim 6, wherein the step of S33 comprises: S331, for the connection node in the directed heterograph And node And belong to the first Edges of class semantics, calculating a attention coefficient using proprietary learnable parameters configured for edges of class semantics To characterize nodes under this type of relationship Opposite node The relative importance of the effects; s332, based on the calculated attention coefficient, information transmission between nodes is carried out through a multi-layer network, wherein in each layer, for a target node The updated characteristic representation is obtained by combining information in the same semantic edge type and combining aggregation results of different edge types, and after the characteristic representation passes through all network layers, a final node representation for root cause positioning is formed, so that deep fusion of multi-source heterogeneous data is realized.
  8. 8. The unmanned remote inspection method for water plants based on multi-source data fusion and intelligent analysis according to claim 6, wherein in S331, attention coefficients are Calculated by the following formula: Wherein, the The semantic type of the edge is represented, Is dedicated to the first A matrix of learnable weights for class edges, Is dedicated to the first The learnable attention vector of the class edge, And Respectively nodes Sum node Is used for representing vector splicing operation; In S332, the target node Updated feature representation of (c) The method is obtained by updating the formula through the following nodes: In the formula, Is expressed by the first Class edge and node A set of connected neighbor nodes that are connected, Is composed of The resulting attention weight is normalized by the softmax function, Representing a nonlinear activation function.
  9. 9. The unmanned remote inspection method for water plants based on multi-source data fusion and intelligent analysis according to claim 6, wherein S34 comprises the following operations: s341, calculating the final node representation of the suspected abnormal node marked in S32 after S33 updating Average value of all node representations of the node in history normal running state Euclidean distance between the nodes, and taking the distance value as an abnormal influence score of the node I.e. ; S342, calculating the abnormal influence score of other nodes in the directed abnormal graph to the suspected abnormal node through a gradient back transmission method based on the trained graph attention network Quantifying the absolute value or square value of the gradient into the influence contribution degree of the corresponding node to the current abnormal state; S343, traversing all the nodes which are calculated to obtain the influence contribution degree, screening out the nodes with the contribution degree exceeding a preset contribution degree threshold, and judging the nodes which are located at the upstream or control position of the suspected abnormal nodes in the process relation diagram as root cause equipment for causing the current abnormality.
  10. 10. A water mill unmanned remote expert inspection system based on multi-source data fusion and intelligent analysis, characterized in that the system is used for realizing the method of any one of claims 1 to 9, comprising: the conventional acquisition subsystem is deployed at a plurality of preset positions of the water plant and is used for acquiring conventional monitoring data; The autonomous mobile robot is used for autonomous navigation to the target area position according to the scheduling instruction and data acquisition of multiple angles and close distances is carried out on the target monitoring points; The edge computing gateway is deployed on the water plant site, is in communication connection with the conventional acquisition subsystem and is used for receiving and caching the conventional monitoring data, comparing the conventional monitoring data with a preset abnormal triggering condition, and generating a robot scheduling instruction containing a target monitoring point mark when the comparison result meets the abnormal triggering condition; The cloud intelligent analysis platform is in communication connection with the edge computing gateway and the autonomous mobile robot and is used for receiving and storing conventional monitoring data from the conventional acquisition subsystem and review data from the autonomous mobile robot, running an equipment relationship model constructed based on a water plant process flow so as to execute fusion analysis of multi-source heterogeneous data and locate root cause equipment causing abnormality, and automatically generating an expert inspection report based on the fusion analysis and the locating result; the inspection management subsystem provides a visual man-machine interaction interface, is connected with the cloud intelligent analysis platform and is used for configuring an inspection plan, presetting abnormal triggering conditions and scheduling strategies, displaying inspection task states, real-time videos, equipment states and historical data, and receiving and presenting expert inspection reports generated by the cloud intelligent analysis platform.

Description

Unmanned remote water plant inspection method and system based on multi-source data fusion and intelligent analysis Technical Field The application relates to the technical field of intelligent patrol, in particular to a water plant unmanned remote patrol method and system based on multi-source data fusion and intelligent analysis. Background With the deep construction of intelligent water works, the safety, stability and high efficiency of operation and maintenance of water works face higher requirements. The traditional manual inspection method has the inherent defects of low efficiency, high cost, strong subjectivity, high safety risk and the like, and the inspection quality is highly dependent on the experience of personnel. In the prior art, although fixed video monitoring and Internet of things sensors are introduced into part of water plants, all systems independently operate to form a data island, and threshold value warning or video security monitoring of single-point parameters can be realized only, so that comprehensive diagnosis capability on equipment appearance abnormality (such as leakage and corrosion), structural problems and multi-factor coupling faults is lacked. The abundant visual information contained in the video monitoring is not deeply mined for equipment state analysis, and the fixed sensor data has single dimension, so that the whole health state of the equipment is difficult to reflect. Meanwhile, the existing scheme cannot effectively integrate fixed monitoring, mobile robot inspection and equipment process knowledge, the response period from abnormal discovery to expert diagnosis is long, early warning and root cause positioning cannot be realized, and small problems are easy to evolve into large faults. Therefore, a water plant inspection system capable of realizing unmanned, remote, intelligent and expert diagnosis is needed to break the data barrier and realize multi-source information deep fusion and intelligent analysis, thereby improving the modernization level of operation and maintenance. Disclosure of Invention In order to solve the problems, the invention provides a water plant unmanned remote inspection method and system based on multi-source data fusion and intelligent analysis. Through the depth fusion video monitoring, the internet of things sensor and the autonomous mobile robot, the advanced multi-source data fusion and the artificial intelligence algorithm are combined, comprehensive, accurate, efficient and safe remote intelligent inspection of the running state of the water plant is realized, and a remote expert can finish high-level inspection work without needing to physically visit the site. In a first aspect, the technical scheme of the invention is that the unmanned remote inspection method for the water plant based on multi-source data fusion and intelligent analysis comprises the following steps: s1, acquiring conventional monitoring data of a plurality of preset positions through a video monitoring subsystem and an Internet of things sensor subsystem which are deployed in a water plant; s2, comparing the conventional monitoring data with a preset abnormal triggering condition, determining an abnormal triggering target monitoring point when the comparison result meets the abnormal triggering condition, automatically scheduling the autonomous mobile robot to go to the position of the area where the target monitoring point is located, and acquiring multi-angle refined data of the target monitoring point by utilizing the carried multi-mode sensing load to obtain at least one review data including a visible light image, an infrared thermal image and equipment sound; S3, inputting the conventional monitoring data and the rechecking data into an equipment relation model constructed based on the water plant process flow, carrying out fusion analysis on multi-source heterogeneous data, and positioning abnormal equipment based on the association relation among the equipment; S4, based on analysis and positioning results, automatically generating an expert patrol report containing abnormal positions, types, multidimensional evidence chains and maintenance suggestions. The multi-source heterogeneous data is synthesized, and the differential propagation mechanism of faults among different types of incidence relations can be understood. By calculating the contribution degree of abnormal influence score and gradient anti-transmission quantification, the system can realize accurate root cause positioning, and false alarm and missing alarm are greatly reduced. As a preferable aspect of the present invention, in S2, the abnormal triggering condition includes at least one of the following: matching the image collected by the video monitoring subsystem with a normal state image template based on a similarity threshold condition of image matching, and judging whether the matching similarity is lower than a preset first threshold value; and comparing the data acquired by t