Search

CN-121278787-B - Integrity auditing method and system in edge computing environment

CN121278787BCN 121278787 BCN121278787 BCN 121278787BCN-121278787-B

Abstract

The invention relates to the field of edge computing data security, in particular to an integrity auditing method and system in an edge computing environment. The problem of the current cloud audit model delay is high under the marginal scene, the bandwidth consumption is big pain point is mainly solved. And performing data block aggregation operation on the edge node, extracting the Merker hash tree path evidence, forming dual-verification integrity evidence, and verifying the aggregation evidence and the path evidence by utilizing the dual-linear mapping to realize high-efficiency and credible audit. And the dynamic data updating and accurate positioning of the damaged blocks are supported, the communication overhead is obviously reduced, and the real-time credibility of the edge data is ensured.

Inventors

  • YAN WENYI

Assignees

  • 南京大学射阳高新技术研究院

Dates

Publication Date
20260508
Application Date
20250926

Claims (10)

  1. 1. An integrity auditing method in an edge computing environment, characterized by the steps of: S1, performing block hash operation on an original data file by adopting a cryptographic hash function to generate a merck hash tree structure and a file tag of the original data file; s2, selecting a subset in the block index set of the original data file by utilizing random sampling, and generating an audit challenge request containing a random factor; s3, performing aggregation operation on the target data blocks stored in the edge server according to the audit challenge request to generate aggregation components of the target data blocks; s4, extracting authentication path information corresponding to the target data block index based on the merck hash tree structure to generate a path evidence set; S5, merging the aggregation component and the path evidence set to form an integrity certification; s6, performing double verification calculation on the integrity certification by utilizing the file tag and a random factor in the audit challenge request, and judging the integrity state of the original data file; s7, responding to the update request and executing operation on the target data block of the original data file through signature verification, updating the Merker hash tree structure and generating a new file tag; And S8, performing anomaly detection and analysis verification on the target data block based on the integrity state of verification failure, and generating an audit report containing damaged data block positioning information.
  2. 2. The method for auditing integrity under an edge computing environment according to claim 1, wherein when the cryptographic hash function is adopted to perform a block hash operation on an original data file in S1, the method comprises: after the original data file is blocked, calculating the hash value of each data block by adopting an SHA-256 algorithm; Constructing the merck hash tree structure by taking the hash value as a leaf node; extracting a root hash value of the merck hash tree structure; the message containing the hash value is signed using the data owner private key to generate the file tag.
  3. 3. The method for auditing integrity under an edge computing environment according to claim 1, wherein when the selecting a subset of the set of block indexes of the original data file using random sampling in S2 comprises: randomly selecting a subset from the block index set of the original data file; generating a random factor for each index in the subset; combining the subset index with a corresponding random factor; adding a timestamp generates the audit challenge request.
  4. 4. The method for auditing integrity under an edge computing environment according to claim 1, wherein when performing an aggregation operation on a target data block stored in an edge server according to the audit challenge request in S3, the method comprises: Analyzing a target data block index in the audit challenge request; retrieving the contents of the target data block from an edge server; extracting a random factor in the audit challenge request; And executing homomorphic aggregation operation on the content of the target data block and the random factor to generate the aggregation component.
  5. 5. The method for auditing integrity under an edge computing environment according to claim 1, wherein when extracting authentication path information corresponding to the target data block index based on the merck hash tree structure in S4, the method comprises: locating leaf nodes in the merck hash tree according to the target data block index in the audit challenge request; Extracting a path from the leaf node to a root node from the merck hash tree structure; collecting hash values of all brothers in the path; And ordering the brother node hash values according to a hierarchy, and generating the path evidence set.
  6. 6. An integrity auditing method in an edge computing environment as described in claim 1, the step of merging the aggregate component with the path evidence set in S5 includes: encoding the aggregated component into a first verification field; encoding a hash value sequence in the path evidence set into a second verification field; combining the first verification field with a second verification field; an edge node identifier is added to form the integrity manifest.
  7. 7. The method according to claim 1, wherein when performing a double verification calculation on the integrity certification using the file tag and the random factor in the audit challenge request in S6, the method comprises: analyzing a root hash value of a public key and a signature of a data owner from the file tag; verifying the aggregated component in the integrity certification using a random factor in the audit challenge request; reconstructing a root hash value by using the path evidence set, and comparing the root hash value with the root hash value of the signature; and when both the double verification passes, judging that the original data file is complete.
  8. 8. The method according to claim 1, wherein in response to the update request and by signature verification, the step of performing an operation on the target data block of the original data file in S7 includes: Receiving an update request containing a target data block index; After verifying the signature of the update request, executing an operation on the target data block; updating node hash values of affected paths in the merck hash tree structure; A new root hash value is generated and the file tag is updated.
  9. 9. The method according to claim 1, wherein when performing anomaly detection and analysis verification on the target data block based on the integrity status of verification failure in S8, comprising: When the integrity status verification fails, analyzing the double verification calculation result; if the aggregation verification fails, marking all target data blocks in the audit challenge request as suspicious; if the path verification fails, positioning an abnormal node in the path evidence set; and outputting an audit report containing the suspicious data block index or the abnormal node position.
  10. 10. An integrity audit system in an edge computing environment, comprising: the data preprocessing module is used for executing the block hash operation on the original data file by adopting the cryptographic hash function to generate a merck hash tree structure and a file tag of the original data file; the audit challenge module is used for executing the selection of a subset in the block index set of the original data file by using random sampling, and generating an audit challenge request containing a random factor; The edge computing module is used for executing aggregation operation on a target data block stored in an edge server according to the audit challenge request to generate an aggregation component of the target data block; the evidence synthesis module is used for combining the aggregation component and the path evidence set to form an integrity evidence; The double verification module is used for executing double verification calculation on the integrity certification by utilizing the file label and the random factor in the audit challenge request, and judging the integrity state of the original data file; The dynamic updating module is used for responding to the updating request and executing operation on the target data block of the original data file through signature verification, updating the merck hash tree structure and generating a new file label; and the error positioning module is used for performing anomaly detection and analysis verification on the target data block based on the integrity state of verification failure and generating an audit report containing damaged data block positioning information.

Description

Integrity auditing method and system in edge computing environment Technical Field The invention relates to the field of edge computing data security, in particular to an integrity auditing method and system in an edge computing environment. Background The data are stored in heterogeneous nodes near the user side in a scattered manner in the edge computing environment, and the traditional cloud audit model faces the fundamental architecture defect. The existing integrity verification scheme relies on a centralized processing mechanism, requires an edge node to transmit massive raw data to a remote audit center, and causes a serious transmission bottleneck in a low-bandwidth wide area network environment. In high-frequency audit scenes such as the industrial Internet of things, continuous data return causes saturation of network channels, service data transmission delay is increased rapidly, and millisecond response requirements of real-time control type applications cannot be met. The existing audit protocol has serious shortages in supporting dynamic data updating. The mainstream static verification model assumes that data cannot be changed, and when data insertion, deletion or modification occurs at the edge end, global parameter reconstruction or full data re-signing is triggered. Under the real-time stream data processing scenes such as intelligent traffic, the high-overhead updating mechanism expands a service interruption window, so that the auditing system loses adaptability to a dynamic edge environment. There is a significantly shorter plate of fault location capability for the distributed architecture. The traditional scheme can only return a binary judgment result of data integrity, and cannot accurately position the position of a specific damaged data block in the multi-node network. When the edge gateway is damaged by silent data, operation and maintenance personnel are forced to check node by node, and the fault recovery time is far beyond the service level protocol requirement. The multi-copy peer-to-peer enhancement scheme brings a great deal of redundant storage overhead, and forms sharp conflicts with the resource constraint of the edge device. Therefore, we propose an integrity audit method and system in an edge computing environment to solve the above-mentioned problems. Disclosure of Invention The invention provides an integrity auditing method and system under an edge computing environment, and the core is to construct a lightweight double-verification architecture. And after the integrity certification is built by combining the two components, verifying the consistency of the aggregation evidence and the path evidence by utilizing bilinear mapping, thereby realizing high-efficiency and credible audit. The aim of the invention can be achieved by the following technical scheme: An integrity auditing method and system under an edge computing environment comprises the following steps: S1, performing block hash operation on an original data file by adopting a cryptographic hash function to generate a merck hash tree structure and a file tag of the original data file; s2, selecting a subset in the block index set of the original data file by utilizing random sampling, and generating an audit challenge request containing a random factor; s3, performing aggregation operation on the target data blocks stored in the edge server according to the audit challenge request to generate aggregation components of the target data blocks; s4, extracting authentication path information corresponding to the target data block index based on the merck hash tree structure to generate a path evidence set; S5, merging the aggregation component and the path evidence set to form an integrity certification; s6, performing double verification calculation on the integrity certification by utilizing the file tag and a random factor in the audit challenge request, and judging the integrity state of the original data file; s7, responding to the update request and executing operation on the target data block of the original data file through signature verification, updating the Merker hash tree structure and generating a new file tag; and S8, performing anomaly detection and analysis verification on the target data block based on the integrity state of the verification failure, and generating the audit report containing the damaged data block positioning information. Preferably, when the block hash operation is performed on the original data file by using the cryptographic hash function in S1, the method includes: after the original data file is blocked, calculating the hash value of each data block by adopting an SHA-256 algorithm; Constructing the merck hash tree structure by taking the hash value as a leaf node; extracting a root hash value of the merck hash tree structure; and signing the message containing the hash value by using a private key of the data owner to generate the file tag. Preferably, the selecting a