CN-121979873-A - Enterprise informatization management integrated platform based on big data
Abstract
The invention belongs to the technical field of enterprise informatization management and big data integration, and particularly relates to an enterprise informatization management integrated platform based on big data, which comprises an audit element unified structural acquisition module, a main body unique identification analysis and failure closed loop module, a caliber version and field mapping version combined locking module and an evidence index binding and consistency verification packaging module; the platform reads interface and field mapping according to a configuration table from a service system log, writes an event account book juxtaposition state after normalization, executes main body gating on an original account record to generate main body identification and write back, writes a reason code and blocks when failure occurs, returns a failure evidence index, analyzes the path script node in a serialization manner and generates a path chain fingerprint, calculates a summary of event fragments in a serialization manner and builds a batch summary root, positions a first inconsistent sequence number when recalculation is inconsistent, and dispatches and supplements so as to output a rechecked audit evidence packet. The invention can realize consistent caliber and traceable evidence.
Inventors
- JIANG DONGPO
Assignees
- 淮安东创兴科科技有限公司
Dates
- Publication Date
- 20260505
- Application Date
- 20260122
Claims (10)
- 1. An enterprise informatization management integrated platform based on big data, which is characterized in that: the module M1 is used for carrying out element standardization processing on operation logs generated by each business system of an enterprise, generating an audit event record with uniform structure, writing the audit event record into an event account book, and setting a processing state identifier as an original account entry; The module M2 is used for generating a unique main body identifier and writing back the event ledger by adopting identity gating processing on the record of which the processing state identifier is the original account entry in the event ledger based on the structured audit event record; The caliber version and field mapping version joint locking module M3 is used for performing caliber locking processing on enterprise index caliber and field mapping based on the unique main body identification, generating caliber blood margin version chains and writing into an caliber blood margin library; and the module M4 is used for carrying out evidence solidification treatment on the event ledger record passed by the main body and the caliber blood edge version chain passed by the caliber on the basis of the caliber blood edge version chain, generating a rechecking audit evidence package and outputting the rechecked audit evidence package.
- 2. The enterprise informatization management integration platform based on big data according to claim 1, wherein the audit element unified structure acquisition module comprises the following specific steps: The method comprises the steps of generating a data source access parameter set by reading a data source configuration record, generating an original operation log record set by an interface calling and authenticating unit by initiating interface calling on an interface address and performing authentication verification on an authentication certificate, generating a to-be-mapped cache record by writing the original operation log record set into a to-be-mapped cache region by the interface calling and authenticating unit when the interface returns a successful state code, generating an abnormal handling record by writing a data source identifier, failure time and failure reason code into an acquisition abnormal record and writing the data source identifier into a retry queue when the interface returns an unsuccessful state code, generating a skip handling record by writing the data source identifier into a skip queue when retry times reach a retry upper limit, generating a unified structured audit event record set by performing unified field renaming and enumeration value fixed mapping table conversion on the to-be-mapped cache record by a field mapping and value domain normalizing unit, and generating a reject list by writing the record reject list into a deletion field account when the field audit record deletion operation time or the deletion operation account number is returned by the interface, and generating an audit list by writing the reject list into the original account by the field map normalization unit.
- 3. The enterprise informatization management integration platform based on big data according to claim 1, wherein the audit element unified structure acquisition module comprises the following specific steps: The event unique identification generation unit generates an event fragment string by performing deterministic serialization on a uniformly structured audit event record set according to a fixed field sequence, generates unique character string input by connecting adjacent fields by adopting fixed separators and writing fixed placeholders into null fields, generates an event unique identification by performing secure hash calculation on the event fragment string, generates a reject list traceable record by not generating the event unique identification and reserving a missing field reason code when the audit event record is marked as the reject list record, and blocks the record from entering an event account writing and state setting process and a subsequent main body analysis process.
- 4. The enterprise informatization management integration platform based on big data according to claim 1, wherein the main body unique identification analysis and failure closed loop module comprises the following specific steps: The catalog equivalence matching unit reads an operation account number from a record with the processing state identification of the event ledger as an original account, and performs equivalence comparison with an account number index of a unified identity catalog to generate a matching hit mark and a first candidate main body identification; when the matching hit mark is not hit, the hash consistency characteristic generating unit generates an event side identity element set by reading a mobile phone number field, a mailbox field and a job number field in an event account record and executing standardization processing, the hash consistency characteristic generating unit generates a catalog side candidate identity element set by executing search on a unified identity catalog according to the event side identity element set, generates a mobile phone number consistency mark, a mailbox consistency mark and a job number consistency mark by executing secure hash calculation and executing equivalent comparison on the event side identity element and the catalog side candidate identity element respectively, the main body matching score calculating unit generates a main body matching score by executing weighted summation on the consistency mark according to a weight coefficient and writes the main body matching score into a matching score field of the event account record, and the threshold value reading and gating judging unit generates an identity mapping threshold value by executing reading on a threshold value configuration record and generates a gating judging result and a failure reason code by executing numerical comparison on the main body matching score and the identity mapping threshold value.
- 5. The enterprise informatization management integration platform based on big data according to claim 1, wherein the main body unique identification analysis and failure closed loop module comprises the following specific steps: The main body write-back unit writes a first candidate main body identifier into a main body identifier field of the event ledger record when the gating judgment result is that the matching hit mark is hit, updates a processing state identifier into a main body passing state to generate a main body passing event ledger record, orders the candidate main body identifier set according to a main body matching score descending order and selects a candidate main body identifier with a first serial number to generate a second candidate main body identifier when the gating judgment result is that the gating judgment result is passed and the matching hit mark is not hit, writes the second candidate main body identifier into a main body identifier field of the event ledger record and updates a processing state identifier into a main body passing state identifier to generate a main body passing event ledger record, and writes an event unique identifier, a main body matching score, a failure reason code, a data source identifier and a record time into an evidence index to generate a main body failure evidence record when the gating judgment result is that the gating judgment result is failed, updates the processing state identifier of the event ledger record into a main body failure, and the packaging input blocking unit sets a packaging permission identifier to be forbidden and writes the event unique identifier into a packaging exclusion list to generate a blocking result when the processing state identifier is failed.
- 6. The enterprise informatization management integration platform based on big data according to claim 1, wherein the specific steps of the caliber version and field mapping version joint locking module are as follows: the node analysis unit generates a sequence of caliber node sets and topological ordered nodes by performing mark sequence generation and bracket pairing check on the caliber script text, writes failure cause codes and script text positioning information into evidence indexes when bracket pairing check is not passed or character segments can not be identified, places caliber processing state identifiers into caliber failures and places caliber encapsulation allowing marks into forbidden, and the rule normalization serialization unit generates a sequence of constant value and constant value by replacing field names with uniform field names, sorting the exchange rule sets into dictionary sequences, separating the sequence of constant value and time-series configuration records into a constant value sequence according to the field mapping rule sets and a fixed-order sequence, and generating a sequence of fixed-order node series according to the registration rule sets, and then generates a sequence of fixed-order node series according to the registration rule sets, adopting serialization configuration to record registered fixed separator and writing fixed placeholder to null value during splicing, thereby generating node fingerprint string and writing in the fingerprint buffer area; when the field mapping rule set cannot match the field name in the field mapping rule set, the rule standardization serialization unit writes the missing field reason code into the evidence index, and sets the caliber processing state identifier as caliber failure and sets the caliber packaging permission mark as prohibition; the method comprises the steps of executing rule normalization processing on a node rule text and executing deterministic serialization processing on a normalization result by adopting an caliber blood edge deterministic fingerprint generation algorithm, so as to generate a normalized rule string and a node fingerprint string, and writing the normalized rule string and the node fingerprint string into a fingerprint cache region, wherein the caliber blood edge deterministic fingerprint generation algorithm executed by the rule normalization serialization unit comprises the following formula: , Wherein the method comprises the steps of The node fingerprint string of the ith node, S is the node fingerprint string, i is the serial number index of the caliber node, In order to deterministic serialization of the functions, For the node identification of the i-th node, For the node type of the i-th node, Representing the result of ordering the upstream node identification set in dictionary order, For the results after the ordering of the dictionary sequence, The upstream node of the ith node identifies the set, For the normalized rule string of the ith node, Configured for serialization.
- 7. The enterprise informatization management integration platform based on big data according to claim 1, wherein the specific steps of the caliber version and field mapping version joint locking module are as follows: The system comprises a node fingerprint string generating unit, a conversion digest generating unit, a version conflict checking unit, a caliber blocking and evidence outputting unit and a caliber encapsulation permission mark, wherein the node fingerprint string generating unit generates a node conversion digest value through performing safe hash calculation on the node fingerprint string, the node conversion digest value is subjected to chain aggregation according to a topological ordered node sequence to generate caliber blood edge version chain fingerprints and write an entry path blood edge library, the version conflict checking unit generates a historical chain fingerprint set through constructing a joint key by an index mark, a caliber version number and a field mapping version number to search the caliber blood edge library, and generates a conflict judging result through performing equivalent comparison on a current caliber blood edge version chain fingerprint and the historical chain fingerprint set, the caliber blocking and evidence outputting unit sets a caliber processing state mark as caliber passing and sets a caliber encapsulation permission mark as permission to output caliber blood edge version chains as subsequent encapsulation input when the conflict judging result is conflict, and the caliber blocking and evidence outputting unit sets caliber processing state marks as caliber failed and the joint key, the caliber blood edge version chain fingerprints and the conflict reason code as blocking prohibition input sets when the conflict judging result is conflict.
- 8. The enterprise informatization management integration platform based on big data according to claim 1, wherein the evidence index binding and consistency verification packaging module comprises the following specific steps: The event fragment serialization unit obtains an event unique identifier, a main body unique identifier, an index identifier, an operation timestamp, an operation type, an object identifier and a processing purpose through reading an event account record passing through a main body, obtains a caliber version number, a field mapping version number and a caliber blood edge version chain fingerprint through reading a caliber blood edge version chain passing through a caliber, so as to generate an event fragment field set, the event fragment serialization unit performs deterministic serialization on the event fragment field set according to a field sequence registered by a packaging configuration table to generate an event fragment string, the fragment digest calculation unit performs secure hash calculation on the event fragment string according to a hash algorithm type registered by a hash algorithm configuration item to generate a fragment digest value, and the evidence index binding unit writes the event unique identifier, the fragment digest value, the caliber blood edge version chain fingerprint, the index identifier, the main body unique identifier and the operation timestamp into an evidence index to generate an evidence index record and writes the evidence index identifier back into an evidence index field of the event account record to establish a retrievable binding relation.
- 9. The enterprise informatization management integration platform based on big data according to claim 1, wherein the evidence index binding and consistency verification packaging module comprises the following specific steps: The batch grouping unit reads the batch window time length from the packaging configuration table, limits the batch window time length to five minutes to sixty minutes, performs time window grouping on the evidence index record according to the operation time stamp to generate a batch set, writes the batch identifier, the window starting time and the window ending time to generate a batch index record, the in-batch ordering unit orders the evidence index record in the same batch according to the operation time stamp ascending order and orders the evidence index record according to the event unique identification word classical order under the same time stamp condition to generate an in-batch ordered record sequence, the preamble digest chain generating unit performs secure hash calculation on the batch identifier and the window starting time to generate a chain starting digest, iteratively polymerizes the preamble digest value and the current fragment digest value according to the order of the in-batch ordered record sequence to generate a preamble digest chain value sequence, writes the preamble digest chain field of the evidence index record corresponding to the evidence index record to generate a chain value storage result, the digest tree constructing unit performs secure hash calculation on the digest value layer by layer according to the adjacent two splice methods until a single root is obtained, the digest tree root generating unit generates a digest tree root number in adjacent two-batch, the digest tree construction unit generates a batch digest root by copying the last fragment digest value to be even when the number of the fragment digest values is odd and then aggregating the last fragment digest value into a batch digest root according to adjacent pairwise aggregation, and the digest root signature unit generates a digest root signature value by reading a signature key identifier and a signature algorithm type in a key configuration table and performing signature operation on the batch digest root and writes the digest root signature value into a signature field of a batch index record.
- 10. The enterprise informatization management integration platform based on big data according to claim 1, wherein the evidence index binding and consistency verification packaging module comprises the following specific steps: The consistency recalculation verification unit generates a recalculation summary root by reading event fragment strings and recalculation fragment summary values, a front summary chain value and a batch summary root one by one for an ordered sequence of records in a batch, and generates a consistency verification result by comparing the recalculation summary root with the batch summary root in a batch index record in an equivalent manner, the consistency recalculation verification unit sets a batch state identifier as packaging passing when the consistency verification result is passing so as to trigger the evidence packet generation unit to generate a recheck audit evidence packet, and the consistency recalculation verification unit sets the batch state identifier as packaging failure and writes a failure reason code and generates a first inconsistent position sequence number and writes a failure positioning field when the consistency verification result is not passing so as to generate minimum recapturing positioning information, and performs one by one comparison by reading the front summary chain value of the evidence index record and the front summary chain value obtained by the consistency recalculation verification unit so as to determine a first inconsistent position sequence number and write the failure positioning field when the first inconsistent position sequence number is not equal to generate minimum recalculation positioning information, wherein the inconsistent positioning algorithm is as follows: , Wherein the method comprises the steps of The check unit calculates the index record of the j evidence for consistency to obtain the preamble abstract chain value, For recalculating the obtained preamble abstract chain value, j is the sequence number index of the ordered record sequence in the batch, A function is calculated for the secure hash and, The fragment digest value obtained by recalculating the event fragment string corresponding to the jth evidence index record is calculated for the consistency recalculation checking unit, Is a hash algorithm type; The failure re-picking scheduling unit calculates a re-picking start boundary and a re-picking end boundary by reading operation time stamps corresponding to records of the failure positioning fields and combining with a re-picking time expansion amount registered by the encapsulation configuration table so as to generate a re-picking task, wherein the re-picking time expansion amount is limited to ten seconds to three hundred seconds, and a batch identifier, the re-picking start time, the re-picking end time, a failure reason code and a data source identifier are written into a re-picking queue so as to trigger a collecting side to re-pick according to the re-picking time boundary and re-enter an event fragment serialization processing flow.
Description
Enterprise informatization management integrated platform based on big data Technical Field The invention belongs to the technical field of enterprise informatization management and big data integration, and particularly relates to an enterprise informatization management integrated platform based on big data. Background In China application number CN202411785976.4, an enterprise informatization management integrated platform based on big data. The method is constructed based on a Hadoop distributed computing framework, integrates technologies such as MapReduce parallel processing, spark real-time computing, HDFS distributed storage and the like, realizes data parallel processing through a group intelligent algorithm and a pheromone transmission mechanism, utilizes a hierarchical self-coding network to carry out heterogeneous data mapping, combines a biological cell division mechanism and an entropy value balance mechanism to realize dynamic scheduling of resources, optimizes storage management through a self-adaptive grid partition and data temperature layering technology, and builds multi-scale feature fusion and knowledge reasoning support decision analysis based on SparkMLlib. Although the invention solves the problems of low data processing efficiency, limited system expansion, difficult data sharing and the like in the traditional enterprise informatization management, provides reliable technical support for enterprise digital transformation, and is used for providing traceable audit materials for audit elements and processing links in a scene of integrating cross-system data caliber and access links for personal information protection compliance audit, the enterprise deploys multiple types of service systems in the operation management process, and the multiple types of service systems continuously generate operation logs and data processing records in the operation process, which can lead the enterprise to form traceable audit materials when the enterprise performs compliance audit of personal information processing activities. In order to solve the problems, in the prior art, an enterprise informatization management integration platform performs field mapping and data synchronization by interfacing a service system interface, writes multi-system data into a data warehouse or a data lake, deploys a unified identity catalog and single sign-on to realize association of a system account and a personnel file, deploys a log acquisition component, acquires an operation log and generates an audit list according to sampling rules, deploys a data catalog and a blood margin management tool, registers caliber information and exports the caliber information through a report to form audit materials, but has the problems that firstly, the account and the main body are associated by static binding or single rule, so that unique main body identification is difficult to generate in the same event, secondly, the different systems are inconsistent in processing purposes, the processing objects and the record caliber of a sensitive label, so that the audit elements are difficult to form unified structured records, thirdly, the caliber version and the field mapping version are lack of joint constraint, so that a historical caliber calculation path is difficult to reproduce, and fourth, the materials are exported and packed as a main, and the structural index, consistency check and the cross-system association rules are lacked, so that the verification of the consistency of the main body information and the main body is difficult to execute under a compliance audit scene. In order to solve the problems, the invention provides an enterprise informatization management integrated platform based on big data. The platform generates a unified structured audit event record and writes the audit event record into an event account book by performing element standardization and field mapping normalization on operation logs of each business system of an enterprise, generates a main body identification and forms a trace-remaining closed loop when failure occurs by performing main body unique identification analysis and gating blocking on original account record, generates a caliber blood edge version chain and blocks packaging input when collision occurs by performing node standardization, rule standardization and version joint locking on an index caliber script, and generates a rechecked audit evidence package and positions a minimum recovery boundary when disagreement occurs by performing deterministic serialization, abstract calculation and consistency recalculation checking on event fragments. Disclosure of Invention The present invention has been made in view of the above-described problems occurring in the prior art. The invention provides an enterprise informatization management integrated platform based on big data, and aims to solve the problems that under the parallel condition of an enterprise multi-service system, events