Search

CN-122021021-A - General algorithm modeling method, system, platform, electronic equipment and storage medium

CN122021021ACN 122021021 ACN122021021 ACN 122021021ACN-122021021-A

Abstract

The application discloses a general algorithm modeling method, a system, a platform, electronic equipment and a storage medium, belonging to the technical field of computer software, wherein the method comprises the steps of analyzing a predefined protocol file in an algorithm package to obtain configuration information of an algorithm; the method comprises the steps of dynamically generating a visual configuration interface based on the configuration information to receive configuration input of a user, automatically constructing an algorithm operable mirror image according to the configuration information and the configuration input, distributing runtime resources, loading and executing the mirror image in the runtime resources, and responding to data and parameter requests in an algorithm executing process through a proxy component embedded in the algorithm to complete algorithm operation. The application realizes decoupling of the algorithm and the engineering system through the protocol file, and realizes full-flow automation and standardization of the algorithm from development, configuration and release to operation by utilizing the containerization and dynamic scheduling technology, thereby remarkably improving the algorithm delivery efficiency and reusability.

Inventors

  • ZHU GUANGHUI
  • Zhao Boci
  • ZHANG ZHENYU
  • LI HAIBO
  • ZHENG JIANSONG
  • ZHOU XIAODONG
  • Jia Dezhou
  • WU HAORAN

Assignees

  • 杉数科技(北京)有限公司
  • 上海杉数网络科技有限公司
  • 北京杉数智能科技有限公司
  • 广州杉数科技有限公司

Dates

Publication Date
20260512
Application Date
20260130

Claims (11)

  1. 1. A method of modeling a generic algorithm, comprising: analyzing a predefined protocol file in an algorithm package to obtain configuration information of an algorithm; dynamically generating a visual configuration interface based on the configuration information to receive configuration input of a user; automatically constructing an algorithm operable mirror image according to the configuration information and the configuration input, and distributing runtime resources; and loading and executing the mirror image in the runtime resource, and responding to the data and parameter requests in the algorithm execution process through the proxy component embedded in the algorithm to complete the algorithm operation.
  2. 2. The method of claim 1, wherein the configuration information includes at least one or more of an input-output data structure, an operating parameter definition, and an operating context definition of the algorithm.
  3. 3. The method of claim 1, wherein the step of dynamically generating a visual configuration interface based on the configuration information to accept configuration input from a user comprises: and converting the parameter modes defined in the protocol file into a front-end form description language, and rendering to generate a graphical user interface.
  4. 4. The method of claim 1, wherein the steps of automating the mirroring that an algorithm can run and allocating runtime resources based on the configuration information and the configuration input, comprise: analyzing the dependency relationship of the algorithm package, and dynamically generating a mirror image construction file based on the analysis result; And constructing a mirror image which can be operated by a file automation construction algorithm based on the mirror image.
  5. 5. The method of claim 1, wherein the step of loading and executing the image in the runtime resource and responding to the data and parameter requests during execution of the algorithm by a proxy component embedded in the algorithm to complete the algorithm operation comprises: Intercepting a logic data query statement sent by the algorithm; And converting the query statement into a physical data query statement based on user configuration and then executing the query statement to respond to a data request in the algorithm execution process.
  6. 6. The method of claim 1, wherein after the steps of automatically building an algorithm-executable mirror image based on the configuration information and the configuration input and allocating runtime resources, further comprising: and carrying out global scheduling and life cycle management on the runtime resources, and starting and stopping algorithm operation environments according to the resource utilization rate state.
  7. 7. The method of claim 6, wherein the global scheduling and lifecycle management is based on a resource queue and a task queue implementation, comprising: receiving a new task operated by an algorithm; Judging whether an example of the running algorithm in the new task exists or not; If so, adding the new task into a task queue of the instance to multiplex the existing running environment; If the new task does not exist, distributing new runtime resources for the new task and creating a new running environment; After the task operation is finished, the corresponding operation environment is destroyed after a predetermined time delay so as to recover resources.
  8. 8. A generic algorithmic modeling system, comprising: the analysis module is used for analyzing the predefined protocol file in the algorithm package and acquiring the configuration information of the algorithm; the configuration interface generation module is used for dynamically generating a visual configuration interface based on the configuration information so as to receive configuration input of a user; the environment construction module is used for automatically constructing an image which can be operated by an algorithm according to the configuration information and the configuration input and distributing the resources in the operation; and the execution module is used for loading and executing the mirror image in the runtime resource, and responding to the data and parameter requests in the algorithm execution process through the proxy component embedded in the algorithm so as to complete the algorithm operation.
  9. 9. An algorithmic modeling platform, comprising: The application system component is used for converting service data of an external application system into input data conforming to a predefined protocol file according to the structure and the constraint of the protocol file; an algorithm component comprising algorithm logic developed to follow the structure and constraints of the protocol file to process the input data; The generic algorithmic modeling system of claim 8, which connects the application system components with the algorithmic components and performs decoupled collaborative interactions.
  10. 10. An electronic device comprising a processor and a memory, wherein the memory stores program code that, when executed by the processor, causes the processor to perform the method of any of claims 1-7.
  11. 11. A computer readable storage medium comprising program code for causing an electronic device to perform the method of any one of claims 1-7 when the storage medium is run on the electronic device.

Description

General algorithm modeling method, system, platform, electronic equipment and storage medium Technical Field The present application relates to the field of computer software technologies, and in particular, to a general algorithm modeling method, a system, a platform, an electronic device, and a storage medium. Background With the development of artificial intelligence and big data technology, algorithm projects are increasingly deeply applied to various industries. However, conventional algorithm project delivery faces serious challenges, and has the problems of long period, high cost and high coupling degree. The traditional algorithm delivery flow generally needs close matching of a plurality of engineering links, namely an algorithm engineer needs to define data items and structures required by the algorithm, then the data engineer needs to integrate, process and convert data according to requirements, and meanwhile, a system development engineer needs to develop a corresponding service system to provide a parameter configuration interface and a data interface for the algorithm. This process involves deep coupling and frequent joint debugging of multiple roles, such as algorithm engineers, data engineers, and system development engineers, and any delay or change in one loop can block the overall project progress. More importantly, the algorithm logic is deeply bound with specific engineering environments (such as data sources, parameter delivery mechanisms, running environments), so that the algorithm is difficult to multiplex and migrate. When a business scenario changes or needs to be deployed to a new environment, a great deal of repetitive engineering development work is often required. The prior art lacks a generic framework that can decouple the algorithm core logic from the underlying engineering implementation. Therefore, an innovative solution is urgently needed to simplify the algorithm delivery flow, improve the development efficiency, and enhance the portability of the algorithm. Disclosure of Invention The embodiment of the application provides a general algorithm modeling method, a system, a platform, electronic equipment and a storage medium, wherein algorithm development is focused on business logic through a predefined protocol file, and the algorithm modeling platform automatically processes engineering details, so that decoupling of the algorithm and the algorithm modeling platform is realized, and the defects of the prior art are overcome. In one aspect, an embodiment of the present application provides a general algorithm modeling method, including: analyzing a predefined protocol file in an algorithm package to obtain configuration information of an algorithm; dynamically generating a visual configuration interface based on the configuration information to receive configuration input of a user; automatically constructing an algorithm operable mirror image according to the configuration information and the configuration input, and distributing runtime resources; and loading and executing the mirror image in the runtime resource, and responding to the data and parameter requests in the algorithm execution process through the proxy component embedded in the algorithm to complete the algorithm operation. In a possible embodiment, the configuration information includes at least one or more of an input-output data structure, an operating parameter definition, and an operating context definition of the algorithm. In a possible embodiment, the step of dynamically generating a visual configuration interface based on the configuration information to accept configuration input of a user includes: and converting the parameter modes defined in the protocol file into a front-end form description language, and rendering to generate a graphical user interface. In a possible embodiment, the steps of automating the mirror image that the build algorithm can run and allocating the runtime resources based on the configuration information and the configuration input include: analyzing the dependency relationship of the algorithm package, and dynamically generating a mirror image construction file based on the analysis result; And constructing a mirror image which can be operated by a file automation construction algorithm based on the mirror image. In one possible embodiment, the steps of loading and executing the mirror image in the runtime resource, and responding to the data and parameter requests during the execution of the algorithm by the proxy component embedded in the algorithm to complete the algorithm operation include: Intercepting a logic data query statement sent by the algorithm; And converting the query statement into a physical data query statement based on user configuration and then executing the query statement to respond to a data request in the algorithm execution process. In a possible embodiment, after the step of automatically constructing an image that an algorithm can run and