Search

CN-121979528-A - Local-cloud collaborative RTL code optimization method for IP security

CN121979528ACN 121979528 ACN121979528 ACN 121979528ACN-121979528-A

Abstract

The invention discloses a local-cloud collaborative RTL code optimization method of IP security, which comprises the steps of 1, constructing a K pair proprietary module and a first draft by utilizing a local large language model, 2, extracting an optimization principle set from the proprietary module to the first draft by utilizing the local large language model, 3, carrying out IP leakage risk detection verification on the optimization principle set by utilizing the local large language model, 4, feeding a leakage feedback value back to the local large language model, repeating the steps 2 to 4 until the leakage verification of the optimization principle set is passed, and5, uploading the optimization principle set which is passed by the leakage verification to the cloud large language model to optimize a target module to be optimized. The invention achieves 50.85% of optimization success rate in the critical path delay optimization task and 66.67% of optimization success rate in the power consumption optimization task. According to the method and the device, on the premise of ensuring the safety of intellectual property, the optimization capability of the cloud large model is utilized, so that the hardware description language code can be effectively optimized.

Inventors

  • WANG JING
  • LI YAN
  • ZENG XIAOYANG
  • GUO YUFENG
  • LI ZHENG
  • LI LEI
  • CHEN LINYI
  • HE FAN
  • LIN LIYU
  • LAI YAO
  • YANG KEMENG
  • YAO JIAFEI

Assignees

  • 南京邮电大学
  • 南京邮电大学南通研究院有限公司

Dates

Publication Date
20260505
Application Date
20251205

Claims (10)

  1. 1. A local-cloud collaborative RTL code optimization method for IP security is characterized by comprising the following steps: Step 1, data set construction, namely generating a manuscript based on a selected proprietary module P i by utilizing a local large-scale language model Forming a proprietary code set with K pairs of proprietary modules and first manuscripts Wherein i is more than or equal to 1 and less than or equal to K; step 2, extracting local principle, namely utilizing a local large language model pair Extraction from Optimization principle set to P i ; Step3, local IP leakage detection, namely using a local large language model to optimize a principle set Performing IP leakage risk detection verification, skipping to step 5 when the verification is passed, otherwise, judging that the verification is not passed, recording a leakage feedback value and entering step 4; Step 4, feeding the leakage feedback value in the step 3 back to the local large language model in the step 2, repeating the steps 2 to 4, and carrying out optimization principle set again Extraction and local IP leak detection until a set of optimization principles The leakage verification is passed; Step 5, optimizing principle set passing leakage verification Uploading the optimization principle set to the cloud big language model And optimizing the target module to be optimized.
  2. 2. The method for optimizing the local-cloud collaborative RTL code for IP security according to claim 1, wherein the method for constructing the data set comprises the following steps: Step 1-1, selecting K proprietary modules P i with representatives from the public data set; step 1-2, generating K one-to-one manuscripts by using K proprietary modules P i based on functional specifications by using a local large language model Forming K modules and manuscript code pairs ; And step 1-3, performing PPA analysis on the K modules and the code pairs of the first draft through simulation to obtain PPA indexes of each pair of the proprietary modules and the first draft.
  3. 3. The method of claim 2, wherein in step 1-1, proprietary modules P i are selected from a Resyn K subset of RTLCoder, each proprietary module P i includes a natural language design specification and a corresponding Verilog implementation, and the number K of proprietary modules P i is not less than 1000, and covers nine functional classes of multipliers, FIFOs, ALUs, shift registers, counters, decoders, state machines, arbiters, and bit width configurations.
  4. 4. The method for optimizing local-cloud collaborative RTL code for IP security as set forth in claim 1, 2 or 3, wherein in step 1, the K pair proprietary module and the first draft each comprise a maximum power consumption reduction sample and a maximum delay reduction sample, each sample comprising a functional specification, two Verilog implementations, and a corresponding PPA index.
  5. 5. The method for optimizing local-cloud cooperation RTL code of IP security as set forth in claim 1, wherein in step 2, the optimization principle set is adopted The expression of (2) is: (1) In the formula, Is the leakage feedback value at the previous iteration t-1 and at the initial iteration t=1, Defaulting to empty without leakage; to guide the local large language model to compare proprietary codes with prompt text of the draft extraction optimization principle.
  6. 6. The method for optimizing local-cloud collaborative RTL code for IP security as set forth in claim 1, wherein in step 3, the output model of the detection verification is The specific expression is: (2) In the formula, For the current t iteration decision results, Indicating that the verification is passed, Indicating that the verification is not passed; the leakage feedback value of the current t iterations; verification of optimization principle set for guiding local large language model A prompt text for whether or not to contain proprietary information.
  7. 7. The method for optimizing local-cloud collaborative RTL code for IP security as set forth in claim 1, wherein in step 5, the target module after optimization of the cloud large language model is recorded as The specific expression is: (3) Wherein N is a target module to be optimized; Optimizing principle set after verification for guiding cloud large language model application The hint text of the new code is optimized.
  8. 8. The method for optimizing local-cloud collaborative RTL codes for IP security according to claim 1, wherein in step 7, the cloud large language model is Instruct +V3 mixed model or Coder +R1 mixed model.
  9. 9. A storage medium, wherein a computer program stored in the storage medium, when executed, performs the IP-secured local-cloud collaborative RTL code optimization method according to any one of claims 1-8.
  10. 10. An electronic device comprising the storage medium of claim 9.

Description

Local-cloud collaborative RTL code optimization method for IP security Technical Field The invention relates to the technical field of electronic design automation, in particular to a local-cloud collaborative RTL code optimization method for IP security. Background Large Language Models (LLMs) are being applied in the field of electronic design automation EDA, especially digital integrated circuit designs. These models are capable of generating and optimizing Hardware Description Language (HDL) code, yielding Register Transfer Level (RTL) implementations from functional specifications. Are widely used in industrial environments as aids for code development and design optimization. The optimization effect based on the large language model depends on the quality and quantity of the training data. However, existing open source hardware datasets suffer from limited scale and lack of diversity in optimization strategies. While high quality training data is typically found in proprietary code libraries that contain tested solutions developed by engineers for specific application needs. For example, battery-powered internet of things devices achieve power consumption reduction through clock gating, while high performance accelerators focus on pipelining to minimize critical path delays. Proprietary IP code embodying such domain-specific policies implies valuable contextual knowledge that can enhance the optimization capabilities of large language models. Semiconductor intellectual property IP, including proprietary circuit design, RTL implementations, and design methodology, is a strategic asset that supports competitive advantages. The most advanced large language model at present mainly operates as cloud service, but security holes exist when processing the sensitive design data, and the special Verilog IP can be memorized and stored in the fine tuning process of the large language model, so that the leakage risk exists. In addition, the powerful model enables power consumption-performance-power consumption (PPA) aware optimization, but requires uploading proprietary code. While the locally deployed large language model can address privacy concerns, its performance is limited. Hybrid approaches employ search enhancement generation techniques, but still expose design patterns to external services. These approaches focus mainly on optimizing effects, but fail to address the fundamental contradiction between privacy protection and performance. In summary, the invention aims to provide a local-cloud collaborative RTL code optimization method for IP security, and on the premise of ensuring the security of the intellectual property of a semiconductor, the optimization capability of a cloud large language model is utilized to realize the effective optimization of hardware description language codes. Disclosure of Invention The technical problem to be solved by the invention is to provide the local-cloud collaborative RTL code optimization method of IP security, which solves the contradiction between the protection and optimization effect of the semiconductor intellectual property rights in the hardware description language code optimization process through principle abstraction and verification and fully utilizes the capability of local and cloud large language models. In order to solve the technical problems, the invention adopts the following technical scheme: a local-cloud collaborative RTL code optimization method for IP security comprises the following steps. Step 1, data set construction, namely generating a manuscript based on a selected proprietary module P i by utilizing a local large-scale language modelForming a proprietary code set with K pairs of proprietary modules and first manuscriptsWherein, the method comprises the steps of, i is more than or equal to 1 and less than or equal to K. Step 2, extracting local principle, namely utilizing a local large language model pairExtraction fromOptimization principle set to P i。 Step 3, local IP leakage detection, namely using a local large language model to optimize a principle setAnd (3) performing IP leakage risk detection verification, skipping to step 5 when the verification is passed, otherwise, judging that the verification is not passed, recording a leakage feedback value and entering step 4. Step 4, feeding the leakage feedback value in the step 3 back to the local large language model in the step 2, repeating the steps 2 to 4, and carrying out optimization principle set againExtraction and local IP leak detection until a set of optimization principlesLeakage verification passed. Step 5, optimizing principle set passing leakage verificationUploading the optimization principle set to the cloud big language modelAnd optimizing the target module to be optimized. And 1, a data set construction method comprises the following steps. Step 1-1, selecting K proprietary modules P i with representatives from the public data set. Step 1-2, generating K one-to-one man