US-12627690-B1 - Using generative artificial intelligence with asset tagging for code security
Abstract
A build artifact to be used in an application in a development pipeline and information associated with the build artifact is identified. Metadata including the information associated with the build artifact is generated. The metadata is associated with the build artifact, wherein the metadata conveys with the build artifact in the development pipeline.
Inventors
- Christien R. Rioux
- PETER W. O'HEARN
- SOWMYA KARMALI
- Charles Y. Kim
- Yijou Chen
Assignees
- FORTINET, INC.
Dates
- Publication Date
- 20260512
- Application Date
- 20231012
Claims (19)
- 1 . A method comprising: identifying a build artifact and information associated with the build artifact, the build artifact to be used in an application in a development pipeline; generating, by a processing device, metadata comprising the information associated with the build artifact; and associating the metadata with the build artifact, wherein the metadata conveys with the build artifact in the development pipeline; wherein identifying the build artifact and information associated with the build artifact includes: receiving a request to analyze the build artifact to identify security vulnerabilities in the build artifact, and analyzing the build artifact to determine whether the build artifact includes the security vulnerabilities, wherein analyzing the build artifact includes the processing device executing a generative artificial intelligence (AI) model that analyzes the build artifact.
- 2 . The method of claim 1 , wherein the development pipeline is a continuous integration and continuous delivery (CI/CD) pipeline.
- 3 . The method of claim 1 , wherein the information associated with the build artifact comprises one or more of a source of the build artifact, a commit identifier, or a user associated with the commit identifier.
- 4 . The method of claim 1 , wherein the information associated with the build artifact comprises a software bill of materials (SBOM) for the build artifact.
- 5 . The method of claim 1 , wherein generating the metadata comprising the information associated with the build artifact further comprises encrypting the metadata.
- 6 . The method of claim 1 , wherein identifying the build artifact for the application in the development pipeline further comprises: in response to determining that the build artifact lacks the security vulnerabilities, generating the metadata that includes an indication that the build artifact lacks the security vulnerabilities.
- 7 . The method of claim 1 , wherein identifying the build artifact for the application in the development pipeline further comprises: in response to determining that the build artifact has one or more security vulnerabilities, generating the metadata that includes an indication that the build artifact has the one or more security vulnerabilities.
- 8 . The method of claim 7 , further comprising: providing, by the generative AI model, one or more recommended solutions to the one or more security vulnerabilities.
- 9 . The method of claim 7 , further comprising: generating content describing context associated with the determination that the build artifact has the one or more security vulnerabilities; and providing the content to a sender of the request to analyze the build artifact.
- 10 . The method of claim 1 , wherein the information associated with the build artifact comprises one or more capabilities associated with the build artifact.
- 11 . The method of claim 10 , wherein the processing device executing the generative AI model further comprises: reading the metadata comprising the information associated with the build artifact, the metadata comprising the one or more capabilities associated with the build artifact; identifying current capabilities of the build artifact; determining that the current capabilities differ from the one or more capabilities indicated in the metadata; and generating an indication that the current capabilities differ from the one or more capabilities indicated in the metadata.
- 12 . A non-transitory computer readable storage medium storing instructions which, when executed, cause a processing device to: identify a build artifact and information associated with the build artifact, the build artifact to be used in an application in a development pipeline; generate metadata comprising the information associated with the build artifact; and associate the metadata with the build artifact, wherein the metadata conveys with the build artifact in the development pipeline; wherein to identify the build artifact and information associated with the build artifact, the processing device is further to: receive a request to analyze the build artifact to identify security vulnerabilities in the build artifact, and analyze the build artifact to determine whether the build artifact includes the security vulnerabilities, wherein analyzing the build artifact includes the processing device to execute a generative artificial intelligence (AI) model that analyzes the build artifact.
- 13 . The non-transitory computer readable storage medium of claim 12 , wherein to identify the build artifact for the application in the development pipeline, the processing device is further to: in response to determining that the build artifact has one or more security vulnerabilities, generate the metadata that includes an indication that the build artifact has the one or more security vulnerabilities.
- 14 . The non-transitory computer readable storage medium of claim 13 , wherein the processing device executing the generative AI model is further to: provide one or more recommended solutions to the one or more security vulnerabilities.
- 15 . The non-transitory computer readable storage medium of claim 13 , wherein the processing device executing the generative AI model is further to: generate content describing context associated with the determination that the build artifact has the one or more security vulnerabilities; and provide the content to a sender of the request to analyze the build artifact.
- 16 . The non-transitory computer readable storage medium of claim 13 , wherein the processing device executing the generative AI model is further configured to generate a polygraph.
- 17 . The non-transitory computer readable storage medium of claim 12 , wherein the information associated with the build artifact comprises one or more capabilities associated with the build artifact.
- 18 . The non-transitory computer readable storage medium of claim 17 , wherein the processing device executing the generative AI model is further to: read the metadata comprising the information associated with the build artifact, the metadata comprising the one or more capabilities associated with the build artifact; identify current capabilities of the build artifact; determine that the current capabilities differ from the one or more capabilities indicated in the metadata; and generate an indication that the current capabilities differ from the one or more capabilities indicated in the metadata.
- 19 . The non-transitory computer readable storage medium of claim 12 , wherein to identify the build artifact for the application in the development pipeline, the processing device is further to: in response to determining that the build artifact lacks the security vulnerabilities, generate the metadata that includes an indication that the build artifact lacks the security vulnerabilities.
Description
BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings illustrate various embodiments and are a part of the specification. The illustrated embodiments are merely examples and do not limit the scope of the disclosure. Throughout the drawings, identical or similar reference numbers designate identical or similar elements. FIG. 1A shows an illustrative configuration in which a data platform is configured to perform various operations with respect to a cloud environment that includes a plurality of compute assets. FIG. 1B shows an illustrative implementation of the configuration of FIG. 1A. FIG. 1C illustrates an example computing device. FIG. 1D illustrates an example of an environment in which activities that occur within datacenters are modeled. FIG. 2A illustrates an example of a process, used by an agent, to collect and report information about a client. FIG. 2B illustrates a 5-tuple of data collected by an agent, physically and logically. FIG. 2C illustrates a portion of a polygraph. FIG. 2D illustrates a portion of a polygraph. FIG. 2E illustrates an example of a communication polygraph. FIG. 2F illustrates an example of a polygraph. FIG. 2G illustrates an example of a polygraph as rendered in an interface. FIG. 2H illustrates an example of a portion of a polygraph as rendered in an interface. FIG. 2I illustrates an example of a portion of a polygraph as rendered in an interface. FIG. 2J illustrates an example of a portion of a polygraph as rendered in an interface. FIG. 2K illustrates an example of a portion of a polygraph as rendered in an interface. FIG. 2L illustrates an example of an insider behavior graph as rendered in an interface. FIG. 2M illustrates an example of a privilege change graph as rendered in an interface. FIG. 2N illustrates an example of a user login graph as rendered in an interface. FIG. 2O illustrates an example of a machine server graph as rendered in an interface. FIG. 3A illustrates an example of a process for detecting anomalies in a network environment. FIG. 3B depicts a set of example processes communicating with other processes. FIG. 3C depicts a set of example processes communicating with other processes. FIG. 3D depicts a set of example processes communicating with other processes. FIG. 3E depicts two pairs of clusters. FIG. 3F is a representation of a user logging into a first machine, then into a second machine from the first machine, and then making an external connection. FIG. 3G is an alternate representation of actions occurring in FIG. 3F. FIG. 3H illustrates an example of a process for performing extended user tracking. FIG. 3I is a representation of a user logging into a first machine, then into a second machine from the first machine, and then making an external connection. FIG. 3J illustrates an example of a process for performing extended user tracking. FIG. 3K illustrates example records. FIG. 3L illustrates example output from performing an ssh connection match. FIG. 3M illustrates example records. FIG. 3N illustrates example records. FIG. 3O illustrates example records. FIG. 3P illustrates example records. FIG. 3Q illustrates an adjacency relationship between two login sessions. FIG. 3R illustrates example records. FIG. 3S illustrates an example of a process for detecting anomalies. FIG. 4A illustrates a representation of an embodiment of an insider behavior graph. FIG. 4B illustrates an embodiment of a portion of an insider behavior graph. FIG. 4C illustrates an embodiment of a portion of an insider behavior graph. FIG. 4D illustrates an embodiment of a portion of an insider behavior graph. FIG. 4E illustrates a representation of an embodiment of a user login graph. FIG. 4F illustrates an example of a privilege change graph. FIG. 4G illustrates an example of a privilege change graph. FIG. 4H illustrates an example of a user interacting with a portion of an interface. FIG. 4I illustrates an example of a dossier for an event. FIG. 4J illustrates an example of a dossier for a domain. FIG. 4K depicts an example of an Entity Join graph by FilterKey and FilterKey Group (implicit join). FIG. 4L illustrates an example of a process for dynamically generating and executing a query. FIG. 5 sets forth a flowchart illustrating an example method of dynamically generating monitoring tools for software applications in accordance with some embodiments of the present disclosure. FIG. 6 sets forth a flowchart illustrating an additional example method of dynamically generating monitoring tools for software applications in accordance with some embodiments of the present disclosure. FIG. 7 sets forth a flowchart illustrating an additional example method of dynamically generating monitoring tools for software applications in accordance with some embodiments of the present disclosure. FIG. 8 sets forth a flowchart illustrating an example method of generating the ancestry of a deployment in a cloud environment in accordance with some embodiments of the present disclosure. FI