US-12626143-B2 - Composite adversarial attack model training for neural networks
Abstract
Some embodiments of the present disclosure are directed to systems, computer-readable media, and computer-implemented methods for neural network training. Some embodiments are directed to determining an attack order schedule for the data sample that includes a plurality of adversarial perturbation attacks associated with the data sample, and performing a composite adversarial attack process against the data set using the determined attack order schedule to generate a perturbed data sample for the data sample. Other embodiments may be disclosed or claimed.
Inventors
- Pin-Yu Chen
- I-Hsin Chung
- Bo Wu
- Chuang Gan
- Lei Hsiung
- Yun-Yun Tsai
- Tsung-Yi Ho
Assignees
- INTERNATIONAL BUSINESS MACHINES CORPORATION
- NATIONAL TSING HUA UNIVERSITY
Dates
- Publication Date
- 20260512
- Application Date
- 20230608
Claims (20)
- 1 . A computer system comprising: a processor; and a memory coupled to the processor and storing instructions that, when executed by the processor, cause the computer system to: retrieve a data set for a neural network that includes a data sample stored in an electronic file format; determine, based on the data set, an attack order schedule for the data sample that includes a plurality of adversarial perturbation attacks associated with the data sample, wherein each respective attack: has a respective type, includes a respective attack power value, and includes a respective perturbation interval; optimize each respective attack and its respective ordering in the attack order schedule using an iterative gradient descent process; and perform a composite adversarial attack process against the data set using the determined attack order schedule to generate a perturbed data sample for the data sample, the perturbed data sample having a common electronic file format to the data sample.
- 2 . The computer system of claim 1 , wherein the data sample comprises at least one of: text, an image, audio, and video.
- 3 . The computer system of claim 1 , wherein optimizing each respective attack and its respective ordering in the attack order schedule is performed dynamically during the performing of the composite adversarial attack process.
- 4 . The computer system of claim 1 , wherein performing the composite adversarial attack process against the data set includes modifying the perturbed data sample after each subsequent attack in the plurality of adversarial perturbation attacks.
- 5 . The computer system of claim 1 , wherein determining the attack order schedule for the data sample includes optimizing an assignment function to yield a maximum classification error.
- 6 . The computer system of claim 1 , wherein determining the attack order schedule for the data sample includes generating a scheduling matrix that includes a respective surrogate composite adversarial data sample associated with a modification to the data sample from each respective attack.
- 7 . The computer system of claim 6 , wherein optimizing each respective attack and its respective ordering in the attack order schedule includes applying a Sinkhorn normalization operation to the scheduling matrix to generate an updated scheduling matrix.
- 8 . The computer system of claim 7 , wherein optimizing each respective attack and its respective ordering in the attack order schedule includes applying a Hungarian assignment process to the updated scheduling matrix.
- 9 . A computer-readable storage medium storing instructions that, when executed by a computer system, cause the computer system to: retrieve a data set for a neural network that includes a data sample stored in an electronic file format, wherein the data sample comprises at least one of: text, an image, audio, and video; determine, based on the data set, an attack order schedule for the data sample that includes a plurality of adversarial perturbation attacks associated with the data sample, wherein each respective attack: has a respective type, includes a respective attack power value, and includes a respective perturbation interval; optimize each respective attack and its respective ordering in the attack order schedule using an iterative gradient descent process; and perform a composite adversarial attack process against the data set using the determined attack order schedule to generate a perturbed data sample for the data sample, the perturbed data sample having a common electronic file format to the data sample.
- 10 . The computer-readable storage medium of claim 9 , wherein optimizing each respective attack and its respective ordering in the attack order schedule is performed dynamically during the performing of the composite adversarial attack process.
- 11 . The computer-readable storage medium of claim 9 , wherein performing the composite adversarial attack process against the data set includes modifying the perturbed data sample after each subsequent attack in the plurality of adversarial perturbation attacks.
- 12 . The computer-readable storage medium of claim 9 , wherein determining the attack order schedule for the data sample includes optimizing an assignment function to yield a maximum classification error.
- 13 . The computer-readable storage medium of claim 9 , wherein determining the attack order schedule for the data sample includes: generating a scheduling matrix that includes a respective surrogate composite adversarial data sample associated with a modification to the data sample from each respective attack; applying a Sinkhorn normalization operation to the scheduling matrix to generate an updated scheduling matrix; and applying a Hungarian assignment process to the updated scheduling matrix.
- 14 . A computer-implemented method comprising: retrieving, by a computer system, a data set for a neural network that includes a data sample stored in an electronic file format, wherein the data sample comprises at least one of: text, an image, audio, and video; determining, by the computer system based on the data set, an attack order schedule for the data sample that includes a plurality of adversarial perturbation attacks associated with the data sample, wherein each respective attack: has a respective type, includes a respective attack power value, and includes a respective perturbation interval; optimizing, by the computer system, each respective attack and its respective ordering in the attack order schedule using an iterative gradient descent process; and performing, by the computer system, a composite adversarial attack process against the data set using the determined attack order schedule to generate a perturbed data sample for the data sample, the perturbed data sample having a common electronic file format to the data sample.
- 15 . The computer-readable medium of claim 14 , wherein optimizing each respective attack and its respective ordering in the attack order schedule is performed dynamically during the performing of the composite adversarial attack process.
- 16 . The computer-readable medium of claim 14 , wherein performing the composite adversarial attack process against the data set includes modifying the perturbed data sample after each subsequent attack in the plurality of adversarial perturbation attacks.
- 17 . The computer-readable medium of claim 14 , wherein determining the attack order schedule for the data sample includes optimizing an assignment function to yield a maximum classification error.
- 18 . The computer-readable medium of claim 14 , wherein determining the attack order schedule for the data sample includes generating a scheduling matrix that includes a respective surrogate composite adversarial data sample associated with a modification to the data sample from each respective attack.
- 19 . The computer-readable medium of claim 18 , wherein optimizing each respective attack and its respective ordering in the attack order schedule includes applying a Sinkhorn normalization operation to the scheduling matrix to generate an updated scheduling matrix.
- 20 . The computer-readable medium of claim 19 , wherein optimizing each respective attack and its respective ordering in the attack order schedule includes applying a Hungarian assignment process to the updated scheduling matrix.
Description
STATEMENT REGARDING PRIOR DISCLOSURES BY THE INVENTOR OR A JOINT INVENTOR The following disclosure is submitted under 35 U.S.C. 102(b)(1)(A): HSIUNG et al., “Toward Compositional Adversarial Robustess: Generalizing Adversarial Training to Composite Semantic Perturbations,” arXiv:2202.04235v3, 21 Mar. 2023, pp 1-21. BACKGROUND Embodiments of the present invention generally relate to neural networks, and more specifically, to computer systems, computer-implemented methods, and computer program products for training neural network models using composite adversarial attacks. Deep neural networks (DNNs) are machine learning systems that have been widely deployed in applications such as biometric authentication (e.g., facial image recognition), medical diagnosis (e.g., CT lung cancer detection) and autonomous driving systems (e.g., traffic sign classification). However, while these models can achieve outstanding performance on benign data points, recent research has shown that state-of-the-art models can be easily fooled by malicious data points crafted intentionally with adversarial perturbations. A simple example of such a perturbation could include a data sample including an image stored in an electronic format (e.g., JPEG) being mislabeled (e.g., an image of a dog being labeled as a “cat”). Other features of such a data sample may likewise be subject to adversarial attacks that introduce perturbations affecting the image's hue, saturation, brightness, contrast, etc. Data items in other formats (e.g., text, video, etc.) may likewise be manipulated. Embodiments of the present invention address these and other issues by providing enhanced methods for neural network model training that provide robust models to help defend against such adversarial attacks. SUMMARY Embodiments of the present invention are directed to computer systems, computer-readable media, and computer-implemented methods for training neural network models using composite adversarial attacks. One exemplary embodiment includes a computer system comprising a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: retrieve a data set for a neural network that includes a data sample stored in an electronic file format; determine, based on the data set, an attack order schedule for the data sample that includes a plurality of adversarial perturbation attacks associated with the data sample, wherein each respective attack: has a respective type, includes a respective attack power value, and includes a respective perturbation interval; optimize each respective attack and its respective ordering in the attack order schedule using an iterative gradient descent process; and perform a composite adversarial attack process against the data set using the determined attack order schedule to generate a perturbed data sample for the data sample, the perturbed data sample having a common electronic file format to the data sample. Additional technical features and benefits are realized through the techniques of the present invention. Embodiments and aspects of the invention are described in detail herein and are considered a part of the claimed subject matter. For a better understanding, refer to the detailed description and to the drawings. BRIEF DESCRIPTION OF THE DRAWINGS The specifics of the exclusive rights described herein are particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features and advantages of the embodiments of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which: FIG. 1 is a block diagram illustrating an example of a computer system for use in conjunction with one or more embodiments of the present invention; FIG. 2 is a functional block diagram illustrating an example of training a neural network in accordance with one or more embodiments of the present invention; FIG. 3 illustrates an example of a process for training a neural network model in accordance with various embodiments of the disclosure; and FIG. 4 is a flowchart of a method for training a neural network model in accordance with one or more embodiments of the present invention. DETAILED DESCRIPTION Disclosed herein are methods, systems, and computer program products for neural network training. Among other things, embodiments of the present disclosure can train neural network models to identify adversarial perturbations in data sets more effectively and completely than in conventional systems, thereby improving the level of accuracy of machine-learning-based systems. Various aspects of the present disclosure are described by narrative text, flowcharts, block diagrams of computer systems, and/or block diagrams of the machine logic included in computer program product (CPP) embodiments. With respect to any flowcharts, depending upon the technology i