Investigating on Gradient Regularization for Testing Neural Networks
Abstract
Convolutional Neural Networks (CNNs) have become ubiq- uitous in diverse applications, including safety-critical domains such as autonomous driving, where ensuring reliability is crucial. CNNs relia- bility can be jeopardized by the occurrence of hardware faults during the inference, leading to severe consequences. In recent years, gradient regularization has garnered attention as a technique able to improve generalization and robustness to Gaussian noise injected into the pa- rameters of neural networks, but no study has been done considering its fault-tolerance effect. This paper analyzes the influence of gradient reg- ularization on CNNs reliability for classification tasks in the presence of random hardware faults, exploring impacts on the network’s performance and robustness. Our experiments involved simulating permanent stuck- at faults through statistical fault injection and assessing the reliability of CNNs trained with and without gradient regularization. Experimental results point out that regularization reduces the masking ability of neural networks, paving the way for efficient in-field fault detection techniques that aim at unveiling permanent faults. Specifically, it systematically re- duces the percentage of masked faults up to 15% while preserving high prediction accuracy.
Origin | Files produced by the author(s) |
---|---|
Licence |