Artificial intelligence researchers at North Carolina State University (NC State) have enhanced deep neural network performance by integrating feature normalization and feature attention modules into a hybrid attentive normalization (AN) module.
This module improves system accuracy, while consuming negligible additional computation power.
The team tested the AN module by plugging it into four popular neural network architectures: ResNets, DenseNets, MobileNetsV2, and AOGNets.
Testing these networks against the ImageNet-1000 classification and the MS-COCO 2017 object detection and instance segmentation benchmarks demonstrated improved performance.
NC State's Tianfu Wu said, "We have released the source code and hope our AN will lead to better integrative design of deep neural networks."
From NC State News
View Full Article
No entries found