浏览全部资源
扫码关注微信
1. 浙江科技学院理学院/大数据学院,浙江 杭州 310023
2. 海康威视-浙江科技学院边缘智能安全联合实验室,浙江 杭州 310023
3. 浙江水利水电学院信息工程与艺术设计学院,浙江 杭州310023
4. 浙江大学电气工程学院,浙江 杭州 310063
[ "陈靓(1995- ),男,浙江科技学院硕士生,主要研究方向为网络剪枝" ]
[ "钱亚冠(1976- ),男,博士,浙江科技学院副教授,主要研究方向为深度学习、人工智能安全、大数据处理" ]
[ "何志强(1996- ),男,浙江科技学院硕士生,主要研究方向为网络剪枝" ]
[ "关晓惠(1977- ),女,博士,浙江水利水电学院副教授,主要研究方向为数字图像处理与模式识别" ]
[ "王滨(1978- ),男,博士,浙江大学研究员,主要研究方向为人工智能安全、物联网安全、密码学" ]
[ "王星(1985- ),男,博士,浙江大学在站博士后,主要研究方向为机器学习与物联网安全" ]
网络出版日期:2022-01,
纸质出版日期:2022-01-20
移动端阅览
陈靓, 钱亚冠, 何志强, 等. 深度卷积神经网络的柔性剪枝策略[J]. 电信科学, 2022,38(1):83-94.
Liang CHEN, Yaguan QIAN, Zhiqiang HE, et al. A flexible pruning on deep convolutional neural networks[J]. Telecommunications science, 2022, 38(1): 83-94.
陈靓, 钱亚冠, 何志强, 等. 深度卷积神经网络的柔性剪枝策略[J]. 电信科学, 2022,38(1):83-94. DOI: 10.11959/j.issn.1000-0801.2022004.
Liang CHEN, Yaguan QIAN, Zhiqiang HE, et al. A flexible pruning on deep convolutional neural networks[J]. Telecommunications science, 2022, 38(1): 83-94. DOI: 10.11959/j.issn.1000-0801.2022004.
摘 要:尽管深度卷积神经网络在多种应用中取得了极大的成功,但其结构的冗余性导致模型过大的存储容量和过高的计算代价,难以部署到资源受限的边缘设备中。网络剪枝是消除网络冗余的一种有效途径,为了找到在有限资源下最佳的神经网络模型架构,提出了一种高效的柔性剪枝策略。一方面,通过计算通道贡献量,兼顾通道缩放系数的分布情况;另一方面,通过对剪枝结果的合理估计及预先模拟,提高剪枝过程的效率。基于VGG16与ResNet56在CIFAR-10的实验结果表明,柔性剪枝策略分别降低了71.3%和54.3%的浮点运算量,而准确率仅分别下降0.15个百分点和0.20个百分点。
Despite the successful application of deep convolutional neural networks
due to the redundancy of its structure
the large memory requirements and the high computing cost lead it hard to be well deployed to the edge devices with limited resources.Network pruning is an effective way to eliminate network redundancy.An efficient flexible pruning strategy was proposed in the purpose of the best architecture under the limited resources.The contribution of channels was calculated considering the distribution of channel scaling factors.Estimating the pruning result and simulating in advance increase efficiency.Experimental results based on VGG16 and ResNet56 on CIFAR-10 show that the flexible pruning reduces FLOPs by 71.3% and 54.3%
respectively
while accuracy by only 0.15 percentage points and 0.20 percentage points compared to the benchmark model.
KRIZHEVSKY A , SUTSKEVER I , HINTON G E . ImageNet classification with deep convolutional neural networks [J ] . Communications of the ACM , 2017 , 60 ( 6 ): 84 - 90 .
HE K M , ZHANG X Y , REN S Q , et al . Deep residual learning for image recognition [C ] // Proceedings of 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) . Piscataway:IEEE Press , 2016 : 770 - 778 .
唐博恒 , 柴鑫刚 . 基于云边协同的计算机视觉推理机制 [J ] . 电信科学 , 2021 , 37 ( 5 ): 72 - 81 .
TANG B H , CHAI X G . Cloud-edge collaboration based computer vision inference mechanism [J ] . Telecommunications Science , 2021 , 37 ( 5 ): 72 - 81 .
WANG Y , BIAN Z P , HOU J H , et al . Convolutional neural networks with dynamic regularization [EB ] . 2019 .
COURBARIAUX M , BENGIO Y , DAVID J P . Binary Connect:training deep neural networks with binary weights during propagations [J ] . CoRR , 2015 .
HAN S , POOL J , TRAN J , et al . Learning both weights and connections for efficient neural networks [J ] . CoRR , 2015 .
WEN W J , YANG F , SU Y F , et al . Learning low-rank structured sparsity in recurrent neural networks [C ] // Proceedings of 2020 IEEE International Symposium on Circuits and Systems (ISCAS) . Piscataway:IEEE Press , 2020 : 1 - 4 .
HE Y H , ZHANG X Y , SUN J . Channel pruning for accelerating very deep neural networks [C ] // Proceedings of 2017 IEEE International Conference on Computer Vision (ICCV) . Piscataway:IEEE Press , 2017 : 1398 - 1406 .
LI H , KADAV A , DURDANOVIC I , et al . Pruning filters for efficient ConvNets [EB ] . 2016 .
HU H Y , PENG R , TAI Y W , et al . Network trimming:a data-driven neuron pruning approach towards efficient deep architectures [EB ] . 2016 .
LUO J H , WU J X , LIN W Y . ThiNet:a filter level pruning method for deep neural network compression [C ] // Proceedings of 2017 IEEE International Conference on Computer Vision (ICCV) . Piscataway:IEEE Press , 2017 : 5068 - 5076 .
LIU Z , LI J G , SHEN Z Q , et al . Learning efficient convolutional networks through network slimming [C ] // Proceedings of 2017 IEEE International Conference on Computer Vision (ICCV) . Piscataway:IEEE Press , 2017 : 2755 - 2763 .
YE J B , LU X , LIN Z , et al . Rethinking the smaller-norm-less-informative assumption in channel pruning of convolution layers [EB ] . 2018 .
IOFFE S , SZEGEDY C . Batch normalization:accelerating deep network training by reducing internal covariate shift [J ] . CoRR , 2015 .
HE Y , DONG X Y , KANG G L , et al . Asymptotic soft filter pruning for deep convolutional neural networks [J ] . IEEE Transactions on Cybernetics , 2020 , 50 ( 8 ): 3594 - 3604 .
LIN M B , JI R R , ZHANG Y X , et al . Channel pruning via automatic structure search [C ] // Proceedings of Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence . California:International Joint Conferences on Artificial Intelligence Organization , 2020 .
LECUN Y , DENKER J S , Solla S A . Optimal brain damage [C ] // Proceedings of the Advances in Neural Information Processing Systems . Berlin:Springer , 1989 : 598 - 605 .
LIU Z , SUN M J , ZHOU T H , et al . Rethinking the value of network pruning [EB ] . 2018 .
YE Y , YOU G M , FWU J K , et al . Channel pruning via optimal thresholding [M ] // Communications in Computer and Information Science . Cham : Springer International Publishing , 2020 : 508 - 516 .
ZHUANG T , ZHANG Z X , HUANG Y H , et al . Neuron-level structured pruning using polarization regularizer [C ] // Advances in Neural Information Processing Systems . 2020 : 1 - 13 .
RONG J T , YU X Y , ZHANG M Y , et al . Soft Taylor pruning for accelerating deep convolutional neural networks [C ] // Proceedings of IECON 2020 The 46th Annual Conference of the IEEE Industrial Electronics Society . Piscataway:IEEE Press , 2020 : 5343 - 5349 .
CAI L H , AN Z L , YANG C G , et al . Softer pruning,incremental regularization [C ] // Proceedings of 2020 25th International Conference on Pattern Recognition (ICPR) . Piscataway:IEEE Press , 2021 : 224 - 230 .
XU X Z , CHEN Q M , XIE L , et al . Batch-normalization-based soft filter pruning for deep convolutional neural networks [C ] // Proceedings of 2020 16th International Conference on Control,Automation,Robotics and Vision (ICARCV) . Piscataway:IEEE Press , 2020 : 951 - 956 .
LIU Z C , MU H Y , ZHANG X Y , et al . MetaPruning:meta learning for automatic neural network channel pruning [C ] // Proceedings of 2019 IEEE/CVF International Conference on Computer Vision (ICCV) . Piscataway:IEEE Press , 2019 : 3295 - 3304 .
DING Y X , ZHAO W G , WANG Z P , et al . Automaticlly learning featurs of android apps using CNN [C ] // Proceedings of 2018 International Conference on Machine Learning and Cybernetics (ICMLC) . Piscataway:IEEE Press , 2018 : 331 - 336 .
LIN M B , JI R R , ZHANG Y X , et al . Channel pruning via automatic structure search [C ] // Proceedings of Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence . California:International Joint Conferences on Artificial Intelligence Organization , 2020 : 673 - 679 .
SIMONYAN K , ZISSERMAN A . Very deep convolutional networks for large-scale image recognition [J ] . CoRR , 2014 .
KRIZHEVSKY A , HINTON G . Learning multiple layers of features from tiny images [J ] . Handbook of Systemic Autoimmune Diseases . 2009 , 1 ( 4 )
LUO J H , WU J X . Neural network pruning with residual-connections and limited-data [C ] // Proceedings of 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) . Piscataway:IEEE Press , 2020 : 1455 - 1464 .
ZHOU Y F , ZHANG Y , WANG Y F , et al . Accelerate CNN via recursive Bayesian pruning [C ] // Proceedings of 2019 IEEE/CVF International Conference on Computer Vision (ICCV) . Piscataway:IEEE Press , 2019 : 3305 - 3314 .
PENG H Y , WU J X , CHEN S F , et al . Collaborative channel pruning for deep networks [C ] // Proceedings of the International Conference on Machine Learning . New York:ACM Press , 2019 : 5113 - 5122 .
YANG H R , WEN W , LI H , et al . DeepHoyer:learning sparser neural network with differentiable scale-invariant sparsity measures [EB ] . 2019 .
HE Y , DING Y H , LIU P , et al . Learning filter pruning criteria for deep convolutional neural networks acceleration [C ] // Proceedings of 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) . Piscataway:IEEE Press , 2020 : 2006 - 2015 .
0
浏览量
273
下载量
0
CSCD
关联资源
相关文章
相关作者
相关机构