ВЫЧИСЛИТЕЛЬНЫЕ СИСТЕМЫ И СЕТИ
E. E. Limonova "Fast and Gate-Efficient Approximated Activations for Bipolar Morphological Neural Networks"
ОБРАБОТКА ИНФОРМАЦИИ И АНАЛИЗ ДАННЫХ
ИНТЕЛЛЕКТУАЛЬНЫЕ СИСТЕМЫ И ТЕХНОЛОГИИ
МАТЕМАТИЧЕСКОЕ МОДЕЛИРОВАНИЕ
E. E. Limonova "Fast and Gate-Efficient Approximated Activations for Bipolar Morphological Neural Networks"
Abstract. 

Bipolar morphological neural networks are aimed at efficient hardware implementation without multiplications inside the convolutional layers. However, they use resource demanding activation functions based on binary logarithm and exponent. In this paper, the computationally efficient approximations for activation functions of bipolar morphological neural networks are considered. Mitchell's approximation is used for binary logarithm and demonstrates 12 times decrease in the estimated logic gate number and latency. Schraudolph's approximation used for exponent has 3 times lower logic gates complexity and latency. The usage of approximate activation functions provides a 12-40% latency decrease for the BM convolutional layers with a small number of input channels and 3x3 filters compared to standard ones. The experiments show that these approximations can be used in the BM ResNet trained for classification task with a reasonable recognition accuracy decreasing from 99.08% to 98.90%.

Keywords: 

bipolar morphological networks, approximations, computational efficiency.
 
PP. 3-10. 

DOI 10.14357/20718632220201
 
 
 
References

1. Chernyshova Y. S., Sheshkus A. V., Arlazarov V. V. Two-step CNN framework for text line recognition in camera-captured images // IEEE Access. — 2020. — vol. 8. — pp. 32587-32600. — DOI: 10.1109/ACCESS.2020.2974051.
2. Kanaeva I. A., Ivanova Y. A., Spitsyn V. G. Deep convolutional generative adversarial network-based synthesis of datasets for road pavement distress segmentation // Computer Optics. — 2021. — vol. 45. — №. 6 — pp. 907-916.
3. Lobanov M.G., Sholomov D. L. Application of shared backbone DNNs in ADAS perception systems // In Thirteenth International Conference on Machine Vision. — vol. 11605. — 2021. — p. 1160525.
4. E. I. Andreeva, V. V. Arlazarov, A. V. Gayer, E. P. Dorokhov, A. V. Sheshkus and O. A. Slavin, “Document Recognition Method Based on Convolutional Neural Network Invariant to 180 Degree Rotation Angle,” ITiVS, no 4, pp. 87-93, 2019, DOI: 10.14357/20718632190408.
5. Jacob B., Kligys S., Chen B., Zhu M., Tang M., Howard A.G., Adam H., Kalenichenko D. Quantization and Training of Neural Networks for Efficient Integer-Arithmetic- Only Inference // 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. — 2018. — pp.2704-2713.
6. Liu J., Zhuang B., Chen P., Tan M., Shen C. AQD: Towards Accurate Quantized Object Detection // 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), — 2021. — pp. 104-113.
7. Prazeres M., Li X., Oberman A., Nia V.. EuclidNets: Combining Hardware and Architecture Design for Efficient Training and Inference. // In Proceedings of the 11th International Conference on Pattern Recognition Applications and Methods - ICPRAM, — 2022. — ISBN 978-989-758-549-4 — pp 141-151. — DOI: 10.5220/0010988500003122
8. You H., Chen X., Zhang Y., Li C., Li S., Liu Z., Wang Z., Lin Y. ShiftAddNet: A Hardware-Inspired Deep Network. ArXiv, abs/2010.12785. // Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, NeurIPS 2020. — 2020.
9. Chen H., Wang Y., Xu C., Shi B., Xu C., Tian Q., Xu C. AdderNet: Do We Really Need Multiplications in Deep Learning? // 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). — 2020. — pp.1465-1474.
10. Limonova E., Matveev D., Nikolaev D., Arlazarov V. V. Bipolar morphological neural networks: convolution without multiplication // ICMV 2019. — 2020. — Т. 11433. — 11433 3J. — С. 1-8. — DOI: 10.1117/12.2559299.
11. Limonova E. E., Alfonso D. M., Nikolaev D. P., Arlazarov V. V. Bipolar Morphological Neural Networks: Gate- Efficient Architecture for Computer Vision // IEEE Access. — 2021. — Т. 9. — С. 97569-97581. — DOI: 10.1109/ACCESS.2021.3094484.
12. IEEE Standard for Floating-Point Arithmetic — 2019. —  pp.1-84. — DOI: 10.1109/IEEESTD.2019.8766229.
13. Limonova E. E., Alfonso D. M., Nikolaev D. P., Arlazarov V. V. ResNet-like Architecture with Low Hardware Requirements // Processings of ICPR 2020. — May 2021. — ISSN 1051-4651. — ISBN 978-17-28188-09-6. — С. 6204-6211. — DOI: 10.1109/ICPR48806.2021.9413186.
14. Mitchell J.N. Computer Multiplication and Division Using Binary Logarithms // IRE Trans. Electron. Comput. — 1962. — Vol. 11. — pp. 512-517.
15. Reference Implementations for Intel Architecture Approximation Instructions VRCP14, VRSQRT14, VRCP28, VRSQRT28, and VEXP2, https://www.intel.com/content/www/us/en/developer/articl
es/code-sample/reference-implementations-for-iaapproximation-
instructions-vrcp14-vrsqrt14-vrcp28-
vrsqrt28-vexp2.html (accessed April 5, 2022)
16. Schraudolph N. N. A fast, compact approximation of the exponential function // Neural Computation. — 1999. — vol. 11. — no. 4. — pp. 853-862.
17. The MNIST database of handwritten digits, http://yann.lecun.com/exdb/mnist/ (accessed April 5, 2022).
 
2024 / 03
2024 / 02
2024 / 01
2023 / 04

© ФИЦ ИУ РАН 2008-2018. Создание сайта "РосИнтернет технологии".