Energy-Efficient Deep Learning through Memristive Neuromorphic Synapses: A Hardware Implementation Study


Abstract views: 4 / PDF downloads: 1

Authors

  • Baki Gökgöz Gümüşhane University

Keywords:

AI Accelerators, Machine Learning, Memristors, Neuromorphic Computing, Synapses

Abstract

Advances in artificial intelligence and machine learning, especially in deep learning, have driven
rapid adoption across various fields. However, the high computational demands and extensive data
processing needs of these algorithms pose major energy efficiency challenges for traditional Von Neumann
based computing systems. These issues are compounded by the slowing scalability of semiconductor
technology and the inefficiencies of parallel processing in multi-core architectures. To address these
limitations, neuromorphic computing systems which unify memory and processing at the hardware level
have emerged as a promising solution for energy efficient AI. Among their key components, memristive
devices stand out by mimicking biological synaptic behavior with extremely low power consumption,
allowing for physical representation of synaptic weights in neural networks. This study explores the
hardware implementation of memristive synapses in deep neural networks. While memristive systems may
have longer training times compared to software-based convolutional neural networks, they achieve
competitive accuracy (up to 90%) using gradient descent optimization methods, all while consuming around
100,000 times less energy. This dramatic improvement in energy efficiency makes memristive technology
a leading candidate for both current and future sustainable AI systems.

Downloads

Download data is not yet available.

Author Biography

Baki Gökgöz, Gümüşhane University

Department of Computer Technologies

References

Capra, M., Bussolino, B., Marchisio, A., Masera, G., Martina, M., & Shafique, M. (2020). Hardware and Software Optimizations for Accelerating Deep Neural Networks: Survey of Current Trends, Challenges, and the Road Ahead. IEEE Access, 8, 225134–225180. https://doi.org/10.1109/ACCESS.2020.3039858

Chen, P.-Y., Peng, X., & Yu, S. (2018). NeuroSim: A Circuit-Level Macro Model for Benchmarking Neuro-Inspired Architectures in Online Learning. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 37(12), 3067–3080. https://doi.org/10.1109/TCAD.2018.2789723

Chiu, Y.-C., Khwa, W.-S., Yang, C.-S., Teng, S.-H., Huang, H.-Y., Chang, F.-C., Wu, Y., Chien, Y.-A., Hsieh, F.-L., Li, C.-Y., Lin, G.-Y., Chen, P.-J., Pan, T.-H., Lo, C.-C., Liu, R.-S., Hsieh, C.-C., Tang, K.-T., Ho, M.-S., Lo, C.-P., … Chang, M.-F. (2023). A CMOS-integrated spintronic compute-in-memory macro for secure AI edge devices. Nature Electronics, 6(7), 534–543. https://doi.org/10.1038/s41928-023-00994-0

Chua, L. (1971). Memristor-The missing circuit element. IEEE Transactions on Circuit Theory, 18(5), 507–519. https://doi.org/10.1109/TCT.1971.1083337

Du, Z., Ben-Dayan Rubin, D. D., Chen, Y., He, L., Chen, T., Zhang, L., Wu, C., & Temam, O. (2015). Neuromorphic accelerators. Proceedings of the 48th International Symposium on Microarchitecture, 494–507. https://doi.org/10.1145/2830772.2830789

Feldman, D. E. (2012). The Spike-Timing Dependence of Plasticity. Neuron, 75(4), 556–571. https://doi.org/10.1016/j.neuron.2012.08.001

Gökgöz, B., Aydın, T., & Gül, F. (2024). Optimizing Memristor-Based Synaptic Devices for Enhanced Energy Efficiency and Accuracy in Neuromorphic Machine Learning. IEEE Access, 12, 154401–154417. https://doi.org/10.1109/ACCESS.2024.3482110

Gökgöz, B., Gül, F., & Aydın, T. (2024). An overview memristor based hardware accelerators for deep neural network. Concurrency and Computation: Practice and Experience, 36(9). https://doi.org/10.1002/cpe.7997

Indiveri, G., & Liu, S.-C. (2015). Memory and Information Processing in Neuromorphic Systems. Proceedings of the IEEE, 103(8), 1379–1397. https://doi.org/10.1109/JPROC.2015.2444094

Jouppi, N. P., Young, C., Patil, N., Patterson, D., Agrawal, G., Bajwa, R., Bates, S., Bhatia, S., Boden, N., Borchers, A., Boyle, R., Cantin, P., Chao, C., Clark, C., Coriell, J., Daley, M., Dau, M., Dean, J., Gelb, B., … Yoon, D. H. (2017). In-Datacenter Performance Analysis of a Tensor Processing Unit. Proceedings of the 44th Annual International Symposium on Computer Architecture, 1–12. https://doi.org/10.1145/3079856.3080246

Kuzum, D., Yu, S., & Philip Wong, H.-S. (2013). Synaptic electronics: materials, devices and applications. Nanotechnology, 24(38), 382001. https://doi.org/10.1088/0957-4484/24/38/382001

Malhotra, R., & Singh, P. (2023). Recent advances in deep learning models: a systematic literature review. Multimedia Tools and Applications, 82(29), 44977–45060. https://doi.org/10.1007/s11042-023-15295-z

Mead, C. (1990a). Neuromorphic electronic systems. Proceedings of the IEEE, 78(10), 1629–1636. https://doi.org/10.1109/5.58356

Mead, C. (1990b). Neuromorphic Electronic Systems. Proceedings of the IEEE, 78(10), 1629–1636. https://doi.org/10.1109/5.58356

Prezioso, M., Merrikh-Bayat, F., Hoskins, B. D., Adam, G. C., Likharev, K. K., & Strukov, D. B. (2015). Training and operation of an integrated neuromorphic network based on metal-oxide memristors. Nature, 521(7550), 61–64. https://doi.org/10.1038/nature14441

Sarker, I. H. (2021). Deep Learning: A Comprehensive Overview on Techniques, Taxonomy, Applications and Research Directions. SN Computer Science, 2(6), 420. https://doi.org/10.1007/s42979-021-00815-1

Sarpeshkar, R. (1998). Analog Versus Digital: Extrapolating from Electronics to Neurobiology. Neural Computation, 10(7), 1601–1638. https://doi.org/10.1162/089976698300017052

Schuman, C. D., Potok, T. E., Patton, R. M., Birdwell, J. D., Dean, M. E., Rose, G. S., & Plank, J. S. (2017). A Survey of Neuromorphic Computing and Neural Networks in Hardware. http://arxiv.org/abs/1705.06963

Sung, C., Hwang, H., & Yoo, I. K. (2018). Perspective: A review on memristive hardware for neuromorphic computation. Journal of Applied Physics, 124(15), 151903. https://doi.org/10.1063/1.5037835

Wan, Q., Sharbati, M. T., Erickson, J. R., Du, Y., & Xiong, F. (2019). Emerging Artificial Synaptic Devices for Neuromorphic Computing. Advanced Materials Technologies, 4(4). https://doi.org/10.1002/admt.201900037

Zhu, J., Zhang, T., Yang, Y., & Huang, R. (2020). A comprehensive review on emerging artificial neuromorphic devices. Applied Physics Reviews, 7(1). https://doi.org/10.1063/1.5118217

Downloads

Published

2025-05-09

How to Cite

Gökgöz, B. (2025). Energy-Efficient Deep Learning through Memristive Neuromorphic Synapses: A Hardware Implementation Study . International Journal of Advanced Natural Sciences and Engineering Researches, 9(5), 129–137. Retrieved from https://as-proceeding.com/index.php/ijanser/article/view/2654

Issue

Section

Articles