A1 Journal article (refereed)
Deep Belief Network with Fuzzy Parameters and Its Membership Function Sensitivity Analysis (2025)
Shukla, A. K., & Muhuri, P. K. (2025). Deep Belief Network with Fuzzy Parameters and Its Membership Function Sensitivity Analysis. Neurocomputing, 614, Article 128716. https://doi.org/10.1016/j.neucom.2024.128716
JYU authors or editors
Publication details
All authors or editors: Shukla, Amit K.; Muhuri, Pranab K.
Journal or series: Neurocomputing
ISSN: 0925-2312
eISSN: 1872-8286
Publication year: 2025
Publication date: 11/10/2024
Volume: 614
Article number: 128716
Publisher: Elsevier
Publication country: Netherlands
Publication language: English
DOI: https://doi.org/10.1016/j.neucom.2024.128716
Publication open access: Openly available
Publication channel open access: Partially open access channel
Abstract
Over the last few years, deep belief networks (DBNs) have been extensively utilized for efficient and reliable performance in several complex systems. One critical factor contributing to the enhanced learning of the DBN layers is the handling of network parameters, such as weights and biases. The efficient training of these parameters significantly influences the overall enhanced performance of the DBN. However, the initialization of these parameters is often random, and the data samples are normally corrupted by unwanted noise. This causes the uncertainty to arise among weights and biases of the DBNs, which ultimately hinders the performance of the network. To address this challenge, we propose a novel DBN model with weights and biases represented using fuzzy sets. The approach systematically handles inherent uncertainties in parameters resulting in a more robust and reliable training process. We show the working of the proposed algorithm considering four widely used benchmark datasets such as: MNSIT, n-MNIST (MNIST with additive white Gaussian noise (AWGN) and MNIST with motion blur) and CIFAR-10. The experimental results show superiority of the proposed approach as compared to classical DBN in terms of robustness and enhanced performance. Moreover, it has the capability to produce equivalent results with a smaller number of nodes in the hidden layer; thus, reducing the computational complexity of the network architecture. Additionally, we also study the sensitivity analysis for stability and consistency by considering different membership functions to model the uncertain weights and biases. Further, we establish the statistical significance of the obtained results by conducting both one-way and Kruskal-Wallis analyses of variance tests.
Keywords: deep learning; fuzzy logic; neural computation
Free keywords: deep learning; deep belief networks; restricted Boltzmann machine; fuzzy sets; type-1 fuzzy sets; contrastive divergence
Contributing organizations
Ministry reporting: Yes
VIRTA submission year: 2024
JUFO rating: 2
Preliminary JUFO rating: 2