A1 Journal article (refereed)
Deep Belief Network with Fuzzy Parameters and Its Membership Function Sensitivity Analysis (2025)


Shukla, A. K., & Muhuri, P. K. (2025). Deep Belief Network with Fuzzy Parameters and Its Membership Function Sensitivity Analysis. Neurocomputing, 614, Article 128716. https://doi.org/10.1016/j.neucom.2024.128716


JYU authors or editors


Publication details

All authors or editorsShukla, Amit K.; Muhuri, Pranab K.

Journal or seriesNeurocomputing

ISSN0925-2312

eISSN1872-8286

Publication year2025

Publication date11/10/2024

Volume614

Article number128716

PublisherElsevier

Publication countryNetherlands

Publication languageEnglish

DOIhttps://doi.org/10.1016/j.neucom.2024.128716

Publication open accessOpenly available

Publication channel open accessPartially open access channel


Abstract

Over the last few years, deep belief networks (DBNs) have been extensively utilized for efficient and reliable performance in several complex systems. One critical factor contributing to the enhanced learning of the DBN layers is the handling of network parameters, such as weights and biases. The efficient training of these parameters significantly influences the overall enhanced performance of the DBN. However, the initialization of these parameters is often random, and the data samples are normally corrupted by unwanted noise. This causes the uncertainty to arise among weights and biases of the DBNs, which ultimately hinders the performance of the network. To address this challenge, we propose a novel DBN model with weights and biases represented using fuzzy sets. The approach systematically handles inherent uncertainties in parameters resulting in a more robust and reliable training process. We show the working of the proposed algorithm considering four widely used benchmark datasets such as: MNSIT, n-MNIST (MNIST with additive white Gaussian noise (AWGN) and MNIST with motion blur) and CIFAR-10. The experimental results show superiority of the proposed approach as compared to classical DBN in terms of robustness and enhanced performance. Moreover, it has the capability to produce equivalent results with a smaller number of nodes in the hidden layer; thus, reducing the computational complexity of the network architecture. Additionally, we also study the sensitivity analysis for stability and consistency by considering different membership functions to model the uncertain weights and biases. Further, we establish the statistical significance of the obtained results by conducting both one-way and Kruskal-Wallis analyses of variance tests.


Keywordsdeep learningfuzzy logicneural computation

Free keywordsdeep learning; deep belief networks; restricted Boltzmann machine; fuzzy sets; type-1 fuzzy sets; contrastive divergence


Contributing organizations


Ministry reportingYes

VIRTA submission year2024

JUFO rating2

Preliminary JUFO rating2


Last updated on 2024-20-11 at 20:05