Publication:
On the importance of hidden bias and hidden entropy in representational efficiency of the Gaussian-Bipolar Restricted Boltzmann Machines

dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.kuauthorIsabekov, Altynbek
dc.contributor.kuauthorErzin, Engin
dc.contributor.kuprofileFaculty Member
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.yokidN/A
dc.contributor.yokid34503
dc.date.accessioned2024-11-09T13:49:22Z
dc.date.issued2018
dc.description.abstractIn this paper, we analyze the role of hidden bias in representational efficiency of the Gaussian-Bipolar Restricted Boltzmann Machines (GBPRBMs), which are similar to the widely used Gaussian-Bernoulli RBMs. Our experiments show that hidden bias plays an important role in shaping of the probability density function of the visible units. We define hidden entropy and propose it as a measure of representational efficiency of the model. By using this measure, we investigate the effect of hidden bias on the hidden entropy and provide a full analysis of the hidden entropy as function of the hidden bias for small models with up to three hidden units. We also provide an insight into understanding of the representational efficiency of the larger scale models. Furthermore, we introduce Normalized Empirical Hidden Entropy (NEHE) as an alternative to hidden entropy that can be computed for large models. Experiments on the MNIST, CIFAR-10 and Faces data sets show that NEHE can serve as measure of representational efficiency and gives an insight on minimum number of hidden units required to represent the data.
dc.description.fulltextYES
dc.description.indexedbyWoS
dc.description.indexedbyScopus
dc.description.indexedbyPubMed
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.description.sponsoredbyTubitakEuN/A
dc.description.sponsorshipN/A
dc.description.versionAuthor's final manuscript
dc.description.volume105
dc.formatpdf
dc.identifier.doi10.1016/j.neunet.2018.06.002
dc.identifier.embargoNO
dc.identifier.filenameinventorynoIR01587
dc.identifier.issn0893-6080
dc.identifier.linkhttps://doi.org/10.1016/j.neunet.2018.06.002
dc.identifier.quartileN/A
dc.identifier.scopus2-s2.0-85048838032
dc.identifier.urihttps://hdl.handle.net/20.500.14288/3864
dc.identifier.wos441874700033
dc.keywordsRBM
dc.keywordsHidden entropy
dc.keywordsHidden bias
dc.keywordsRepresentational efficiency
dc.keywordsAutoencoder
dc.keywordsDeep learning
dc.languageEnglish
dc.publisherElsevier
dc.relation.grantnoNA
dc.relation.urihttp://cdm21054.contentdm.oclc.org/cdm/ref/collection/IR/id/8389
dc.sourceNeural Networks
dc.subjectComputer science
dc.subjectNeurosciences and neurology
dc.titleOn the importance of hidden bias and hidden entropy in representational efficiency of the Gaussian-Bipolar Restricted Boltzmann Machines
dc.typeJournal Article
dspace.entity.typePublication
local.contributor.authoridN/A
local.contributor.authorid0000-0002-2715-2368
local.contributor.kuauthorIsabekov, Altynbek
local.contributor.kuauthorErzin, Engin
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae

Files

Original bundle

Now showing 1 - 1 of 1
Thumbnail Image
Name:
8389.pdf
Size:
1.43 MB
Format:
Adobe Portable Document Format