A New Energy-Based Latent-Variable Model for Unsupervised Feature Learning 


Vol. 48,  No. 5, pp. 509-516, May  2023
10.7840/kics.2023.48.5.509


PDF
  Abstract

This paper proposes a new energy-based latent-variable model (EBLVM) for unsupervised feature learning. The joint probability density function of EBLVM defines a new energy function for the continuous visible and hidden variables in which the visible variable is transformed by the deep neural network. We train the parameters of a new EBLVM using a gradient-based contrastive divergence algorithm. Since EBLVM has a deep structure and learns by combining all hidden layers, effective features for feature learning can be extracted from each layer. In comparative feature learning experiments using Fashion MNIST and CIFAR10 data, the proposed method shows better recognition performance than the existing stacked RBM, DBN, DBM, and DEM.

  Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.


  Related Articles
  Cite this article

[IEEE Style]

GuoPeng and DongKookKim, "A New Energy-Based Latent-Variable Model for Unsupervised Feature Learning," The Journal of Korean Institute of Communications and Information Sciences, vol. 48, no. 5, pp. 509-516, 2023. DOI: 10.7840/kics.2023.48.5.509.

[ACM Style]

GuoPeng and DongKookKim. 2023. A New Energy-Based Latent-Variable Model for Unsupervised Feature Learning. The Journal of Korean Institute of Communications and Information Sciences, 48, 5, (2023), 509-516. DOI: 10.7840/kics.2023.48.5.509.

[KICS Style]

GuoPeng and DongKookKim, "A New Energy-Based Latent-Variable Model for Unsupervised Feature Learning," The Journal of Korean Institute of Communications and Information Sciences, vol. 48, no. 5, pp. 509-516, 5. 2023. (https://doi.org/10.7840/kics.2023.48.5.509)
Vol. 48, No. 5 Index