Novel deep generative simultaneous recurrent model for efficient representation learning. 2018

M Alam, and L Vidyaratne, and K M Iftekharuddin
The Vision Lab in Department of Electrical and Computer Engineering, Old Dominion University, Norfolk, VA 23529, United States. Electronic address: malam001@odu.edu.

Representation learning plays an important role for building effective deep neural network models. Deep generative probabilistic models have shown to be efficient in the data representation learning task which is usually carried out in an unsupervised fashion. Throughout the past decade, there has been almost exclusive focus on the learning algorithms to improve representation capability of the generative models. However, effective data representation requires improvement in both learning algorithm and architecture of the generative models. Therefore, improvement to the neural architecture is critical for improved data representation capability of deep generative models. Furthermore, the prevailing class of deep generative models such as deep belief network (DBN), deep Boltzman machine (DBM) and deep sigmoid belief network (DSBN) are inherently unidirectional and lack recurrent connections ubiquitous in the biological neuronal structures. Introduction of recurrent connections may offer further improvement in data representation learning performance to the deep generative models. Consequently, for the first time in literature, this work proposes a deep recurrent generative model known as deep simultaneous recurrent belief network (D-SRBN) to efficiently learn representations from unlabeled data. Experimentation on four benchmark datasets: MNIST, Caltech 101 Silhouettes, OCR letters and Omniglot show that the proposed D-SRBN model achieves superior representation learning performance while utilizing less computing resources when compared to the four state-of-the-art generative models such as deep belief network (DBN), DBM, DSBN and VAE (variational auto-encoder).

UI MeSH Term Description Entries
D009474 Neurons The basic cellular units of nervous tissue. Each neuron consists of a body, an axon, and dendrites. Their purpose is to receive, conduct, and transmit impulses in the NERVOUS SYSTEM. Nerve Cells,Cell, Nerve,Cells, Nerve,Nerve Cell,Neuron
D003071 Cognition Intellectual or mental process whereby an organism obtains knowledge. Cognitive Function,Cognitions,Cognitive Functions,Function, Cognitive,Functions, Cognitive
D000465 Algorithms A procedure consisting of a sequence of algebraic formulas and/or logical steps to calculate or determine a given task. Algorithm
D015233 Models, Statistical Statistical formulations or analyses which, when applied to data and found to fit the data, are then used to verify the assumptions and parameters used in the analysis. Examples of statistical models are the linear model, binomial model, polynomial model, two-parameter model, etc. Probabilistic Models,Statistical Models,Two-Parameter Models,Model, Statistical,Models, Binomial,Models, Polynomial,Statistical Model,Binomial Model,Binomial Models,Model, Binomial,Model, Polynomial,Model, Probabilistic,Model, Two-Parameter,Models, Probabilistic,Models, Two-Parameter,Polynomial Model,Polynomial Models,Probabilistic Model,Two Parameter Models,Two-Parameter Model
D016571 Neural Networks, Computer A computer architecture, implementable in either hardware or software, modeled after biological neural networks. Like the biological system in which the processing capability is a result of the interconnection strengths between arrays of nonlinear processing nodes, computerized neural networks, often called perceptrons or multilayer connectionist models, consist of neuron-like units. A homogeneous group of units makes up a layer. These networks are good at pattern recognition. They are adaptive, performing tasks by example, and thus are better for decision-making than are linear learning machines or cluster analysis. They do not require explicit programming. Computational Neural Networks,Connectionist Models,Models, Neural Network,Neural Network Models,Neural Networks (Computer),Perceptrons,Computational Neural Network,Computer Neural Network,Computer Neural Networks,Connectionist Model,Model, Connectionist,Model, Neural Network,Models, Connectionist,Network Model, Neural,Network Models, Neural,Network, Computational Neural,Network, Computer Neural,Network, Neural (Computer),Networks, Computational Neural,Networks, Computer Neural,Networks, Neural (Computer),Neural Network (Computer),Neural Network Model,Neural Network, Computational,Neural Network, Computer,Neural Networks, Computational,Perceptron

Related Publications

M Alam, and L Vidyaratne, and K M Iftekharuddin
July 2021, Neural networks : the official journal of the International Neural Network Society,
M Alam, and L Vidyaratne, and K M Iftekharuddin
August 2022, Computer methods and programs in biomedicine,
M Alam, and L Vidyaratne, and K M Iftekharuddin
September 2022, IEEE transactions on medical imaging,
M Alam, and L Vidyaratne, and K M Iftekharuddin
January 2023, IEEE transactions on image processing : a publication of the IEEE Signal Processing Society,
M Alam, and L Vidyaratne, and K M Iftekharuddin
September 2020, IEEE transactions on cybernetics,
M Alam, and L Vidyaratne, and K M Iftekharuddin
December 2021, Journal of medicinal chemistry,
M Alam, and L Vidyaratne, and K M Iftekharuddin
July 2023, Nucleic acids research,
M Alam, and L Vidyaratne, and K M Iftekharuddin
August 2021, Applied optics,
M Alam, and L Vidyaratne, and K M Iftekharuddin
May 2018, IEEE transactions on pattern analysis and machine intelligence,
M Alam, and L Vidyaratne, and K M Iftekharuddin
August 2019, Advanced materials (Deerfield Beach, Fla.),
Copied contents to your clipboard!