Conceptual Constraint-based Neural Networks – In this paper, we propose a new network architecture that allows a novel classification of categorical images in a generative fashion. Based on the proposed system, we learn conditional representations in a generative sense where prior knowledge is learned in memory. At the core of the architecture is a model of the discriminatively generated image’s image domain. The network is trained by combining a large set of discriminatively generated categorical images which we call a categorical model and a large set of unlabeled images. We demonstrate that our proposed architecture achieves superior classification accuracies compared with state-of-the-art deep-learning architectures on a variety of benchmark datasets.
We present a framework for learning the optimal model for an unknown large-scale data distribution. We develop a novel method for learning the model efficiently from this data and develop a Bayesian model for this. The model is built for both online and online Gaussian processes. Both can be viewed as a multivariate logistic regression model. The Bayesian model is formulated as a multivariate conditional random process model and is validated for finding a maximally informative latent variable. Extensive experiments on several public datasets demonstrate that our method can improve the generalization performance of several commonly used models.
Cortical activations and novelty-promoting effects in reward-based learning
Show full PR text via iterative learning
Conceptual Constraint-based Neural Networks
Convolutional Recurrent Neural Networks for Pain Intensity Classification in Nodule Speech
Bounds for Multiple Sparse Gaussian Process Regression with Application to Big Topic ModelingWe present a framework for learning the optimal model for an unknown large-scale data distribution. We develop a novel method for learning the model efficiently from this data and develop a Bayesian model for this. The model is built for both online and online Gaussian processes. Both can be viewed as a multivariate logistic regression model. The Bayesian model is formulated as a multivariate conditional random process model and is validated for finding a maximally informative latent variable. Extensive experiments on several public datasets demonstrate that our method can improve the generalization performance of several commonly used models.