Poster
Regularization With Stochastic Transformations and Perturbations for Deep Semi-Supervised Learning
Mehdi Sajjadi · Mehran Javanmardi · Tolga Tasdizen
Area 5+6+7+8 #51
Keywords: [ Deep Learning or Neural Networks ] [ (Other) Unsupervised Learning Methods ] [ (Application) Object and Pattern Recognition ] [ Regularization and Large Margin Methods ] [ Semi-Supervised Learning ]
Effective convolutional neural networks are trained on large sets of labeled data. However, creating large labeled datasets is a very costly and time-consuming task. Semi-supervised learning uses unlabeled data to train a model with higher accuracy when there is a limited set of labeled data available. In this paper, we consider the problem of semi-supervised learning with convolutional neural networks. Techniques such as randomized data augmentation, dropout and random max-pooling provide better generalization and stability for classifiers that are trained using gradient descent. Multiple passes of an individual sample through the network might lead to different predictions due to the non-deterministic behavior of these techniques. We propose an unsupervised loss function that takes advantage of the stochastic nature of these methods and minimizes the difference between the predictions of multiple passes of a training sample through the network. We evaluate the proposed method on several benchmark datasets.
Live content is unavailable. Log in and register to view live content