Invited talk
in
Workshop: Information-Theoretic Principles in Cognitive Systems (InfoCog)
Information Theory for Representation Learning
Alemi
Abstract:
I'll give an overview of how information theoretic principles have been used to motivate and advance representation learning. By combining variational bounds on information theoretic quantities like mutual information with the expressiveness and learnability of modern deep neural networks, information theory can guide the search for useful representations in a wide array of settings including unsupervised learning, supervised learning, bayesian inference and prediction. The emphasis will be on how the modern tools of deep learning can now turn the principled information theoretically motivated objectives across a broad range of interdisciplinary fields into a reality.
Chat is not available.