Message passing algorithms are distributed algorithms that operate on graphs, where each node uses only information present locally at the node and incident edges, and send information only to its neighbouring nodes. They are often highly effective in machine learning and are relatively easy to parallelise. Examples include approximate inference algorithms on probabilistic graphical models, the value iteration algorithm for Markov decision process, graph neural networks and attention networks.
This tutorial presents commonly used approximate inference algorithms for probabilistic graphical models and the value iteration algorithm for Markov decision process, focusing on understanding the objectives that the algorithms are optimising for. We then consider more flexible but less interpretable message passing algorithms including graph neural networks and attention networks. We discuss how these more flexible networks can simulate the more interpretable algorithms, providing some understanding of the inductive biases of these networks through algorithmic alignment and allowing the understanding to be used for network design.
Schedule
Mon 5:00 p.m. - 5:40 p.m.
|
Part 1: Message Passing Overview and Probabilistic Graphical Models
(
Talk
)
>
SlidesLive Video |
Wee Sun Lee 🔗 |
Mon 5:40 p.m. - 5:50 p.m.
|
Q&A
(
Q&A
)
>
|
🔗 |
Mon 5:50 p.m. - 6:00 p.m.
|
Break
|
🔗 |
Mon 6:00 p.m. - 6:16 p.m.
|
Part 2: Markov Decision Process
(
talk
)
>
SlidesLive Video |
Wee Sun Lee 🔗 |
Mon 6:16 p.m. - 6:25 p.m.
|
Q&A
(
Q&A
)
>
|
🔗 |
Mon 6:25 p.m. - 6:30 p.m.
|
Break
|
🔗 |
Mon 6:30 p.m. - 7:20 p.m.
|
Part 3: Graph Neural Networks and Attention Networks
(
Talk
)
>
SlidesLive Video |
Wee Sun Lee 🔗 |
Mon 7:20 p.m. - 7:25 p.m.
|
Break
|
🔗 |
Mon 7:35 p.m. - 8:50 p.m.
|
Overall tutorial Q&A and discussion: 45 minutes or until discussion ends
(
Q&A Overview
)
>
|
🔗 |
Mon 8:30 p.m. - 9:00 p.m.
|
Part 4: Appendix: Proofs and Derivations
(
Recorded video, not to be played live but to be viewed offline b
)
>
SlidesLive Video |
Wee Sun Lee 🔗 |