Tutorial
Pay Attention to What You Need: Do Structural Priors Still Matter in the Age of Billion Parameter Models?
Irina Higgins · Antonia Creswell · Sébastien Racanière
Moderator s: Søren Hauberg · Mark van der Wilk
The last few years have seen the emergence of billion parameter models trained on 'infinite' data that achieve impressive performance on many tasks, suggesting that big data and big models may be all we need. But how far can this approach take us, in particular on domains where data is more limited? In many situations adding structured architectural priors to models may be key to achieving faster learning, better generalisation and learning from less data. Structure can be added at the level of perception and at the level of reasoning - the goal of GOFAI research. In this tutorial we will use the idea of symmetries and symbolic reasoning as an overarching theoretical framework to describe many of the common structural priors that have been successful in the past for building more data efficient and generalisable perceptual models, and models that support better reasoning in neuro-symbolic approaches.
Schedule
Mon 1:00 a.m. - 1:45 a.m.
|
Why do we Need Structure and Where does it Come From?
(
Talk
)
>
SlidesLive Video |
Irina Higgins 🔗 |
Mon 1:45 a.m. - 1:55 a.m.
|
Q&A
(
Q&A
)
>
|
🔗 |
Mon 1:55 a.m. - 2:00 a.m.
|
Break
|
🔗 |
Mon 2:00 a.m. - 2:45 a.m.
|
Symmetries
(
Talk
)
>
SlidesLive Video |
Sébastien Racanière 🔗 |
Mon 2:45 a.m. - 2:55 a.m.
|
Q&A
(
Q&A
)
>
|
🔗 |
Mon 2:55 a.m. - 3:00 a.m.
|
Break
|
🔗 |
Mon 3:00 a.m. - 3:45 a.m.
|
Balancing Structure In NeuroSymbolic Methods
(
Talk
)
>
SlidesLive Video |
Antonia Creswell 🔗 |
Mon 3:45 a.m. - 3:50 a.m.
|
Break
|
🔗 |
Mon 3:50 a.m. - 4:30 a.m.
|
Q&A
(
Q&A
)
>
|
🔗 |