Workshop
The pre-registration workshop: an alternative publication model for machine learning research
Samuel Albanie · João Henriques · Luca Bertinetto · Alex Hernandez-Garcia · Hazel Doughty · Gul Varol
Mon 13 Dec, 4 a.m. PST
Machine learning research has benefited considerably from the adoption of standardised public benchmarks. While the importance of these benchmarks is undisputed, we argue against the current incentive system and its heavy reliance upon performance as a proxy for scientific progress. The status quo incentivises researchers to “beat the state of the art”, potentially at the expense of deep scientific understanding and rigorous experimental design. Since typically only positive results are rewarded, the negative results inevitably encountered during research are often omitted, allowing many other groups to unknowingly and wastefully repeat these negative findings.
Pre-registration is a publishing and reviewing model that aims to address these issues by changing the incentive system. A pre-registered paper is a regular paper that is submitted for peer-review without any experimental results, describing instead an experimental protocol to be followed after the paper is accepted. This implies that it is important for the authors to make compelling arguments from theory or past published evidence. As for reviewers, they must assess these arguments together with the quality of the experimental design, rather than comparing numeric results. While pre-registration has been highly adopted in fields such as medicine and psychology, there is little such experience inthe machine learning community. In this workshop, we propose to conduct a full pre-registration review-cycle for machine learning. Our proposal follows an initial small-scale trial of pre-registration in computer vision (Henriques et al., 2019) and builds on a successful pilot study in pre-registration at NeurIPS 2020 (Bertinetto et al., 2020). We have already received a number of requests to repeat the workshop, indicating strong community interest.
Schedule
Mon 4:00 a.m. - 4:10 a.m.
|
Opening remarks
(
Talk
)
>
SlidesLive Video |
🔗 |
Mon 4:10 a.m. - 4:40 a.m.
|
Invited Talk - Sarahanne Field
(
Talk
)
>
SlidesLive Video |
Sarahanne Field 🔗 |
Mon 4:40 a.m. - 5:00 a.m.
|
PCA Retargeting: Encoding Linear Shape Models as Convolutional Mesh Autoencoders - Eimear O'Sullivan
(
Talk
)
>
SlidesLive Video |
🔗 |
Mon 5:00 a.m. - 5:20 a.m.
|
Spotlights 1 (5 x 3 minutes)
(
Short videos
)
>
SlidesLive Video |
🔗 |
Mon 5:20 a.m. - 5:40 a.m.
|
Unsupervised Resource Allocation with Graph Neural Networks - Miles Cranmer
(
Talk
)
>
SlidesLive Video |
🔗 |
Mon 5:40 a.m. - 6:10 a.m.
|
Break
|
🔗 |
Mon 6:10 a.m. - 6:40 a.m.
|
Invited Talk - Dima Damen
(
Talk
)
>
SlidesLive Video |
Dima Damen 🔗 |
Mon 6:40 a.m. - 7:10 a.m.
|
Invited Talk - Hugo Larochelle
(
Talk
)
>
SlidesLive Video |
Hugo Larochelle 🔗 |
Mon 7:10 a.m. - 7:30 a.m.
|
Spotlights 2 (5 x 3 minutes)
(
Short videos
)
>
SlidesLive Video |
🔗 |
Mon 7:30 a.m. - 8:30 a.m.
|
Poster Session ( Virtual posters ) > link | 🔗 |
Mon 8:30 a.m. - 9:00 a.m.
|
Break
|
🔗 |
Mon 9:00 a.m. - 9:30 a.m.
|
Invited Talk - Paul Smaldino
(
Talk
)
>
SlidesLive Video |
Paul Smaldino 🔗 |
Mon 9:30 a.m. - 9:50 a.m.
|
Confronting Domain Shift in Trained Neural Networks - Carianne Martinez
(
Talk
)
>
SlidesLive Video |
🔗 |
Mon 9:50 a.m. - 10:05 a.m.
|
Discussion Panel - 2020 authors' experience
(
Discussion Panel
)
>
SlidesLive Video |
🔗 |
Mon 10:05 a.m. - 11:05 a.m.
|
Open Discussion
(
Open Discussion
)
>
|
🔗 |
Mon 11:05 a.m. - 11:10 a.m.
|
Closing Remarks
(
Closing Remarks
)
>
SlidesLive Video |
🔗 |