Workshop
Differentiable Programming Workshop
Ludger Paehler · William Moses · Maria I Gorinova · Assefaw H. Gebremedhin · Jan Hueckelheim · Sri Hari Krishna Narayanan
Mon 13 Dec, 6 a.m. PST
Differentiable programming allows for automatically computing derivatives of functions within a high-level language. It has become increasingly popular within the machine learning (ML) community: differentiable programming has been used within backpropagation of neural networks, probabilistic programming, and Bayesian inference. Fundamentally, differentiable programming frameworks empower machine learning and its applications: the availability of efficient and composable automatic differentiation (AD) tools has led to advances in optimization, differentiable simulators, engineering, and science.
While AD tools have greatly increased the productivity of ML scientists and practitioners, many problems remain unsolved. Crucially, there is little communication between the broad group of AD users, the programming languages researchers, and the differentiable programming developers, resulting in them working in isolation. We propose a Differentiable Programming workshop as a forum to narrow the gaps between differentiable and probabilistic languages design, efficient automatic differentiation engines and higher-level applications of differentiable programming. We hope this workshop will harness a closer collaboration between language designers and domain scientists by bringing together a diverse part of the differentiable programming community including people working on core automatic differentiation tools, higher level frameworks that rely upon AD (such as probabilistic programming and differentiable simulators), and applications that use differentiable programs to solve scientific problems.
The explicit goals of the workshop are to:
1. Foster closer collaboration and synergies between the individual communities;
2. Evaluate the merits of differentiable design constructs and the impact they have on the algorithm design space and usability of the language;
3. Highlight differentiable techniques of individual domains, and the potential they hold for other fields.
Schedule
Mon 6:00 a.m. - 6:05 a.m.
|
Welcome
(
Short Introduction & Welcome to the Workshop
)
>
SlidesLive Video |
🔗 |
Mon 6:05 a.m. - 6:35 a.m.
|
Parallel-Friendly Automatic Differentiation in Dex and JAX
(
Invited Talk
)
>
SlidesLive Video |
Adam Paszke 🔗 |
Mon 6:35 a.m. - 7:05 a.m.
|
SYMPAIS: SYMbolic Parallel Adaptive Importance Sampling for Probabilistic Program Analysis
(
Invited Talk
)
>
SlidesLive Video |
Yuan Zhou 🔗 |
Mon 7:05 a.m. - 7:20 a.m.
|
Differentiable Scripting
(
Oral
)
>
link
SlidesLive Video |
Uwe Naumann 🔗 |
Mon 7:20 a.m. - 7:35 a.m.
|
A research framework for writing differentiable PDE discretizations in JAX
(
Oral
)
>
link
SlidesLive Video |
Antonio Stanziola · Simon Arridge 🔗 |
Mon 7:35 a.m. - 7:50 a.m.
|
Break
|
🔗 |
Mon 7:50 a.m. - 8:20 a.m.
|
Differentiable Programming in Molecular Physics
(
Invited Talk
)
>
SlidesLive Video |
Frank Noe 🔗 |
Mon 8:20 a.m. - 8:50 a.m.
|
Diffractor.jl: High Level, High Performance AD for Julia
(
Invited Talk
)
>
SlidesLive Video |
Keno Fischer 🔗 |
Mon 8:50 a.m. - 9:05 a.m.
|
Equinox: neural networks in JAX via callable PyTrees and filtered transformations
(
Oral
)
>
link
SlidesLive Video |
Patrick Kidger 🔗 |
Mon 9:05 a.m. - 9:20 a.m.
|
A fully-differentiable compressible high-order computational fluid dynamics solver
(
Oral
)
>
link
SlidesLive Video |
Deniz Bezgin 🔗 |
Mon 9:20 a.m. - 9:25 a.m.
|
Short Break
|
🔗 |
Mon 9:25 a.m. - 10:40 a.m.
|
Poster Session
(
Poster Session
)
>
|
🔗 |
Mon 9:25 a.m. - 10:40 a.m.
|
Extended Abstract – Enzyme.jl: Low levelauto-differentiation meets high-level language ( Poster ) > link | Valentin Churavy 🔗 |
Mon 9:25 a.m. - 10:40 a.m.
|
GPU Accelerated Automatic Differentiation with Clad ( Poster ) > link | Vassil Vassilev · David Lange 🔗 |
Mon 9:25 a.m. - 10:40 a.m.
|
Unbiased Reparametrisation Gradient via Smoothing and Diagonalisation ( Poster ) > link | Dominik Wagner · Luke Ong 🔗 |
Mon 9:25 a.m. - 10:40 a.m.
|
Gradients of the Big Bang: Solving the Einstein-Boltzmann Equations with Automatic Differentiation ( Poster ) > link | James Sullivan 🔗 |
Mon 9:25 a.m. - 10:40 a.m.
|
Differentiable Parametric Optimization Approach to Power System Load Modeling ( Poster ) > link | Jan Drgona · Andrew August · Elliott Skomski 🔗 |
Mon 9:25 a.m. - 10:40 a.m.
|
On automatic differentiation for the Matern covariance ( Poster ) > link | Oana Marin · Paul Hovland 🔗 |
Mon 9:25 a.m. - 10:40 a.m.
|
Neural Differentiable Predictive Control ( Poster ) > link | Jan Drgona · Aaron Tuor · Draguna Vrabie 🔗 |
Mon 9:25 a.m. - 10:40 a.m.
|
AbstractDifferentiation.jl: Backend-Agnostic Differentiable Programming in Julia ( Poster ) > link | Frank Schäfer · Mohamed Tarek · Lyndon White · Christopher Rackauckas 🔗 |
Mon 9:25 a.m. - 10:40 a.m.
|
Aggregated type handling in AD tape implementations ( Poster ) > link | Max Sagebaum 🔗 |
Mon 9:25 a.m. - 10:40 a.m.
|
Backpropagation through Back substitution with a Backslash ( Poster ) > link | Ekin Akyürek · Alan Edelman · Bernie Wang 🔗 |
Mon 10:40 a.m. - 10:45 a.m.
|
Short Break
|
🔗 |
Mon 10:45 a.m. - 11:15 a.m.
|
Learning from Data through the Lens of Ocean Models, Surrogates, and their Derivatives
(
Invited Talk
)
>
SlidesLive Video |
Patrick Heimbach 🔗 |
Mon 11:15 a.m. - 11:45 a.m.
|
Learnable Physics Models
(
Invited Talk
)
>
SlidesLive Video |
Karen Liu 🔗 |
Mon 11:45 a.m. - 12:00 p.m.
|
Escaping the abstraction: a foreign function interface for the Unified Form Language [UFL]
(
Oral
)
>
link
SlidesLive Video |
Nacime Bouziani 🔗 |
Mon 12:00 p.m. - 12:15 p.m.
|
Towards Denotational Semantics of AD for Higher-Order, Recursive, Probabilistic Languages
(
Oral
)
>
link
SlidesLive Video |
Alexander Lew · Mathieu Huot · Vikash Mansinghka 🔗 |
Mon 12:15 p.m. - 12:30 p.m.
|
Break
|
🔗 |
Mon 12:30 p.m. - 1:00 p.m.
|
Differentiable Programming for Protein Sequences and Structure
(
Invited Talk
)
>
SlidesLive Video |
Sergey Ovchinnikov 🔗 |
Mon 1:00 p.m. - 1:30 p.m.
|
Approximate High Performance Computing Guided by Automatic Differentiation
(
Invited Talk
)
>
SlidesLive Video |
Harshitha Menon Menon 🔗 |
Mon 1:30 p.m. - 1:45 p.m.
|
A Complete Axiomatization of Forward Differentiation
(
Oral
)
>
link
SlidesLive Video |
Gordon Plotkin 🔗 |
Mon 1:45 p.m. - 2:00 p.m.
|
Generalizability of density functionals learned from differentiable programming on weakly correlated spin-polarized systems
(
Oral
)
>
link
SlidesLive Video |
Bhupalee Kalita · Ryan Pederson · Li Li · kieron burke 🔗 |
Mon 2:00 p.m. - 3:00 p.m.
|
Social
(
Social
)
>
|
🔗 |