Poster
in
Affinity Workshop: Black in AI Workshop
Combining Recurrent, Convolutional, and Continuous-Time Models with Structured Learnable Linear State-Space Layers
Abstract:
The Linear State-Space Layer (LSSL) is a model family that combines the strengths of sequential modeling paradigms such as recurrence, convolution, and differential equations. For example, they generalize convolutions to continuous-time, explain common RNN heuristics, and share features of NDEs such as time-scale adaptation. Although naive LSSLs struggle with modeling long dependencies, we introduce a class of LSSL (SLLSSL), which overcomes these limitations by utilizing a trainable set of structured matrices that endow it with long range memory.
Chat is not available.