Skip to yearly menu bar Skip to main content


Poster

Unsupervised Discovery of Temporal Structure in Noisy Data with Dynamical Components Analysis

David Clark · Jesse Livezey · Kristofer Bouchard

East Exhibition Hall B, C #5

Keywords: [ Neural Coding ] [ Neuroscience and Cognitive Science ] [ Algorithms ] [ Components Analysis (e.g., CCA, ICA, LDA, PCA) ]


Abstract:

Linear dimensionality reduction methods are commonly used to extract low-dimensional structure from high-dimensional data. However, popular methods disregard temporal structure, rendering them prone to extracting noise rather than meaningful dynamics when applied to time series data. At the same time, many successful unsupervised learning methods for temporal, sequential and spatial data extract features which are predictive of their surrounding context. Combining these approaches, we introduce Dynamical Components Analysis (DCA), a linear dimensionality reduction method which discovers a subspace of high-dimensional time series data with maximal predictive information, defined as the mutual information between the past and future. We test DCA on synthetic examples and demonstrate its superior ability to extract dynamical structure compared to commonly used linear methods. We also apply DCA to several real-world datasets, showing that the dimensions extracted by DCA are more useful than those extracted by other methods for predicting future states and decoding auxiliary variables. Overall, DCA robustly extracts dynamical structure in noisy, high-dimensional data while retaining the computational efficiency and geometric interpretability of linear dimensionality reduction methods.

Live content is unavailable. Log in and register to view live content