Skip to yearly menu bar Skip to main content


Poster

Asynchronous Perception Machine for Test Time Training

Rajat Modi · Yogesh Rawat

Poster Room - TBD
[ ] [ Project Page ]
Wed 11 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract:

In this work, we propose Asynchronous Perception Machine (APM), a computationally-efficient architecture for test-time-training (TTT). APM can process patches of an image one at a time in any order asymmetrically and still encode semantic-awareness. We demonstrate APM's ability to recognize out-of-distribution images without dataset-specific pre-training, and its competitive classification performance over existing TTT approaches. To perform TTT, APM just distills test sample's representation once. APM possesses a unique property: it can learn using just this single representation and starts predicting semantically-aware features. We demonstrate APM's potential application beyond test-time-training: APM can scale up to a dataset of 2D images and yield semantic-clusterings in a single forward pass. APM also provides first empirical evidence of GLOM's insight, i.e. percept is really a field. Therefore, APM helps us converge towards an implementation which can do both interpolation and perception on a shared-connectionist hardware. Our codebase has been provided for review and will be made publicly-available. --------It now appears that some of the ideas in GLOM could be made to work.https://www.technologyreview.com/2021/04/16/1021871/geoffrey-hinton-glom-godfather-ai-neural-networks/

Live content is unavailable. Log in and register to view live content