Poster
in
Workshop: Machine Learning with New Compute Paradigms
Unleashing Hyperdimensional Computing with Nyström Method based Data Adaptive Encoding
Quanling Zhao · Anthony Thomas · Xiaofan Yu · Tajana S Rosing
Hyperdimensional Computing (HDC) is capable of performing machine learning tasks by first encoding data into high-dimension distributed representation called hypervectors. Learning tasks can then be preformed on those hypervectors with a set of computationally efficient and simple operations. HDC has gained significant attentions in recent years due to its excellent hardware efficiency. The core of all HDC algorithms is the encoding function which determines the expressing ability of hypervectors, thus is the critical bottleneck for performance. However, existing HDC encoding methods are task dependent and often only capture very basic notion of similarity, therefore can limit the accuracy of HDC models. To unleash the potential of HDC on arbitrary tasks, we propose a novel encoding method that is inspired by Nyström method for kernel approximation. Our approach allows one to generate an encoding function that approximates any user-defined positive-definite similarity function on the data via dot-products between encodings in HD-space. This allows HDC to tackle a broader range of tasks with better learning accuracy while still retain its hardware efficiency. We empirically evaluate our proposed encoding method against existing HDC encoding methods that are commonly used in various classification tasks. Our results show the HDC encoding method we propose can achieve better accuracy result in various of learning tasks. On graph and string datasets, our method achieve 10\%-37\% and 3\%-18\% better classification accuracy, respectively.