A probability-inspired normalization for fixed-precision Hyper-Dimensional Computing

Abstract: 

Hyper-Dimensional Computing (HDC), a promising nano-scalable paradigm for low-energy predictions and lightweight learned models, has seen a surge of interest from the hardware accelerator community. However, the classical single-bit per vector element approach for HDC seldom achieves higher classification accuracy than multi-bit alternatives, and is inadequate to support the rapidly growing application space. A great challenge for multi-bit HDC hardware is to negotiate the enormous increase in logic vis-a-vis the single-bit hardware. Key to minimizing this cost is to limit bits per vector element, which is potentially unbounded without transformation, and can be very large for some applications. This work proposes a hardware. friendly numerical transformation on a HDC vector where the result has fixed bits per element. Under a reasonable assumption on the vector's distribution, it is proven that the transformation guarantees at most a small, known error in associative search. Verification experiments indicate the theoretical guarantee is very pessimistic; the actual error is less than 18% of the theoretical upper bound. Estimates predict 3.8X hardware savings with a 0.04% accuracy drop. We believe emerging stochastic approaches like HDC offer exciting new opportunities of employing high-dimensional probability theory for accelerator design.

Author: 
Sohum Datta
Publication date: 
January 1, 2022
Publication type: 
Conference Paper