Poster
in
Workshop: ML with New Compute Paradigms
Energy-Efficient Random Number Generation Using Stochastic Magnetic Tunnel Junctions
Nicolas Alder · Shivam Kajale · Milin Tunsiricharoengul · Deblina Sarkar · Ralf Herbrich
(Pseudo)random sampling is a costly yet widely used method in machine learning. We introduce an energy-efficient algorithm for uniform Float16 sampling, utilizing a room-temperature stochastic magnetic tunnel junction device to generate truly random floating-point numbers. By avoiding expensive symbolic computation and mapping physical phenomena directly to the statistical properties of the floating-point format and uniform distribution, our approach achieves a higher level of energy efficiency than the state-of-the-art Mersenne-Twister algorithm by a minimum factor of 9721 and an improvement factor of 5649 compared to the more energy-efficient PCG algorithm. We provide measurements of the potential accumulated approximation errors, demonstrating the effectiveness of our method.