Expo Talk
East Meeting Room 11, 12

Innovations in number systems, such as logarithmic math, and their co-designed hardware can accelerate AI adoption. We explore practical hardware implementations and provide quantitative examples at both the operation and system levels.

  • Impact of Inference System Design on AI Adoption: How system design affects trust in AI outputs, cost, and user experience.

  • Trends in Low-Precision Data Types: Why low-precision data types are so effective in improving AI compute cost, and a review of common types in AI models.

  • Logarithmic Math as an Alternative: Presenting our research on logarithmic number systems, which replace multiplications with additions, reducing chip area and power by ~4x. We address challenges in mapping from logarithmic to linear space in multiply-accumulate operations and compare approaches like LUTs, Taylor Series, and the Mitchell Approximation regarding accuracy, feasibility, and efficiency.

  • Co-Designing Logarithmic Math and AI Hardware: Introducing an improvement to the Mitchell Approximation that renders logarithmic math Pareto-optimal in power vs. precision, ideal for large multimodal models. We demonstrate enhanced trust and UX over traditional linear math, showing quantitative results for large models with sub-0.1% accuracy losses compared to baseline IEEE 32/16-bit models, while maintaining low costs and power consumption comparable to 4-bit precision on floating-point hardware.

  • Finally, reduced chip area and power offer secondary benefits in silicon design, such as more flexibility in datapath design and more generic compute for better utilization, leading to even lower AI inference costs. We also share our experience in co-designing SW and chips, emphasizing the importance of discipline integration.

With seven years of foundational innovations and proven hardware systems in logarithmic math, we are pioneers in this field and eager to share our insights.

Chat is not available.