Skip to main content Skip to secondary navigation

Hardware and Software for Sparse ML

Main content start

Speaker: Prof. Fredrik Kjolstad, Stanford University
Date: October 5, 2022

Sparse neural networks, such as attention-sparse transformers and graph neural networks, are promising for increasing both efficiency and the range of data types to which we can apply neural networks. I will discuss key challenges of developing software and hardware for sparse machine learning. I will briefly discuss sources of sparsity in neural networks and show that, unlike software and hardware for dense neural networks, systems for sparse neural networks cannot efficiently be reduced to one operation such as matrix multiplication. I will then discuss how compilers and hardware should be designed for sparse neural networks, drawing on work from the AHA project.

Hardware and Software for Sparse ML (Prof. Fredrik Kjolstad, Stanford University)