Skip to content

wxbbuaa2011/ai-accelerators

 
 

Repository files navigation

Survey of AI/ML Accelerators

Since the mid-2010s, my team and I have been tracking accelerator technologies. At that time a trend was starting to form that has since been captured by many researchers and industry leaders [Theis and Wong; Horowitz; Leiserson, Thompson, et al.; Thompson and Spanuth; Hennessy and Patterson; Dally, Turakhia, and Han]. Clock frequencies, core counts, chip power densities, etc. were all hitting physical and/or economic walls. The trend was that accelerators would be the next avenue for enabling greater performance in computing systems. And that trend is definitely coming to pass.

The first application area that has seen an explosion in processing accelerators is deep neural networks (DNNs), a subset of artificial intelligence and machine learning (AI/ML). These accelerators have been developed and brought to market for a variety of applications, and both for training and inference tasks. A few colleagues and I at MIT Lincoln Laboratory Supercomputing Center (LLSC) have been closely following, studying, and analyzing the developments of these AI/ML accelerators. We observed that there was a lot of press and surveys chronicling the venture funding and technology announcements of thes AI/ML accelerators. However, we found only partial surveys of AI/ML accelerators from a computational performance point of view. Hence the genesis of our survey papers. We have published a series of survey papers at the IEEE High Performance Extreme Computing (HPEC) Conference that are synoptic in nature. As we have been releasing these annual papers, we have had numerous requests for some or all of the datasets that we have compiled. This git repository is where we are collecting and making available open datasets from this research work.

Papers and Datasets

So far we have published two papers at the IEEE-HPEC Conference and a third paper has been accepted at IEEE-HPEC 2021. Each of the papers are available in IEEE Xplore and arXiv.org. The datasets that were compiled for these papers are available here on subpages, and more fields are available as CSV files.

2021:

A. Reuther, P. Michaleas, M. Jones, V. Gadepally, S. Samsi and J. Kepner, "AI Accelerator Survey and Trends," Accepted at 2021 IEEE High Performance Extreme Computing Conference (HPEC), 2021, pp. 1-10, [IEEE Xplore doi: coming in October 2021] [ArXiv.org/abs/2109.08957] [data].

2020:

A. Reuther, P. Michaleas, M. Jones, V. Gadepally, S. Samsi and J. Kepner, "Survey of Machine Learning Accelerators," 2020 IEEE High Performance Extreme Computing Conference (HPEC), 2020, pp. 1-12, [IEEE Xplore doi: 10.1109/HPEC43674.2020.9286149] [ArXiv.org/abs/2009.00993] [data].

2019:

A. Reuther, P. Michaleas, M. Jones, V. Gadepally, S. Samsi and J. Kepner, "Survey and Benchmarking of Machine Learning Accelerators," 2019 IEEE High Performance Extreme Computing Conference (HPEC), 2019, pp. 1-9, [IEEE Xplore doi:: 10.1109/HPEC.2019.8916327] [ArXiv.org/abs/1908.11348] [data].

Please acknowledge this work with one or more of the papers above.

Copyright 2021 MIT, Albert I. Reuther

About

CSV spreadsheets and other material for AI accelerator survey papers

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published