Abstract
The emergence of quantum computers as a new computational paradigm has been accompanied by speculation concerning the scope and timeline of their anticipated revolutionary changes. While quantum computing is still in its infancy, the variety of different architectures used to implement quantum computations make it difficult to reliably measure and compare performance. This problem motivates our introduction of SupermarQ, a scalable, hardware-agnostic quantum benchmark suite which uses application-level metrics to measure performance. SupermarQ is the first attempt to systematically apply techniques from classical benchmarking methodology to the quantum domain. We define a set of feature vectors to quantify coverage, select applications from a variety of domains to ensure the suite is representative of real workloads, and collect benchmark results from the IBM, IonQ, and AQT@LBNL platforms. Looking forward, we envision that quantum benchmarking will encompass a large cross-community effort built on open source, constantly evolving benchmark suites. We introduce SupermarQ as an important step in this direction.
Original language | English (US) |
---|---|
Title of host publication | Proceedings - 2022 IEEE International Symposium on High-Performance Computer Architecture, HPCA 2022 |
Publisher | IEEE Computer Society |
Pages | 587-603 |
Number of pages | 17 |
ISBN (Electronic) | 9781665420273 |
DOIs | |
State | Published - 2022 |
Event | 28th Annual IEEE International Symposium on High-Performance Computer Architecture, HPCA 2022 - Virtual, Online, Korea, Republic of Duration: Apr 2 2022 → Apr 6 2022 |
Publication series
Name | Proceedings - International Symposium on High-Performance Computer Architecture |
---|---|
Volume | 2022-April |
ISSN (Print) | 1530-0897 |
Conference
Conference | 28th Annual IEEE International Symposium on High-Performance Computer Architecture, HPCA 2022 |
---|---|
Country/Territory | Korea, Republic of |
City | Virtual, Online |
Period | 4/2/22 → 4/6/22 |
Funding
This material is based upon work supported by the U.S. Department of Energy, Office of Science, National Quantum Information Science Research Centers, Co-design Center for Quantum Advantage (C2QA) under contract number DESC0012704, the Office of Advanced Scientific Computing Research under Award Number DE-SC0021526, and the National Science Foundation under Grant # 2030859 to the Computing Research Association for the CIFellows Project. Funding was also provided in part by EPiQC, an NSF Expedition in Computing, under grants CCF-1730082 / 1730449, by the NSF STAQ project under grant NSF Phy-1818914, by DOE grants DE-SC0020289 and DE-SC0020331, and by NSF CNS-1763743. GSR is supported as a Computing Innovation Fellow at the University of Chicago. KNS is supported by IBM as a Postdoctoral Scholar at the University of Chicago and the Chicago Quantum Exchange. We also acknowledge support from US Department of Energy Office, Advanced Manufacturing Office (CRADA No. 2020-20099.) This research used resources of the Oak Ridge Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC05-00OR22725. We acknowledge the use of IBM Quantum services for this work. The views expressed are those of the authors, and do not reflect the official policy or position of IBM or the IBM Quantum team.
Keywords
- Benchmarking
- Program Characterization
- Quantum Computing
ASJC Scopus subject areas
- Hardware and Architecture