The Standard Performance Evaluation Corporation (SPEC) was founded in 1988 by a handful of workstation vendors who recognized the desperate need for realistic, standardized performance tests.
SPEC has grown to become one of the world’s leading computing performance standardization bodies, with a membership of more than 120 leading computer hardware and software vendors, educational institutions, research organizations, and government agencies worldwide. Each quarter, SPEC publishes hundreds of different performance results spanning a variety of system performance disciplines.
SPEC believes that the user community benefits greatly from an objective series of applications-oriented tests that serve as common reference points during system evaluation. By making these benchmarks relatively easy to use, SPEC encourages the widespread publication of performance results.
To ensure accurate and comparable results, SPEC provides users with a standardized suite of source code based upon real-world applications. This code, already ported to numerous platforms by its members, enables users to compile and run the benchmarks on their own systems, optimizing performance for their specific needs.
SPEC's Board of Directors
SPEC is fortunate to have well qualified people stepping forward to help direct our organization. Take a look and meet our leadership.
SPEC Groups
The Embedded Group (EG)
In 2023, EEMBC®, the Embedded Microprocessor Benchmark Consortium, became SPEC's new Embedded Group (SPEC EG). EG's focus is industry-standard benchmarks for measuring the performance and energy efficiency of embedded processor hardware and software used in autonomous driving, mobile imaging, the Internet of Things (IoT), mobile devices, and more.
The Graphics and Workstation Performance Group (GWPG)
SPEC/GWPG is the umbrella organization for committees that develop consistent and repeatable graphics and workstation performance benchmarks and reporting procedures. SPEC/GWPG benchmarks are worldwide standards for evaluating performance in a way that reflects user experiences with popular applications.
Current GWPG Project Groups:
SPECapc The Application Performance Characterization (SPECapc®) committee was formed in 1997 to provide a broad-ranging set of standardized benchmarks for graphics and workstation applications. The group's current benchmarks span popular CAD/CAM, digital content creation, and visualization applications.
SPECgpc The Graphics Performance Characterization (SPECgpc®) committee, begun in 1993, establishes performance benchmarks for graphics systems running under OpenGL and other application programming interfaces (APIs). The group's SPECviewperf® benchmark is the most popular standardized software for evaluating performance based on popular graphics applications.
SPECwpc The SPEC Workstation Performance Characterization (SPECwpc®) committee has created a benchmark that measures the performance of workstations running algorithms used in popular applications, but without requiring the full application and associated licensing to be installed on the system under test. The SPECwpc® benchmark is easy to install and run, but still rigorous enough to provide meaningful, repeatable data for performance evaluation.
The High-Performance Group (HPG)
The HPG is a forum for establishing, maintaining and endorsing a suite of benchmarks that represent high-performance computing applications for standardized, cross-platform performance evaluation.
These benchmarks target high performance system architectures, such as symmetric multiprocessor systems, workstation clusters, distributed memory parallel systems, and traditional vector and vector parallel supercomputers.
The International Standards Group (ISG)
The ISG was established to oversee the establishment of standardized benchmarks primarily developed for the use in government regulations and programs, and collaborates with national and international standard development organizations to enhance global standards.
Current ISG Committees:
Server Efficiency The Server Efficiency Committee took over the maintenance of the SERT® suite and its further development in 2021.
The Open Systems Group (OSG)
The OSG is the original SPEC committee. This group focuses on benchmarks for desktop systems, high-end workstations and servers running open systems environments.
Current OSG Committees:
CLOUD OSG's Cloud Committee is working on the development of industry standard benchmarks on areas related to Cloud computing infrastructure, applications spaces and emerging technologies.
CPU The people who brought you SPECmarks and the other SPEC CPU® benchmarks.
JAVA The Java Committee developed the Java client and server-side benchmarks: the SPECjvm® 98 and SPECjvm® 2008 benchmarks, the SPECjbb® benchmarks, the SPECjAppServer® and SPECjEnterprise® benchmarks for Java Enterprise Application Servers, and the SPEC JMS® benchmark.
ML The ML Committee was formed in 2021 to develop practical methodologies to benchmark machine learning (ML) performance in the context of real-world platforms and environments. The Committee also works with other SPEC committees to update their benchmarks for ML environments.
POWER The SPECpower Committee tackles the task of creating industry-standard benchmarks and tools to measure the power and performance characteristics of server-class compute equipment. They released the SPECpower_ssj® 2008 benchmark in 2007, which exercises the CPUs, caches, memory hierarchy, and the scalability of shared memory processors on multiple load-levels. The work on the SERT® suite started around 2009 with the ground-breaking design of the Chauffeur® tool, followed by the design and implementation of the workload, automated hardware and software discovery, and GUI. Simultaneously to these efforts, the SPECpower Committee, in collaboration with the more academic-focused SPEC Power Research Working Group, conducted large-scale experiments in order to determine the server efficiency metric for SERT Suite 2.0, which enabled its release in 2017.
STORAGE The Storage Committee developed the SPEC SFS® 93 (LADDIS) benchmark, the SPEC SFS® 97 benchmark, the SPEC SFS® 97_R1 benchmark, the SPEC SFS® 2008 benchmark, the SPEC SFS® 2014 benchmark, and the SPECstorage® Solution 2020 benchmark.
VIRTUALIZATION The Virtualization Committee developed the SPEC VIRT_SC® 2010 benchmark, the first generation SPEC benchmark for comparing virtualization performance for data center servers, its successor the SPEC VIRT_SC® 2013 benchmark, and the SPECvirt® Datacenter 2021 benchmark.
SPEC Research Group (RG)
The Research Group was created to promote innovative research on benchmarking methodologies and tools facilitating the development of benchmark suites and performance analysis frameworks for established and newly emerging technologies. It is designed to encourage exchange among representatives from academia, industry and research institutes. The scope of the conducted research efforts includes techniques and tools for performance measurement, load testing, profiling, workload characterization, dependability and efficiency evaluation of computing systems. While the focus is on performance, other extra-functional system properties such as scalability, availability, cost and energy efficiency are considered as well.
A major component of the RG is the development of standard scenarios and workloads—called research benchmarks—for emerging technologies and applications. Benchmarks from the research group are intended primarily for in-depth analysis and evaluation of early prototypes and research results. This differentiates them from conventional benchmarks used for direct comparison and marketing of existing products.
Other activities of the RG include publishing a newsletter, recognizing outstanding contributions to benchmarking research, and organizing conferences and workshops.