SPEC

Open Systems Group

The Open Systems Group is the oldest group within SPEC. The OSG is the current embodiment of the original founders of SPEC. The OSG is governed by the policies and principles described in the SPEC Open Systems Group Policies and Procedures Document. The Open Systems Steering Committee (OSSC) provides oversight and management for a number of technical subcommittees that investigate and develop the benchmarks, metrics, run and reporting rules, and so on.

OSG focus is on component- and systems-level benchmarks for desktop systems, workstations and servers running open operating system environments. This is the group responsible for the processor metrics SPECint2017 and SPECfp2017 (and their predecessors in 2006, 2000, 1995, 1992 and the original SPECmarks from 1989).

In addition to the SPEC CPU 2017 suite, the group has also developed Java benchmarks, a SIP benchmark, web server benchmarks, mail server benchmark, file server benchmarks, a power and performance benchmark, and the virtualization benchmarks. More in-depth information may be found in the benchmark specific links below.

The Current Benchmarks and Tools

  • SPEC Cloud IaaS 2018
    SPEC's infrastructure-as-a-service (IaaS) performance benchmark for public and private cloud platforms.
  • SPEC CPU 2017
    The current release of SPEC's popular processor performance tests; the successor to SPEC CPU 2006.
  • SPECjbb 2015
    Server-side Java benchmark, developed from the ground up to measure performance based on the latest Java application features.
  • SPECjEnterprise 2018 Web Profile
    SPECjEnterprise 2018 Web Profile measures full-system performance for Java Enterprise Edition (Java EE) Web Profile Version 7 or later application servers, databases and supporting infrastructure.
  • SPECjEnterprise 2010
    The full system benchmark which allows performance measurement and characterization of Java EE 5.0 servers and supporting infrastructure such as JVM, Database, CPU, disk, and servers.
  • SPECjvm 2008
    The current Java virtual machine (JVM) benchmark, with multithreaded workloads that represent a broad range of application areas.
  • SPECpower_ssj 2008
    SPECpower_ssj 2008 is the first industry-standard SPEC benchmark that evaluates the power and performance characteristics of volume server class computers. The first subset of server workloads addresses the performance of server-side Java; additional workloads are planned.
  • SPECstorage Solution 2020
    A storage benchmark suite measuring file server throughput and response time, providing a standardized method for comparing performance across different vendor platforms.
  • SPECvirt Datacenter 2021
    The next generation of virtualization benchmarking for measuring performance of a scaled-out datacenter. It is a multi-host benchmark using simulated and real-life workloads to measure the overall efficiency of virtualization solutions and their management environments.
  • SPEC VIRT_SC 2013
    The benchmark addresses performance evaluation of datacenter servers used in virtualized server consolidation, measuring end-to-end performance of all system components including the hardware, virtualization platform, and the virtualized guest operating system and application software. In addition to major workload upgrades, the SPECvirt web server workload has been modified to require SSL (HTTPS) between the client system and the web server.
  • Chauffeur Worklet Development Kit (WDK)
    The Chauffeur WDK 2.0 simplifies workload development for researchers analyzing computer performance and power consumption. Targeting both industry and academia, the WDK is based on the SERT 2.0 infrastructure, providing run-time & platform configuration support to simplify worklet development.
  • SPEC PTDaemon
    The SPEC PTDaemon software is used to control power analyzers in benchmarks which contain a power measurement component.

Future Benchmarks

These are the projects that SPEC has currently under active development for future release.

  • SPEC ML
    The OSG ML Committee was formed in 2021 to develop practical methodologies to benchmark machine learning (ML) performance in the context of real-world platforms and environments. The Committee also works with other SPEC committees to update their benchmarks for ML environments. The OSG ML Committee's first benchmark, SPEC ML, will measure end-to-end performance of a system under test (SUT) handling ML training and inference tasks.

Retired Benchmarks

Results for benchmarks that have been retired remain on the site, but SPEC no longer accepts new results submissions for these benchmarks.

Submitting results

Benchmark licensees can submit their results for publication on the SPEC web site.

Joining SPEC/ISG

We welcome your interest in joining the Open Systems Group. For more detailed information on benefits, membership fees, and the application process, please go to our OSG membership page.

The SPEC Newsletter

The SPEC Newsletter was the official voice of the OSG. This included news and analysis along with each quarter's performance results. Results are now published electronically on the SPEC web site.